A brief history of cryptography and codes

Matthew Jayes from the School of Business, Economics and Informatics reports on a recent conference on the history of mathematics, looking at the evolution of cryptography and the future of ground-breaking ideas.

On the sunny Saturday of 19 May 2018, London was a very special place to be – Chelsea FC defeated Manchester United FC in the FA Cup Final at Wembley; Saracens beat rival Wasps in a high-octane encounter before home fans in Barnet to qualify for Rugby Union’s Aviva Premiership Final; and there was the matter of royal matrimony in Windsor. At Birkbeck, the Department of Economics, Mathematics and Statistics and the British Society for the History of Mathematics (BSHM) celebrated the 70th anniversary of Claude Shannon’s 1948 publication of his influential paper, ‘A Mathematical Theory for Communication’ at the annual History of Maths conference. The event featured six speakers, including Sir John Dermot Turing, nephew of Alan Turing, who took the audience on a journey from early cryptography to the present day, and into the future!

Dr Elizabeth Quaglia, Lecturer in Information Systems at Royal Holloway, University of London, opened the presentations with a journey through classical cryptography and the art of secrecy. The word “cryptography” is derived from the Greek words kryptos (hidden; secret) and graphein (writing; studying). Quaglia suggested that secret writing is arguably as old as written language itself, and early forms are to be found on Babylonian tablets dating back to 2500 BC. Today, cryptography is a science that involves a blend of mathematics, statistics, computer science and engineering; however, things were not always so complex. Early examples of concealing information (steganography) were chronicled by Herodotus (around 484 – 425 BC). In 499 BC, Histiaeus commanded a man to shave his head, after which he wrote a message on his scalp and waited for the hair to grow back to conceal the message. Demaratus removed the wax from a writing tablet to score a message directly into the surface and then covered it up with fresh wax. Steganography is still practised somewhat – in fact, Oliver Stone portrayed the hiding and transportation of classified information under a Rubik’s cube tile in his 2016 film based on the story of Edward Snowden.

Information has been hidden and smuggled, discovered or not, throughout the history of civilisations. Encryption, therefore, was developed in order to change the information to something that could only be understood by the sender and the intended receiver. The role of the unintended receiver, or eavesdropper, has been to make sense of this code through decryption. Various systems have been used, from the Spartan scytale to Caesar’s Cipher, which was a substitution algorithm involving a shift of letters three positions along. For example, ‘BIRKBECK’ may become ELUNEHFN moving forward or YFOHYBZH moving backwards. You may already notice that the security of this type of cipher is rather weak since there are only 25 code keys, which is the number of places one can shift along the alphabet to substitute, and one defunct key (the zero shift, which would render plain text). Given a relatively short period of time, the eavesdropper could try these keys and break the code easily.

It was soon discovered that to make the trial of keys by an eavesdropper too time-consuming to be practical, it is crucial for the key space to be large. The substitution cipher solved this problem by introducing the technique whereby each letter of the alphabet had a different alphabetic permutation – imagine mapping the alphabet to another alphabet sequenced entirely at random – rendering the number of keys to be 26! (factorial) or 4 x 1026. This method became ubiquitous and Quaglia provided a number of historic cases, including Mary, Queen of Scots’ exchange with her co-conspirators in the Babington Plot.

As with much innovation and discovery, the decryption method for this technique began with academics. In this case, Koranic scholars – notably  Al-Kindi – were using a technique to date words by looking at the frequency with which they appeared within the text (a practice still used, for example, in Google Books Ngram Viewer). Cryptanalysts (eavesdroppers) learned they could apply this method to consider the frequency of letters or words, especially popular letters and significant words such as the names of leaders or places, within a language (example of the frequency of letter use in the English language by Cornell Maths).

Quaglia highlighted that the Babington Plot was foiled through frequency analysis. This technique works well on monoalphabetic ciphers, even those using a large key set through substitution; so, the next step to deceive eavesdroppers was to introduce polyalphabetic ciphers credited to Leon Battista Alberti in the 15th century. This method used more than one permutation for each letter of the alphabet, and letters used most frequently were given the most permutations to try and hide linguistic effects previously discovered by frequency analysis. Blaise de Vigenère added keywords to this method, where popular or regularly-used words were codified using a single symbol to represent the whole word, creating le chiffre indéchiffrable (later known as the Vigenère cipher).

The Vigenère cipher held strong for roughly three centuries until it was finally beaten by the Englishman Charles Babbage, although he kept the solution secret. Quaglia described how one might break this cipher, first by identifying the length of the keyword using the Kasiski method and then determining the keyword using the faithful frequency analysis. Quaglia acknowledged there are many other ciphers to discuss and discover, but to keep to time, suggested that there are still many mysteries and unbroken secrets to discover in history through the study of codes and ciphers.

Klaus Schmeh accepted this baton and provided a thorough account of the types of challenges one might face when looking at cryptograms (defined as an encrypted text from the point of view of someone who wishes to break the code – the eavesdropper), especially those dating from around 1400 – 1970. These cryptograms are often still complex and require time and effort, especially as they are pre-computer age and one can encounter difficulties with the quality of the plaintext itself. Think of a damaged or stained document in an earlier form of modern language or using obsolete abbreviations. Schmeh clarified the concept of nomenclators – coding entire words rather than one letter at a time. He then described the complexity and impracticality of early codebooks, like a dictionary with specific code symbols for each word. The audience was also given a robust overview of Enigma, with a key space of around 276 keys.

Schmeh provides many wonderful examples of historic ciphers and codes through his blog and has blogged about this event too (Why this crypto history conference in London was better than the royal wedding). For cryptography enthusiasts, Schmeh highlighted the Kryptos artwork at CIA headquarters, the fourth and final section of which still requires breaking.

The last of the morning’s speakers was Sir John Dermot Turing, who discussed the combined effort required by codebreakers at Bletchley Park and beyond. Turing argued that the contributions of some linguists, mathematicians, and engineers working on cryptography during World War 2 has been sadly under-celebrated. Choosing to use his platform to rectify this, Turing provided a warm and thorough account of the codebreakers Knox, Tiltman, Welchman, and Clarke. Joan Clarke was a pioneering cryptanalyst, a role reserved for men at the time, and was asked to apply for the role of linguist, then deemed to be the most suitable equivalent role and pay grade available for women. Turing went on to highlight the work of Marian Rejewski, who remarkably reverse-engineered one of the most complex machines of the day without ever having seen one. Rejewski had been studying permutation in group theory and dedicated himself to finding the wiring for the rotors of the Enigma machine, which once successful afforded the technique to manufacture a machine (Bomba) to discover the daily settings in use by Enigma. This information was passed to the Allies in July 1939, prior to the German invasion of Poland and war being declared in September 1939.

Alan Turing was able to combine the knowledge generated by Rejewski and his Polish colleagues with Knox’s cribbing methods to build the prototype bombe machine, which took around ten and a half minutes to run through all 17,000+ permutations of the Enigma set-up. While this was impressive, Sir John Dermot Turing suggested that the value of the intelligence resulting from Enigma decryptions was relatively low. Instead, it was the Lorenz cipher machine used by the Nazi German High Command that yielded the highest value intelligence – the decryption of which was developed by Tiltman, Tutte and Flowers (for more, see Colossus by Jack Copeland). Turing used the rest of his presentation to discuss and dispel some myths surrounding Bletchley Park, and he answered audience questions, including one from a gentleman who studied under Max Newman in Manchester.

At this point, I should admit that my personal journey in mathematics ended after a below-average grade in AS Level Decision Maths back in 2001. Thus, when Professor János Körner from Sapienza University of Rome delved into the history of Claude Shannon’s work on information theory, I feared the complexity would be beyond my grasp. It is testament to Körner’s delivery and the diagrammatical nature of Shannon’s work – criticised at the time in peer review – that the concepts were accessible to a layperson like myself (for more see Waldrop in MIT Technology Review, 2001). Körner hinted at the management of creativity at Bell Laboratories – Shannon was often found juggling and riding a unicycle in parallel – and the benefits of interdisciplinary research. As a communications professional, I also found the concept of noise in a channel rather interesting in relation to spoken interpersonal communication. However, for the stronger mathematicians in the audience, Körner embarked on a journey through the unsolved zero error capacity issue, work by Lóvasz, the Berge conjecture on perfect graphs and its eventual proof by Chudnovsky, Robertson, Seymour and Thomas in 2006.

The political nature of cryptography and its history was discussed further by Professor Keith Martin, also at Royal Holloway, University of London, as he wove through the 20th century to contemporary issues and beyond. Focusing on the development of standards, first the Data Encryption Standard (DES) then later the Advanced Encryption Standard (AES), Martin provided a bold narrative of the conflict around regulation from trade controls of cryptographic hardware to the modern dilemmas faced as cryptography has become more fluid as software. Citing Snowden as an interesting example, Martin argued that the cryptography context has become so ubiquitous, large, and complex that it is now incredibly difficult for individuals to fully grasp the entirety of global information systems and their security. Therefore, today’s eavesdropper is just as likely to be looking for points of vulnerability within the entire system, rather than just decryption per se. Martin concluded that the ethics of cryptanalysis (who performs it and why) will continue to be relevant for years to come, as will cryptography in relation to the Cloud, the Internet of Things (including human-embedded technologies), and Quantum Computing. On this latter topic, Martin suggested that he was looking forward to discussing the cryptography community’s response at future history of mathematics conferences.

The speakers had provided rich context on the history of cryptography and codes, from the political to the technical, and discussion of secrecy in its many related forms; so when Clifford Cocks CB, FRS delivered the final presentation on his discovery of Public Key Cryptography in secret at GCHQ, later discovered in parallel by Diffie, Hellman, and Merkle, the audience was very well aware of just how important this breakthrough was (for more, see Levy in Wired).

The encryption methods by Cocks – and later Rivest, Shamir and Adleman (RSA) – are metaphorically similar to padlock-and-key security systems, although delivered through beautiful mathematics.

What I enjoy most when hearing directly from those responsible for major breakthroughs is the humble sense of human accomplishment and pride they have in their work. Cocks was no exception to this, clearly proud of his accomplishments and the successes of his colleagues, while considering them within the broader context of human knowledge discovery (see also films such as Particle Fever and AlphaGo). This prompts us to ask: who do we celebrate, how and why? Studying the history of mathematics, science and scholarship, in general, affords the opportunity to learn about the individuals and groups who achieved great things, whether or not they were acknowledged at the time. It allows us to look at the social constructs and infrastructures in place at the time – for example during times of war – helping us to question the kind of environment we make available to encourage further discoveries.

For this reason, I would like to take this opportunity to thank the British Society for the History of Mathematics (BSHM) and the Department of Economics, Mathematics and Statistics at Birkbeck for organising this insightful and accessible conference. On behalf of the audience, I extend gratitude to the speakers and the organiser, Professor Sarah Hart.

To attend upcoming BSHM events, including ‘Mathematics in War and Peace’, Wednesday 24 October 2018, please visit the website for more information.

Birkbeck offers courses in mathematics and history.

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.