Claude Shannon's two papers, his
1948 paper on
information theory, and especially his
1949 paper on cryptography, laid the foundations of modern cryptography and provided a mathematical basis for future cryptography. His 1949 paper has been noted as having provided a "solid theoretical basis for cryptography and for cryptanalysis", and as having turned cryptography from an "art to a science". As a result of his contributions and work, he has been described as the "founding father of modern cryptography". Prior to the early 20th century, cryptography was mainly concerned with
linguistic and
lexicographic patterns. Since then cryptography has broadened in scope, and now makes extensive use of mathematical subdisciplines, including information theory,
computational complexity, statistics,
combinatorics,
abstract algebra,
number theory, and
finite mathematics. Cryptography is also a branch of engineering, but an unusual one since it deals with active, intelligent, and malevolent opposition; other kinds of engineering (e.g., civil or chemical engineering) need deal only with neutral natural forces. There is also active research examining the relationship between cryptographic problems and
quantum physics. Just as the development of digital computers and electronics helped in cryptanalysis, it made possible much more complex ciphers. Furthermore, computers allowed for the encryption of any kind of data representable in any binary format, unlike classical ciphers which only encrypted written language texts; this was new and significant. Computer use has thus supplanted linguistic cryptography, both for cipher design and cryptanalysis. Many computer ciphers can be characterized by their operation on
binary bit sequences (sometimes in groups or blocks), unlike classical and mechanical schemes, which generally manipulate traditional characters (i.e., letters and digits) directly. However, computers have also assisted cryptanalysis, which has compensated to some extent for increased cipher complexity. Nonetheless, good modern ciphers have stayed ahead of cryptanalysis; it is typically the case that use of a quality cipher is very efficient (i.e., fast and requiring few resources, such as memory or CPU capability), while breaking it requires an effort many orders of magnitude larger, and vastly larger than that required for any classical cipher, making cryptanalysis so inefficient and impractical as to be effectively impossible. Research into
post-quantum cryptography (PQC) has intensified because practical quantum computers would break widely deployed public-key systems such as RSA, Diffie–Hellman and ECC. A 2017 review in
Nature surveys the leading PQC families—lattice-based, code-based, multivariate-quadratic and hash-based schemes—and stresses that standardisation and deployment should proceed well before large-scale quantum machines become available.
Symmetric-key cryptography Symmetric-key cryptography refers to encryption methods in which both the sender and receiver share the same key (or, less commonly, in which their keys are different, but related in an easily computable way). This was the only kind of encryption publicly known until June 1976. cipher, used in most versions of
PGP and OpenPGP compatible software for time-efficient encryption of messages Symmetric key ciphers are implemented as either
block ciphers or
stream ciphers. A block cipher enciphers input in blocks of plaintext as opposed to individual characters, the input form used by a stream cipher. The
Data Encryption Standard (DES) and the
Advanced Encryption Standard (AES) are block cipher designs that have been designated
cryptography standards by the US government (though DES's designation was finally withdrawn after the AES was adopted). Despite its deprecation as an official standard, DES (especially its still-approved and much more secure
triple-DES variant) remains quite popular; it is used across a wide range of applications, from ATM encryption to
e-mail privacy and
secure remote access. Many other block ciphers have been designed and released, with considerable variation in quality. Many, even some designed by capable practitioners, have been thoroughly broken, such as
FEAL. Stream ciphers, in contrast to the 'block' type, create an arbitrarily long stream of key material, which is combined with the plaintext bit-by-bit or character-by-character, somewhat like the
one-time pad. In a stream cipher, the output stream is created based on a hidden internal state that changes as the cipher operates. That internal state is initially set up using the secret key material.
RC4 is a widely used stream cipher.
Message authentication codes (MACs) are much like
cryptographic hash functions, except that a secret key can be used to authenticate the hash value upon receipt; Thus, a
hash function design competition was meant to select a new U.S. national standard, to be called
SHA-3, by 2012. The competition ended on October 2, 2012, when the NIST announced that
Keccak would be the new SHA-3 hash algorithm. Unlike block and stream ciphers that are invertible, cryptographic hash functions produce a hashed output that cannot be used to retrieve the original input data. Cryptographic hash functions are used to verify the authenticity of data retrieved from an untrusted source or to add a layer of security.
Public-key cryptography Symmetric-key cryptosystems use the same key for encryption and decryption of a message, although a message or group of messages can have a different key than others. A significant disadvantage of symmetric ciphers is the
key management necessary to use them securely. Each distinct pair of communicating parties must, ideally, share a different key, and perhaps for each ciphertext exchanged as well. The number of keys required increases as the
square of the number of network members, which very quickly requires complex key management schemes to keep them all consistent and secret. and
Martin Hellman, authors of the first published paper on public-key cryptography In a groundbreaking 1976 paper, Whitfield Diffie and Martin Hellman proposed the notion of
public-key (also, more generally, called
asymmetric key) cryptography in which two different but mathematically related keys are used—a
public key and a
private key. A public key system is so constructed that calculation of one key (the 'private key') is computationally infeasible from the other (the 'public key'), even though they are necessarily related. Instead, both keys are generated secretly, as an interrelated pair. The historian
David Kahn described public-key cryptography as "the most revolutionary new concept in the field since polyalphabetic substitution emerged in the Renaissance". In public-key cryptosystems, the public key may be freely distributed, while its paired private key must remain secret. The
public key is used for encryption, while the
private or
secret key is used for decryption. While Diffie and Hellman could not find such a system, they showed that public-key cryptography was indeed possible by presenting the
Diffie–Hellman key exchange protocol, a solution that is now widely used in secure communications to allow two parties to secretly agree on a
shared encryption key. Diffie and Hellman's publication sparked widespread academic efforts in finding a practical public-key encryption system. This race was finally won in 1978 by
Ronald Rivest,
Adi Shamir, and
Len Adleman, whose solution has since become known as the
RSA algorithm. The
Diffie–Hellman and
RSA algorithms, in addition to being the first publicly known examples of high-quality public-key algorithms, have been among the most widely used. Other
asymmetric-key algorithms include the
Cramer–Shoup cryptosystem,
ElGamal encryption, and various
elliptic curve techniques. A document published in 1997 by the Government Communications Headquarters (
GCHQ), a British intelligence organization, revealed that cryptographers at GCHQ had anticipated several academic developments. Reportedly, around 1970,
James H. Ellis had conceived the principles of asymmetric key cryptography. In 1973,
Clifford Cocks invented a solution that was very similar in design rationale to RSA. In 1974,
Malcolm J. Williamson is claimed to have developed the Diffie–Hellman key exchange. Public-key cryptography is also used for implementing
digital signature schemes. A digital signature is reminiscent of an ordinary signature; they both have the characteristic of being easy for a user to produce, but difficult for anyone else to
forge. Digital signatures can also be permanently tied to the content of the message being signed; they cannot then be 'moved' from one document to another, for any attempt will be detectable. In digital signature schemes, there are two algorithms: one for
signing, in which a secret key is used to process the message (or a hash of the message, or both), and one for
verification, in which the matching public key is used with the message to check the validity of the signature. RSA and
DSA are two of the most popular digital signature schemes. Digital signatures are central to the operation of
public key infrastructures and many network security schemes (e.g.,
SSL/TLS, many
VPNs, etc.). Most
ciphers, apart from the one-time pad, can be broken with enough computational effort by
brute force attack, but the amount of effort needed may be
exponentially dependent on the key size, as compared to the effort needed to make use of the cipher. In such cases, effective security could be achieved if it is proven that the effort required (i.e., "work factor", in Shannon's terms) is beyond the ability of any adversary. This means it must be shown that no efficient method (as opposed to the time-consuming brute force method) can be found to break the cipher. Since no such proof has been found to date, the one-time-pad remains the only theoretically unbreakable cipher. Although well-implemented one-time-pad encryption cannot be broken, traffic analysis is still possible. There are a wide variety of cryptanalytic attacks, and they can be classified in any of several ways. A common distinction turns on what Eve (an attacker) knows and what capabilities are available. In a
ciphertext-only attack, Eve has access only to the ciphertext (good modern cryptosystems are usually effectively immune to ciphertext-only attacks). In a
known-plaintext attack, Eve has access to a ciphertext and its corresponding plaintext (or to many such pairs). In a
chosen-plaintext attack, Eve may choose a plaintext and learn its corresponding ciphertext (perhaps many times); an example is
gardening, used by the British during WWII. In a
chosen-ciphertext attack, Eve may be able to
choose ciphertexts and learn their corresponding plaintexts. Also important, often overwhelmingly so, are mistakes (generally in the design or use of one of the
protocols involved). Cryptanalysis of symmetric-key ciphers typically involves looking for attacks against the block ciphers or stream ciphers that are more efficient than any attack that could be against a perfect cipher. For example, a simple brute force attack against DES requires one known plaintext and 255 decryptions, trying approximately half of the possible keys, to reach a point at which chances are better than even that the key sought will have been found. But this may not be enough assurance; a
linear cryptanalysis attack against DES requires 243 known plaintexts (with their corresponding ciphertexts) and approximately 243 DES operations. This is a considerable improvement over brute force attacks. Public-key algorithms are based on the computational difficulty of various problems. The most famous of these are the difficulty of
integer factorization of
semiprimes and the difficulty of calculating
discrete logarithms, both of which are not yet proven to be solvable in
polynomial time (
P) using only a classical
Turing-complete computer. Much public-key cryptanalysis concerns designing algorithms in
P that can solve these problems, or using other technologies, such as
quantum computers. For instance, the best-known algorithms for solving the
elliptic curve-based version of discrete logarithm are much more time-consuming than the best-known algorithms for factoring, at least for problems of more or less equivalent size. Thus, to achieve an equivalent strength of encryption, techniques that depend upon the difficulty of factoring large composite numbers, such as the RSA cryptosystem, require larger keys than elliptic curve techniques. For this reason, public-key cryptosystems based on elliptic curves have become popular since their invention in the mid-1990s. While pure cryptanalysis uses weaknesses in the algorithms themselves, other attacks on cryptosystems are based on actual use of the algorithms in real devices, and are called
side-channel attacks. If a cryptanalyst has access to, for example, the amount of time the device took to encrypt a number of plaintexts or report an error in a password or PIN character, they may be able to use a
timing attack to break a cipher that is otherwise resistant to analysis. An attacker might also study the pattern and length of messages to derive valuable information; this is known as
traffic analysis and can be quite useful to an alert adversary. Poor administration of a cryptosystem, such as permitting too short keys, will make any system vulnerable, regardless of other virtues.
Social engineering and other attacks against humans (e.g., bribery,
extortion,
blackmail, espionage,
rubber-hose cryptanalysis or torture) are usually employed due to being more cost-effective and feasible to perform in a reasonable amount of time compared to pure cryptanalysis by a high margin.
Cryptographic primitives Much of the theoretical work in cryptography concerns
cryptographic primitives—algorithms with basic cryptographic properties—and their relationship to other cryptographic problems. More complicated cryptographic tools are then built from these basic primitives. These primitives provide fundamental properties, which are used to develop more complex tools called
cryptosystems or
cryptographic protocols, which guarantee one or more high-level security properties. Note, however, that the distinction between cryptographic
primitives and cryptosystems, is quite arbitrary; for example, the RSA algorithm is sometimes considered a cryptosystem, and sometimes a primitive. Typical examples of cryptographic primitives include
pseudorandom functions,
one-way functions, etc.
Cryptosystems One or more cryptographic primitives are often used to develop a more complex algorithm, called a cryptographic system, or
cryptosystem. Cryptosystems (e.g.,
El-Gamal encryption) are designed to provide particular functionality (e.g., public key encryption) while guaranteeing certain security properties (e.g.,
chosen-plaintext attack (CPA) security in the
random oracle model). Cryptosystems use the properties of the underlying cryptographic primitives to support the system's security properties. As the distinction between primitives and cryptosystems is somewhat arbitrary, a sophisticated cryptosystem can be derived from a combination of several more primitive cryptosystems. In many cases, the cryptosystem's structure involves back and forth communication among two or more parties in space (e.g., between the sender of a secure message and its receiver) or across time (e.g., cryptographically protected
backup data). Such cryptosystems are sometimes called
cryptographic protocols. Some widely known cryptosystems include RSA,
Schnorr signature,
ElGamal encryption, and
Pretty Good Privacy (PGP). More complex cryptosystems include
electronic cash systems,
signcryption systems, etc. Some more 'theoretical' cryptosystems include
interactive proof systems, (like
zero-knowledge proofs) and systems for
secret sharing.
Lightweight cryptography Lightweight cryptography (LWC) concerns cryptographic algorithms developed for a strictly constrained environment. The growth of
Internet of Things (IoT) has spiked research into the development of lightweight algorithms that are better suited for the environment. An IoT environment requires strict constraints on power consumption, processing power, and security. Algorithms such as
Ascon and
SPECK are examples of the many LWC algorithms that have been developed to match the criteria for the
CAESAR Competition and the standard set by the
National Institute of Standards and Technology. == Applications ==