Encryption perspectives in a world of quantum computers

Feature articles |
By eeNews Europe

Academia and businesses are therefore intensely researching new cryptographic algorithms (usually public-key algorithms) which are expected to be efficiently secured against an attack using a quantum computer.

Various internet and industry standards use asymmetric cryptography based on RSA or elliptic curve cryptography (ECC) to protect data communication between smart cards, smart phones, computers, servers, or industrial control systems. As an example, with the RSA algorithm a public-key encryption (PKE) scheme can be realized that allows it to send an encrypted email (e.g., with PGP/ GPG or S/MIME) to a recipient without the need to first exchange a symmetric key on a secured channel – the public key of the recipient is sufficient to achieve confidentiality. Other applications of asymmetric cryptography are digital signatures, also based on RSA or ECC. They can be used to sign and verify data where the public key is used to check the validity of a signature. If a digitally signed contract or long term archives is modified after signing, even by one bit, a digital signature check will fail.

Together, PKE and digital signatures are both crucial in the Transport Layer Security (TLS) protocol which is the backbone of secured communication in the Internet and used by browsers, smart phones and increasingly also Internet of Things (IoT) devices.

However, since the seminal work of Peter Shor in 1994 it is known that RSA and ECC are susceptible to attacks by quantum computers. A quantum computer powerful enough to implement Shor’s algorithm would allow the factoring of the public key of the RSA cryptosystem in polynomial time. Additionally, the algorithm can also be adapted to break ECC-based public keys. Another quantum algorithm has been proposed by Grover that can be used to accelerate brute-force attacks on symmetric block ciphers like AES. However, it does not lead to a complete break but roughly halves the bit-security level of symmetric algorithms. As a consequence, the key length of algorithms with a 128-bit key and thus 128-bit security has to be doubled (moving to AES-256).


Post-Quantum versus Quantum Cryptography

Post-quantum cryptography (PQC) refers to new cryptographic algorithms executed on classical computers that are expected to be efficiently secured against attackers using quantum computers. In contrast, quantum cryptography (QC) or more specifically quantum key distribution (QKD) uses properties of quantum mechanics to establish a secret key between two parties. This secret key can then be used to encrypt bulk data using classic symmetric ciphers like AES. The security of QKD is not based on a computational assumption that a measurement of a quantum state (e.g., of the spin of a photon) destroys the state and that quantum states would be protected from being copied. As a consequence, if a passive attacker performs a measurement to obtain the key, this attack could be detected by the receiver. However, QKD is not a universal replacement of RSA and ECC.

In contrast, post-quantum cryptosystems are executed on classical computers and have the same high-level behavior as currently available ciphers so that they act as a drop-in replacement. Currently, the main concern of researchers in PQC is asymmetric cryptography (i.e., replacing RSA and ECC) as the threat of Grover’s algorithm to symmetric schemes can be mitigated by moving to already available 256-bit secured algorithms.

Pros and Cons of post-quantum and quantum cryptography.

To be protected against attacks by Shor’s algorithm, PQC schemes rely on fundamentally different hardness assumptions that are expected to be intractable by algorithms executed on quantum computers. However there is no known approach to ultimately prove that such hardness assumptions will always be true and intractable by a quantum or even classical computer. There are five common hardness assumptions or families of algorithms used to realize asymmetric post-quantum cryptography.

Hash-based signatures: Signature schemes whose security can be reduced to the hardness of breaking a symmetric hash function (e.g., SHA-256).

Code-based cryptography related to computationally hard problems in coding theory. However, to provide a sufficient level of post-quantum security, recommended parameter sets lead to public key sizes of roughly one megabyte.

Multivariate cryptography (MQ): Refers to cryptosystems whose security is based on the hardness of solving systems of multivariate polynomial equations. Most MQ schemes are signature schemes and usually feature very short signatures but very large public keys.

Lattice-based cryptography: A lattice is a regular structure of points and the security of lattice-based cryptosystems is related or reduced to the hardness of some mathematical problems in such lattices, e.g., finding short vectors.

Isogeny-based cryptography: Uses the relation the between two elliptic curves to design key exchange and digital signature schemes.


NewHope – a lattice-based key exchange scheme

The so-called lattice-based NewHope key exchange scheme developed by Alkim, Ducas, Pöppelmann and Schwabe is designed to replace (or complement) current key-exchange mechanisms using Diffie-Hellman or Elliptic Curve Diffie-Hellman. It offers about 256-bits of security and is very fast (about as fast or faster as ECC) with a ciphertext size of 2048 bytes. When implementing NewHope or other lattice-based cryptography, a clear difference is that a lot of computations on small numbers (e.g., 14 bit) are required while RSA and ECC need relatively few computations on very large numbers (e.g., 2048 bit).

This poses new challenges and renders current ECC and RSA accelerators largely useless but also provides opportunities for implementation and parallel processing, e.g., using the Intel Advanced Vector Extensions (AVX) or ARM NEON architecture extension that is often found in smart phones. In 2016 the New Hope algorithm was already tested by Google in a public release of a test-version of its Chrome browser and the NewHope authors were awarded the Internet Defense Prize 2016 by Facebook.

Infineon recently implemented New Hope on a contactless power-constrained smart card. This shows that some PQC schemes are practical and able to replace RSA and ECC. Another approach is to augment the security of RSA and ECC by using new PQC in parallel with them. However, more research is required to develop better schemes and to optimize implementations of existing schemes for energy consumption, low memory footprint, and high security.

Applications, outline, and standardization

Currently, a large number of academic proposals for PQC exist but no standards have been developed so far. Standards will allow implementation of interoperable protocols required to establish secured communication across all types of devices, services, and cloud infrastructures. However, it may take several years before first PQC standards are to be expected followed by a transition phase that will naturally also take some years. An exception are statefull hash-based signature schemes (e.g., XMSS or LMS) that are well understood and currently in a standardization process at the IETF.

In practice, PQC will most likely be first rolled out in high-value communications that require long term confidentiality. But still it is advisable to analyze long living infrastructures like electronic passports, or long-term signed digital archives now to find out about possible threats and to plan for mitigations or at least algorithm agility in the future.

It might happen that a post-quantum key exchange scheme gets introduced into the TLS protocol to obtain long-term confidentiality rather soon. This is possible as most devices executing TLS are rather powerful (e.g., smart phones or PCs) and currently have good update mechanisms. Technically, PQC implementation in TLS has already been shown to be possible with lattice-based schemes as the result of Google’s experiment with NewHope was positive.

For highly critical communication lines like governmental communication, telecommunication companies, datacenter connections, or enterprise intranets with long-term confidentiality requirements it is expected that PQC will get deployed soon. This is likely even without standards due to the risk of eavesdroppers storing the communication. When both endpoints of a communication line are controlled by one party the setup also becomes simpler due to less interoperability and maintenance concerns. Additional immediate measures could also be the use of pre-placed 256-bit symmetric keys and key rotation schemes at each communication endpoint for additional symmetric encryption of all the data.

The increasing connectivity of cars via mobile networks enables a lot of new services and business models. At the same time it introduces security risks into the system of the car. Additionally, complex security architectures based on asymmetric cryptography are currently implemented in cars. Due to the long lifetimes of cars in the field, the industry must try to find answers to the threat of quantum computers and at least plan for a response to the possible arise of quantum computing. Thus the development of mechanisms for secured firmware updates that can be used to roll-out PQC algorithms in the future is of most importance.

The increasing number of attacks on IoT devices shows the need for further security measures against attacks by quantum computers. That holds especially true for devices with a longer lifetime such as those used in industry automation, but also in smart home systems.

Infineon as a leading supplier of security ICs is committed to PQC and actively performs research in this field. To better respond to security threats that are yet to come, we continuously collaborate with the academic community, customers and partners. We have demonstrated the world’s first PQC implementation on a commercially available contactless security chip. And we push for future standards that can be executed efficiently and securely on small and embedded devices.


About the authors:

Dr. Thomas Pöppelmann is Security and Cryptography Engineer for Chip Card & Security at Infineon Technologies AG. Dr. Josef Haid is Head of Technical Marketing for Embedded Security Solutions at Infineon Technologies Austria AG –



1) New Hope implementation on a contactless security chip.

2) Pros and Cons of post-quantum and quantum cryptography.


Linked Articles
eeNews Europe