Fixing Cryptography Before It Is Broken

By Carmen Kempka, Head of Corporate Technology, WIBU-SYSTEMS AG

Ever since Peter Shor showed how a quantum computer could factorize large numbers with exponential speedup, it has been known that quantum computers could become a threat to most cryptographic algorithms currently in use. Today, we have cloud access to several functioning, intermediate-scale quantum computers.

But how big is this threat? Which kinds of cryptographic algorithms are affected, and to what extent? When do we need to act, and most importantly, what can we do, and which obstacles are to be expected? This article intends to give decision-makers a brief overview of the current situation and some advice for possible next steps.

What is broken by quantum computers?

Quantum computers function inherently different from the computers we work with every day. In some cases, a quantum computer could achieve what is termed an exponential speedup. This allows them to solve certain hard mathematical problems, for which the device you are reading this article on would need decades or centuries, in a matter of hours.

Unfortunately, today’s cybersecurity relies on the difficulty of exactly these mathematical problems. Affected crypto algorithms include, among others, the well-known encryption schemes RSA and ECIES, the digital signature algorithms DSA and ECDSA, and the Diffie-Hellman key agreement. In other words: Virtually every protocol in current use that involves a certificate with a public key of some kind is likely to be broken by a quantum computer in the future.

The attacker won’t even need to break into a system first. A user’s public key, contained for example in a public certificate, would be enough to compute a matching “secret” key, with which an attacker could decrypt any message for this user, forge arbitrary signatures, or falsely authenticate as the key’s owner. Using only the information in the root certificate of a Certificate Authority (CA), the attacker could even compute the CA’s secret key and create new valid certificates. To prevent this from happening, quantum-safe alternatives need to be in place before the threat becomes real.

What about symmetric cryptography?

Symmetric encryption schemes like AES and hash functions like the SHA family would not be fully broken by a quantum computer. While a quantum algorithm proposed by Lov Grover could find a secret key for an AES encryption or the pre-image of a hash value, the efficiency gain over conventional computing is not as drastic. To achieve roughly the same security as pre-quantum, these algorithms need to be adapted, so that the bitlength of the secret key or hash value is doubled. In essence, this means an upgrade from AES-128 to AES-256 and from SHA-256 to SHA-384 or SHA-512. While this is good news, we cannot do entirely without the asymmetric schemes, which were invented to solve the problem of securely exchanging the secret key needed for symmetric encryption and to ensure integrity and authenticity in the mix.

Where are we today?

Today’s quantum computers have reached a so-called NISQ state (noisy intermediate-scale quantum computers). This means that they have become too large to be efficiently simulated by conventional computers and large enough to enable some basic error correction and perform some simple computations, but not large enough (in fact, far from large enough) to break our current cryptosystems. The NISQ state ranges from a capacity of 50 to a few hundred quantum bits (“qubits”). At the time of writing, IBM offers cloud access to a quantum computer with 127 qubits.

Luckily for the IT security world, quantum computers are very error-prone. The quantum circuit implementing Shor’s algorithm to break RSA 1024 or EDSA/ECIES 256 would require around 2000 “logical” (already error-corrected) qubits. However, more than a million “physical” (actual) qubits would be necessary to reliably execute a circuit of this size. Moreover, if we built a large enough quantum computer with today’s technology, such a circuit would have a runtime ranging from several hours to a hundred days, require rare resources (superconducting qubits, for example, need to be cooled down to almost absolute zero by a dilution fridge mixing Helium-3 and Helium-4 isotopes), and the result would be just one broken key. This doesn’t seem very profitable.

Nevertheless, technology in the quantum world keeps evolving, and expert groups like the National Institute of Standards and Technology (NIST) PQC group or the German BSI (Federal Office of Information Security) recommend having a plan B ready within the next 10-15 years.

Let’s talk about Plan B

Meanwhile, in the conventional computing world, something else has been evolving: post quantum cryptography (PQC). Cryptographic algorithms which run on a conventional computer and turned out to be robust against currently known attacks from quantum computers were published soon after the invention of public key cryptography. The well-known McEliece cryptosystem, for example, was published in 1978, three years before Richard Feynman came up with the first inkling of a quantum computer.

After the publication of Shor’s quantum algorithm, more and more approaches for quantum-safe cryptography were identified, including the usual cat and mouse game of breaking and improving cryptosystems. By today, we have quite a variety of cryptosystems at our disposal, based on different approaches. This variety is important, since we don’t yet know what other conventional or quantum algorithms great minds like Shor might come up with in the future.

In 2016, NIST started the process for standardizing quantum-secure algorithms. It began with a call for candidates, followed by several rounds of candidate selection. At the time of writing, the third round is about to end, finalists have been named, and candidates for standardization will soon be selected. According to NIST (https://csrc.nist.gov/Projects/post-quantum-cryptography/workshops-and-timeline), the first standardization drafts are expected to be finished in 2024.

One goal in the standardization process is to choose a selection of algorithms based on different mathematical approaches. This means that, should one of the new schemes be broken, we have a plan B. And a plan C. This also means that we have got a lot to learn.

The importance of crypto agility

The current situation poses a bit of a dilemma: on the one hand, we have well-known cryptographic algorithms like RSA and ECDSA, based on mathematical problems which have been studied for centuries, with one of the first known occurrences of the factoring problem being Euclid’s “Elements” of more than 2000 years ago. We know how to implement them. We know what to do to make them resistant against side channel attacks. We know how to implement them in efficient ways. And we trust them. Unfortunately, once a cryptographically relevant quantum computer exists, all bets are off.

Post-quantum cryptography, on the other hand, is based on mathematical problems which have been extensively studied in the last few years, but are still very recent.

This situation has brought forward the notion of crypto agility. Cryptographers recommend designing or adapting software architectures in a way that all cryptographic algorithms can easily be replaced once they are broken. This holds for the migration from current crypto to quantum-safe approaches, as well as substituting one quantum-safe cryptosystem with another, should one of the new schemes turn out to be not as secure as we thought it would be.

So, we just need to be flexible, right?

Given that the quantum-safe cryptosystems are based on different mathematical approaches, this is clearly more easily said than done. There are things developers (and crypto experts, too) need to learn anew for each approach. This includes optimizations, possibilities for memory/performance trade-offs, side-channel-robust implementations, and questions like which dataset is safe to keep in memory.

Flexibility can also be impeded by technical limitations: the new cryptosystems differ strongly in key size and the runtime required for key generation, encryption, decryption, signature generation, or signature verification. On a system with less memory capacity or computing power, the choice of quantum-safe crypto algorithms that can reasonably be run might be very limited.

Often enough we cannot or do not want to upgrade from current crypto to PQC only. In this case, new concepts need to be developed to combine both old (but well-known) and new crypto algorithms for reasons of downward compatibility, or to combine well-known schemes with new ones to provide at least as much security as the stronger scheme.

When should we act?

The threat of quantum computers still seems quite far off, so asking whether it is worthwhile to react now seems a valid question. In fact, if we do nothing, there is a significant probability that we will still be safe for the next 30 years or more. But there is also a significant probability that this all ends badly.

Mosca proposed a simple theorem that has since become a staple in the PQC community:

Let X be the time that crypto currently in use must remain secure

Let Y be the time needed to migrate your product/infrastructure to PQC algorithms

Let Z be the time we have left until a cryptographic relevant quantum computer exists.

Mosca’s theorem now simply says: “If X+Y > Z, then worry.” Which means, X+Y is basically the time during which current crypto is still in use and supposed to offer some security. If Z is shorter, secret keys are revealed, encrypted messages can be read, and signatures can be forged.

It is hard to predict what Z will be in real terms. Expert groups like the NIST PQC group or the German BSI recommend assuming Z to be around 10-15 years in order to stay on the safe side.

Mosca’s theorem leaves out one important detail: We are all part of a value chain, starting with the chip manufacturer offering hardware acceleration for certain crypto operations, and vendors of crypto libraries, to middleware and applications offering and using crypto libraries, to end users interacting with cryptographic functionality in arbitrary ways, through authentication, by actively protecting their data, or by setting up a secure connection. It is easy to see that migration to post-quantum security is not a one-man show. We need to deal with dependencies, extending the theorem to “If X + Y1 + Y2 + … > Z, then worry”.

Given the complexity of the migration to PQC, and considering the whole value chain, ten years might be over all too soon. Therefore, now is a good time to prepare for the migration to a more crypto agile approach.

Where to start?

On an organizational level, interdependencies with the migration process of suppliers and customers need to be identified. Start rethinking your requirements now and adapt concepts where necessary. One important issue to decide is whether to go for a soft transition, where old and new crypto works in parallel or offers double protection at least for a certain timespan.

On the development side, a good start would be to make an inventory of where cryptographic algorithms are used and which of these might be affected by a quantum attack. Hardware requirements or hardware limitations also need to be considered.

There are several open-source or public domain references and sample implementations of quantum-safe algorithms, most of which can easily be found following links on the website of the NIST PQC standardization: https://csrc.nist.gov/Projects/post-quantum-cryptography/round-3-submissions.

Benchmarks for most of the candidates for NIST PQC standardization were published in this paper: https://eprint.iacr.org/2019/844 alongside sample implementations for the ARM Cortex-M4 on https://github.com/mupq/pqm4.

About the Author

Carmen Kempka AuthorCarmen Kempka is the Head of Corporate Technology of WIBU-SYSTEMS AG. She studied computer science at the University of Karlsruhe (TH) with a focus on cryptography and quantum computing. After completing her PhD in 2014, she joined the Secure Platform Laboratories of NTT in Japan for two years of postdoctoral research in cryptography. She moved to WIBU-SYSTEMS AG in 2016, where she is now responsible for R&D projects and supports her colleagues with all questions about cryptography and security. Carmen can be reached online at carmen.kempka@wibu.com and at our company website http://www.wibu.com/.

FAIR USE NOTICE: Under the “fair use” act, another author may make limited use of the original author’s work without asking permission. Pursuant to 17 U.S. Code § 107, certain uses of copyrighted material “for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright.” As a matter of policy, fair use is based on the belief that the public is entitled to freely use portions of copyrighted materials for purposes of commentary and criticism. The fair use privilege is perhaps the most significant limitation on a copyright owner’s exclusive rights. Cyber Defense Media Group is a news reporting company, reporting cyber news, events, information and much more at no charge at our website Cyber Defense Magazine. All images and reporting are done exclusively under the Fair Use of the US copyright act.

Source: www.cyberdefensemagazine.com