Quantum-safe cryptography — also known as post-quantum or quantum-resistant cryptography — refers to public-key cryptographic algorithms that are believed to be secure against attacks utilizing quantum computers. The public-key cryptography used today is not quantum safe. It is quantum vulnerable.
The security of any public-key cryptosystem is based on the difficulty of solving some underlying math problem. Current public-key cryptosystems, such as RSA and elliptic curve-based systems, depend on the difficulty of computing the prime factors of large integers and of solving the (elliptic curve) discrete logarithm problem, respectively. For big enough numbers (i.e., for cryptographically large numbers), these problems are believed to be implausible for any conventional computer or supercomputer to solve. In fact, the confidence in the security of these algorithms is so high that their usage is seemingly ubiquitous. Trillions of dollars in global digital economic activity are protected by these public-key algorithms.
In 1994, Peter Shor, an American mathematician, invented a quantum algorithm that efficiently solves the integer factorization and discrete logarithm problems; theoretically breaking current public-key cryptography. However, running Shor’s Algorithm in practice requires an adequately powerful quantum computer. Quantum computers powerful enough to threaten today’s public-key cryptography are called Cryptographically Relevant Quantum Computers (CRQCs), sometimes also called Cryptanalytically Relevant Quantum Computers. As of today, there is no reason to believe a CRQC exists. Yet.
It is impossible to predict exactly when a cryptography-breaking quantum computer will arrive (sometimes referred to as Q-Day or Y2Q). However, current best estimates suggest that the likelihood of one emerging within 5-10 years is materially high; with more conservative estimates claiming a reasonably high chance in around 15 years from now.
To some, 5-15 years might sound like a long time. Unfortunately, in terms of migrating our digital infrastructure to quantum-safe cryptography, 15 years might not be enough time. The quantum-safe migration will be deeply complex and will have far-reaching effects, with many large organizations potentially taking decades or longer to fully migrate. All organizations today are strongly encouraged to begin investigating and planning their quantum-safe migrations today, if they have not already started.
Just like current public-key cryptosystems, the security of a quantum-safe system depends on the difficulty of solving some underlying math problem. The critical difference being that the problems underlying quantum-safe systems are believed to be implausible even for quantum computers to solve.
Without getting into the details, there are five primary areas of mathematics which are believed to yield quantum-safe cryptosystems: multivariate quadratic polynomials, coding theory, cryptographic hash functions, isogenies between supersingular elliptic curves, and lattice theory.
Thankfully, quantum-safe cryptography can be implemented in much the same way as current (traditional) public-key cryptography. That is, quantum-safe algorithms can be implemented on conventional computers (i.e., a normal laptop, server, mobile device, etc.); no need for some kind of quantum hardware. Moreover, and depending on the application, certain quantum-safe algorithms are effectively a "drop in" replacement for current algorithms.
However, this will not always be the case.
Quantum-safe algorithms tend to have very different performance characteristics than do RSA and ECC – such as key size, signature/ciphertext size, algorithm runtimes, and other resource requirements. Meaning, different algorithms provide different advantages and disadvantages, and there will no longer be a "one-size-fits-all" algorithm similar to RSA or ECC. Each application utilizing public-key cryptography should be examined to understand its requirements and to discern which quantum-safe algorithms are most appropriate for it. Considering the global scope of applications which use public-key cryptography, the enormity of this task cannot be understated.
The US National Institute of Standards and Technology (NIST) ran a years-long process to identify, analyze, and eventually standardize a suite of quantum-safe algorithms. This process was known as the NIST Post-Quantum Cryptography (PQC) Standardization Process. In August 2024, NIST announced the publication of the first set of post-quantum Federal Information Processing Standards: FIPS 203, FIPS 204, and FIPS 205; and announced their intent to release more in the future. These publications formally standardized the ML-KEM, ML-DSA, and SLH-DSA cryptosystems. Check out the ISARA blog for a more comprehensive overview of the NIST PQC process.
NIST’s intent is to standardize multiple algorithms for digital signatures to replace the signatures specified in FIPS 186-5 (such as RSA, DSA and ECDSA), as well as multiple key-encapsulation mechanisms (KEMs) algorithms to replace the key-establishment algorithms specified in NIST SP 800-56 A/B (such as DH, ECDH, MQV, and RSA OAEP).
It is also worth noting that other organizations are pursuing post-quantum standards as well. Such as international standards development organizations like ISO and IEC, and even certain governments are making their own recommendations.
Further, quantum-safe roots of trust using stateful hash-based signatures are trusted, mature, and available today. Such algorithms have been standardized by NIST in SP 800-208 (based on the RFC 8391 and RFC 8554 specifications produced by the Crypto Forum Research Group) and are recommended for applications such as code-signing. Moreover, stateful hash-based signatures have been included in NSA's Commercial National Security Algorithm (CNSA) suite 2.0. NSA recommends that National Security Systems adopt stateful hash-based signatures for firmware- and software-signing applications by as early as 2025 and requires their usage by 2030. CNSA 2.0 also has requirements for the adoption of the post-quantum FIPS publications.
Algorithms based on different areas of math have distinct advantages. For example, it is generally thought that hash-based cryptography can provide the most secure algorithms for digital signatures. On the other hand, lattice-based key exchanges can be the fastest, while code-based key exchanges can have the shortest ciphertexts.
Another reason for developing algorithms from multiple areas of math is that if a vulnerability is found in one type of algorithm, it does not doom the other classes of post-quantum cryptography. Developers can even use a hybrid approach by combining algorithms from different areas to create even stronger cryptosystems. Standards already exist for certain hybrid schemes, with more currently under development.
An interesting quirk about public-key cryptography is that we can’t generally prove mathematically that a system is invulnerable. While we often have considerable mathematical evidence to support the security claims, definitive proofs are not currently possible. It is possible to do so for very specific types of cryptosystems (this is known as information theoretic security), but such systems tend to have limited practical utility, and we still cannot prove that the implementations are invulnerable.
Our trust in the security of current and quantum-safe public-key cryptography is rooted in the fact that they’ve stood the test of time against decades of cryptanalysis, deployment, and continual improvement. Experts from all over the world have analyzed, studied, and attempted to break these algorithms over and over again. When new analysis weakens a system, improvements are proposed, the algorithms are strengthened, and the cycle begins again.
NIST defines cryptanalysis as "the study of mathematical techniques for attempting to defeat cryptographic techniques and information system security. This includes the process of looking for errors or weaknesses in the implementation of an algorithm or in the algorithm itself."
Confidence in quantum-safe cryptography grows with the volume of study or cryptanalysis each algorithm undergoes as part of, for example, the NIST PQC Standardization Process. While some of the algorithms utilize areas of math that are mature and trusted, such as hash-based cryptography — which is over 40 years old —, other areas of math are relatively new. For example, supersingular isogenies were only proposed for use in cryptography about a decade ago. And indeed, the isogeny-based cryptosystem SIKE, a leading contender for standardization, sadly experienced a complete break late into the NIST process.
ISARA believes it is vital to take a diversified and agile approach to quantum-safe cryptography. Our strategy is to support many post-quantum algorithms in the ISARA Radiate™ Quantum-safe Toolkit so that in the unlikely event that a future theoretical breakthrough leads to an attack on one algorithm, others will be available to not only replace the broken scheme but to do so with as small of a switching cost as possible.
Guide
Blog Post
Blog Post