Cybersecurity in the Age of Quantum Computing: A Cryptographer’s Call to Action

(This is a joint blog post between Accenture Federal Services and ISARA)

Acc_Logo_Black_Purple_RGB.png

ISARA_logo_transparent.png

Introduction
Computing power has advanced far beyond the invention of the transistor. Today’s digital infrastructures rely on public-key cryptography to safeguard commerce, state secrets and personal privacy. But a new paradigm is within our view. Quantum computers utilize superposition and entanglement to perform computations that are infeasible for classical machines. Unlike classical bits, which are either 0 or 1, superposition allows quantum bits (qubits) to be put into states that are "combinations" of 0 and 1, where the relative sizes of the constituent "parts" have important meaning. When multiple qubits are entangled, they act as one physical system regardless of the distance between them. This exponential state space allows quantum algorithms such as Shor’s and Grover’s to threaten the cryptographic foundations that are taken for granted. While there is not a "cryptographically relevant" quantum computer that exists today, experts estimate that one capable of breaking today’s public-key encryption could arrive within the next decade. This threat demands urgent action from cryptographers, security professionals and business leaders.

The Quantum Threat: Understanding Shor’s and Grover’s Algorithms

Classical vs. quantum computing
Classical computers manipulate bits using logical gates. Their performance against cryptographic problems such as factoring large integers, solving discrete logarithms or performing exhaustive searches is fundamentally limited because each operation must evaluate a single state at a time. Quantum computers operate on entangled, superposed qubits. By performing operations on these "combinations" of states rather than individual discrete states, they can solve specific problems exponentially or quadratically faster than can classical systems. This speedup threatens public-key cryptography, and to a lesser extent, symmetric cryptography.

Shor’s algorithm: a dagger to public-key cryptography
In 1994 Peter Shor devised an algorithm that factors integers and computes discrete logarithms in polynomial time on a quantum computer. These two problems underpin RSA, Diffie-Hellman and elliptic-curve cryptography (ECC). According to security analyses, Shor’s algorithm could reduce the time to factor a 2048-bit RSA key from billions of years to hours or days once sufficiently large fault-tolerant quantum computers exist. For equivalent classical security levels, it is generally accepted that ECDSA and other elliptic curve-based systems are more vulnerable to Shor’s algorithm than is RSA. Without new algorithms, digital signatures, key-exchange protocols and public-key infrastructures (PKIs) will be rendered obsolete.

Grover’s algorithm: squeezing symmetric key security
Grover’s algorithm gives a quadratic speedup for an unstructured search. Although (due to computational overhead) Grover’s algorithm does not cut the security of symmetric-key schemes in half, as is often said, it does negatively impact the security. While in the near- or mid-term, 128-bit symmetric keys might provide sufficient quantum security in certain applications, it is strongly recommended that applications transition to longer key lengths where feasible. This fix is straightforward through the doubling of the key length (ex., use AES-256 rather than AES-128), but hash functions and message-authentication codes must also account for this reduction in security margin. Grover’s attack does not completely break symmetric schemes, but it underscores the need for increased key sizes and stronger cryptographic designs.

The "Harvest Now, Decrypt Later" (HNDL) threat
Even without a large quantum computer today, adversaries can collect encrypted traffic now and store it until quantum capabilities mature. Researchers call this tactic "Harvest Now, Decrypt Later". A Federal Reserve study warns that obtaining a ledger replica and holding encrypted data allows attackers to decrypt it in the future, compromising confidentiality of transactions and personal data. NIST similarly notes that while no cryptographically relevant quantum computer exists today, it could appear in less than 10 years and that data encrypted under today’s algorithms "remains under threat" because attackers may harvest and later decrypt it. Sensitive data with long retention periods like medical records, state secrets and critical infrastructure telemetry must be protected now against future quantum attacks.

The Solution Space: Post-Quantum Cryptography

Post-Quantum Cryptography (PQC) refers to algorithms that resist both classical and quantum attacks. Unlike Quantum Key Distribution (QKD), PQC does not require specialized hardware. It runs on existing systems but relies on mathematical problems believed to be infeasible for both quantum and classical computers to solve. NIST warns that the transition from standardization to full adoption can take 10-20 years because vendors must update protocols and products. Therefore, algorithms need to be standardized long before the quantum threat fully materializes.

NIST’s PQC standardization process
To prepare for the post-quantum era, NIST launched the PQC Standardization Process in 2016. After three rounds of evaluation, NIST selected CRYSTAL-Kyber for key encapsulation and CRYSTALS-Dilithium, Falcon and SPHINCS+ for digital signatures. These selections were based on security analyses, performance and implementation considerations. Formal standards for ML-KEM (based on Kyber), ML-DSA (based on Dilithium) and SLH-DSA (based on SPHINCS+) were published in August 2024.

ML-KEM’s security is based on the Module Learning with Errors (MLWE) lattice problem; its standard describes three parameter sets (ML-KEM-512, -768 and -1024) with increasing security strength. ML-DSA is a lattice-based signature scheme that provides authentication and non-repudiation and is believed to remain secure against quantum adversaries; it also offers three parameter sets (ML-DSA-44, -65 and -87) with increasing security strength. SLH-DSA is a hash-based signature scheme that does not require managing state (as is the case with other hash-based signature schemes, such as HSS and XMSS) and offers strong security assurance, though its signatures are significantly longer than ML-DSA’s or Falcon’s. The SLH-DSA standard approves 12 parameter sets, but NIST intends to approve more in the future (offering different performance trade-offs).

NIST continued evaluating code-based and supersingular isogeny-based schemes to provide algorithmic diversity. In July 2022 the fourth round began with four KEM candidates: BIKE, Classic McEliece, HQC and SIKE. NIST’s fourth round status report notes that the SIKE team acknowledged its insecurity and withdrew the submission. After reviewing feedback and research, NIST selected Hamming Quasi-Cyclic (HQC) in March 2025. HQC, based on quasi-cyclic moderate-density parity-check codes, serves as a backup KEM for the general-purpose ML-KEM and offers strong security but uses larger keys and ciphertexts. NIST expects to release a draft standard for FN-DSA (based on Falcon) in late 2025 or early 2026. Falcon/FN-DSA provide small signatures and fast verification but are more complex to implement.

Underlying mathematics: lattice, code, and hash-based schemes

  1. Lattice-based: Both ML-KEM and ML-DSA rely on hardness of solving systems of noisy linear equations. ML-KEM derives its security from the MLWE problem while ML-DSA uses the Fiat-Shamir with aborts paradigm (with security based on MLWE and a nonstandard variant of MSIS called SelfTargetMSIS). FN-DSA’s security is based on the SIS problem over NTRU lattices.
  2. Code-based: HQC and Classic McEliece use error-correcting codes. HQC encrypts messages using quasi-cyclic codes and has well-understood decoding failure rates. Classic McEliece, though not selected, remains of interest due to its long-proven security and smaller ciphertexts but very large public keys.
  3. Hash-based: SPHINCS+ underlies SLH-DSA. Its security stems solely from cryptographic hash functions, avoiding number-theoretic assumptions. Signatures are large, but the scheme offers simplicity and strong security.

Implementation matters: verifiable decapsulation and secure integration
PQC algorithms are not only mathematical constructs; they must be implemented correctly. A 2025 pre-print warns that practical implementations of KEMs derived from the Fujisaki-Okamoto (FO) transform can be vulnerable if they skip the re-encryption check. The authors show that omitted checks allow adversaries to recover private keys and propose a verifiable decapsulation method that embeds a confirmation code into the key derivation function. They demonstrated the attack on an HQC implementation that omitted the check for 19 months. This underscores that migrating to PQC is not just about algorithms, it requires rigorous testing, secure coding practices and vendor scrutiny.

Hybrid mode: a belt and suspenders approach
Because PQC standards are new, many cryptographers need time to gain confidence in their security. A hybrid key exchange mode involves performing a classical key exchange and a post-quantum key exchange in parallel; the resulting session key depends on both secrets. Microsoft researchers explain that hybrid mode combines classical and post-quantum techniques so that breaking both would be necessary to compromise confidentiality. The Post-Quantum.com analyses notes that hybrid encryption provides defense in depth and mitigates the Harvest-Now-Decrypt-Later threat while allowing organizations to remain compliant with existing standards. Hybrid signatures similarly combine classical and PQC signatures to ensure interoperability and trust during the transition.

The Cryptographic Engineers Playbook: Migration Strategy

  1. Take inventory and develop crypto agility
    The first step toward quantum readiness is understanding what cryptography you use. NIST’s migration guidance stresses creating a cryptographic inventory which is a detailed map of all algorithms, protocols and keys used across applications, networks and hardware. This inventory supports risk management, policy enforcement and rapid response to cryptographic vulnerabilities. NIST refers to crypto agility as "the capabilities needed to replace algorithms in protocols, applications, software, hardware, firmware, and infrastructures while preserving security and ongoing operations". Practical steps include:
    ·    Cataloging applications and data flows: Identify every use of cryptography such as TLS, VPNs, secure messaging, encrypted databases and hardware security modules.
    ·    Recording algorithms and key sizes: Note whether RSA, ECC, or AES-128 is used, and document certificate expiration dates and trust anchors.
    ·    Assessing crypto agility: Determine how easily each component can be updated. Systems with hard-coded algorithms or proprietary protocols will require more effort.
    ·    Engaging vendors: Request roadmaps for PQC support and hybrid modes; evaluate firmware update paths and compliance with FIPS 203/204/205 and upcoming literature.

  2. Prioritize based on data value and lifespan
    Not all data is equal. Long-term secrets such as health records, intellectual property, classified documents and financial transactions remain sensitive for years or decades. Organizations should prioritize systems protecting long-lived data (7-20 years or more) and rotate keys or re-encrypt periodically. Short-lived session keys may be less urgent because they expire before quantum computers are expected to arrive. Risk prioritization therefore involves:
    ·    Classifying data: Data must be categorized by confidentiality and retention requirements.
    ·    Identifying where HNDL exposures exist: Data transmitted over public networks, stored in cloud environments or recorded on blockchains are at risk because adversaries can archive it for later decryption.
    ·    Planning re-encryption schedules: Periodically re-encrypt archives with stronger keys; adopt protocols offering forward-secrecy to limit exposure.
    ·    Segmentation and minimization: Limit the amount of sensitive data stored and ensure it is compartmentalized.

  3. Adopt and test post-quantum/hybrid primitives
    Once you know your exposure, begin integrating PQC. Consider the following practical challenges:
    ·    Key and signature size: PQC keys and signatures are larger than classical equivalents. ML-KEM offers three parameter sets with increasing security and decreasing performance. SLH-DSA signatures are long. Ensure protocols (ex., TLS, IPsec) and databases can handle larger payloads.
    ·    Performance trade-offs: Lattice-based schemes like ML-KEM and ML-DSA are generally fast, but code-based schemes such as HQC have larger keys and may tax constrained devices. Benchmark implementations on your platforms.
    ·    Protocol integration: Use standardized libraries that implement the FIPS 203/204/205 algorithms. For TLS, select libraries supporting hybrid ECDH+ML-KEM and hybrid ECDSA+ML-DSA. Verify certificate chains and handshake modifications.
    ·    Implementation security: Ensure decapsulation procedures verify ciphertexts correctly. Follow consistent coding practices and monitor side-channel resistance.
    ·    Testing and certification: Take advantage of NIST’s validation programs. Conduct interoperability tests across vendors and systems.

  4. Plan for continuous agility
    Quantum resilience is not a one-time upgrade; cryptography will evolve. Gartner and other analysts recommend embedding crypto agility into development pipelines to automate algorithm updates, maintain up-to-date cryptographic inventories and monitor NIST announcements. For legacy systems that cannot be updated, employ network-level protections (ex., quantum-safe VPNs) or decommission them. Document migration plans, train staff on PQC concepts and engage with standards bodies.

Conclusion: The Path Forward

Consensus on timelines and urgency
Experts disagree on the exact date when a cryptographically relevant quantum computer (CRQC) will come, but the consensus is narrowing. NIST cautions that it could be less than 10 years before quantum computers can break today’s encryption. The Global Risk Institute’s 2024 Quantum Threat Timeline Report, summarized at the SANS Emerging Threats Summit, estimates that within 5-15 years a CRQC could break standard public-key encryption in under 24 hours. Presenters at the summit warned that adversaries could circumvent RSA and ECC by the early 2030s. Waiting is not an option; migration and crypto-agility efforts must begin now.

The cryptographic engineer as the "Quantum Bridge"
Security executives often view the coming quantum shift as abstract physics. Cryptographic engineers must bridge this gap by translating quantum threats into concrete engineering actions. They must understand Shor’s and Grover’s implications, master PQC algorithms and implement hybrid strategies while communicating risk to leadership. This "Quantum Bridge" role ensure that organizational strategy aligns with the pace of quantum advancements.

Call to action
The single most important action for security leaders in 2025 is to initiate a comprehensive cryptographic inventory and mitigation plan. This plan should map existing cryptographic use, prioritize systems based on data value and lifespan, adopt hybrid PQC in critical paths and build crypto agility into procurement and development processes. Organizations that begin now will be ready to protect sensitive data when CRQCs emerge. Those that delay risk watching decades of security disappear overnight. The quantum era is coming and by acting today, cryptographers can secure the next generation of security.

At ISARA, we help organizations navigate the complexities of cryptographic transformations with purpose-built tools for quantum readiness. Whether you are just starting to think about quantum security—or want to—we’re here to help.

Let’s get your organization quantum ready.

Authors
Raymond P. Beecham, M.Sc.
Security Delivery Specialist, 
Quantum Community of Practice,
Accenture Federal Services.

Philip Lafrance, MMath, CISSP.
Standards Manager,
ISARA Corporation.