In-Depth

Quantum Computing Is Coming for Your Data -- Are You Ready?

Quantum physics has been around since the 1920s, while applying the unique characteristics of it in a computer had to wait until the early 2000s. Recent standards from NIST and other government programs are pointing firmly to a reality -- your organization must start planning for a post-quantum future now.

In this article I'll go into the basics of quantum computing, what the potential risks are, what steps you can take to mitigate those risks, what others in the industry are doing and how you can adopt the recommendations outlined by several different governments.

Quantum Physics & Quantum computers
"I think I can safely say that nobody understands Quantum Mechanics." -- Richard P. Feynman.

I'm not a physicist of any type, but I've gleaned a basic understanding of the weirdness of quantum phenomena through reading and watching lots of presentations on the topic. In the quantum world (in reality, not the MCU variety), the building blocks of atoms can behave either as waves or particles, and if you observe one characteristic (position or motion), you can't know other characteristics.

In 1998 a two-qubit computer proved that harnessing these weird characteristics of quantum physics into a computing device was possible. A qubit (quantum bit) is the equivalent of the bit (0 or 1) in a classical computer, but unlike a bit, a qubit can be in a linear combination of two states in what's known as quantum superposition. And a quantum computer can manipulate the cubit, where wave interference amplifies the probability of the desired measurement result. Quantum algorithms create procedures that allow a quantum computer to perform this amplification across many qubits.

IBM, Google, Microsoft and many others have been working on different approaches to quantum computers for decades at this point, but it's slow going with the "size" of the computers measured in dozens or hundreds of qubits. This is progress, but one of the big challenges is noise in the system, so if you have 1,000 qubits, but a lot of noise, you'll need to use a lot of qubits for error correction, leading to a much smaller number of logical qubits that you can rely on when running your calculations.

Another big misunderstanding is that quantum computers will replace classical computers. First of all, they're currently insanely expensive, and (most of them) need to be supercooled, thus relegating them to labs or public clouds where you can rent access. Instead, quantum computers are not better at every type of calculation, instead being useful for very specific types of problems where they're expected to significantly outperform classical computers once they have enough qubits.

The class of problems where they'll shine are, for example, ones that simulate chemical processes, because these fundamentally happen because of electron exchanges, a quantum phenomenon. Complex problems with many, many different potential "paths," but only one or a few correct ones are also good candidates. A classical computer must try each path, one after another, whereas a quantum computer with the right program can calculate them all at the same time. Our current cryptographic algorithms rely on that characteristic, having to try each possible path serially, in classical computers to make it computationally too expensive to crack encrypted / signed data. And that brings us to the challenge quantum computers bring to cryptography.

Cryptographically Relevant Quantum Computer
A Cryptographically Relevant Quantum Computer (CRQC) is a quantum computing system powerful enough to break modern cryptographic algorithms. The timeline estimates for when they might become available vary from early 2030's to the 2040's, to much more distant. That day is often referred to as Q-Day.

As mentioned, the cryptography we rely on today, TLS, VPNs and digital signatures all rely on the fact that certain mathematical problems are intractable, or too costly / time-consuming to break. A CRQC with thousands of error-corrected qubits could potentially undermine this by solving these problems very quickly. One central algorithm is Shor's from 1994 which could theoretically be used to break the RSA and Diffie-Hellman asymmetric encryption protocols, using a CRQC. Another is Grover's algorithm from 1996 which promises the same kind of speed improvements for symmetric cryptography, halving the length of an AES key for example (AES-128 becomes AES-64), which is why you'll often see the minimum of AES-256 recommended, as 128 bits is still considered secure.

So, with all this in mind, you might say -- "this is nerd stuff, and I can see how it can be dangerous to my business when a CRQC is available, but I don't need to worry about it now -- I'll just deal with it when that happens." Problem is, that'll be way too late, kind of like human-caused climate change mitigation efforts.

A Now Problem -- Not a Future Problem
There are two parts to the risks that necessitate planning for post-quantum cryptography (PQC) now. The first is "Harvest Now, Decrypt Later (HDNL)." Several spy agencies around the world, including the NSA, are collecting vast amounts of encrypted data that they can't decrypt now, but will be able to when CRQCs are available. This means that if you store or transmit data today that contains PII, PHI or other data that you want to (or are compelled to by regulation) keep private for longer than say five, and definitely 10 years, you need to use quantum resistant algorithms for the encryption now.

The second problem is that it is a huge issue to fix, akin to the Y2K challenge. Finding every system, device and part of your network that uses encryption, what type of algorithm, understanding the type of data stored or transmitted by it and producing a prioritized action plan is a big project. Some systems will be relatively easy to upgrade, but many won't be, and in some cases, it'll be impossible to upgrade the software in embedded systems to support stronger algorithms, requiring replacement. And you don't want to be the organization that starts this project in the early 2030s when many other organizations also do so, looking for replacement hardware. This is especially important if you have specialized devices that encrypt data inline for transmission over high bandwidth links, these will require extra attention for PQC readiness.

Quantum-Resistant Algorithms
NIST in the US has led the work on soliciting different proposals for quantum resistant algorithms, and over the last year, several of these have become standards:

  • FIPS 203 defines a cryptographic scheme called Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM), which is derived from the CRYSTALS-KYBER submission.
  • FIPS 204 is the Module-Lattice-Based Digital Signature Algorithm (ML-DSA), based on the CRYSTAL-Dilithium submission.
  • FIPS 205 specifies the Stateless Hash-Based Digital Signature Algorithm (SLH-DSA), which is derived from the SPHINCS+ submission.

NIST is also looking at developing a FIPS for digital signatures based on the fourth submission, FALCON. MLA-DSA is used to verify identity, integrity or authenticity using digital signatures, ML-KEM is used where public key encapsulation or key exchange is desired. Initially the transition will be hybrid, where existing encryption schemes will be used in parallel with the PQC ones. At the core level, each of these new encryption standards use different types of mathematics (mostly lattice based, hence the references to crystals in the names of the original submissions) which is resistant to the type of calculations a CRQC performs.

A requirement to adopt these new algorithms for traffic on the web will be Transport Layer Security (TLS) version 1.3, so make sure to require this for all traffic as a starting point.

The European Union outlined their implementation roadmap back in April 2024, and the U.S. had requirements for all government organizations to implement PQC, which the current administration downgraded to a recommendation.

Various government organizations around the world have also outlined their plans, the Australian Signals Directorate, ASD, (equivalent to the U.K.'s NCSC or the U.S.'s NSA) recently published this article for businesses and government organizations, outlining the requirement to start planning for CRQCs now. The NCSC has an article, and a set of timelines with 2028 the deadline for having discovered all services and infrastructure, and having the initial plan in place, 2031 for completing the highest priority migrations and 2035 for a completed migration.

PQC Transition Timeline
[Click on image for larger view.] PQC Transition Timeline Against the Increasing Risk of a CRQC Becoming Available (source: ASD).

Microsoft, one of the contenders for possibly creating the first CRQC, are aiming to have all of their services transitioned to PQC by 2033. And the Post Quantum Cryptography Coalition have their migration roadmap, through their Quantum Safe Program (QSP).

Microsoft QSP Strategy and Timeline
[Click on image for larger view.] Microsoft QSP Strategy and Timeline (source: Microsoft).

As for operating systems, preview versions of Windows 11 and Windows Server have updated versions of SymCrypt, the same library that's used across Azure and Microsoft 365. ML-KEM and ML-DSA are already available in SymCrypt, both on Windows and Linux. SymCrypt-OpenSSL also offers the same support for OpenSSL. Apple is also including PQC in their CryptoKit for developers, and iMessage in iOS and TLS 1.3 in iOS26 are already incorporating PQC.

Cloudflare is ahead of the curve here, boasting that "over 45% of human-generated internet traffic sent to their network is already post-quantum encrypted."

One of the core concepts when you're looking at your own applications, and updating their code, is crypto agility. As much as possible, build your software so that you can "swap out" cipher suites, or entire algorithms as newer ones become available. Having to rely on outdated cryptography because updating your code is too difficult isn't good enough anymore.

Many of the new PQC algorithms might be found to have flaws, or require changes in the next few years, making crypto agility even more important.

Conclusion
With so many contemporary cyber security risks and attacks vying for attention, taking a step back and looking at the wider impact of CRQC, Q-day and PQC might seem difficult. If you're a small business, using mostly SaaS services, this will be generally taken care of by your providers (make sure to inventory all your systems though). But if you're a larger enterprise, this project will only get harder and more complex (with less talent around to hire / retain as time goes on) the longer you leave it, so start your planning now.

Featured

Subscribe on YouTube