Finished reading? Continue your journey in Tech with these hand-picked guides and tutorials.
Boost your workflow with our browser-based tools
Share your expertise with our readers. TrueSolvers accepts in-depth, independently researched articles on technology, AI, and software development from qualified contributors.
TrueSolvers is an independent technology publisher with a professional editorial team. Every article is independently researched, sourced from primary documentation, and cross-checked before publication.
Google's 2029 post-quantum cryptography deadline surprised cryptography experts and arrived years ahead of government targets. But the date isn't a prediction that quantum computers will break encryption by 2029. Understanding the two distinct threats it addresses, one already active, is what determines whether your organization needs to act now.

Post-quantum cryptography (PQC) refers to encryption algorithms designed to resist attacks from quantum computers, which operate on fundamentally different mathematics than today's systems. The transition from current encryption to these new standards is one of the most significant infrastructure projects in the history of digital security, and the window for an orderly migration is narrowing.
Google's Willow chip landed in late 2024, Craig Gidney's RSA-2048 paper dropped in May 2025, and the 2029 migration target arrived March 25, 2026. Each step followed a deliberate sequence from Google's own research bench to its security planning desk.
Google's March 25, 2026 announcement, written by Heather Adkins, VP of Security Engineering, and Sophie Schmieg, Senior Staff Cryptography Engineer, commits the company to completing its post-quantum cryptography migration by 2029. The drivers Google cited are three: advances in quantum computing hardware, improvements in error correction, and revised estimates of how many quantum computing resources are actually needed to break widely used encryption. The announcement states explicitly that the 2029 target does not assume a cryptographically relevant quantum computer (CRQC) will exist by that year.
That distinction matters enormously. A CRQC is a quantum machine powerful enough to run Shor's algorithm at scale, defeating RSA and elliptic-curve encryption in practical time. No such machine exists today. The 2029 date is not Google's forecast for when one will arrive. It is the deadline Google calculated for completing a multi-year migration before one plausibly could.
No authoritative source, including Google, predicts that a CRQC will exist by 2029. Expert estimates range from the early 2030s to beyond 2050, and Google's own blog post acknowledges this explicitly. The expert community's genuine disagreement on the timeline does not invalidate the urgency; it explains why a logistics-backward target makes more sense than a threat-forward one.
Brian LaMacchia, who led Microsoft's post-quantum cryptography transition from 2015 to 2022 and now works at Farcaster Consulting Group, told Ars Technica that Google's deadline is "a significant acceleration/tightening of the public transition timelines we've seen to date, and is accelerated over even what we've seen the US government ask for." His follow-up observation, that the timeline "raises the question of what's motivating them," is the key signal. When a credentialed expert is surprised by urgency from a company that builds both the attack vector and the defense, that surprise is meaningful.
This marks the first time Google has attached a specific migration deadline to its post-quantum work. The earlier call to action from February 2026 framed the issue in general terms; the March 25 announcement added a number to the conversation. By committing publicly to 2029, Google is doing two things simultaneously: managing its own internal infrastructure risk and applying industry pressure. Both motivations are legitimate, and both are real.
Most reporting on the Google announcement treats the quantum threat as a single problem with a future date. It is two problems, one of which has no future date at all.
The first threat is the one most people picture: a sufficiently powerful quantum computer running Shor's algorithm, factoring the large prime numbers that RSA encryption depends on, and decrypting traffic or stored data. This requires a CRQC, which does not yet exist. The Global Risk Institute's Quantum Threat Timeline 2024 found that most specialists place this event in the 2030s or later, and even optimistic forecasts put the probability of a CRQC arriving within ten years at under 20%. The uncertainty is genuine, and experts disagree because they are projecting across independent challenges: hardware stability, error correction, and algorithmic efficiency each progress on different timelines.
The second threat does not require a CRQC. It is already underway. Nation-state adversaries and sophisticated threat actors are collecting encrypted network traffic today, including financial records, diplomatic communications, healthcare data, and corporate trade secrets, and storing it. The strategy, known as harvest-now-decrypt-later (HNDL), assumes that a sufficiently capable quantum computer will eventually arrive and that the stored data will then be decryptable. The intelligence community has formally documented this as an active threat model. We cannot determine with certainty when nation-state adversaries began collecting encrypted data for future decryption, but intelligence agencies have documented this as an active operational threat model, not a theoretical one.
This distinction changes everything about how to evaluate urgency. Any organization holding data that must remain confidential for a decade or more is already in the HNDL exposure window. Medical records, government communications, intellectual property, financial transaction histories, and long-lived cryptographic key material all qualify. The question for these organizations is not "when will quantum computers arrive?" The relevant question is: when was this data collected, and how long must it stay secret?
Symmetric encryption, which protects data at rest using shared keys, is far less vulnerable to quantum attack than public-key cryptography. The immediate focus for most organizations is the public-key infrastructure underlying TLS handshakes, digital certificates, code signing, VPNs, and authentication systems. These are the systems where both the HNDL threat and the future signature-verification threat concentrate.
Understanding that Google's 2029 announcement is primarily a response to the second threat, not a prediction about the first, reframes what it means for organizations outside Google's infrastructure. The migration deadline is about protecting data being generated and transmitted today, not about guessing when a CRQC will exist.
The skeptic's rebuttal to all quantum threat warnings runs approximately as follows: current quantum computers have hundreds to thousands of noisy qubits, breaking RSA-2048 would require millions of error-free ones, and the engineering gap between those two numbers makes the threat distant. This rebuttal was more credible in 2019 than it is in 2026, for a specific reason that has nothing to do with hardware.
Craig Gidney's May 2025 arXiv paper found that a 2048-bit RSA integer could be factored in under a week by a quantum computer with fewer than one million noisy qubits. In 2019, Gidney co-authored a paper estimating the same task would require 20 million qubits, making the 2025 revision a roughly 20-fold reduction in the estimated resource requirement. The hardware assumptions in the two papers are identical: a gate error rate of 0.1%, a surface code cycle time of one microsecond, and a control system reaction time of ten microseconds. The reduction came entirely from better algorithms, not from better chips.
The 2019 Gidney estimate required 20 million noisy qubits; the 2025 revision requires fewer than one million; and a March 2026 Caltech/Oratomic paper demonstrated the same attack on ECC-256 with as few as 10,000 physical qubits on neutral-atom hardware.
A simultaneous Google whitepaper on breaking elliptic curve cryptography found the threshold for ECC-256 may be under 500,000 physical qubits with approximately 1,200 to 1,450 logical qubits. The historical compression of RSA-2048 qubit requirements runs from approximately one billion in 2012, to 20 million in 2019, to under one million in 2025. Each reduction came through published algorithms and error-correction improvements, not through hardware milestones.
This is the mechanism that makes the "we're years away from dangerous quantum computers" argument systematically unreliable. Because qubit compression comes from algorithmic innovation, it advances on a software timeline. A paper submitted to arXiv can reduce the apparent gap between today's hardware and encryption-breaking capability in ways that a hardware roadmap cannot anticipate. The encryption-breaking threshold does not move only when quantum computers get bigger; it moves whenever someone finds a more efficient algorithm, and those algorithms are published continuously.
How quickly this compression trend will continue is genuinely unknown; Gidney's own 2025 paper notes he does not see a further 10x reduction under current assumptions, but that same caveat appeared before every previous breakthrough. Security planning built on a static qubit threshold is planning built on an assumption that has been invalidated repeatedly. The prudent response is not to abandon timeline estimates entirely but to add a margin for the next algorithmic improvement that no one has published yet.
The combination of these technical data points and the HNDL threat model is what connects the research pipeline to organizational decision-making. It is not necessary to believe Google's implicit suggestion that a CRQC is near. It is sufficient to observe that the resource requirements keep shrinking, that the direction of travel is consistent, and that the data most at risk is being generated right now.
Google's 2029 target is not the only deadline in this landscape, but it is currently the most aggressive one from a major technology company. Understanding where it sits relative to government mandates helps organizations calibrate their own response.
NIST published three finalized post-quantum cryptography standards on August 13, 2024: FIPS 203 (ML-KEM, for general key encapsulation and encryption), FIPS 204 (ML-DSA, for digital signatures), and FIPS 205 (SLH-DSA, a hash-based digital signature backup). These are the algorithms that Google, Apple, Signal, and Cloudflare are deploying. They are not experimental candidates; they are finalized federal standards produced by an eight-year international evaluation process, and NIST is calling on organizations to begin adoption immediately. NIST selected HQC as an additional backup key encapsulation mechanism in March 2025. Google's March 25, 2026 blog post, authored by Heather Adkins, VP of Security Engineering, and Sophie Schmieg, Senior Staff Cryptography Engineer, specifies that the 2029 target reflects updates to quantum hardware development, error correction, and factoring resource estimates, not a prediction that a CRQC will exist by that date.
Google is not alone in converging on 2029 as a planning horizon. IBM has staked a public claim to fault-tolerant quantum systems on a similar timeline, and understanding what that claim actually means for error correction efficiency versus raw qubit counts is essential context for enterprise planners. Our analysis of IBM's 2029 fault-tolerance roadmap examines why architecture matters more than headline qubit numbers, and how that distinction should shape your organization's quantum security assumptions.
The NSA's CNSA 2.0 framework sets the mandatory migration schedule for US National Security Systems, and its structure reveals how the government thinks about the phasing problem. All new NSS must support CNSA 2.0 algorithms by January 1, 2027. Software and firmware signing must use quantum-resistant algorithms exclusively by 2030. Traditional networking equipment, such as VPNs and routers, follows the same 2030 deadline. Web services, cloud infrastructure, and operating systems must reach full compliance by 2033. Legacy and niche systems have until 2033 as the final hard cutoff. The overall goal, aligned with National Security Memorandum 10, is that all US national security systems will be fully quantum-resistant by 2035.
The NSA previously targeted 2031 as a milestone for national security systems, while broader US government guidance has consistently pointed to 2035 as the agency-wide endpoint. Google's 2029 deadline sits four to six years ahead of the government's own planning horizon.
That gap likely reflects something real about the difference between a company managing its own infrastructure and a regulatory framework designed for thousands of heterogeneous government systems. Google can set a hard internal deadline because it controls every layer of its stack. The NSA's mandate has to account for decades-old legacy systems in defense agencies, contractor networks, and classified environments with long procurement cycles. This difference in agility does not mean the government thinks the threat is less urgent; it means the transition path is structurally different.
Google's Android 17 implementation illustrates what early adoption looks like in practice. Android Verified Boot is being updated to use ML-DSA, adding quantum-resistant signatures to the boot integrity process. Remote attestation, the mechanism that lets corporate and cloud servers verify device integrity, is transitioning to a PQC-based architecture. Google Play will generate quantum-safe ML-DSA signing keys for new apps and for existing apps that opt in during the Android 17 cycle. These changes target the authentication layer first, reflecting Google's stated priority of securing digital signatures before encryption, because signature compromise enables impersonation of trusted systems before a CRQC can break stored data.
The interpretation that Google's unusual urgency likely reflects its dual position, building quantum hardware and defending against it, is consistent with the evidence. Google is simultaneously the organization most capable of estimating how close the threat is and the organization with the most institutional credibility to set a public example. Whether 2029 is the correct year for every organization is secondary to the question of whether the migration process should have already started.
The conversation about quantum threat timelines often ends with a vague call to "begin planning." This section is about what beginning actually means.
No meaningful migration can begin without knowing what needs to be migrated. A cryptographic inventory catalogs every system, application, protocol, and data store that uses public-key cryptography: TLS endpoints, certificate authorities, API gateways, code signing pipelines, hardware security modules (HSMs), identity and access management systems, and third-party vendor dependencies. This sounds straightforward; in practice, for any organization beyond small-to-medium scale, it is a substantial project requiring specialized tooling.
The Trusted Computing Group found that 91% of businesses lack a formal roadmap for quantum-safe migration; against that baseline, even the more generous industry survey finding of 40% "actively transitioning" likely counts organizations still in the planning and assessment stage. The inventory is what makes risk-based prioritization possible. Systems protecting data that must remain confidential for ten or more years should rise immediately to the top of that prioritized list, because those are the systems already inside the HNDL exposure window.
Crypto-agility is the architectural practice of designing cryptographic choices as externally configurable parameters rather than hard-coded dependencies, so that algorithm upgrades become operational tasks instead of engineering rewrites. An authentication service built in 2026 with a fixed RSA-2048 implementation will require a code overhaul to migrate. The same service built with algorithm selection isolated from application logic can migrate by updating a configuration file.
This distinction matters beyond the immediate PQC transition. NIST is still actively evaluating additional digital signature algorithms, and further standards updates will follow. Organizations that build crypto-agility into new systems now will be positioned to respond to those updates without treating each one as a replacement project. We note that current implementation guidance does not yet specify a single timeline for when crypto-agility must be demonstrated to regulators, but the trajectory of government mandates makes it a near-certainty within the CNSA 2.0 compliance window.
Only 14% of organizations have conducted a full assessment of their quantum-vulnerable systems, which means the vast majority have not yet built the foundation on which an agile architecture depends. Starting the inventory is not optional preparatory work; it is the prerequisite for everything else.
Organizations with mature cryptographic infrastructure still need two to three years for a comprehensive PQC migration; those with hardcoded cryptography in legacy applications need four to six years or more. These estimates are front-loaded in the discovery and architecture phases, which means organizations that begin inventory work now will reach production deployment significantly earlier than those who start in 2027. The effort is not linear; the hardest work comes first.
Hybrid deployments, running classical and post-quantum algorithms in parallel, are the standard transition architecture for most organizations. They provide defense-in-depth during the migration window, maintain backward compatibility with systems that have not yet upgraded, and give security teams practical experience with the new algorithms before they become the exclusive standard. Major vendors including Google, Apple, Signal, and Cloudflare have already deployed hybrid PQC across significant portions of their infrastructure.
Fewer than 14% of organizations have completed a comprehensive cryptographic inventory, while the NSA's CNSA 2.0 framework sets its first hard requirements beginning January 2027; organizations that haven't started discovery work are already behind the earliest federal compliance threshold. The data points toward a landscape where most organizations will face compliance pressure before they have completed even the inventory phase of migration. The readiness gap and the compliance calendar are pointing in opposite directions, and that gap widens with each month of delay. Every week that passes without a cryptographic inventory reduces the available runway for an orderly migration and increases the probability of a rushed, expensive, and error-prone transition when a deadline arrives that cannot be delayed.
The parallel with Y2K is imperfect but instructive. Y2K remediation succeeded because organizations had a specific, known, unmovable date. Quantum risk has no such date. It is possible that a CRQC will arrive in 2031. It is possible that algorithmic improvements and hardware advances will align in 2029, or in 2033, or in 2038. What is not possible is knowing in advance which of those scenarios is true in time to begin a four-to-six-year migration project after the fact.
Financial services, telecommunications, and government are currently leading on PQC migration, and they are doing so for a structural reason: these sectors hold exactly the data that the HNDL threat model targets most aggressively. Financial transaction records, classified government communications, telecommunications metadata, and healthcare records all carry long confidentiality lifetimes. A financial transaction from 2026 that must remain private for regulatory purposes over the next decade is inside the HNDL window today.
Defense contractors face the strictest deadlines through the NSA's CNSA 2.0 framework, which begins cascading compliance requirements through the defense supply chain starting with new system acquisitions in January 2027. Healthcare organizations face a parallel urgency driven by the sensitivity and longevity of patient data, even if formal regulatory mandates have not yet reached CNSA 2.0's specificity.
For organizations outside these sectors, the prioritization question turns on data lifetime. Any system protecting data that must remain confidential for ten or more years, regardless of industry, is exposed to HNDL risk on the same terms as a defense contractor. The sector label matters less than the answer to: how long does this data need to stay secret?
No. RSA and ECC are not broken. The quantum computers that would be required to break them at practical scale do not yet exist. Today's most advanced quantum systems operate with hundreds to thousands of noisy physical qubits; breaking RSA-2048 under the assumptions in Gidney's 2025 paper still requires a machine with under one million qubits operating at error rates and cycle times that no current system achieves.
Google's warning is not a disclosure that encryption has been broken. It is a migration planning deadline. The distinction is precise: a company that wants to be protected before a threat materializes must begin migration years in advance, because migration takes years. Setting a 2029 target does not mean the threat arrives in 2029. It means the migration needs to be complete by the time the threat plausibly could.
What the research trajectory does establish is that the resource requirements for breaking encryption keep declining, and they decline through algorithmic papers rather than hardware milestones. That trajectory makes the margin between today's hardware and tomorrow's capability harder to quantify reliably, which is exactly why Google explicitly declined to wait for certainty before setting an internal deadline.
Crypto-agility is the architectural practice of isolating cryptographic choices from the business logic that surrounds them, so that algorithms can be changed without rewriting applications. A system is crypto-agile if updating its signature scheme from RSA to ML-DSA requires changing a configuration parameter rather than auditing and rewriting application code.
The reason security architects emphasize this now is that no organization can be certain the standards landscape has stabilized. NIST is still evaluating additional digital signature algorithms beyond FIPS 203, 204, and 205. New vulnerabilities in any of the approved algorithms would trigger replacement cycles. Organizations that have built systems with crypto-agility treat those updates as operational tasks; organizations with hardcoded cryptography treat them as engineering projects that compete with every other development priority.
Crypto-agility is also the foundation of the hybrid deployment approach that makes the transition practical. Running classical and post-quantum algorithms in parallel requires systems that can negotiate which algorithm to use with a counterparty, which requires those systems to have been designed with algorithm selection as a first-class operational parameter. Building that flexibility in is far cheaper during initial development than retrofitting it during migration under compliance deadline pressure.
NIST's post-quantum standardization process ran for eight years beginning in 2016 and received 69 initial submissions from cryptographers worldwide. The three primary finalized standards, FIPS 203, 204, and 205, represent the algorithms that survived multiple rounds of public cryptanalysis. They are not the only output.
NIST selected HQC as an additional backup key encapsulation mechanism in March 2025. HQC is based on different mathematics than ML-KEM, making it a hedge against the possibility that lattice-based cryptography develops an unexpected vulnerability. NIST's project head Dustin Moody stated at the time that organizations should continue migrating to the 2024 standards and that HQC exists as insurance, not as a replacement candidate. A fourth algorithm, FALCON, to be published as FIPS 206 under the name FN-DSA, was selected for standardization and covers digital signatures using a different lattice structure than ML-DSA.
NIST is also evaluating a longer list of additional digital signature schemes submitted after the initial process. The overall architecture of the post-quantum standards landscape is designed with diversity in mind: multiple mathematical approaches provide resilience if cryptanalysts find weaknesses in any single family of algorithms. For most organizations, the actionable guidance is to begin implementing FIPS 203 and 204, the primary general-encryption and digital-signature standards, while designing systems that can accommodate algorithm updates without rearchitecting from scratch.