The Post-Quantum Reckoning Has Already Begun—And Your Adversary Knows It

The cryptographic lifespan of your organisation's most sensitive data was decided in 2022 when NIST formally standardised the first post-quantum cryptography (PQC) algorithms, and that decision has a shelf-life measured in months, not years. Yet the migration timeline that matters—not the comfortable 2030–2035 window published by government agencies and consultancies—is the one your adversaries are already executing: collect encrypted data today, break it with quantum capability tomorrow. This is called "harvest now, decrypt later" and it is not a theoretical exercise, it is an active threat against organisations holding sensitive intellectual property, healthcare records, financial instruments, and state secrets. Your migration window closed quietly while you were still running penetration tests.

The Industry Narrative: Measured Transition and Regulatory Handwaving

The orthodox industry position, articulated by bodies including NIST (SP 800-228), the CISA PQC Roadmap, and reinforced by the usual suspects in the big-four consulting ecosystem and legacy security vendors, frames quantum migration as a mid-to-long-term hardening exercise. The timeline is blandly reassuring: organisations have until 2030–2035 to transition critical infrastructure and high-value cryptographic material. The rationale is mathematically sound in isolation—building quantum computers that can run Shor's algorithm against RSA-2048 or ECDP-256 requires millions of stable qubits, which does not yet exist in production form. IBM's largest announced system operates around 1,000 qubits and remains prone to decoherence. D-Wave operates a different paradigm (adiabatic) entirely. Near-term practical threats from "Q-day" feel speculative, almost comfortable.

But this framing ignores the evidence already in the open literature. In 2024, security researchers including Robert Booker (Deloitte Cyber) and teams at ETH Zurich published analyses demonstrating that "cryptographically relevant quantum computers" (CRQC) capable of breaking current RSA-2048 and ECDP-256 in meaningful timeframes might require only 20 million logical qubits—a target that extrapolation of current chipset roadmaps (IBM, Google, IonQ, Atom Computing) could plausibly reach within a decade, possibly sooner under military or state funding. NIST's SP 800-228 and the concurrent guidance from the NSA (now the Cybersecurity and Infrastructure Security Agency) explicitly identified "store now, decrypt later" as an active adversary posture; the NSA Commercial National Security Algorithm Suite 2.0 (CNSA 2.0) mandate for federal and critical infrastructure procurement already demands PQC-resistant algorithms by 2035 at the latest, with migration of classified and sensitive-compartmented information (SCI) archives by 2033.

The regulatory picture is tightening. The European Union's proposed Cybersecurity Regulation (NIS2 Directive, now in force for most sectors) and the Digital Operational Resilience Act (DORA) for financial services do not yet explicitly mandate PQC, but they require organisations to maintain a "strategic cryptographic inventory" and justify cryptographic choices against emerging threats under Article 7 (Risk Management Framework) and Article 15 (Technical and Organisational Measures). The SEC's 4-day disclosure rule (effective February 2024) and Australia's APRA CPS 234 similarly demand material security incidents be disclosed; a breach of cryptographic material—or proof that legacy encryption has been compromised via quantum techniques—would trigger disclosure within days, not quarters. The Financial Conduct Authority's SM&CR regime and Singapore's MAS Technology Risk Management Notice (TRM) extend liability to senior management for inadequate cryptographic governance.

Yet organisations remain unprepared. A 2024 Accenture survey found that 73% of Fortune 500 organisations had not yet initiated cryptographic agility testing, and only 12% had a documented PQC migration roadmap. Most critically, in the same period, the Snowflake tenant cascade breach of 2024—which exposed metadata and encrypted customer archives through compromised service account credentials—revealed that even sophisticated organisations lack visibility into which customer data has been encrypted with legacy cryptography and which remains at risk of harvest-now attacks. The incident exposed the architectural fallacy: encryption is not a cipher, it is a system-wide assumption. When that assumption breaks, the entire data lifecycle becomes suspect.

The Structural Failure: Cryptographic Governance by Assumption, Not Architecture

The current industry approach treats PQC migration as a compliance project, typically delegated to AppSec or infrastructure teams as an inventory-and-remediate exercise. Organisations identify systems running TLS 1.3 with RSA or ECDP, schedule vendor communication, wait for PQC-capable TLS libraries, conduct limited testing in dev/staging, and stage rollout to production—the standard change management theatre. This is cryptographic governance by assumption: assume that encryption is in place, assume vendors will provide compliant libraries, assume no adversary is collecting ciphertext today, assume migration can happen without operational disruption.

The PULSE reading of this failure is architectural, not procedural. The problem is not that migration is slow; it is that the organisation has built no structural visibility, control, or agility into its cryptographic substrate. Encryption is treated as a property of the transport layer (TLS) or the storage layer (at-rest encryption managed by database vendors or third-party key management services), not as a continuous, adaptive system-wide control. This design creates several distinct failure modes under harvest-now threat:

First: Unknown ciphertext inventories. Most organisations cannot answer the question "what proportion of our customer data at rest has been encrypted with post-2025-compliant algorithms?" Because encryption is delegated to third-party vendors (cloud storage providers, database platforms, hardware security modules), governance is fragmented. A data lake in AWS S3 might be encrypted with KMS, but the KMS key derivation, rotation schedule, and algorithm version are controlled by the cloud provider's default policies, not the organisation's cryptographic strategy. Lateral movement in identity (as with the Scattered Spider attack on M&S in 2025, where compromised credentials exposed cryptographic key material stored in privileged identity systems) immediately breaks this assumption.

Second: No selective re-encryption architecture. Once harvest-now attacks are acknowledged as a material risk, the remediation strategy becomes "re-encrypt everything with post-quantum algorithms." But this creates operational chaos. A large financial institution with petabytes of historical transaction records cannot simultaneously re-encrypt everything without major downtime, and incremental re-encryption (new data flows with PQC, old data remains in RSA) creates a two-tier cryptographic estate that persists for years. Standard re-encryption workflows (read with old key, decrypt, encrypt with new key, write) create transient plaintext windows that are themselves attack surface. Most organisations have no streaming re-encryption pipeline, no dual-algorithm decryption capability, no staged re-keying system.

Third: Transport-layer PQC does not protect archived plaintext or key material. The Optus 2022 breach exposed 9.8 million customer records by compromising API authentication and then exfiltrating plaintext data; even if Optus had implemented post-quantum TLS, the breach itself was a failure of access control and data-plane segmentation, not encryption agility. More recently, the Change Healthcare 2024 ransomware attack (where LockBit actors extorted $22 million despite claims of "encrypted" backups) revealed that organisations often store decryption keys adjacent to encrypted data—a design that assumes the encryption is the perimeter, when the real attack surface is privilege and lateral movement. Cryptographic isolation is an afterthought, not a substrate.

The PULSE Doctrine: Zero-Knowledge Substrate and Cryptographic Drift

The PULSE approach to post-quantum readiness reframes the entire problem. Rather than treating PQC migration as a time-boxed remediation, we embed cryptographic agility into the foundational data architecture itself. This rests on three non-negotiable principles:

Zero-knowledge substrate. Data should never exist in a state where the organisation holding it can decrypt it without explicit, timestamped authorisation. This means:

This is not zero-knowledge proofs (though ZKP is a component). It is a design where the organisation's primary asset—the data—is cryptographically opaque to the organisation itself. If an adversary compromises the data store, they get ciphertext. If they compromise the key management system, they get key material that does not correspond to data they can read. Cryptographic authority is separated from data custody.

Cryptographic drift and continuous algorithm migration. Rather than a single "Q-day migration," the architecture implements continuous cryptographic drift—a background process that monitors threat assessment against cryptographic inventory, selects appropriate post-quantum candidates (from NIST PQC standards or emerging schemes), and re-encrypts data asynchronously, without synchronous locks or downtime. This requires:

Adaptive post-quantum threat assessment. The migration timeline is not fixed; it is driven by adversary posture and algorithm maturity. This means:

Implementation Architecture: Data-Plane Isolation and Cryptographic Primitives

In practice, the PULSE approach to post-quantum readiness looks like this:

Cryptographic isolation by data classification. Every data object is assigned a residual risk profile (RRP): the probability that an adversary currently harvesting ciphertext will possess quantum capability before the data's sensitivity expires. Genomic data, trade secrets, and financial derivatives have high RRP; transaction logs and marketing analytics have low RRP. High-RRP data is immediately migrated to post-quantum algorithms (or stored in plaintext with cryptographic controls as the secondary layer). This is not a global migration; it is a targeted, asynchronous process.

Functional encryption and order-preserving ciphers for queries. Instead of storing plaintext indexes or encrypted indexes that leak query patterns, the architecture uses order-preserving encryption (OPE) or deterministic authenticated encryption (DAE) for fields that require sorting or range queries, and functional encryption (FE) for aggregation. These are quantum-resistant primitives (lattice-based or multivariate polynomial schemes) and they enable queries on ciphertext without decryption.

Hardware security module (HSM) and TEE integration with cryptographic agility. Instead of storing all keys in a single KMS, the architecture distributes key material across HSMs (which can be updated to PQC-capable hardware) and trusted execution environments (Intel SGX, ARM TrustZone) that run adaptive cryptographic libraries. The TEE acts as a cryptographic engine that selects the appropriate algorithm based on the data's migration status.

Audit and compliance as a cryptographic substrate. Instead of logging access and then auditing logs (which are themselves subject to tampering), the architecture uses cryptographic accumulators and forward-secure signatures to bind every data access to an immutable cryptographic proof of authorisation. This makes PQC compliance auditable at the cipher level, not the process level.

The Regulatory and Operational Imperative

The 2035 deadline published by NIST and the NSA is not a migration target; it is a regulatory cliff. By 2033, federal and critical infrastructure procurement will mandate CNSA 2.0 compliance, which means organisations that supply to government or critical infrastructure—defence contractors, energy operators, financial infrastructure providers—will lose contracts if they cannot demonstrate PQC readiness. The SEC's 4-day rule and APRA CPS 234 mean that a breach involving compromised RSA keys will trigger material disclosure; the financial and reputational cost is no longer deferrable.

Yet the critical error in current planning is assuming that "compliance by 2035" means "you can start real work in 2030." Organisations that begin PQC migration in 2030 will find that vendors have not released stable, production-grade PQC libraries; that their legacy infrastructure cannot be upgraded without major replatforming; and that they lack operational experience with dual-algorithm decryption, cryptographic drift, and post-quantum audit. The organisations that will navigate this transition successfully are those that have already embedded cryptographic agility into their substrate—that treat post-quantum readiness not as a project, but as a continuous, architectural property.

The Snowflake cascade of 2024 and the Change Healthcare attack demonstrated that when cryptographic assumptions fail, they fail completely. The organisations that will survive the cryptographic transition are those where cryptography is not an assumption; it is a visible, controllable, auditable layer of the data architecture itself.

For organisations operating in regulated sectors, holding sensitive data over long timescales, or supplying to critical infrastructure, the question is no longer whether to migrate to post-quantum cryptography—it is whether your current architecture can support continuous cryptographic drift without operational disruption. If the answer is no, the work begins now.

Qualified operators holding responsibility for cryptographic governance in regulated or critical infrastructure sectors are invited to request a technical briefing under executed NDA.

Engagement

Request a briefing under executed Mutual NDA.

PULSE engages only with verified counterparties. Strategic briefing material — reference architecture, regulatory mapping, deployment topology — is released after counter-execution of the NDA scoped to the recipient's evaluation purpose.

Request Briefing →

Related Reading