Post-Quantum Cryptography (PQC): Current Risk
Published on October 28, 2025
In previous articles, we explored AI as a tactical weapon that is already reshaping attacks and Zero Trust as a strategic model for modern defence. Now, let's address a threat that does not aim to infiltrate our infrastructure, but to invalidate the cryptographic foundations on which it is built.
Analysts such as Gartner estimate that standard asymmetric encryption may not be secure by 2029 and could be completely broken by 2034. However, focusing on a specific date detracts attention from two much more pressing issues. Let's take a look at what they are:
- Harvest Now, Decrypt Later (HDNL): The risk is not future; it is present. Adversaries are already intercepting and storing huge volumes of encrypted data. They cannot read them today, but they are patiently storing them, waiting for a quantum computer (or another breakthrough) to be able to decrypt them. If a datum (industrial, health, government secret) must remain secret for 10 years, it is already compromised today.
- The Complexity of Migration: The transition to Post-Quantum Cryptography (PQC) is not a “patch day”. It is perhaps the most complex infrastructural migration ever undertaken, a colossal undertaking involving the entire global IT infrastructure.
The Real Challenge: “Hybrid” Architecture
The National Institute of Standards and Technology (NIST) has already standardized its first set of PQC algorithms, such as CRYSTALS-Kyber (now ML-KEM) and CRYSTALS-Dilithium (now ML-DSA). The point, however, is not to “switch off” RSA and “switch on” PQC.
For at least a decade, we will live in a “hybrid” cryptographic world. Our systems will have to manage an uncomfortable coexistence:
- Interoperability: systems upgraded to PQC will still need to communicate with billions of legacy devices (IoT, OT, embedded systems) that cannot be updated.
- Performance: The new PQC algorithms have different characteristics. Larger keys and heavier digital signatures can introduce latency. This is irrelevant for an email, but it is a huge problem for high-frequency payment systems or low-latency communications.
- Key Management: The complexity of Key Management Systems (KMS) will explode, as they will have to manage the lifecycles of classical and quantum keys in parallel.
PQC “As” a prerequisite for Zero Trust
As we have seen, the Zero Trust model is based on identity and continuous authentication. But what happens when the encryption that proves that identity (digital certificates, signatures) is no longer reliable? The entire Zero Trust architecture collapses if the mathematics on which it is based fails.
PQC is not a separate technological “silo.” It is the key prerequisite for ensuring that identity, access, and segmentation still have meaning in 2030. Quantum resilience is the natural evolution of “Crypto-Agility” that is, the ability to change cryptographic algorithms without having to redesign the entire architecture.
From Inventory to Action: A Pragmatic Approach
The question is no longer whether to migrate, but how to orchestrate this transition without disrupting business. The NSA roadmap (CNSA 2.0) sends a clear signal, recommending the start of adoption of quantum-resistant algorithms by 2025.
- Problem n.1: The Cryptographic Inventory. We cannot protect what we cannot see. The first real hurdle is mapping every single library, certificate, and hard-coded protocol in the infrastructure.
- Quantify the risk. Management does not react to “Shor,” it reacts to “Risk.” We need to map the data and then ask ourselves: “What is the value of this data if it is made public in 7 years?” This shifts the debate from the IT budget to the business continuity plan.
- Test the Impact. Architectures must start testing PQC algorithms now, in hybrid mode. Not (only) for security, but for performance. Will your VPN handle the overhead? Will your mobile apps experience delays?
- Check the Supply Chain. It is essential to ask your cloud, software, and hardware suppliers about their PQC roadmap.
The quantum transition has begun. It is not a future event, it is an engineering process that is already behind schedule.
The path to post-quantum security: where are we now?
The transition process towards post-quantum cryptography (PQC) is not only a technological issue, but also an organizational and strategic one. Every company is currently at a different point along this path, which we can ideally divide into three levels of maturity: Aware, Defined, and Managed.
Level 1 – “Aware” (Awareness)
In the first stage, the organization became aware of the threat posed by quantum computers, but has not yet taken any concrete action. This is the stage in which Top Management and Risk Management were formally informed about the PQC risk, including the issue of Harvest Now, Decrypt Later (HDNL) — i.e., the possibility that data encrypted today could be collected and decrypted in the future thanks to quantum computing capabilities.
During this initial phase, internal discussions often focus on fundamental concepts such as the “useful life” of data, that is, how long certain information must remain confidential. At the same time, the organization begins to monitor announcements from NIST and the roadmaps of its key suppliers—such as cloud providers, operating system manufacturers, or firewall manufacturers—to understand the direction in which the market is moving.
Level 2 – “Defined” (Inventory and Planning)
Once simple awareness has been overcome, a more operational phase begins. The organization starts to map what already exists and plan the following steps. This involves starting a cryptographic inventory, aimed at identifying where vulnerable algorithms such as RSA or ECC are used, both in applications and in internal systems and libraries.
Furthermore, critical systems and data at higher risk are identified, i.e., those that require protection for a long period of time. Based on this analysis, a roadmap for migration to post-quantum solutions is defined, even if only in preliminary form. At the same time, crypto-agility—the ability to easily replace one cryptographic algorithm with another—becomes a fundamental feature in new contracts and in the required specifications for suppliers.
Level 3 – “Managed” (Test and Integration)
Finally, in the “Managed” phase, the organization moves on to the active testing of PQC technologies. In controlled environments, post-quantum algorithms—such as ML-KEM—are tested to assess their performance impact in terms of latency, CPU usage, and bandwidth on critical applications. Hybrid-mode interoperability is also verified, combining classical and post-quantum algorithms in popular protocols such as TLS 1.3, while DevSecOps pipelines begin to include automated testing for new cryptographic libraries.
At which level is your organization today? And above all, what do you think are the main obstacles slowing down the path to post-quantum security? Is it a question of technology, budget, or legacy complexity?
Raffaele Sarno
Head Pre-Sale Manager, NEVERHACK Security Operation Department, Italy

