[Recommend my two-volume book for more reading]: BIT & COIN: Merging Digitality and Physicality
The specter of quantum computing looms large in the public imagination, heralded as both a technological wonder and a cryptographic doomsday device. For over a decade, breathless headlines have promised that quantum computers will soon shatter the encryption that safeguards our digital world, from cryptocurrencies to national secrets.
This narrative is hyperbolical and misleading. It conflates theoretical breakthroughs with practical realities. In the particular case of Bitcoin blockchain security, the narratives is simply wrong, even in theory, much more so in reality.
This essay, structured as a series of critical questions and answers, aims to dismantle the myth of an impending “cryptopocalypse” with clarity, rigor, and a touch of assertive pragmatism. Drawing on the latest scientific insights and engineering realities, we will navigate the chasm between quantum hype and truth.
Bitcoin blockchain is not based on encryption.
First of all, Bitcoin’s security model is based on hashing and digital signatures, not encryption. Even if a future computer breaks an encryption standard equivalent to ECDSA used in Bitcoin, it does not threat the most essential part of the Bitcoin blockchain, which does not rely on encryption. (Note: Digital signature used in Bitcoin is for verification of the possession of the key of an address, not a legal signature that verifies the person’s identity, which is a very different concept.)
A standard “Pay-to-Public-Key-Hash” (P2PKH) address does not expose the public key used for the transaction.
The public key can be kept secret, but only the hash of the public key is made public. Because the hash is much shorter than the public key itself, and does not contain enough information to reverse-compute the public key (or its primers for an equivalent, such as factored numbers or ECDLP), it is essentially a one-way function, preventing any computing, regardless how powerful it is, to break such specific applications (that is, publicize only a hash of the public key, but keep both the private key and a public key secret).
In other words, publishing only the hash of a public key, as in Bitcoin’s Pay-to-Public-Key-Hash (P2PKH) scheme, creates a one-way barrier. A hash (e.g., SHA-256) is shorter than the public key and mathematically irreversible, even by quantum computers, as it lacks the information and structure needed for Shor’s algorithm.
One might ask, how can the public key a hidden in Bitcoin? Don’t miners need the public key to verify the validity of a signature in a transaction provided by the owner?
The answer is that the public key is only hidden before the transaction takes place. The owner reveals the full public key only when the owner decides to spend the bitcoin, as part of the transaction data itself. For bitcoin transactions, this works brilliantly: the public key is used only when its owner uses it to create a signature, and when the owner spends the bitcoin in an address.
This scheme offers robust “at rest” protection. An unused key’s hash is immune to quantum attacks, as the public key itself is hidden. However, when the key is used, the public key is exposed, creating a race condition. A quantum computer could attempt to derive the private key before the transaction is confirmed (e.g., in minutes for Bitcoin). This vulnerability underscores the need for single-use keys and quantum-resistant algorithms.
The above is an extremely important difference between hashing and encryption, a point widely misunderstood by people in the cryptocurrency field. For encryption, the above-described one-way protection does not hold. For encryption, it is fundamentally impossible to withhold the public key from the person who wants to send you a secret message, and therefore is at least theoretically subject to a decryption threat from a powerful computer.
Read more: Bitcoin and the quantum threat.
Why Are People Worried About Quantum Computers Breaking the current encryption standards?
The fear that quantum computers could demolish modern encryption stems from Shor’s algorithm, a quantum method proposed by Peter Shor in 1994.
It is because Shor’s algorithm is different from the classical brute-force approaches.
Classical brute-force attacks cannot break modern encryption because the numbers involved are physically insurmountable.
- Private Keys (ECDSA): Your private key is a 256-bit number. A brute-force attack would need to guess from 2²⁵⁶ possibilities. This number is so astronomically large—comparable to finding one specific atom in the known universe—that the energy required to cycle through all options exceeds what humanity could ever produce. Its security is bound by the laws of physics, not just current technology.
- Hashing (SHA-256): Bitcoin’s hashing algorithm is a one-way street. It is easy to compute a hash from data, but computationally impossible for any classical computer to work backward and find the original data from the hash (see previous section “Bitcoin blockchain is not based on encryption”).
In short, Bitcoin’s security isn’t based on a clever secret, but on a search space so vast that it is physically impossible to explore with classical computers.
Unlike classical brute-force approaches, however, Shor’s algorithm exploits quantum parallelism and entanglement to transform integer factorization into a problem of finding a function’s period, solving it in polynomial time. For a number with n digits, classical algorithms take sub-exponential time, while Shor’s algorithm scales as ~n^2 log(n), an exponential speedup.
However, practical implementation is another matter. Current quantum computers have factored trivial two-digit numbers like 15 and 21 using Shor’s algorithm, feats achievable on a smartphone.
The concern is future-oriented: a sufficiently powerful quantum computer could decrypt data collected today, a strategy dubbed “store now, decrypt later.” While today’s quantum hardware is embryonic, the proven efficiency of Shor’s algorithm fuels worry about long-term security.
If Shor’s Algorithm Is So Efficient, Why Haven’t Quantum Computers Factored Larger Numbers?
The disconnect between Shor’s theoretical elegance and practical success is stark. If factoring difficulty scales polynomially (e.g., theoretically 100 times more effort for a 200-digit number vs. a 2-digit one), and quantum computing power is growing exponentially, why are we stuck at numbers like 15 and 21?
The answer lies in the distinction between physical and logical qubits.
Physical qubits, which are fragile quantum states in hardware like superconducting circuits, are noisy, with error rates as high as 40%.
Logical qubits, the stable units needed for Shor’s algorithm, require bundling hundreds or thousands of physical qubits via Quantum Error Correction (QEC) to mitigate decoherence.
Given the necessity of multilayer correction (see below), the number of physical qubits required for a practical quantum computer that has enough logical qubits is likely to be much higher in the future, despite the breakthroughs of error correction techniques.
To factor a 200-digit number (approaching 256-bit encryption), estimates suggest ~2,300 logical qubits, requiring millions of physical qubits. Today’s best machines boast ~1,000 physical qubits, with logical qubit counts in the dozens.
The “exponential growth” in qubit counts refers to physical qubits, not logical ones. Scaling to millions of high-quality physical qubits, all synchronized in a fault-tolerant system, is a physics and engineering challenge akin to building a starship from a go-kart.
How Many Logical Qubits Are Needed to Break ECDSA-256?
ECDSA (Elliptic Curve Digital Signature Algorithm), with its 256-bit keys, is theoretically easier to break than RSA-2048. ECDSA’s potential vulnerability lies in the Elliptic Curve Discrete Logarithm Problem (ECDLP), which requires quantum resources that are not dramatically fewer than RSA-2048.
Breaking ECDSA-256 demands ~2,300 logical qubits, compared to ~4,000 for RSA-2048. Depending on qubit quality and runtime goals (e.g., one day vs. one hour), this translates to millions of physical qubits for both ECDSA and RSA.
Both targets remain far beyond current capabilities, requiring millions of pristine physical qubits. Neither is at risk in the foreseeable future.
What Is the State of Logical Qubits in 2025?
As of mid-2025, the most advanced quantum computers have demonstrated ~48 logical qubits in experimental settings, achieved by a Harvard-led team with QuEra Computing using a 280-physical-qubit neutral-atom processor. Another milestone is Atom Computing and Microsoft’s entanglement of 24 logical qubits, creating a Greenberger-Horne-Zeilinger state and running the Bernstein-Vazirani algorithm. IBM’s Quantum Starling (200 Logical Qubits) is planned for 2029.
These are scientific triumphs, not practical tools. The 24-qubit experiment had a logical error rate of 10.2%, improved from 42% for physical qubits, but still too high for complex computations. These benchmarks test system control, not real-world utility. Scaling to thousands of logical qubits remains a distant goal, hindered by the need for millions of physical qubits and robust error correction.
How Is Error Rate Defined, and Why Does 10% Matter?
Error rates in quantum computing are defined per gate operation (e.g., single- or two-qubit gates).
For example, the Atom Computing/Microsoft experiment reported a 10% error rate for logical qubit operations, where multiple physical qubits are bundled to reduce errors via QEC. The breakthrough of this system lies in proving that scaling QEC can reduce errors, paving the way for fault-tolerant systems.
However, a 10% error rate per gate would be catastrophic, as a program with thousands of gate operations would fail almost certainly. Most practical programs would require far more than thousands of gate operations. It’s useless for practical computation, as algorithms like Shor’s require billions of operations.
What Error Rate Is Needed for Practical Quantum Computing?
For an algorithm requiring 30–100 billion logical operations (e.g., breaking ECDSA-256), the error rate per logical operation must be ~10^-12 (one in a trillion) to ensure a high probability of success. Even a 10^-6 (one in a million) error rate would guarantee failure, as errors would accumulate thousands of times over.
Achieving 10^-12 is not possible through linearly improving hardware. It must leverage QEC’s exponential scaling and multilayer error correction schemes.
Is a One-in-a-Trillion Error Rate Realistic?
The leap from 10% to 10^-12 seems astronomical, but QEC’s scaling laws say it is theoretically feasible. Experiments like those from Harvard and Microsoft validate that larger QEC codes reduce logical error rates, confirming the theory. The challenge is two-fold:
- Improve Physical Qubits: Reduce physical error rates from ~1% to 0.01% through advances in materials and control.
- Scale Systems: Engineer machines with millions of physical qubits, synchronized via complex classical computing infrastructure.
Companies like Google, IBM, and Microsoft are investing billions, betting on incremental breakthroughs in qubit quality and system integration.
The promise of QEC is that, using concatenated quantum error correction, each layer of error correction squares the reliability (i.e., squares the error rate downwards).
However, in order to do that, it must also square the resources (the number of physical qubits) upwards. The method essentially uses the redundancy of the previous layer’s logical qubits to reduce the error rate. But by doing that, the number of physical qubits required would also be increasing exponentially, potentially canceling out any advantage you gain using the concatenated quantum error correction. To avoid this from becoming a merely mathematical game and not creating any physical advantage, the hope has been placed in the fault-tolerance threshold theorem which promises some kind of a “break-even point” assuming that if the error rate of your individual physical components (qubits and gates) is below a certain critical value (the threshold), then you can use Quantum Error Correction to make the error rate of your logical qubits arbitrarily low.
But all this is still at a mathematical level with no concrete proof in actual physics, much less engineering.
The Cryptopocalypse Is a Mirage
The quantum threat is at best distant. Current processors with ~1,000 physical qubits are useless for breaking encryption. The 48 logical qubits achieved in 2025 are a scientific marvel, but their 10% error rate is a non-starter for practical algorithms. Breaking ECDSA-256 or RSA-2048 requires 2,300–4,000 logical qubits, translating to millions of physical qubits, a scale orders of magnitude beyond today’s and tomorrow’s technology.
The quantum cryptopocalypse narrative is a distortion, born from people’s zeal for speculations, downplaying the significance of the error correction challenge, and conflating physical and logical qubits, experimental milestones with practical capability, and theoretical threats with imminent danger.
[Recommend my two-volume book for more reading]: BIT & COIN: Merging Digitality and Physicality
Comments are closed