[Recommend my two-volume book for more reading]: BIT & COIN:  Merging Digitality and Physicality

The specter of quantum computing looms large in the public imagination, heralded as both a technological wonder and a cryptographic doomsday device. For over a decade, breathless headlines have promised that quantum computers will soon shatter the encryption that safeguards our digital world, from cryptocurrencies to national secrets.

This narrative is speculative, hyperbolical, and misleading. It conflates theoretical breakthroughs with practical realities. In the particular case of Bitcoin blockchain security, the narrative is simply wrong, even in theory, much more so in reality.

This essay aims to dismantle the myth of an impending “cryptopocalypse.” Drawing on scientific insights and engineering realities, we will navigate the chasm between quantum hype and truth.

The focus of this essay is on the potential impact of quantum computing on the Bitcoin blockchain. It is complementary to a previous article: Bitcoin and the quantum threat.

For a more general discussion of quantum computing, see: The Overhyped Quantum Computing.

Bitcoin blockchain is not based on encryption.

First of all, Bitcoin’s security model is based on hashing and digital signatures [Note], not encryption.

This is important because it means that, even if a future computer breaks an encryption standard equivalent to ECDSA used in Bitcoin, it does not threat the most essential part of the Bitcoin blockchain, which does not rely on encryption.

(Note: Digital signature used in Bitcoin is for verification of present possession of the key associated with an address. It’s not a legal signature that verifies the person’s identity, which is a very different concept.)

In a fundamental sense, Bitcoin does not have encryption at all. Bitcoin transactions are transparent, not encrypted.

Although a user can always choose to encrypt their own data attached to a transaction, such custom encryption is not part of the Bitcoin protocol. It is compatible with the Bitcoin protocol, but not part of it.

Basic fact: Bitcoin’s standard “Pay-to-Public-Key-Hash” (P2PKH) address does not expose the public key used for a transaction.

With P2PKH, the public key is kept secret (as well as the private key, of course), but only a double hash of the public key is made public.

Applying the SHA-256 hash function to an Elliptic Curve Digital Signature Algorithm (ECDSA) public key results in a lower amount of information when measured in terms of entropy. This holds true for both a single hash and a subsequent double hash.

The core reason lies in the fundamental nature of cryptographic hashing and the comparative sizes of the public key and the hash output. An ECDSA public key is not merely a random string of bits; it is a specific point (x, y) on a given elliptic curve, derived from a private key. The size of this public key is determined by the curve’s parameters.

For a common and relevant example like the secp256k1 used in Bitcoin, the public key is significantly larger than the 256-bit output of SHA-256.

Because the hash is smaller than the public key itself, it does not contain enough information to reverse-compute the public key (or its primers for an equivalent, such as factored numbers or ECDLP). The result is a one-way function, preventing any computing from breaking it, regardless of how powerful a computer is.

The above is a basic mathematical conclusion based on the essence of information. To faithfully transform or decode the information from one space to another (say, from a public key to a corresponding private key), the amount of information must be equivalent, though not necessarily in the same form.

The double hash used in Bitcoin adds another layer of security. Although double hash does not further reduce the size of the information space, it removes any concern that the single hash might have some unknown structure that provides a somewhat shorter path to discover the corresponding point in the original information space, specifically the public key and subsequently the private key in the case of Bitcoin.

In other words, Bitcoin’s double hashing of a public key creates an impenetrable one-way barrier. It is mathematically irreversible, even by the imagined quantum computers, as it lacks the information and structure needed for Shor’s algorithm.

The distinction between hashing and encryption

For encryption, however, the above-described one-way protection does not hold because it is fundamentally impossible to withhold the public key from the person who wants to send you an encrypted message.

Therefore, unlike Bitcoin hashing and signatures, encryption is at least theoretically subject to a decryption threat from a powerful computer.

The distinction between hashing and encryption is a point widely misunderstood by people in the cryptocurrency field.

Can the public key actually be hidden in Bitcoin?

One might ask, how can the public key be hidden in Bitcoin? Don’t miners need the public key to verify the validity of a signature in a transaction provided by the owner?

The answer is that the public key is hidden until a spending takes place. The owner reveals the full public key only when he decides to spend the bitcoin, as part of the transaction data. For bitcoin transactions, this works brilliantly: the public key is exposed only when its owner uses it to spend the bitcoin in an address.

This scheme offers robust “at rest” protection. You know your bitcoin is safe when you are not spending it. A key’s hash is immune to quantum attacks, as long as the public key itself is hidden.

A potential “race condition”

However, when the key is being used, the public key is exposed, creating a potential race condition. An imagined quantum computer could attempt to derive the public key and then the private key before the transaction is confirmed (e.g., in minutes for Bitcoin).

But this concern is exaggerated.

First of all, note that the so-called “race condition” is not an unconditional or random pervasive condition, nor an event that another person outside of your control can initiate. It happens only in a very specific condition triggered by a user’s release of a public key in attempting to spend the UTXO. Furthermore, the condition does not persist, but lasts briefly, only until the Bitcoin blockchain system accepts and confirms the spend.

In other words, the potential “race condition” in Bitcoin only leaves a very brief window for a quantum computer to steal the corresponding coin. Because this particular window necessarily occurs with the true owner’s knowledge and intention, the owner has much control.

For example, if the amount of coin involved is large, the owner would be motivated to split the UTXO into multiple smaller UTXOs, each requiring a different public key to mitigate the threat (see below section “Single-use keys and the scale of Bitcoin”).

For smaller amounts, the owner may choose not to worry, knowing that no one would attempt to crack a UTXO of the sub-dollar value using a hypothetical quantum computer that could potentially cost many thousands of dollars.

The real Bitcoin design solves the “race condition” issue

In addition, Satoshi’s original design of Bitcoin, namely Bitcoin Satoshi Vision (BSV), has an effective and practical way to prevent double-spending even if the above “race condition” becomes a real threat.

Bitcoin Satoshi Vision (BSV), the genuine Bitcoin blockchain, applies a strict first-send rule as a defense against a double spend in the same-block race condition.

In the upcoming Teranode framework, the prevention of double-spends during block construction relies on the same foundational principles as Satoshi’s original design, but executed with a radically different architecture built for hyperscale. The “first-seen” rule is abstracted away from a simple mempool check and implemented within a distributed, parallelized validation service that queries a high-performance UTXO database. This allows Teranode to reject double-spend attempts at massive speed and volume deterministically.

This core protocol function is further augmented by the symbiotic ARC transaction processor, which provides an essential layer of real-time risk assessment and fraud alerts for users. Ultimately, should any race condition persist, the immutable laws of Proof-of-Work and the longest chain rule provide the final, objective resolution.

Teranode, therefore, is not merely an effective defense against any double-spend attacks by an imagined or real quantum computer; it represents the fulfillment of Satoshi’s vision for an enterprise-grade and global-scale information system.

Just as a chasm exists between a mathematical structure and a physical machine, a practical separation also exists between the mathematically possible attack and real-life consequences.

Single-use keys and the scale of Bitcoin

However, what if the public key is accidentally or maliciously exposed?

The danger of this scenario is not a simple “race condition” but is more serious, because the window for hacking could be beyond the control of the owner and may exist for an indefinite period of time without the owner’s knowledge. In this scenario, a perpetrator has a much greater opportunity to use that imagined quantum computer to discover the private key and thus spend the associated UTXO.

This potential vulnerability underscores the need for single-use keys, another important aspect of Bitcoin advocated by Satoshi but misunderstood or ignored by most others, especially the BTC community.

Furthermore, the most effective and powerful practice is not only to use a key only once for a single transaction, but also to intentionally break a large transaction into many transactions, each involving very small amounts and utilizing a unique key.

To do this, however, the Bitcoin blockchain must be scalable, extremely efficient, and extremely low-cost.

Only BSV that has unbounded scalability and sub 1/1000th of a cent transaction fee is up to this challenge.

BTC, a sabotaged version of Bitcoin, is totally inadequate.

Concerning Bitcoin’s imagined quantum vulnerability, please read more: Bitcoin and the quantum threat.

Below, we further discuss a more fundamental and technical topic related to encryption and the quantum cryptopocalypse.

Why Are People Worried About Quantum Computers Breaking the current encryption standards?

As said above, Bitcoin is fundamentally immune to the potential threat of quantum computing due to its use of one-way functions in hashing and double hashing the public keys.

However, the concern of the quantum threat to general encryption is a separate but related topic and deserves more attention.

The fear that quantum computers could demolish modern encryption stems from Shor’s algorithm, a quantum method proposed by Peter Shor in 1994.

It is because Shor’s algorithm is different from the classical brute-force approaches.

Classical brute-force attacks cannot break modern encryption because the numbers involved are physically insurmountable.

  1. Private Keys (ECDSA): Your private key is a 256-bit number. A brute-force attack would need to guess from 2²⁵⁶ possibilities. This number is so astronomically large (comparable to finding one specific atom in the known universe) that the energy required to cycle through all options exceeds what humanity could ever produce. Its security is bound by the laws of physics, not just current technology.
  2. Hashing (SHA-256): Bitcoin’s hashing algorithm is a one-way street. It is easy to compute a hash from data, but computationally impossible for any classical computer to work backward and find the original data from the hash (see previous section “Bitcoin blockchain is not based on encryption”).

In short, Bitcoin’s security isn’t based on a clever secret that could be uncovered by computing, but on a search space so vast that it is physically impossible to explore with classical computers.

Unlike classical brute-force approaches, however, Shor’s algorithm exploits quantum parallelism and entanglement to transform integer factorization into a problem of finding a function’s period, solving it in polynomial time. For a number with n digits, classical algorithms take sub-exponential time, while Shor’s algorithm scales as ~n^2 log(n), an exponential speedup.

However, the math is one thing, and practical implementation is another matter. Current quantum computers have factored trivial two-digit numbers like 15 and 21 using Shor’s algorithm, feats achievable on a smartphone.

The concern is future-oriented: a sufficiently powerful quantum computer could decrypt data collected today, a strategy dubbed “store now, decrypt later.” While today’s quantum hardware is embryonic, the proven efficiency of Shor’s algorithm fuels worry about long-term security.

If Shor’s Algorithm Is So Efficient, Why Haven’t Quantum Computers Factored Larger Numbers?

The disconnect between Shor’s theoretical elegance and practical success is stark. If factoring difficulty scales polynomially (e.g., theoretically, factoring a 200-digit number is only 100 times more effort for factoring a 2-digit one), and quantum computing power is growing exponentially, why are we stuck at numbers like 15 and 21?

The answer lies in the distinction between physical qubits and logical qubits.

Physical qubits, which are fragile quantum states in hardware like superconducting circuits, are noisy, with error rates as high as 40%.

Logical qubits, the stable units needed for Shor’s algorithm, require bundling hundreds or thousands of physical qubits via Quantum Error Correction (QEC) to mitigate decoherence.

Given the necessity of multilayer correction (see below), the number of physical qubits required for a practical quantum computer that has enough logical qubits is likely to be much higher in the future, despite the breakthroughs of error correction techniques.

To factor a 200-digit number (approaching 256-bit encryption), estimates suggest ~2,300 logical qubits, requiring millions of physical qubits. Today’s best machines boast ~1,000 physical qubits, with logical qubit counts in the dozens.

The impressive qubit counts refer to physical qubits, not logical ones. Scaling to millions of high-quality physical qubits, all synchronized in a fault-tolerant system, is a physics and engineering challenge that we cannot even major let alone achieve at the present time.

How Many Logical Qubits Are Needed to Break ECDSA-256?

ECDSA (Elliptic Curve Digital Signature Algorithm) used in Bitcoin has 256-bit keys. ECDSA’s potential vulnerability lies in the Elliptic Curve Discrete Logarithm Problem (ECDLP), which requires quantum resources that are not dramatically less than RSA-2048 to breach.

Breaking ECDSA-256 is estimated to demand ~2,300 logical qubits with an extremely low error rate, compared to ~4,000 for RSA-2048. Depending on qubit quality and runtime goals (e.g., one day vs. one hour), this translates to millions or even billions of physical qubits.

The target remains far beyond current capabilities. Modern encryption is not at risk in the foreseeable future.

What Is the State of Logical Qubits in 2025?

As of mid-2025, the most advanced quantum computers have demonstrated ~48 logical qubits in experimental settings, achieved by a Harvard-led team with QuEra Computing using a 280-physical-qubit neutral-atom processor. Another milestone is Atom Computing and Microsoft’s entanglement of 24 logical qubits, creating a Greenberger-Horne-Zeilinger state and running the Bernstein-Vazirani algorithm. IBM’s Quantum Starling (200 Logical Qubits) is planned for 2029.

These are scientific feats, not practical tools. The 24-qubit experiment had a logical error rate of 10%, improved from 42% for physical qubits, but still many orders of magnitude too high for complex computations. These benchmarks do not represent real-world utility.

Scaling to thousands of logical qubits that have truly useful reliability (low error rate) remains a distant goal, hindered by the need for millions of physical qubits and robust error correction.

What about Error Rate?

Error rates in quantum computing are defined per gate operation (e.g., single- or two-qubit gates).

For example, the Atom Computing/Microsoft experiment reported a 10% error rate for logical qubit operations, where multiple physical qubits are bundled to reduce errors via QEC. The breakthrough of this system lies in its claim to show that scaling QEC can reduce errors, paving the way for fault-tolerant systems.

However, a 10% error rate per gate would be catastrophic, as a program with about ten gate operations would fail almost certainly. It’s useless for practical computation, as most practical programs would require far more than ten or even thousands of gate operations. Algorithms like Shor’s require 100 billion operations to break a modern encryption.

What Error Rate Is Needed for Practical Quantum Computing?

For an algorithm requiring 100 billion logical operations (e.g., breaking ECDSA-256), the error rate per logical operation must be ~10^-12 (one in a trillion) to ensure a high probability of success. Even a 10^-6 (one in a million) error rate would guarantee failure, as errors would accumulate thousands of times over.

Achieving 10^-12 is not possible through linearly improving hardware. It must leverage QEC’s exponential scaling and multilayer error correction schemes.

Is a One-in-a-Trillion Error Rate Realistic?

The leap from 10% to 10^-12, or even 10^-9 for that matter, is astronomical. But QEC’s scaling hypothesis says it is theoretically feasible. Experiments like those from Harvard and Microsoft validate that larger QEC codes reduce logical error rates.

The challenge is two-fold:

  1. Improve Physical Qubits: Reduce physical error rates from ~1% to 0.01% through advances in materials and control.
  2. Scale Systems: Engineer machines with millions of physical qubits, synchronized via complex classical computing infrastructure.

Companies like Google, IBM, and Microsoft are investing billions, betting on incremental breakthroughs in qubit quality and system integration.

The promise of QEC is that, using concatenated quantum error correction, each layer of error correction squares the reliability (i.e., squares the error rate downwards).

However, in order to do that, it must also square the resources (the number of physical qubits) upwards even in ideal conditions. And that is a problem.

The method essentially uses the redundancy of the previous layer’s logical qubits to reduce the error rate. But by doing that, the number of physical qubits required would also be increasing exponentially, canceling out any advantage you gain using the concatenated quantum error correction.

To avoid this from becoming a merely mathematical game and not creating any physical advantage, the hope has been placed in the fault-tolerance threshold theorem which promises some kind of a “break-even point” assuming that if the error rate of individual physical components (qubits and gates) is below a certain critical value (the threshold), you can use Quantum Error Correction (QEC) to make the error rate of your logical qubits arbitrarily low.

Whether or not the so-called threshold can be achieved remains unknown. But QEC has an even more fundamental problem: The above-described multilayered concatenated quantum error correction scheme is based on a huge assumption that is almost certainly false: that the failure or error of the quantum qubit is a mere random and linear phenomenon.

The failure or error of quantum qubits is a result of losing the coherence of quantum entanglement. This process is definitely not linear with time. That is, when the time span doubles, the chance for a quantum qubit to lose coherence does not double, but is orders of magnitude higher. And the speed of such decay is determined by the overall system condition. As a result, the assumption that the occurrences of an error of two logical qubits are completely independent of each other may not be true. They may be correlated, either through the shared common environment and system condition, or even by some unknown physics.

It is one thing to have a mathematical theory to show scalability; it is entirely another in terms of physics and engineering.

In other words, all the above is still at a mathematical level with no concrete proof in actual physics, much less engineering. See more: The Overhyped Quantum Computing.

As further discussed in The Overhyped Quantum Computing, people often draw an analogy between quantum qubits and semiconductors, ignoring that the physics and engineering of quantum qubits are not remotely analogous to that of semiconductors. The fact that semiconductors are superbly scalable was obvious from the very beginning. The scalability of semiconductors is not a surprise discovery later. Even before the first semiconductor integrated circuit was made in 1958, physicists and engineers were especially excited about the apparent and unique scalability of semiconductors.

In contrast, from the very beginning, the excitement of quantum qubits was their unique computing characteristics, not the potential scalability. On the contrary, the extreme conditions required by the physics of qubits have always screamed “unscalable” in your face from the beginning, and little progress has been made in this respect.

The mathematical scalability of quantum error correction is important and necessary, but it is not the same as physical and engineering scalability. So far, despite the encouraging QEC theory in math, everything in physics and engineering stacks against the scalability of the quantum computer.

The Cryptopocalypse Is a Myth

As far as transactions on a scalable Bitcoin blockchain are concerned, the quantum threat is fundamentally nonexistent. This is an advantage for BSV, the genuine Bitcoin blockchain according to Satoshi’s vision. To BSV, the threat does not exist even if the imagined ECDSA-breaking quantum computer should become a reality in the future.

The threat to ECDSA encryption in general is at least theoretically possible but practically distant at best. Breaking ECDSA-256 or RSA-2048 requires 2,300–4,000 logical qubits, translating to millions of physical qubits, a scale orders of magnitude beyond today’s and tomorrow’s technology. More important, unlike semiconductors, whose scalability was obvious from the very beginning, the scalability of quantum computers has not been proven beyond mere mathematics. Quantum computing faces huge obstacles in engineering and possibly dead blocks in physics. See more: The Overhyped Quantum Computing.

The quantum cryptopocalypse narrative is a distortion, born from people’s zeal for speculations, downplaying the significance of the error correction challenge, and conflating physical qubits with logical qubits, experimental milestones with practical capability, and theoretical threats with imminent danger.

Those who are building on blockchains, especially on the genuine Bitcoin blockchain BSV, may focus on building without distraction.

Read more: The Overhyped Quantum Computing.

[Recommend my two-volume book for more reading]: BIT & COIN:  Merging Digitality and Physicality

Share
#

Comments are closed

Recent Posts