In our series on the convergence of blockchain technology and AI, we have explored the potential of these two revolutionary forces while essentially taking for granted that no black swan event could disrupt their continued expansion.
In our article on AI-driven risk assessment, we have seen how AI can enhance cybersecurity through anomaly detection and predictive analysis in this new arena. But what would happen if a separate revolutionary technology were to come along and create a new technological paradigm shift within DeFi? What if that new technology could completely undermine the very foundations upon which DeFi sits by overcoming the cryptography that secures the entire industry?AI-driven risk assessment how AI can enhance cybersecurity through anomaly detection and predictive analysis in this new arena, but what would happen if a separate revolutionary technology were to come along and create a new technological paradigm shift within DeFi? What if that new technology could completely undermine the very foundations upon which DeFi sits by overcoming the cryptography that secures the entire industry?
That is precisely the future that many experts are concerned about with the advancement of quantum computing. Those concerns have been around since the earliest days of crypto but were further heightened with Google’s 2024 Q4 announcement of the creation of their newest quantum computer chip, Willow.
So, in this entry, we will explore quantum computing, its workings, its potential threat to DeFi, and whether the industry should be concerned about it moving forward.
When exploring the threat of quantum computing, it is important to first understand it as a radically different operating system from the classical computing we are familiar with today. To begin comprehending it, we must drill down to the smallest unit of digital information available: the bit. Bits are the granular building blocks for all modern-day computing and have historically been represented by either a zero or a one.
This foundational unit has allowed the structural development of all modern-day computing technology. The strength of the binary system enables a concrete foundation upon which larger, more complex systems can be built.
Quantum computing challenges the very essence of this binary system by creating an alternative to this computational unit. In quantum computing, the alternative qubits hold multiple differing positions, not only in where the traditional bit can be located but also in the multiple truths of whether it can be coded as one or zero.
Superposition is one of the cornerstones of quantum computing but is so abstract that it may be difficult for some people to process. The state of a bit in classical computing will always be represented by a one hundred percent certainty that it is either a one OR a zero. For quantum computing, the qubit is represented as both one and zero simultaneously. Imagine that something can both be yes and no at the same time. It doesn’t make much sense under traditional thinking and classical models.
The easiest way to explain this phenomenon is through twentieth-century physicist Erwin Schrödinger and his theory on the principle of uncertainty in quantum mechanics. You may be more familiar with it as Schrödinger’s Cat, whereby we are asked to imagine a scenario where a cat is placed inside a sealed box with a Geiger Counter and a piece of radioactive substance that releases poison as it decays. The theory is that because the radioactive substance is unpredictable in how and when it will decay, the cat is technically in a state of both being alive and dead since we cannot know for certain which is accurate until we open the box and view it for ourselves. Qubits are both alive and dead until they are forced to process computing.
If you are still reading, congratulations. The concept of quantum computing hasn’t melted your brain just yet. So, now we understand what qubits are and how they are represented, we need to examine how the interior particles inside each qubit relate to each other. This is known as entanglement and is the second cornerstone of quantum computing.
We have seen how quantum computing works on a model similar to Schrödinger’s Cat thought experiment. However, quantum computing takes the analogy one step further with entanglement. Rather than holding two computing states simultaneously, quantum computing involves more scenarios through multiple locations that interact and change each other. Imagine trying to navigate through a maze. Under classical computing, if your first route ends in a dead end, it can be classified as a zero. A second attempt also fails, and the process continues in sequence until the correct answer is found. In the same scenario, quantum computing would have all routes mapped out simultaneously, with each failed and successful route impacting the efficacy of every other situation.
As difficult as it is to try and grapple with the theory of quantum computing, it is important to do so if we are to consider just how radical the improvement in computing power will be. The superposition and entanglement of the qubits enable quantum computing to process problems on a scale that is magnitudes larger and faster than anything we can even comprehend today.
Our article on the limits to a technological revolution demonstrated the differing sizes of modern-day digital information. Everything from how many KB it would take to hold the average five-page word processor document to how many MB a three-minute MP3 audio file would require. Understanding the scale of digital data is key to understanding why reconstructing the fundamental building blocks from bits to qubits can have such a compounding effect.
With that frame of reference, we can begin to put into context just how powerful quantum computing can be. Google claims its new quantum chip, Willow, will take five minutes to process a problem that the fastest classical computers on the market today would take ten septillion YEARS to solve. Five minutes versus 10 000,000,000,000,000,000,000,000 years.
You may see where quantum computing can begin to pose a systemic threat to the DeFi industry. That scale of processing power is exactly the kind of technology that could usurp the cryptographic security that underpins the entire blockchain ecosystem. Brute force attacks that reveal private key information and siphon user accounts instantly.
In our earlier entries, we alluded to how integrating AI and blockchain interoperability could lead to a poison pill where malicious actors operate. Now, combining those theories with the potential of quantum computing makes it not difficult to imagine a future where the entire DeFi space is unwound in a matter of minutes.
If you are scared about the threat that quantum computing poses to DeFi, there are a number of reasons why you should remain optimistic. First and foremost, if you haven’t gathered by now, quantum computing is extremely difficult, complex, and expensive. Very few quantum computers exist today, with most being developed under very tightly regulated and controlled environments at IBM, Google, Amazon, and Alibaba. None of these companies have a vested interest in undermining DeFi's cryptographic security. Doing so would undermine the same cryptographic security used in securing legacy banks and national defense infrastructure, including nuclear reactors and weapons.
Secondly, it is often easy to overestimate the importance of our industry. With the DeFi space currently having $125 Billion in TVL, we are still a very young and small industry by comparison. Money market banks have a market cap of over $909,647 Billion, and integrated oil and gas has over $109 Billion. It may sound nihilistic, but if quantum computing accelerates and becomes a threat, attacking the DeFi space will be one of the least attractive targets because everything else in the world will be just as, if not more, susceptible to attack. Even if DeFi were targeted, we would be dealing with the societal fallout of a quantum computational attack on other, more pressing areas, like disrupted global supply routes, AGI, or the deployment of nuclear weapons. DeFi and safeguarding private keys would be the least of our concerns.
Additionally, as technology progresses and new threats arrive, so do new solutions. Post-quantum cryptography is high on the agenda for those engaged in this space. Developers are acutely aware of the threats that quantum computing could pose if released without a safeguard in place. The National Insitute of Standards and Technology (NIST) is leading the charge in establishing global standards for post-quantum cryptography.
We should also recognize the emergence of quantum computing as a force for good. The benefits of having a computational force as powerful as the one enabled by the quantum models will lead to unforeseen discoveries in everything from space travel to global IoT and AI integration. Through our series, we have already seen the accelerated pace of innovation in smart vehicles, medical breakthroughs, and drug discovery that AI has enabled. Quantum computing would further enhance and improve the pace of discovery at a rate unfathomable to even the most optimistic of futurists.
Finally, we should acknowledge that quantum computing is still in its infancy. Even with the development of Willow, the practical applications of quantum computers in the real world remain limited. To become fully operational and integrated with the real world, quantum computers must undergo extensive indexing of the current digital environment. Converting every industry to become robust against the threats of quantum computing or merely ready to embrace its benefits will be a herculean task that society must overcome. Whatever the future of quantum computing holds, it is sure to be an interesting and disruptive path. How DeFi, AI, and post-quantum cryptography respond will be one of the most defining periods of the twenty-first century.