To Bit or Not to Bit? Qubit is the Answer

Today, I'd like to delve into a topic that feels a bit like venturing into uncharted territory for me: quantum computing. 

While I'm far from an expert, my curiosity has always been piqued by this fascinating technology. So, despite my limitations, I've tried to grasp its essence and potential as I try to share it with you today. 

And while diving into this very technical and extremely interesting field has been illuminating in many ways, it's not just intellectual curiosity driving me. It's the conviction that quantum computing is not a distant fantasy but a looming reality, likely arriving sooner than many anticipate.

So. What is Quantum Computing and why can it be so hard to grasp?

The inherent complexity of quantum computing stems largely from the counterintuitive nature of quantum mechanics itself. Our brains, honed by evolution to navigate the macroscopic world, didn't require an understanding of the universe at its most fundamental level to ensure our survival. Classical mechanics, which elegantly describes the trajectory of an arrow, the workings of a car engine, and the necessity of an antenna for radio and television reception, sufficed for our everyday needs.

However, if we truly aspire to comprehend the universe around us, from the most distant galaxies to the humble light bulb illuminating our tables, we must confront the quantum realm. 

This isn't merely an academic pursuit. Unlocking the secrets of quantum physics presents us with unprecedented opportunities and capabilities.

As the brilliant American theoretical physicist Richard Feynman famously quipped: "Nature isn’t classical, dammit. And if you want to make a simulation of nature, you’d better make it quantum mechanical.”

Let's begin with the fundamentals. 

Our senses are attuned to the objects and phenomena governed by classical mechanics. These physical laws are based on the existence and interaction of elementary particles. This framework serves us well until we venture beyond a certain boundary: the atomic scale. 

As we approach the realm of atoms, the laws of classical mechanics begin to break down. This is where quantum mechanics steps in, extending our understanding into the atomic and subatomic world by introducing the concept of wave-particle duality. In essence, matter at this level exhibits the properties of both particles and waves, simultaneously.

This concept can be challenging to visualize, but an analogy might help. Imagine the ripples that spread across a pond when a stone is thrown into the water. Similarly, when a particle moves at the atomic level, it behaves like these waves. Crucially, particles at this scale are never truly still; they are always in motion, vibrating constantly.

Therefore, to grasp the principles of quantum physics, a basic understanding of wave behavior is essential. A key concept associated with waves is interference, the phenomenon that occurs when two or more waves interact. 

To understand this, let's consider some fundamental wave characteristics.

Two waves can differ in their phase and/or amplitude. The phase represents the horizontal shift of the wave, while the amplitude describes its maximum height. Additionally, frequency indicates how many times the wave crosses the x-axis (i.e., has zero amplitude) within a second.

A wave can be graphically represented as a sinusoidal curve on a Cartesian plane.

Same phase, same frequency, but different amplitude

 Same amplitude, same frequency, but different phase

Same amplitude, same phase, different frequency/.

As these representations illustrate, a wave can have both positive and negative values along the y-axis.

Wave interference is the result of the superposition of two or more waves. 

The resulting wave's amplitude at any point is determined by the sum of the amplitudes of the individual waves at that point. Think back to our pond analogy: if you throw two stones close together, the resulting water surface will exhibit a complex pattern of crests and troughs where the waves from each stone interact.

Specifically, at a given point, the amplitudes of the two waves can:

  • Add up if both amplitudes are positive or both are negative, resulting in a wave with a larger amplitude.

  • Subtract if one amplitude is positive and the other is negative, resulting in a wave with a smaller amplitude.

  • Cancel out completely if the amplitudes are equal in magnitude but opposite in sign, effectively resulting in no wave at that point.

When we apply this concept to a quantum computer, the event of reciprocal wave cancellation can represent a logical Zero, while the presence of a resulting wave represents a logical One.

So, ones and zeros. Starting to sound familiar again?

OK. Enough physics for now, let's turn our attention back to computing.

The fundamental unit of information in classical computers is the Bit

A Bit can exist in only one of two states: 0 or 1. This represents the physical state of a transistor within the computer's circuitry. 

A 0 typically signifies that the transistor is not amplifying the current passing through it, while a 1 indicates that it is. Historically, these two states corresponded to voltage levels of approximately 0 Volts and 5 Volts, although modern computers often use lower voltages, with the '1' state closer to 3 Volts or even less.

Bits are grouped into larger units like Bytes (8 bits) and words (16, 32, 64, or 128 bits) to represent more complex information than a simple Boolean True or False (where False = 0 and True = 1). 

Classical computers utilize this binary representation to encode numbers, strings of alphanumeric characters, and program instructions.

Let’s look at a few examples.

Here’s a string of 1s and 0s that represent an internal computer “translation” of the number 5 in an 8-bit memory location (a Byte):

0 0 0 0 0 1 0 1

Here’s an internal Intel x86 computer processor representation of the coding instruction: INC AL (hexadecimal FEC0). INC AL means increment the processor's internal register AL by 1:

1 1 1 1 1 1 1 0 1 1 0 0 0 0 0 0

Here’s an internal computer representation of an alphanumeric character using ASCII coding:

Lastly, here’s an internal computer representation of the string "Hi" (hexadecimal 4869):

0 1 0 0 1 0 0 0 0 1 1 0 1 0 0 1

And on and on you go until you translate every possible word, number, and command as an expression of One or Zero until you have binary language.

Now that we understand a bit more about how current computers talk to us, and to each other,
how is that different in a quantum system?

The execution of a program in a classical computer follows a sequential process: fetching a command (instruction or function) and its associated operands from memory, executing the command, storing the result for later use, and then moving on to the next command. The program flow is a linear stream of operations.

In contrast, the fundamental unit of information in quantum computers is the Qubit. Like a classical Bit, a Qubit can also exist in the states 0 and 1. However, the crucial difference lies in the quantum mechanical phenomenon of superposition. A qubit can exist in a probabilistic combination of both 0 and 1 simultaneously.

This is possible because we are dealing with quantum entities that exhibit wave-like behavior, such as photons (light particles). 

A helpful, though simplified, analogy is to imagine a tiny sphere that can rotate. A '1' state could be represented by a clockwise rotation ("Up"), and a '0' by a counterclockwise rotation ("Down"). However, unlike a classical object that is definitively rotating in one direction or the other, a qubit can be in a state of both rotations at the same time, with a certain probability of being measured as either clockwise or counterclockwise.

Furthermore, because qubits behave like waves, they can also exhibit interference during the execution of an algorithm. This interference allows quantum computers to explore a spectrum of probabilistic possibilities as the outcome of a computation, much like the complex wave patterns formed by multiple stones thrown into a pond.

Another significant distinction from classical computers is that quantum computers perform computations in a fundamentally parallel manner. Instead of exploring one computational path at a time, a quantum computer can, probabilistically, explore all possible outcomes simultaneously. 

Imagine searching for the exit in a labyrinth: a classical computer would try each path one after the other, in a logical succession, until it finds the correct one, while a quantum computer could explore all possible paths concurrently and then identify the one with the highest probability of leading to the exit.

This leads to one of the major challenges in quantum computing: how to extract the desired, highly probable solution from the multitude of possibilities generated by quantum computation. 

Here, the principle of wave interference is again employed. By carefully manipulating the qubits, the probabilities of incorrect solutions can be made to destructively interfere (cancel each other out), while the probability of the correct solution is enhanced through constructive interference. 

This process allows the desired outcome to be "filtered out" and then measured, converting it back into the classical realm for us to interpret.

Just as a single bit holds limited information, a single qubit alone doesn't offer immense computational power. Quantum computers, therefore, group qubits together. Crucially, the computational power of a quantum computer grows exponentially with the number of qubits it can manipulate simultaneously.

As the chart illustrates, the increase in computational capacity is not linear but exponential.

A quantum computer with just 300 qubits has the theoretical capacity to perform a number of operations exceeding the estimated number of atoms in the observable universe.

So all of that sounds insane.
Is this merely science fiction?

Not anymore. While quantum computers have existed in physics laboratories for some time, only recently have we begun to see the development of sufficiently stable machines with significant computational power. We are likely not far from having industrially viable quantum computers ready for the market. Some experts predict this could happen within the next decade.

But will we be using quantum computers to send emails? No way.

I mean have you seen a quantum computer? Here’s Google’s:

Google’s Quantum Computer

No way that’s fitting on anybody’s desk.

Besides being a colossal waste of such a powerful resource for a trivial task, even if the cost becomes accessible to the average household, quantum computers are not intended for general-purpose applications. Their strength lies in tackling specific types of computationally intensive problems. They are likely to become the next generation of supercomputers.

A quick search can easily reveal some of the fields poised to benefit significantly from this technology:

  • Drug Discovery: Quantum computers can model the intricate interactions between molecules, a critical aspect of drug development. By simulating these interactions, researchers can gain valuable insights into how well a drug candidate might bind to a target molecule, predicting its effectiveness and potential side effects.

  • Material Science: Beyond pharmaceuticals, quantum computing can simulate the behavior of materials at the quantum level, potentially leading to breakthroughs in diverse areas, including the development of advanced battery technology.

  • Financial Modeling: The complexity inherent in financial models and large-scale data analysis can be significantly addressed by quantum computing, potentially leading to more accurate risk assessment and optimized investment portfolios.

  • Artificial Intelligence: Quantum computing is being actively explored for its potential to accelerate artificial intelligence and machine learning algorithms, leading to faster and more accurate predictions and insights.

  • Cybersecurity: Quantum computing presents both challenges and opportunities in the realm of cybersecurity. While it could potentially break current encryption methods, it also opens the door to the development of new, more robust, quantum-resistant encryption algorithms.

The cybersecurity aspect is particularly noteworthy due to the concern that quantum computers, with their immense computational capacity, could easily crack existing encryption protocols. 

In a significant recent development, quantum computers have already demonstrated their ability to outperform classical supercomputers in generating certified random numbers, a crucial element in many encryption schemes.

Experts predict that mature quantum computers could factorize 2048-bit RSA encryption keys, a common standard, in mere minutes, a task that would take classical supercomputers trillions of years. 

Currently, the security of much of our digital infrastructure relies not on the inherent complexity of breaking the encryption but on the practically infinite time required for classical computers to perform the necessary calculations to decode the encryption keys.

While initial concerns about the immediate collapse of current cryptography have been somewhat tempered thanks to some recent additional research, the potential implications remain profound. 

Consider the security of military communications, cryptocurrencies, banking and e-commerce transactions, even your personal hard drive backups and WhatsApp messages. In theory, a quantum computer targeted for such a purpose could cut through that security like paper.

The current strategy involves proactively developing and implementing quantum-proof cryptography protocols, with a target of obsolescing current vulnerable methods by 2030.

The progress in quantum computing is remarkable. For instance, Google reported that their newest quantum computer, released just six years after their previous 2019 model, is already 241 million times faster than its predecessor.

Additionally, researchers at the University of Texas demonstrated a quantum computation that would have taken classical supercomputers 2.6 billion years, completing it in just 4 minutes.

So what’s the endgame here?

As mentioned, while significant progress has been made, further work is needed to stabilize these machines and prepare them for commercialization. However, a decade is not a long time in the grand scheme of technological advancement.

This powerful technology promises to accelerate progress across various fields, fundamentally reshaping our world. We are currently already experiencing the transformative wave of artificial intelligence, and the fusion of AI with quantum computing will likely propel AI capabilities to unprecedented levels. As everyone can likely imagine, that will hold both positives as well as certain negatives.

Quantum physics will not only deepen our understanding of the universe's fundamental secrets but also drive technological advancements that will inevitably impact our daily lives as new discoveries trickle down to society.

I envision a future where a wearable device, perhaps a pair of smart glasses or a bio-engineered implant, could collect data from our surroundings, transmit it to a remote quantum computer, and receive near-instantaneous results, even for incredibly complex computations. 

Imagine looking at a vast library and having the books alphabetically sorted and a brief biography of each author displayed in your field of vision in the blink of an eye.

It kind of does seem like science fiction, right?

So, buckle up and prepare for this new chapter. 

The choice is yours: will you be a passive observer or an active participant in this unfolding technological revolution?

Previous
Previous

The Invention of Zero: Thanks for Nothing!

Next
Next

Data Centers’ Power and The Power of Data Centers