The Sky is the Limit: Quantum Computing Explained

For all of human civilization, we’ve continually upgraded our systems. Fire and sticks became rockets and nuclear weapons, hunter-gatherers became 3D-printed food. However, the biggest upgrade has happened to our brains.

Well, not really. Computers, more precisely. These metallic brains won’t mess up during a calculus problem and are rapidly overtaking many functions we currently consider human.

One of Google's quantum computers | Stephen Shankland/CNET

How it Works Now

To understand quantum computers, we must first understand classical computers.

Computers manipulate data, expressed in bits (0 or 1), using transistors. Transistors are basically on/off switches where bits, similarly to electrons, either pass through or are stopped. These transistors form logic gates that take a certain input and produce a certain, predictable output. For example, in an AND gate, if the two inputs were 0 and 1, it would produce a 0; if it were both 1’s, then it would output a 1.

The next level up is basic modules. These perform basic calculations using a combination of many logic gates. A module can, in certain low-level applications, be used for simple circuits to perform very low-level calculations.

And finally, these modules are formed into a circuit chip, powering computers and processors worldwide.

The Limits

However, as transistors become smaller and smaller, the limits of our physical world become all too clear. Current transistors are around 8-14 nanometers, far smaller than the size of an HIV virus or a red blood cell. And the smaller we get, the more quantum physics is going to make it tricky for us.

10nm Transistor | Toshiba

And it’s not just transistors too. There’s a whole host of reasons why we can’t rely on simply hardware to solve our problems. Briefly summarized, they are…

Moore’s Law: It observes that the number of transistors on an integrated circuit doubles approximately every two years. It is slowly becoming obsolete as the slope decreases slightly, and experts believe it will become obsolete by around 2030.

Pollack’s Rule: It states that microprocessors performance increase due to microarchitecture advances roughly proportional to the square root of the increase in complexity. Complexity here refers to processor logic.

The Great Moore’s Law Compensator: Software advances less rapidly than hardware, so software will limit hardware.

And to find out a way we can get around this, let’s take a visit to the quantum realm.

The Quantum Realm

Cool, we're finally here.

So let’s talk about superposition. Differently from a normal system, in a quantum computer, qubits replace bits. A qubit is anything that represents states -- 0 or 1, a photon’s horizontal or vertical polarization. However, whilst unobserved in the quantum realm, the qubit can be in any proportion of both states at once -- between 0 or 1. However, as soon as you observe the photon, it collapses into one of the two states, either 0 or 1. A famous thought experiment in superposition is known as Schrödinger's cat, where a hypothetical cat is said to be simultaneously both alive and dead in the quantum superposition state.

Superposition Explained | A2Apple

For computing, however, we can see that superposition is a game changer. We can have many orders of magnitude more calculations than a classical computer, as qubits can exist in all states at once -- and with the right technology, engineers and scientists can pull the desired value and output without having to perform far more calculations.

Ok. Let’s talk about entanglement now. It’s an observed behavior of qubits that you can directly deduce its partner qubit’s properties, regardless of where the qubits are in the universe. The two qubits affect each other in certain ways that allows R&D engineers to manipulate this odd quality to increase computing speeds drastically.

The last major technological principle relevant here are quantum gates, which manipulate superpositions, rotate probabilities, and measure the outcome, collapsing superpositions to a sequence of 0s and 1s. Instead of normal logic gates, these don’t behave logically -- they produce another superposition as its output.

By cleverly manipulating superposition and entanglement, we get quantum computers: vastly more efficient machines that will easily be able to store and manipulate massive chunks of data quickly.

But Where to Set Up Shop

Anyone who hasn’t been widely involved in quantum computers is probably really confused now. We’re used to understanding how our machines work, but not to worry-- quantum computers are, and will not be, nowhere near your home. Instead, a quantum computer will probably be used for very specific applications that have a small input and output but infinite possible outcomes. Several of these possible applications are outlined here.

High-Level Diagrom of Shor's Algorithm
  1. Database searching. Quantum computers require only the square root of the time it would take a classical computer to search databases using Grover’s algorithm, making calculating and searching vast quantities of data simple.

  2. IT security. Currently, deducing private encryption keys from the publicly encrypted key takes an exorbitant amount of time that is simply not worth it. However, with quantum computers, using an algorithm called Shor’s algorithm, all of this can change as calculations are performed substantially faster than they currently are.

  3. Simulations and modeling. Resource-intensive and time-crunching, current simulations take forever and are often not accurate. However, with quantum computing, quantum simulations can become a reality, possibly revolutionizing humanity forever. We may be able to model entire molecules, DNA, or even the human brain itself, all of which are quantum in nature.

  4. NP-complete problems. With small inputs and outputs, but almost infinite possibilities, classical computers are incapable of finding the answer to these problems. We can leverage quantum computers to look at climate change, train artificial intelligence, or look at financial stock and market patterns much more efficiently than with classical computers.

Classical DNA modeling

Of Course There Are Problems

Quantum computing isn’t coming anytime soon. And here’s some of the reasons why.

Quantum science at Yale
  1. Qubits are incredibly fragile and “picky”. They need to be completely protected from any radiation exposure, and must be kept incredibly cold: ~10 microkelvin. The hardware required to keep these qubits in a state of superposition is incredibly difficult and costly, and the most advanced quantum computers today have barely over 50 qubits.

  2. Possibilities. Quantum computers, unlike classical computers, do not give a direct output -- they give an output that is a percentage. Even as engineers and scientists work to create the most “accurate” quantum computers (by using particle wave interference to cancel out the incorrect possibilities), to reach a theoretical percentage, it is still incredibly far off and in the distant future.

Phew, We’re Done: In Conclusion

Quantum computers are hyped up by the media now, with some even saying that they will somehow solve climate change or break encryption within the next five years. I can confidently assure everyone that this will not happen; we are still in the early stages of understanding and developing the underlying technology to make these a possibility in the distant future.

And while they’re part of humanity’s incredibly exciting future, quantum computers will not replace classical computers anytime soon.

The famous paradox of Schrödinger's cat | CC0 Public Domain

  • Instagram
  • LinkedIn
  • YouTube

©2020 by IgniteMinds