Quantum Computing: A New Technological Frontier.

Aidan Bartholomew
5 min readDec 31, 2019

--

Quantum computing is a buzz-phrase that most will recognise, but few could explain. Tech giants like IBM and Google are racing to create a quantum computer that is more reliable and higher performance than the others, as the theory indicates that quantum computing could well be the next significant technological frontier.

In computers as we know them, computations are done using “classical” bits, which can be either zero or one. In a quantum computer, quantum bits, or “qubits”, are used instead. There are several different particles that can be used as a qubit; a photon, a nucleus and an electron can all be used as each of these particles exhibit quantum phenomena, and the quality that makes them useful in computing is their ability to be in two states simultaneously. We can only determine the probability that the particle is in one state or the other. This is known as superposition. However, once measured, the qubit will be in either one state or the other.

In order to illustrate the quantum computers advantage in being able to express more information, let’s look at it in reverse and use an analogy where we express the system using pieces of information, and then count the pieces of information it took. Let’s compare two classical bits to two qubits. For the classical bits, there are 2² = 4 different combinations of these two bits, but the two bits will only be in one of these four states. To express that state, I only need to give you two pieces of information, the value of the first bit and the value of the second. Now think of two qubits. They also have four different combinations of states, but because of their quantum behaviour, these states are all in superposition. To express the state of this system, I must give you the probabilities of each of these four states occurring once they are measured. In this sense, two qubits express four pieces of information, whereas two classical bits only express two pieces of information.

By this logic, if I have three qubits, the number of pieces of information that that expresses is 2³ = 8. There is an exponential relationship between the number of qubits and the number of the pieces of information they can express. IBM has the world’s biggest quantum computer, which has 53 qubits, and can thus express 2⁵³ pieces of information. Dario Gill, the head of IBM’s research lab has said: “Imagine you had 100 perfect qubits. You would need to devote every atom of planet Earth to store bits to describe that state of that quantum computer. By the time you had 280 perfect qubits, you would need every atom in the universe to store all the zeros and ones.”

Another quantum principle that comes into play is quantum entanglement, or as Einstein aptly referred to it as, “spooky action at a distance.” Quantum entanglement is the idea that when two quantum particles become entangled, one of them will react to change in the other immediately, no matter how far apart they are. This means that by observing the state of one qubit, you can make inferences about a second. Scientists use microwave pulses in some cases to entangle qubits. This principle doesn’t only work for two qubits, quantum computer scientists try and achieve an entangled state across many more than just two of the qubits. As you can imagine, involving more qubits makes this entangled state more fragile.

Because qubits are made up of subatomic particles, they are very sensitive to outside stimulus such as vibration and temperature. This outside interference can cause the qubits to lose their quantum state, an effect called decoherence. This decoherence in turn disrupts the entangled states which the qubits have with each other.

Despite their immense power, quantum computers likely won’t become a replacement for classical computers. Instead, they will be employed to carry out calculations that would simply take too long on a classical computer. Some areas where lots of large computations are involved include cryptography, artificial intelligence, and molecular modelling.

Cryptography as it’s used in our internet today works because a classical computer would be too slow to guess the correct number to break the encryption. This is a prime example of how a quantum computer could be of much significance. A quantum computer wouldn’t have much of a problem with guessing values until it finds the correct one and could do so in an exponentially smaller timeframe. However, this can also work in reverse, and there is already lots of research going on regarding quantum encryption. This could usher in a new era of cyber security. This is just one area in which quantum computing could have a huge impact.

The technology definitely sounds exciting, but why don’t we have it right now? The main answer to question is that current quantum computers have a big issue with reliability. Putting all the qubits in a quantum state and then entangling them is difficult to achieve and harder to maintain. With today’s quantum computing technology, there are only so many computations that can be done before this perfect quantum state breaks down and the information is lost.

Quantum computing is still relatively young, and it’s facing a lot of the same problems that classical computing experienced on its rise to prominence. In the near future, researchers will be working to increase the fault tolerance of these machines, increase the reliability, and increase the capacity. The potential areas for application are just the tip of the iceberg. There’s no doubt that this new kind of ultra-powerful computing will find applications that we could never predict, a bit like the state of a quantum particle.

References

1. Mark Jackson, 2017, “6 Things Quantum Computers will be incredibly useful for.” https://singularityhub.com/2019/02/26/quantum-computing-now-and-in-the-not-too-distant-future/, accessed on 11.09.2019.

2. Veritasium YouTube Channel, 2013, How does a quantum computer work? https://www.youtube.com/watch?v=g_IaVepNDT4, YouTube, accessed on 11.09.2019.

3. Dennis Overbye, 2019, “Quantum computing is coming, bit by qubit.” https://www.nytimes.com/2019/10/21/science/quantum-computer-physics-qubits.html?searchResultPosition=2, accessed on 10.09.2019.

4. Morello, A. (2018). “Lunch & Learn: Quantum Computing.”. [video] Available at: https://www.youtube.com/watch?v=7susESgnDv8 [Accessed 11 Nov. 2019].

--

--

No responses yet