top of page

What Are The Advantages of Quantum Computing?

By Oliver Bean - Engineering Student @ Wadham College, Oxford

 

Quantum computers are often hailed as the future of computing and the possible applications of quantum computers in physics simulations, cryptography and complex modelling could have wide reaching impacts. However, quantum computers have been in development for decades (the first physical realisation of a quantum computing component was in 1988) and yet still cannot outperform classical computers. In this article I will investigate the theory and engineering behind quantum computing and compare this with traditional computers.


The computers we use today are based upon a physical component being in one of two states - ‘on’ or ‘off’. The Harvard Mark 1 computer (used during the 1940’s) had mechanical switches which were electrically controlled. When an electrical signal was passed to the switch, a current would pass through a coil of wire. According to Ampere’s law, this generates a magnetic field which in turn attracts the metal switch and allowed the gate to be opened or closed. Due to the time delay for the physical component to move, this computer was only capable of performing 3 additions or subtractions per second. The computer was unreliable for a number of reasons - one time a moth was found to be the reason for a fault in the Harvard Mark 2 computer, and this is why programmers complain of computer ‘bugs’!


Computers have since advanced and have progressed from using vacuum tubes to modern day transistors in the internal switches. Transistors make use of electrodes and semiconductor materials to switch between ‘on’ and ‘off’ states much more rapidly than mechanical switches as they don’t contain any physically moving components. The engineering concepts behind transistors and vacuum tubes are equally ingenious and you can explore more about them in the links below.


Contrastingly, current quantum computing is based on an entirely different system. The fundamental building block of quantum computing is the quantum bit - or qubit. A qubit is a sub atomic particle, such as an electron or a photon, and hence is subject to quantum mechanics as opposed to classic physical theory. Quantum computing measures a property of this sub atomic particle to represent a computing state; this could be the spin of an election or the polarisation of a photon. Interestingly, a qubit can exist in a state of ‘superposition’ - it can be neither ‘on’ nor ‘off’ but instead a combination - or superposition - of these two outcomes.


To illustrate the impact of this phenomenon, consider that a combination of 2 classical computing bits can correspond to between 1 and 4 different outcomes. Take a ‘0’ to equal ‘off’ and a ‘1’ to equal ‘on’ - the possible outcomes are 00, 01, 10, and 11. However, a combination of 2 qubits - due to superposition - can correspond to all 4 at once. This is because both qubits can be either ‘on’ or ‘off’ simultaneously. The first qubit can be both ‘0’ and ‘1’ and the second can also be either ‘0’ or ‘1’ - therefore the two combined qubits can simultaneously represent 00, 01, 10 or 11. Computer scientists have written algorithms that are able to make use of this to explore multiple computing branches at the same time and give quantum computers an edge over classical computers. In situations where there are a large number of possible combinations, quantum computers can consider every path simultaneously. Examples where this could be revolutionary include trying to simulate chemical reactions or finding the best route between two places.


However, not all computing scenarios requite this simultaneous branching ability. There will be plenty of situations where classical computers will perform equally to quantum computers - an example of this could be streaming a video online or other types of basic data transfer. There is speculation that devices of the future may be a combination of both classical and quantum computers. A further downside to quantum computing is that the quantum properties being measured are incredibly sensitive and quantum computers are required to be kept at temperatures just above absolute zero. This requires large amounts of energy and currently prohibits more widespread use of quantum computers.


Despite this, as the scale of computer chips decreases we are beginning to reach the limits of classical computing. At the nano-metre and pico-metre level quantum mechanics dominates and as our understanding of this area improves the possibilities of developing a fully functional quantum computer move ever closer.


Further Reading:

bottom of page