A quantum computer achieves significantly more computing power with qubits than a supercomputer with bits. How this works, simply explained.
Traditional IT is based on bits – currents of electrical or optical impulses which represent ones and zeros. Our entire digital world, from tweets and emails, to Netflix streams, consists of long strings of these binary numbers. Quantum computing, in contrast, works with what are known as quantum bits (“qubits” for short).
Qubits inhabit a “superposition”. They simultaneously represent the states of 0 and 1, and everything in between. Only when they are measured do they take on one value or the other. Think of tossing a coin – while it is in the air, the result could be heads or tails. Only once it stops do we see the result. By inhabiting these “superpositions”, a group of qubits can calculate a large number of possible results in parallel.
Think of a sudoku. While a conventional computer would solve it one-by-one, a quantum computer could “look” at all possibilities simultaneously, easily finding the best solution available.
What is the difference between this and supercomputing?
Quantum computing has the potential to carry out complex calculations which bring a supercomputer to the limits of its capabilities. Supercomputers cannot solve mathematical problems with a very large number of possible solutions or can only do so very slowly. A classic example is the "problem of the traveling salesman": how to calculate the shortest possible route that visits a number of cities, before returning home. At first glance, the solution looks simple. But the big challenge is that the number of potential routes grows exponentially, the more stops the salesman plans to make. For three cities there are only two possible travel options – but for ten the number is already 362,880. For 50 cities and over the calculation is so complex that it can no longer be solved by a supercomputer.
This raises the possibility of real industry applications for simulations, cryptography, artificial intelligence and machine learning.