COMPUTING Quantum computers: How do they work, and what might we expect of them?
Quantum computing is a weird concept, yet it has become a reality - at least on a limited and experimental basis. So, what is it, what can it do that traditional computing cannot, and why are quantum computers not yet available as production items?
Quantum computing sounds like the stuff of science fiction, yet, to some extent, it’s a reality. While it’s not certain when – or if – commercial machines will appear, Google and IBM, plus other tech giants and start-ups, are competing to build the first actually useful quantum device. To understand why the prospect of quantum computing is so compelling, we can start by looking at the limitations of conventional computing, and at how quantum computing avoids these limitations. Next, we can look at applications that would benefit particularly well from the technology – and at why it hasn’t yet translated into everyday, usable products.
All computers today, whether for industrial, commercial, personal or any other applications, depend on binary logic, where all numbers and states are represented by strings of 1s and 0s. Irrespective of how large or complex a computer is, it depends on transistor gates to implement this binary logic. Each gate is either OPEN for a 1, or CLOSED for 0; no other state is recognized.
5G and its impact on data centers
As integration technology has steadily improved, the transistors have become smaller, allowing greater processing density, better efficiency, and faster, more powerful machines. However, this miniaturisation process has reduced transistors to sizes of only a few atoms across. With any further reduction in size, the transistors are likely to stop working, as electrons will start to bypass the gates whether they are on or off.
This suggests that we will meet a limit to the amount of processing power available from this technology, and that a different approach is needed for further progress. One possibility being worked on is to use light photons instead of electricity to move information within and beyond integrated circuits. Another is to use quantum computing.
The quantum concept is valuable because it allows for more than just two states. It relates to interactions between particles at a tiny scale, so small that the rules of physics as we normally experience them no longer apply. However, it is possible that some of the unusual – and counterintuitive - phenomena occurring at this level could be used to overcome the limitations of traditional computing.
In quantum computing, the smallest unit of data is not the bit, but the qubit, based on something like the spin of a magnetic field. Like a bit, this can be set to one of two states - 0 or 1 - but unlike a bit, it is not as simple as just being on or off. Thanks to the quirks of the quantum level, a qubit can also be in any proportion of both states, called a superposition. Sometimes, this is described as being both 0 and 1 simultaneously, although this isn't entirely accurate. Rather, it could be anywhere between completely 0 and completely 1 - but the catch is, as soon as we actually measure a qubit, it collapses into one of the two definite states.
While being difficult or impossible to fully understand, superposition means that the amount of data that can be stored grows exponentially as the number of qubits increases. A group of 20 qubits can hold more than a million values at once. However, quantum computing also involves further concepts, particularly quantum entanglement. This means that, unlike traditional computers that can only process data sequentially, quantum computers can process it all simultaneously.
This strength in particular makes quantum computing an attractive candidate for database processing, as it can check many records simultaneously. Artificial intelligence is another application that can benefit significantly from simultaneous processing; it’s based on learning from experience, becoming more accurate as feedback is given. And this feedback is based on calculating the probabilities for many possible choices.
For example, Lockheed Martin plans to use its D-Wave quantum computer to test autopilot software that is currently too complex for classical computers, and Google is using a quantum computer to design software that can distinguish cars from landmarks. Quantum computing also has the potential to disrupt online security and cryptography applications. These depend on large, encrypted key numbers that cannot be decrypted by traditional computers trying every possible combination, because the process would take an unfeasibly long time. However, quantum’s simultaneous processing capabilities could remove that protection, so new, ‘quantum-resistant’ encryption methods will become necessary.
For quantum computers to become truly useful, robust quantum error correction is essential. Currently, quantum computers are highly sensitive: heat, electromagnetic fields and collisions with air molecules can cause a qubit to lose its quantum properties. This process, known as quantum decoherence, causes the system to crash, and it happens more quickly the more particles that are involved.
Quantum computers need to protect qubits from external interference, either by physically isolating them, keeping them cool or zapping them with carefully controlled pulses of energy. Additional qubits are needed to correct for errors that creep into the system.
Tools & Software
Industrial cybersecurity software in three versions
Nevertheless, progress continues to be made. Google, for instance, is claiming to have achieved quantum supremacy – a milestone at which quantum computing outperforms the best of traditional computing. Google claimed that a quantum processor comprising 54 qubits was able to perform a random sampling calculation in just 3 minutes and 20 seconds; they also claimed that IBM’s Summit, the world’s most powerful supercomputer, would have needed 10 ,000 years to complete the same task.
Not everyone believes in a viable future for quantum computers. Some think that the difficulties of eliminating quantum errors will be simply too great. Only time will tell if they, or the many corporations and start-up companies investing in quantum computer technology, turn out to be right.