Quantum computing is attracting a great deal of interest from governments around the world and from major companies such as IBM, Google, Fujitsu and others. The UK government is clearly confident in the predicted capabilities of the technology, and has invested £270m over five years into a National Quantum Technologies Programme to accelerate the translation of quantum technologies into the marketplace.
The advent of quantum computing promise to create accurate weather forecasts, enable the discovery of new drugs, facilitate the creation of novel materials and perhaps most impactful of all, support the creation of ‘general’ A. These are just some of the reasons why quantum computers are being pursued with such vigour. However, true ‘universal’ quantum computing is still in the physics lab - there are a range of techniques that are used to create the ‘qubits’, the essential building blocks of a quantum computer. They range from superconducting circuits (Josephson Junctions) to trapped ions, quantum dots, nitrogen vacancies in diamond, to the truly mind-bending topological quantum computing, allegedly favoured by Microsoft.
We are commonly distracted by the ‘qubit race’ where progress is measured by the number of qubits that have been generated, generally a few tens. Don’t be confused by the high qubit count reported by quantum adiabatic computing, or quantum annealing, championed by the likes of D-Wave and Fujitsu - these are related technologies, designed to solve complex optimisation problems but are not true universal quantum computers. The real measure of where quantum computers have got to is more complex. Not only do you need a decent number of qubits but they have to be stable.
In the atomic-scale quantum world, keeping anything in a defined state is incredibly hard, requiring operating temperatures approaching absolute zero and complex arrangements of microwaves, lasers and other paraphernalia to interact with the qubits. It’s interesting to note that, although the concept of a quantum computing, often attributed to the famous physicist Richard Feynman in 1982 (more correctly attributed to Oxford’s David Deutsch at Oxford), the field only started to really receive attention when MIT’s Peter Shor formulated an approach to quantum error correction in 1995. You need qubits and stability.
Having enabled, or at least made more feasible, universal quantum computing, Peter Shor also formulated an algorithm to efficiently factorise integers and solve discrete logarithms over finite fields and elliptic curves. This algorithm would essentially allow a quantum computer to break public key encryption algorithms like RSA. This could mean all sorts of data, from financial information to medical records, currently protected by existing public key encryption, would become readable in a few years’ time.
This has added a whole new impetus to the race to build practical quantum computers. It also started a parallel race to create quantum resistant algorithms. In 2016, the US National Institute of Standards and Technology (NIST) took the unusual step of launching a public call for candidate ‘quantum resistant’ algorithms. After receiving nearly 100 schemes, they are now focusing on 26 approaches with the intent to down-select a small number next year. They will then update standards related to digital signature and public key encryption algorithms which they cite as the most vulnerable.
So, although useable quantum computers are still some years away, the reality is that we are laying down data now that could be compromised in a few years’ time. Hence NIST’s interest - we need to have quantum resistant approaches as soon as possible, if for no other reason than it will take several years to properly integrate the new algorithms into our information processing and storage ecosystems.
It’s not all bad news - it turns out that private key encryption is much less vulnerable to attack, even though there are quantum-enabled techniques such as Grover’s algorithm. However, the field is young and new ideas are being mooted and this position of relative security may yet be challenged. It’s also apparent that quantum computers will not sound the death knell of conventional digital computers. There are certain types of problem that quantum computers will be enormously useful for, equally, there are many types of use where conventional computers will continue to be the most practical way of getting work done.
So, there are enormous benefits to creating practical quantum computers but also worrying threats - the concerns about security are not confined to the field of cryptography.
Think of the economic and security dividend that will accrue to countries that have quantum computers. Think of the demands, in terms of skills, that quantum computing will make upon our workforces - the programming paradigms available to quantum computers are radically different from conventional computers. For example, concepts like variables and loops having no equivalent in the quantum world. Think also about an emerging ‘quantum divide’ where those with access will be significantly advantaged compared to those without. Think of the potential to exploit quantum computing for criminal purposes, from hacking financial transactions to creating new forms of uncontrolled drugs.
As ever with a radically new technology, we have to think carefully about how it is used and controlled. The UK is well placed to be at the forefront of quantum computing, with many of our universities leading research in the area, often working closely with tech giants and startups. We have the education base, a national culture of innovation and the legal and professional frameworks to support the wider needs of this embryonic industry. The UK invented the conventional digital computer, perhaps we can also be at the forefront of the quantum computer.
Dr Andrew Rogoyski is Innovation Director for Roke Manor Research
Find out more about techUK's #QuantumFuture week by jumping to our landing page now or get in touch with Tom.Henderson@techUK.org or Sue.Daley@techUK.org today!