The growing relationship between quantum and cloud
The term quantum computing has been in circulation since the 1980s when physicist Paul Benioff proposed a quantum mechanical model of the Turing machine. By 1988, Yoshihisa Yamamoto and Kazuhiro Igeta had built the first prototype of a quantum computer, so why are we still talking about quantum computing as a futuristic concept? The truth is that the adoption of quantum computers has been slow. They have traditionally been prohibitively expensive to develop and until now the reserve of a few key players with adequate capital and R&D capabilities such as Google and IBM.
For these reasons, McKinsey predicts that this adoption will remain fairly slow and by 2030 only between 2,000 and 3,000 quantum computers will be in operation. However, the future of quantum is as bright as it has ever been. While predictions of the potential quantum computing market size vary, in June 2021 the UK government has invested £210m in an artificial intelligence (AI) and quantum computing centre to help cement the UK’s status as a global leader in scientific research.
A new age of computing
Leaving aside the challenges faced in its 40 year history, the reason the future is bright for quantum computers is because they can do things that digital computers can’t. Simply put, they are different and work differently to digital computers. Quantum computers should also not be confused with supercomputers, which are more powerful versions of everyday digital computers that solve problems more quickly but using the same process. Even supercomputers follow the same model of computing demonstrated by the mechanical Turing Machine in 1936. That is to solve problems in order, one after another, checking every possible scenario to eventually find a solution. Quantum computing, on the other hand, looks at every possible scenario in parallel. This means it can check for different solutions to the problem simultaneously. The result of this is not just that quantum computers can solve problems quickly, it also means they can solve problems which are too complex for even the most powerful supercomputers in existence to comprehend.
For this reason, quantum computers will be used to solve really huge problems. Industries such as healthcare, for example, will use quantum computing to make medical research – potentially leading to finding new treatments and cures for serious diseases. The financial services industry stands to benefit from the progress of quantum computers by using its immense computing power to better calculate risk and price derivatives. Monte Carlo simulations, a computerised mathematical technique used for calculating risk in quantitative analysis, are traditionally done on supercomputers. These are the first algorithms that would benefit from using quantum computers. Large automotive vendors are already trialling the use of quantum computers to design better batteries, efficiently use robotic arms in manufacturing, and optimise complex supply chains. The aviation Industry is investigating how quantum computers can help improve the efficiency and ensure the safety of flight paths. While manufacturers, especially those making new materials and chemicals, will be among the first to use quantum computers in production in the next few years.
Increasingly, high-performance computing (HPC) use-cases are drawing on the power of the public cloud, one such example being the partnership between Formula 1 and AWS, with HPC in the cloud being used to fine-tune the vehicle design of the current F1 car. Quantum computing will augment HPC and enable it to solve problems it currently can’t by finding solutions in a parallel rather than a linear fashion, at scale and speed. This will have a positive impact in terms of reducing some of the data debris created by these huge computational undertakings, reducing digital waste.
A quantum leap
As with HPC and supercomputing, the public cloud is key to democratising quantum. This emergence of a new relationship with the public cloud is the reason quantum computing will attract investment and experience growth without a vast uptick in the number of physical quantum computers being manufactured. Quantum computing solutions are already available via the public cloud with the hyperscalers vying for dominance in this space. Customers can now experiment and try various cloud-based quantum computers for applications in quantum machine learning, optimisation and simulation of chemical molecules and materials. In the coming years we can expect this relationship to grow even further as businesses at the forefront of digital transformation (DX) move towards quantum transformation (QX), experimenting with complex workloads like HPC and high-performance AI for competitive advantage.
Data plays a critical role in HPC and AI. Whilst the volume of data consumed by quantum computers is low, with current projects utilising synthetic data instead of real data, the real test will emerge in the next two to three years when massively powerful quantum computers can use real enterprise data. It is clear that quantum computers are unlikely to be deployed in data centres and will instead be accessed via the public cloud, increasing the need to migrate data from on-premise to the cloud. NetApp’s data fabric can help businesses looking to migrate data and workloads to the public cloud or indeed move between clouds. This will help businesses investing in AI, machine learning, the Internet of Things (IoT) and edge computing get more value from their data and give them the freedom of choice no matter where their data resides. Combined with a data fabric, businesses across all industries can reap the benefits of cloud, quantum computing, machine learning, IoT and fundamentally transform our relationship with technology. After all, speed is the new scale and the speeds that quantum computing offers will dramatically increase the scale of computing available to enterprise.
Deepth Dinesan, Director, Artificial Intelligence, Robotics, Quantum Computing, NetApp
Quantum Commercialisation Week
Click here to read more insights published during techUK's Quantum Commercialisation Week
Laura is techUK’s Programme Manager for Technology and Innovation.
She supports the application and expansion of emerging technologies across business, including Geospatial Data, Quantum Computing, AR/VR/XR and Edge technologies.
Before joining techUK, Laura worked internationally in London, Singapore and across the United States as a conference researcher and producer covering enterprise adoption of emerging technologies. This included being part of the strategic team at London Tech Week.
Laura has a degree in History (BA Hons) from Durham University, focussing on regional social history. Outside of work she loves reading, travelling and supporting rugby team St. Helens, where she is from.
Zoe is a Programme Assistant, supporting techUK's work across Policy, Technology and Innovation.
The team makes the tech case to government and policymakers in Westminster, Whitehall, Brussels and across the UK on the most pressing issues affecting this sector and supports the Technology and Innovation team in the application and expansion of emerging technologies across business, including Geospatial Data, Quantum Computing, AR/VR/XR and Edge technologies.
Before joining techUK, Zoe worked as a Business Development and Membership Coordinator at London First and prior to that Zoe worked in Partnerships at a number of Forex and CFD brokerage firms including Think Markets, ETX Capital and Central Markets.
Zoe has a degree (BA Hons) from the University of Westminster and in her spare time, Zoe enjoys travelling, painting, keeping fit and socialising with friends.
- [email protected]
- 020 7331 2174