Every day, businesses are discovering the transformative power artificial intelligence (AI) offers their organizations. The AI Winter of the 1980s and ‘90s has thawed into Spring unleashing a torrent of machine learning (ML) software and hardware which has led to an AI renaissance.
While ML, and by extension artificial intelligence, has received a lot of press and attention in recent years, Quantum Computing (QC) has been steadily growing in utility but has not yet grabbed the media spotlight. The first quantum computing company, D-Wave, was founded over 15 years ago and has been steadily increasing the number of qubits on their chips in line with Rose’s law; they have only recently become practical for use outside of pure research.
Quantum Computing 101
Before I jump into the relationship between AI and QC, let me provide a little more background on quantum computing without getting too far into the weeds on quantum physics. A commonly held misconception is that because quantum computers are capable of being in all states simultaneously, they can break encryption or solve problems instantaneously. What is true is that quantum computers can solve specific types of complex problems orders of magnitude faster than their classical counterparts.
As we shrink classical computer chips, their feature sizes get so tiny that quantum effects, which cannot be observed on the macro scale but can be observed at the atomic scale, start to interfere with their operation. For example, at the nanometer scale (one billionth of a meter) electrons can tunnel across a layer of insulation, instead of being impeded by it. At the atomic scale, it is nearly impossible to prevent electric currents from traveling where you don’t want them to. Companies like Intel and TSMC have designed novel transistors to minimize the amount of current leakage. With 5nm chips releasing soon we are very close to the limits of what is possible with conventional technology.
The field of quantum computing takes an entirely different approach. Instead of asking how we can negate quantum effects in order to compute, what if we could harness the strange occurrences at the quantum level to perform useful calculations? Most of us have heard of Schrodinger’s cat, which is said to be in a “superposition” of both alive and dead at the same time (this is a much more accessible explanation by a brilliant physicist). You may have also heard of entanglement or “spooky” action at a distance, where two particles separated by a great distance will take the same state when either one is observed.
Quantum computing makes use of these effects in qubits (or quantum bits) which are the basic building blocks of a quantum computer. In traditional computing, binary “bits” or the 0’s and 1’s encode all the information stored and processed on a computer. While qubits can hold a value of 0 or 1, they can also be in a state of superposition, which means that their state is ill-posed in a classical sense. Qubits can be entangled with each other at varying strengths such that if one qubit was to take on a specific value of 0 or 1, then a qubit which is entangled with that qubit is more likely to have the same value.
By setting the strength and polarity of connections between qubits, we can encode a challenging optimization problem, such as the traveling salesman problem, on a quantum annealing chip and have quantum physics “do the work” of solving the problem for us. This is accomplished by starting the system at a “high temperature” indeterminate state and slowly decreasing the temperature until the system of qubits has settled on a low energy state, ideally a global minimum. This process is run multiple times to account for the noisiness of today’s quantum computers, and the solution is read out to a classical computer for analysis. All quantum computing chips are connected to classical computers and act the same way an accelerator card like a GPU would.
Quantum Computers vs. Classical Computers
A common question is whether quantum computers will replace classical computers. At this point, the majority of algorithms are still best suited to computation on a classical computer, however, for certain subsets of problems, there will be an immense economic advantage to running them on a quantum computing chip.
Quantum computers can be used to improve certain parts of the machine learning pipeline when posed as an optimization problem. Therefore, we are likely to see machine learning run on a hybrid system composed of both classical and quantum computing units. Just like artificial intelligence is currently run on special-purpose chips controlled by classical general-purpose CPUs, we can imagine a near future in which computing workloads will be distributed across a heterogeneous computing environment consisting of CPUs, GPUs, ASICs and QPUs (Quantum Processing Units), with the right workloads sent to the hardware best suited to solve that particular problem or sub-problem.
Companies are still in the process of moving AI workloads from CPU + GPU machines to CPU + ASIC machines like the Google Tensor Processing Unit, the NVIDIA A100, or the Cerebras Wafer Scale Engine chip. But this trend will likely accelerate over the next few years.
What the Future Looks Like
While there are teams all over the world working on quantum machine learning (QML) today, and that have been working on it for at least 5 years now, we are still on the cusp of seeing QML techniques being employed in the industry. In the meanwhile, hybrid systems that use both ML on classical computers and optimization problems on quantum annealers are already seeing interesting results (https://www.menten.ai/).
Quantum computing has moved from the chalkboard to reality but is still a very niche domain. We tend to agree with IDC’s prediction that “by 2023, 25% of the Fortune Global 500 will gain competitive advantage from quantum computing” and for machine learning applications to follow not long after. The impact on a number of fields that can make use of quantum computing’s advantages could be immense, including in artificial intelligence.
While the potential near-future applications for quantum computing are myriad, AI’s potential can be experienced right now! While many companies still see artificial intelligence as a technology that will transform their business in the future, their competitors are already using it. By 2019 a survey of thousands of CIOs by Gartner showed that over 37% of them had already implemented AI in one form or another.
Get in touch with me at firstname.lastname@example.org. I’m curious to hear about an interesting problem you’re trying to solve. Let’s see if we can transform your data into wisdom with machine learning!
P.s. Hartmut Neven is doing some fascinating work in the area of quantum machine learning.