I recently watched a Google Tech Talk with Eric Ladizinsky who visited the Quantum AI Lab at Google to talk about his D-Wave quantum computer. The talk is called Evolving Scalable Quantum Computers and is great, I highly recommend it.
The talk starts out with a quick overview of quantum mechanics. He gives a mind bending example of a CD that holds about 10 billion bits that can be turned into a quantum state with a single photon. The problem is it’s not in a form that you can readily manipulate. This problem of developing natural systems that can innately quantum compute is one area that interests him.
The heart of the talk is the description of how to perform classical linear algebra operations using a quantum computer in order to get an exponential (logarithmic) speedup. This is desirable because simple (although computationally expensive) vector operations underlie a lot of computer science in general and machine learning algorithms specifically.
The quantum versions of linear algebra operations Seth focuses on are:
- Lloyds algorithm that underlies KMeans
- Fourier Transform
- Inversion for sparse matrices
- Support Vector Machines (find support vectors, mapping new points)
- Manifold learning (finding holes and connected components)
He comments that you don’t get everything for free, it is taking serious work for them to map useful algebra into the crazy world of quantum mechanics. The explanations he offers appear quite intuitive (he’s a good communicator), although I expect they are deceptively complex once you step into the detail. It’s not really my area.
Patrick Rebentrost came up with the notion that machine learning and quantum mechanics are fundamentally about manipulating large numbers of vectors in high dimensional spaces and about bringing these two fields together. The key paper on these ideas are Quantum algorithms for supervised and unsupervised machine learning and Quantum support vector machine for big feature and big data classification, both from 2013.