Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Like this presentation? Why not share!

- High Performance Quantum Inspired M... by Peter Wittek 3529 views
- Quantum Annealing for Statistical M... by issei_sato 5369 views
- Sthack 2015 - Renaud "@nono2357" Li... by StHack 966 views
- Quantum Computing by Sai Varun Padala 2618 views
- Introduction to Quantum Computing ... by Rahul Mee 1229 views
- Quantum Computing: Welcome to the F... by VernBrownell 2509 views

4,096 views

4,002 views

4,002 views

Published on

No Downloads

Total views

4,096

On SlideShare

0

From Embeds

0

Number of Embeds

2,475

Shares

0

Downloads

59

Comments

0

Likes

2

No embeds

No notes for slide

- 1. Machine Learning and Quantum Computing: A Look at Quantum Support Vector Machines Seminar at the Centre for Quantum Technologies Peter Wittek University of Bor˚as September 19, 2013
- 2. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions What Machine Learning Is Not It is not statistics Data-driven Strict assumptions on underlying distributions It is not AI Model-driven Uncertainty is addressed It is not data mining Although there is a considerable overlap Peter Wittek Quantum Support Vector Machines
- 3. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions What Machine Learning Should Be About Data-driven Looking for patterns Classes, groups of similar objects Mainly quantitative, but can also be qualitative Robust, tolerates noise Generalize well beyond training data Peter Wittek Quantum Support Vector Machines
- 4. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Characteristics Loose collection of algorithms No common ground Few assumptions Parameters can be a major obstacle Computationally intensive Not easy to parallelize N:N access patterns are common Or N:K through a proxy Peter Wittek Quantum Support Vector Machines
- 5. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Nature-Inspired Methods Many nature-inspired methods Computational Intelligence Neural networks, ﬂocking algorithms, genetic algorithms, chemical reactions, etc. Also methods inspired by quantum mechanics Others: manifold learning, density-based clustering, support vector machines, etc. Peter Wittek Quantum Support Vector Machines
- 6. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Learning Approach Supervised Biomedical: recognizing cancer cells Recognizing handwriting Spam detection Unsupervised Recommendation engines Finding groups of similar patents Identifying trends in a dynamic environment Peter Wittek Quantum Support Vector Machines
- 7. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Ensembles Peter Wittek Quantum Support Vector Machines
- 8. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions High-Performance Machine Learning Petabytes of data Sparse, noisy, might be missing elements There should be as few assumptions as possible Large scale may not entail a need for quick learning methods Peter Wittek Quantum Support Vector Machines
- 9. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Examples: Blood Pressure Monitoring Simple, SVM-based pipeline achieved 5 % accuracy. Using cell phone camera. Time 5 4 3 2 1 Coefficients (a) Systolic blood pres- sure of 92 mm Hg Time 5 4 3 2 1 Coefficients (b) Systolic blood pres- sure of 107 mm Hg Time 5 4 3 2 1 Coefficients (c) Systolic blood pres- sure of 127 mm Hg Peter Wittek Quantum Support Vector Machines
- 10. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Examples: Self-organizing Maps Peter Wittek Quantum Support Vector Machines
- 11. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Main Research Directions Learning a unitary transformation. Adiabatic quantum computing. Other methods. Peter Wittek Quantum Support Vector Machines
- 12. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Learning a Unitary Transformation Black-box approach to learning. Observed input and output, learn the mapping function. A form of quantum process tomography. Unknown function == unknown quantum channel. Peter Wittek Quantum Support Vector Machines
- 13. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Adiabatic Quantum Computing Find the global minimum of a given function f : {0, 1}n → (0, ∞), where minx f(x) = f0 and f(x) = f0 iff x = x0. Consider the Hamiltonian H1 = x∈{0,1}n f(x)|x x|. Its ground state is |x0 . To ﬁnd this ground state, consider the Hamiltonian H(λ) = (1 − λ)H0 + λH1. It already demonstrated: search engine ranking and binary classiﬁcation. Nonconvex loss function. Peter Wittek Quantum Support Vector Machines
- 14. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Other Methods Quantum Bayesian inference. Pattern matching: unknown state to target known template state. Quantum particle swarm optimization. Peter Wittek Quantum Support Vector Machines
- 15. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Support Vector Machines: Risk Minimization and Generalization Training example set: {(x1, y1), . . . , (xM, yM)}, xi ∈ RN are the data points. y ∈ {−1, 1} are binary classes. Minimize 1 2 uT u + C M i=1 ξi subject to yi(uT xi + b) ≥ 1 − ξi, ξi ≥ 0, i = 1, . . . , N. Output is a hyperplane: yi := sgn(uT xi + b). Support vectors are the training data that lie on the margin. Peter Wittek Quantum Support Vector Machines
- 16. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Support Vector Machines: Nonlinear Embedding Making a problem linearly separable after embedding into a feature space by a nonlinear map φ. Only the constraints change: yi(uT φ(xi) + b) ≥ 1 − ξi, ξi ≥ 0, i = 1, . . . , N. The decision function becomes f(x) = sgn(uT φ(x) + b). a) b) Peter Wittek Quantum Support Vector Machines
- 17. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Support Vector Machines: KKT Conditions and Dual Formulation Introduce Lagrangian multipliers. The partial derivatives in u, b, and ξ deﬁne a saddle point of Lagrangian. Maximize M i=1 αi − 1 2 M i=1 M j=1 αiyiαjyjK(xi, xj) subject to M i=1 αiyi = 0, αi ∈ [0, C], i = 1, . . . , M. K(xi, xj) is the kernel function. No need to know the embedding function φ. Peter Wittek Quantum Support Vector Machines
- 18. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Least Squares Support Vector Machines Use the l2 norm in the regularization term. Minimize 1 2 uT u + γ 2 M i=1 e2 i subject to the equality constraints yi(uT φ(xi) + b) = 1 − ei, i = 1, . . . , N. We obtain the following least-squares problem: 0 1T 1 K + γ−1I b α = 0 y (1) . The trade-off: zero αi-s for nonzero error terms ei. Peter Wittek Quantum Support Vector Machines
- 19. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions The Outline of Quantum SVMs Kernel matrix: O(M2N). Least-squares formulation: O(M3). Quantum variant: O(log(MN)). Peter Wittek Quantum Support Vector Machines
- 20. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Calculating the Gram Matrix Generate two states, |ψ and|φ , with an ancilla variable; Estimate the parameter Z = |xi|2 + |xj|2 – the sum of the squared norms of the two instances; Perform a projective measurement on the ancilla alone, comparing the two states. Peter Wittek Quantum Support Vector Machines
- 21. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Calculating the Gram Matrix We calculate the dot product in the linear kernel as xT i xj = Z−|xi −xj |2 2 . |ψ = 1√ 2 (|0 |xi + |1 |xj ) – from QRAM. |φ = 1 Z (|xi||0 − |xj||1 ) is created simultaneously with Z. To get |φ and Z, evolve 1√ 2 (|0 − |1 ) ⊗ |0 with the Hamiltonian H = (|xi||0 0| + |xj||1 1|) ⊗ σx . Measure the ancilla bit. Perform a swap test on |ψ and |φ . Overall complexity: O( −1 log N). Peter Wittek Quantum Support Vector Machines
- 22. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Solving the Linear Equation Core ideas: Quantum matrix inversion is fast. Simulation of sparse matrixes is efﬁcient. Non-sparse density matrices reveal the eigenstructure exponentially faster than in classical algorithms. Peter Wittek Quantum Support Vector Machines
- 23. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Solving the Linear Equation F = 0 1T 1 K + γ−1I = J + Kγ, J = 0 1T 1 0 , Kγ = 0 0 0 K + γ−1I . Peter Wittek Quantum Support Vector Machines
- 24. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Solving the Linear Equation 1 We calculate the matrix exponential of F with the Baker-Campbell-Hausdorff formula. e−i ˆF∆t = e −iJ∆t trKγ e −iγ−1I∆t trKγ e −iK∆t trKγ + O(∆t2 ). (2) 2 We use quantum phase estimation using the exponential to obtain the eigenstructure. The sparse matrices J and the constant multiply of the identity matrix are easy to simulate. Peter Wittek Quantum Support Vector Machines
- 25. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Solving the Linear Equation The kernel matrix K is not sparse. Quantum self analysis: Multiple copies of a density matrix ρ. Perform e−iρt . The state plays an active role in its measurement, by exponentiation it functions as a Hamiltonian. ˆK is a normalized Hermitian matrix, which makes it a prime candidate for quantum self analysis. The exponentiation is done in O(logN). Peter Wittek Quantum Support Vector Machines
- 26. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Open Questions The kernel function is restricted. Sparse data? O(log(MN)) states are required. The model is not sparse, we do not overcome this limit of least squares SVMs. Peter Wittek Quantum Support Vector Machines
- 27. Machine Learning Quantum Computing and Machine Learning Support Vector Machines Quantum SVMs Conclusions Summary Machine learning algorithms are diverse. Growing data sets need both faster execution and better ﬁt to unseen instances. Quantum approaches can help: Nonconvex loss functions, nonclassical correlations, . . . Exponential speedup. Peter Wittek Quantum Support Vector Machines

No public clipboards found for this slide

×
### Save the most important slides with Clipping

Clipping is a handy way to collect and organize the most important slides from a presentation. You can keep your great finds in clipboards organized around topics.

Be the first to comment