Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Quantum Machine Learning and QEM for Gaussian mixture models (Alessandro Luongo)
1. Quantum Machine Learning
and QEM for Gaussian mixture models
Alessandro Luongo
Machine Learning Data Science Meetup
July 1, 2020
Iordanis Kerenidis: 1,3,4
Anupam Prakash: 4
AL: 1,2
2.
3. In short:
We propose a quantum algorithm for
Expectation-Maximization
that fits a
Gaussian mixture model
(using quantum linear algebra, QRAM, distance estimation, etc..)
for a matrix V ∈ Rn×d in time complexity:
O
d2k4.5η3.5κ2(V )κ2(Σ)µ(V )
δ3
log n .
4. In short:
We propose a quantum algorithm for
Expectation-Maximization
that fits a
Gaussian mixture model
(using quantum linear algebra, QRAM, distance estimation, etc..)
for a matrix V ∈ Rn×d in time complexity:
O
d2k4.5η3.5κ2(V )κ2(Σ)µ(V )
δ3
log n .
5. Notation & quantum information 101 - part 1
Qubit: any unit vector in C2.
Basis for this space: |0 = [1, 0]T , |1 = [0, 1]T
|ψ = α |0 + β |1 s.t. |α2
| + |β2
| = 1
Tensor product between vectors:
[a, b] ⊗ [c, d] = [ac, ad, bc, bd]
Quantum state are vectors. Let x ∈ Rd , then:
|x =
1
x 2
x =
1
x 2
2n
i=1
xi |i
Note: |x has log d qubits!
Measuring quantum state |x :
p(|i ) = x2
i / x 2
6. Notation & quantum information 101 - part 2
A Quantum program is a unitary matrix U
|y = U |x
Quantum circuit is composed of elementary quantum gates:
NOT =
0 1
1 0
H =
1
√
2
1 1
1 −1
Runtime of quantum algorithm = depth of circuit.
7. Quantum Machine Learning Toolkit
1. Grover-like algorithms
2. Quantum access to classical data
3. Quantum linear algebra
4. Distance estimation
5. Tomography
(others: amplification techniques, Hamiltonian simulations, etc..)
9. Grover’s Algorithm
Problem: Given a function f : {0, 1}n
→ {0, 1}, find the only x
such that f (x) = 1.
Classically: It takes O(2n).
Theorem [Grover algorithm (informal)]
There exists an algorithm that finds x such that f (x) = 1 in time
O(
√
2n)
10. Quantum access to classical matrices
Let V ∈ Rn×d ,
with vi row of V ,
1
V F i
vi |i |vi (1)
11. Quantum access to classical matrices
Let V ∈ Rn×d ,
with vi row of V ,
1
V F i
vi |i |vi (1)
Facts:
Preparation (preprocessing) time: O(nd log nd)
Size: O(nd log nd)
Execution time: O(log nd)
Can be implemented with a generalization of the QRAM,
(PhD thesis of A. Prakash)
12.
13. Quantum linear algebra
“Solve” linear systems Ax = b
Breakthrough! - HHL algorithm in 2009
Assuming quantum access to matrix A ∈ Rn×d
Encode b ∈ Rd as |b
We can prepare |A−1b or |Ab , in O µ(A)κ(A)
Norm A−1b is estimated with rel. error with additional
O(1/ ) time.
14. Distance estimation
What: With quantum access to matrices V , C:
|i |j |0 → |i |j |d(vi , cj )
distance between row i of V and j of C.
Error: relative
Time: O η
where η = maxi,j ( vi
2
+ cj
2
)
15. Tomography
Retrieving data from quantum computers
For a quantum state |x ∈ Rd we get classical estimate x
|x − x 2 ≤ δ by generating |x O d/δ2 times.
|x − x ∞ ≤ δ by generating |x O log(d)/δ2 times.
Also, sampling, amplitude estimation..
18. Gaussian mixture models (GMM)
Assumption on data (vi , yi ). Is generated by:
1. Sampling a label yi ∈ [k] according to Mult(θ),
2. Sampling a vector vi according to N(µyi , Σyi ).
Mult(θ): multinomial distribution (a dice) for θ ∈ Rk
N(µyi , Σyi ): multivariate Gaussian distribution
GMM Model: γ = (θ, µ1, · · · , µk, Σ1, · · · , Σk)
19. Gaussian mixture models (GMM)
Assumption on data (vi , yi ). Is generated by:
1. Sampling a label yi ∈ [k] according to Mult(θ),
2. Sampling a vector vi according to N(µyi , Σyi ).
Mult(θ): multinomial distribution (a dice) for θ ∈ Rk
N(µyi , Σyi ): multivariate Gaussian distribution
GMM Model: γ = (θ, µ1, · · · , µk, Σ1, · · · , Σk)
Robust GMM Model: γ = (θ, µ1, · · · , µk, Σ1, · · · , Σk)
20. Gaussian mixture models (GMM)
Assumption on data (vi , yi ). Is generated by:
1. Sampling a label yi ∈ [k] according to Mult(θ),
2. Sampling a vector vi according to N(µyi , Σyi ).
Mult(θ): multinomial distribution (a dice) for θ ∈ Rk
N(µyi , Σyi ): multivariate Gaussian distribution
GMM Model: γ = (θ, µ1, · · · , µk, Σ1, · · · , Σk)
Robust GMM Model: γ = (θ, µ1, · · · , µk, Σ1, · · · , Σk)
θ − θ ≤ δθ, µj − µj ≤ δµ, Σj − Σj ≤ δµ
√
η
21.
22.
23.
24.
25.
26.
27. Classical EM for GMM: O(tndω
k)
1: repeat
2: Expectation
rt
ij =
θt
j N(vi ; µt
j , Σt
j )
k
l=1 θt
l N(vi ; µt
l , Σt
l )
∀i ∈ [n], j ∈ [k]
3: Maximization
Update the parameters of the model as:
θt+1
j ←
1
n
n
i=1
rt
ij
µt+1
j ←
n
i=1 rt
ij vi
n
i=1 rt
ij
Σt+1
j ←
n
i=1 rt
ij (vi − µt+1
j )(vi − µt+1
j )T
n
i=1 rt
ij
4: t=t+1
5: until | (γt−1; V ) − (γt; V )| < τ
28.
29. Quantum Expectation
We want estimate:
rij = p(vi |j)
We build
U |i |j → |i |j |rij
and
U |j |0 → |j
1
Rj
n
i
rij |i
Error: additive
Time:
˜O(
k1.5ηκ(Σ)
δ
)
30. Quantum Maximization θt+1
We want:
θt+1
j ←
1
n
n
i=1
rt
ij
Trick: Use Expectation Step and amplitude amplification to build:
|
√
R :=
k
j=1
θj
t+1
| Rj |j . (2)
Error:
θ
t
− θt
< δθ
Runtime:
Tθ = O k3.5
η1.5 κ2(Σ)µ(Σ)
δ2
θ
log n
31. VoxForge dataset: Speaker recognition
Σ 2 |logdet(Σ)| κ∗
(Σ) µ(Σ) µ(V ) κ(V )
MAP
avg 0.244 58.56 4.21 3.82 2.14 23.82
max 2.45 70.08 50 4.35 2.79 40.38
ML
avg 1.31 14.56 15.57 2.54 2.14 23.82
max 3.44 92,3 50 3.67 2.79 40.38
η = 13
Classical accuracy: 98.8%
Quantum accuracy: 98.2%
Comparable number of iterations
34. De-quantizations
Sampling-based sublinear low-rank matrix arithmetic
framework for dequantizing quantum machine learning
-Nai-Hui Chia, Andr´as Gily´en, Tongyang Li, Han-Hsuan Lin,
Ewin Tang, Chunhao Wang [1910.06151]
Quantum-Inspired Classical Algorithms for Singular Value
Transformation - Dhawal Jethwani, Fran¸cois Le Gall, Sanjay
K. Singh [1910.05699]
... but also...
Arrazola, Juan Miguel, et al. ”Quantum-inspired algorithms in
practice.” arXiv preprint [1905.10415] .
Dequantized recommandation system’ runtimes:
O
A 24
F
12σ24
35. Conclusions
We made well-clusterability assumptions,
... but we have runtime guarantees on non well-clusterable
datasets!
We have also quantum initialization strategies!
QEM works for all base distributions in exponential family!
Faster quantum algorithms for the log-determinant soon! :)