08448380779 Call Girls In Civil Lines Women Seeking Men
Tensor Networks and Their Applications on Machine Learning
1. Tensor Networks and Their
Applications on Machine Learning
Kwan-Yuet “Stephen” Ho
Leidos
DS Study Group (July 8, 2020)
2. Kwan-Yuet “Stephen” Ho
Experience:
● April 2020 - present: Data Scientist, Leidos
● September 2018 - April 2020: Machine Learning
Engineer, Capital One
● October 2012 - August 2018: Research Scientist,
General Dynamics Information Technology
● September 2009 - September 2012: Research
Assistant, University of Maryland
● June 2005 - August 2006: Guest Researcher, National
Institute of Standards and Technology
● June 2003 - August 2003: Research Assistant,
California Institute of Technology
Education:
● PhD (Physics), University of Maryland, 2012.
● BSc (Physics), Chinese University of Hong Kong, 2004.
Interests:
● Machine Learning
● Theoretical Physics
● Applied Mathematics
● Python Package Development
4. What is a tensor network?
● A mathematical tool from theoretical
quantum many-body theory.
● “A tensor network is a collection of tensors
with indices connected according to a
network pattern. It can be used to
efficiently represent a many-body wave-
function in an otherwise exponentially
large Hilbert space.”
● It can be represented as graph.
● Why? It facilitates high-rank tensor
analysis.
● Tensor Networks are useful for
constructing machine learning algorithms.
● Useful introductory texts:
arXiv:1708.00006, arXiv:1603.03039
7. Google publishes Python Package
“tensornetwork”
Google AI Blog:
https://ai.googleblog.com/2019/06/introducing
-tensornetwork-open-source.html
Github:
https://github.com/google/TensorNetwork
Article: arXiv:1905.01330
Built on TensorFlow.
Medium:
https://medium.com/syncedreview/google-
tensornetwork-library-dramatically-
accelerates-ml-physics-tasks-8c7011e0f7b0
8. Why Tensor Networks? Here is some history...
Mehta, Schwab, “An exact
mapping between the Variational
Renormalization Group and Deep
Learning,” arXiv:1410.3831.
Mathematical equivalence of
Restricted Boltzmann Machines
(RBM) and Variational RG
9. Stoudenmire, Schwab, “Supervised
Learning With Quantum-Inspired Tensor
Networks,” arXiv:1605.05775.
Supervised learning using ideas from DMRG
and TNN. Optimization using sweeping
algorithm.
12. I am still very confused.
What are these things? How are they connected?
13. What is RG? What is DMRG? Why TNN?
● Renormalization group (RG) is a formalism of “zooming out” in scale-invariant system,
determining which terms to truncate in a model. (Good reference: Sheng-kang Ma, Modern
Theory of Critical Phenomena)
● Density matrix renormalization group (DMRG) is an variational real-space numerical technique
that look at collections of quantum bits (zoomed-out) as block. Its encapsulation makes it a good
tool for strongly correlated electronic system. (First paper: Steven White, PRL 69 (19): 2863-
2866 (1992); good reference: arXiv:cond-mat/0409292)
● DMRG can be expressed conveniently using TNN. (arXiv:1008.3477)
14. Is TNN related to quantum computing?
● Yes and No.
● Yes, these are good tools to study many-body quantum systems,
including those for the implementation of quantum computers.
● No, TNN is not quantum computing.
IBM-Q
15. Is TNN related to quantum machine learning?
● TNN is not used for quantum version of well-known machine
learning algorithms.
● TNN helps develop new quantum machine learning algorithms.
(Huggins, Patil, Mitchell, Whaley, Stoudenmire, Quantum Sci. Technol.
4, 024001 (2019)).
Peter Wittek, Quantum Machine Learning
https://www.amazon.com/Quantum-Machine-Learning-Computing-
Mining/dp/0128100400
Biamonte, Wittek, Pancotti, Rebentrost, Wiebe, Lloyd, Nature 549, 195-
202 (2017).
16. Are there ML applications of TNN?
Supervised Learning:
● TNML: a supervised classification algorithm (arXiv:1605.05775)
● arXiv:1906.06329
● Entanglement-guided ML (arXiv:1803.09111)
● Exponential machines (arXiv:1605.03795)
● TorchMPS (https://github.com/stephenhky/TorchMPS)
Unsupervised Learning:
● Probabilistic modeling with MPS (arXiv:1902.06888)
● TensorSpace Language Model (TSLM, arXiv:1901.11167)