Slides from a webinar presented May 23, 2024 by Capitol Technology University and featuring faculty member Dr. Alexander Perry discussing hybrid quantum Machine Learning.
Strengths and limitations of quantum computingVinayak Sharma
Quantum computing as a research field has been around for about 30 years. It seems like a way to overcome the challenges that classical (boolean based) computers are facing due to “quantum tunneling” effect. Although, there are various theoretical and practical challenges that are needed to be dealt with if we want quantum computes to perform better that classical computers (i.e achieving “quantum supremacy”). This seminar will aim to shed light on basics of quantum computing and its strengths and weaknesses.
Video Links
Part 1: https://www.youtube.com/watch?v=-WLD_HnUvy0
Part 2: https://www.youtube.com/watch?v=xXzUmpk8ztU
This document provides an overview of quantum computing presented by Dr Marcus Doherty. It discusses how quantum computing works, important concepts in quantum physics, applications of quantum computing, and opportunities for software professionals. Some key points include:
- Quantum computing offers potential solutions to problems that are intractable for classical computers by exploiting properties of quantum mechanics like superposition and entanglement.
- It works by initializing qubit states, applying quantum gates to encode data and algorithms, then reading out and repeating to build statistics.
- Challenges include errors, limited connectivity, and lack of a quantum random access memory. Software plays a key role in error correction, compilation, and developing applications.
- Potential applications include optimization, machine learning
Quantum Computing: Unleashing the Power of Quantum MechanicsTechCyber Vision
Quantum computing is an emerging field that utilizes principles of quantum mechanics to process information. While still in early stages, it has made progress in areas like quantum algorithms, error correction, and physical implementations. Major challenges remain around scaling up qubits, reducing errors, and developing practical applications. Continued research and collaboration are needed to realize quantum computing's full potential to solve problems beyond the capabilities of classical computers.
The document discusses quantum computing, providing an introduction, key differences from traditional computing, principles like superposition and entanglement, applications in areas like AI and chemistry, challenges around errors and cooling requirements, and India's initiatives including allocating funds for a National Mission on quantum technologies and applications. Advantages of quantum computing include exponentially faster calculations and lower power needs, while disadvantages include challenges with error correction and lack of widely available technology.
Quantum technology, a burgeoning field at the intersection of physics, engineering, and computer science, holds immense promise for revolutionizing various industries and transforming our understanding of the universe. This document serves as a comprehensive exploration of quantum technology, delving into its underlying principles, current advancements, potential applications, and societal implications.
At the heart of quantum technology lies the enigmatic realm of quantum mechanics, a branch of physics that describes the behavior of particles at the smallest scales. Unlike classical physics, which operates on deterministic principles, quantum mechanics introduces probabilistic phenomena such as superposition and entanglement. Superposition allows particles to exist in multiple states simultaneously, while entanglement links the properties of particles regardless of the distance between them. These fundamental principles form the foundation upon which quantum technologies are built.
One of the most promising applications of quantum technology is quantum computing. Traditional computers rely on bits, units of information represented as either 0 or 1, to perform computations. In contrast, quantum computers employ quantum bits, or qubits, which can exist in a superposition of 0 and 1 simultaneously. This enables quantum computers to perform complex calculations exponentially faster than classical computers for certain tasks. Quantum algorithms, such as Shor's algorithm for integer factorization and Grover's algorithm for unstructured search, showcase the potential of quantum computing to revolutionize fields such as cryptography, optimization, and machine learning.
Quantum communication offers another compelling application of quantum technology. Quantum key distribution (QKD) protocols leverage the principles of quantum mechanics to enable secure communication between parties. By encoding information in quantum states, such as the polarization of photons, QKD ensures that any attempt to intercept the communication will be immediately detected. This promises unprecedented levels of security for sensitive information, with applications ranging from financial transactions to government communications.
Quantum sensing and metrology harness the delicate quantum states of particles to achieve unparalleled levels of precision and sensitivity in measuring physical quantities. Quantum sensors, such as atomic clocks and magnetometers, offer applications in navigation, geology, and medical imaging. These sensors have the potential to revolutionize fields such as GPS technology, mineral exploration, and early disease detection, where traditional sensors fall short in terms of accuracy or resolution.
Quantum simulation represents yet another frontier of quantum technology, offering the ability to simulate complex quantum systems that are computationally intractable for classical computers. Quantum simulators, whether digital or analog, mimic the behavior o
Quantum Computing: The next new technology in computingData Con LA
Data Con LA 2020
Description
Quantum computing is rapidly becoming commercially feasible. Many tech giants - Google, IBM, Honeywell, and Microsoft - are spending billions to far outpace Moore's Law. Last year achieved the major milestone of Quantum Supremacy where it was shown that a quantum computer could greatly outperform a classical computer. Quantum computing offers the promise of solving problems which would be impossible for a classical computer including optimization, anomaly detection, and material design. It also allows unhackable communication.
In this presentation I will summarize what quantum computing is and why it is so important. I will sketch the landscape of the field including the hardware, software, and major customers at present. The tool most critical for data analysis - quantum machine learning - will be explained, along with the type of applications it is best suited for. Finally I will explain how you can take the first steps into leveraging quantum computing for your enterprise's benefit.
* What is quantum computing
* Who are the major players in the field
* What is quantum machine learning and what types of problems can it address
* How your company can take advantage of this
Speaker
Mark Jackson, Cambridge Quantum Computing, Scientific Lead of Business Development
Quantum computing in the cloud allows users to access quantum processors and run algorithms through online platforms. IBM and Alibaba currently offer cloud-based quantum computing, providing access to 5-qubit, 16-qubit, and 11-qubit quantum processors. Potential applications of quantum computing in the cloud include solving problems in medicine, logistics, finance, and AI. While it poses security threats, quantum computing could also speed up complex calculations and simulations to provide benefits across many fields.
Strengths and limitations of quantum computingVinayak Sharma
Quantum computing as a research field has been around for about 30 years. It seems like a way to overcome the challenges that classical (boolean based) computers are facing due to “quantum tunneling” effect. Although, there are various theoretical and practical challenges that are needed to be dealt with if we want quantum computes to perform better that classical computers (i.e achieving “quantum supremacy”). This seminar will aim to shed light on basics of quantum computing and its strengths and weaknesses.
Video Links
Part 1: https://www.youtube.com/watch?v=-WLD_HnUvy0
Part 2: https://www.youtube.com/watch?v=xXzUmpk8ztU
This document provides an overview of quantum computing presented by Dr Marcus Doherty. It discusses how quantum computing works, important concepts in quantum physics, applications of quantum computing, and opportunities for software professionals. Some key points include:
- Quantum computing offers potential solutions to problems that are intractable for classical computers by exploiting properties of quantum mechanics like superposition and entanglement.
- It works by initializing qubit states, applying quantum gates to encode data and algorithms, then reading out and repeating to build statistics.
- Challenges include errors, limited connectivity, and lack of a quantum random access memory. Software plays a key role in error correction, compilation, and developing applications.
- Potential applications include optimization, machine learning
Quantum Computing: Unleashing the Power of Quantum MechanicsTechCyber Vision
Quantum computing is an emerging field that utilizes principles of quantum mechanics to process information. While still in early stages, it has made progress in areas like quantum algorithms, error correction, and physical implementations. Major challenges remain around scaling up qubits, reducing errors, and developing practical applications. Continued research and collaboration are needed to realize quantum computing's full potential to solve problems beyond the capabilities of classical computers.
The document discusses quantum computing, providing an introduction, key differences from traditional computing, principles like superposition and entanglement, applications in areas like AI and chemistry, challenges around errors and cooling requirements, and India's initiatives including allocating funds for a National Mission on quantum technologies and applications. Advantages of quantum computing include exponentially faster calculations and lower power needs, while disadvantages include challenges with error correction and lack of widely available technology.
Quantum technology, a burgeoning field at the intersection of physics, engineering, and computer science, holds immense promise for revolutionizing various industries and transforming our understanding of the universe. This document serves as a comprehensive exploration of quantum technology, delving into its underlying principles, current advancements, potential applications, and societal implications.
At the heart of quantum technology lies the enigmatic realm of quantum mechanics, a branch of physics that describes the behavior of particles at the smallest scales. Unlike classical physics, which operates on deterministic principles, quantum mechanics introduces probabilistic phenomena such as superposition and entanglement. Superposition allows particles to exist in multiple states simultaneously, while entanglement links the properties of particles regardless of the distance between them. These fundamental principles form the foundation upon which quantum technologies are built.
One of the most promising applications of quantum technology is quantum computing. Traditional computers rely on bits, units of information represented as either 0 or 1, to perform computations. In contrast, quantum computers employ quantum bits, or qubits, which can exist in a superposition of 0 and 1 simultaneously. This enables quantum computers to perform complex calculations exponentially faster than classical computers for certain tasks. Quantum algorithms, such as Shor's algorithm for integer factorization and Grover's algorithm for unstructured search, showcase the potential of quantum computing to revolutionize fields such as cryptography, optimization, and machine learning.
Quantum communication offers another compelling application of quantum technology. Quantum key distribution (QKD) protocols leverage the principles of quantum mechanics to enable secure communication between parties. By encoding information in quantum states, such as the polarization of photons, QKD ensures that any attempt to intercept the communication will be immediately detected. This promises unprecedented levels of security for sensitive information, with applications ranging from financial transactions to government communications.
Quantum sensing and metrology harness the delicate quantum states of particles to achieve unparalleled levels of precision and sensitivity in measuring physical quantities. Quantum sensors, such as atomic clocks and magnetometers, offer applications in navigation, geology, and medical imaging. These sensors have the potential to revolutionize fields such as GPS technology, mineral exploration, and early disease detection, where traditional sensors fall short in terms of accuracy or resolution.
Quantum simulation represents yet another frontier of quantum technology, offering the ability to simulate complex quantum systems that are computationally intractable for classical computers. Quantum simulators, whether digital or analog, mimic the behavior o
Quantum Computing: The next new technology in computingData Con LA
Data Con LA 2020
Description
Quantum computing is rapidly becoming commercially feasible. Many tech giants - Google, IBM, Honeywell, and Microsoft - are spending billions to far outpace Moore's Law. Last year achieved the major milestone of Quantum Supremacy where it was shown that a quantum computer could greatly outperform a classical computer. Quantum computing offers the promise of solving problems which would be impossible for a classical computer including optimization, anomaly detection, and material design. It also allows unhackable communication.
In this presentation I will summarize what quantum computing is and why it is so important. I will sketch the landscape of the field including the hardware, software, and major customers at present. The tool most critical for data analysis - quantum machine learning - will be explained, along with the type of applications it is best suited for. Finally I will explain how you can take the first steps into leveraging quantum computing for your enterprise's benefit.
* What is quantum computing
* Who are the major players in the field
* What is quantum machine learning and what types of problems can it address
* How your company can take advantage of this
Speaker
Mark Jackson, Cambridge Quantum Computing, Scientific Lead of Business Development
Quantum computing in the cloud allows users to access quantum processors and run algorithms through online platforms. IBM and Alibaba currently offer cloud-based quantum computing, providing access to 5-qubit, 16-qubit, and 11-qubit quantum processors. Potential applications of quantum computing in the cloud include solving problems in medicine, logistics, finance, and AI. While it poses security threats, quantum computing could also speed up complex calculations and simulations to provide benefits across many fields.
Machine Learning and AI: Core Methods and ApplicationsQuantUniversity
This session was presented at the CFA Institute on May 6th 2020
This deep-dive session discusses core methods and applications to provide an understanding of supervised and unsupervised machine learning. Participants will be introduced to advanced topics that include time series analysis, reinforcement learning, anomaly detection, and natural language processing. Case studies will also examine how to predict interest rates and credit risk with alternative data sets and how to analyze earning calls from EDGAR using Natural Language Processing Techniques.
Data science and quantum computing are two rapidly growing fields that can be combined to lead to major advances. Quantum computing uses quantum bits and quantum mechanics to solve problems too complex for classical computers, enabling more efficient problem solving, new algorithms, and tools for data scientists. However, quantum computing also faces challenges like noisy qubits and error correction. Collaboration between data scientists, quantum physicists, and domain experts is needed to fully realize the potential of quantum computing in data science.
This summarizes my work during my first year of PhD at Institute for Manufacturing, University of Cambridge where I investigate the feasibility of deploying machine learning under uncertainty for cyber-physical manufacturing systems.
Modern Computing: Cloud, Distributed, & High Performanceinside-BigData.com
In this video, Dr. Umit Catalyurek from Georgia Institute of Technology presents: Modern Computing: Cloud, Distributed, & High Performance.
Ümit V. Çatalyürek is a Professor in the School of Computational Science and Engineering in the College of Computing at the Georgia Institute of Technology. He received his Ph.D. in 2000 from Bilkent University. He is a recipient of an NSF CAREER award and is the primary investigator of several awards from the Department of Energy, the National Institute of Health, and the National Science Foundation. He currently serves as an Associate Editor for Parallel Computing, and as an editorial board member for IEEE Transactions on Parallel and Distributed Computing, and the Journal of Parallel and Distributed Computing.
Learn more: http://www.bigdatau.org/data-science-seminars
Watch the video presentation: http://wp.me/p3RLHQ-ghU
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
The $5 Million Question: How Can We Make Quantum Computing Useful?uolgdsc
Google has recently announced a $5 million dollar, 3-year competition to develop quantum algorithms that can solve real-world problems. At GDSC UoL we explored realm of quantum computing and its potential with Petar Korponaić.
The slides of the first Meetup of the Quantum Technology community in Paris ! Hosted on 10/16/2018 at WeWork Lafayette
Contribtutors
- Chris Erven CEO of KETS Quantum Security
- Michael Marthaler CEO of Heisenberg Quantum Simulations
- Wojciech Burkot CPO of Beit, on quantum optimization
- Christophe Jurczak CEO of Quantonation, on VC funding
Unlocking the Power of Quantum Machine Learning with Azure QuantumKumton Suttiraksiri
This document introduces quantum computing and machine learning with Azure Quantum. It defines quantum theory, technologies, and computing. It explains how qubits can represent information in superposition and how measurements cause collapse. It also outlines Azure Quantum's ecosystem for quantum algorithms, hardware, and solutions. Finally, it demonstrates a quantum machine learning example on Azure Quantum and provides additional learning resources.
Quantum computers are designed to perform tasks much more accurately and efficiently than conventional computers, providing developers with a new tool for specific applications.
It is clear in the short-term that quantum computers will not replace their traditional counterparts; instead, they will require classical computers to support their specialized abilities, such as systems optimization.
This document provides an introduction to quantum computing, including its history, principles, uses, applications, challenges and conclusions. It discusses how quantum computing leverages phenomena like superposition and entanglement to perform computations more powerful than classical computing. Potential applications include cryptography, artificial intelligence, big data analysis, medicine, banking, and transportation. Key challenges to address include developing reliable hardware, complex software, and managing regulation around its immense capabilities. In conclusion, quantum computing provides unprecedented computational power to solve previously intractable problems.
Quantum computing has the potential to solve certain problems exponentially faster than classical computers by exploiting principles like superposition, entanglement, and interference. Current quantum computers with 50-100 qubits operate in the Noisy Intermediate-Scale Quantum (NISQ) era and use algorithms like the Variational Quantum Eigensolver (VQE) that are hybrid quantum-classical and incorporate techniques like quantum error mitigation. Major players in the field include IBM, Google, and Rigetti who are developing quantum hardware and software for applications in optimization, simulation, and machine learning.
Sample Codes: https://github.com/davegautam/dotnetconfsamplecodes
Presentation on How you can get started with ML.NET. If you are existing .NET Stack Developer and Wanna use the same technology into Machine Learning, this slide focuses on how you can use ML.NET for Machine Learning.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2023/10/practical-approaches-to-dnn-quantization-a-presentation-from-magic-leap/
Dwith Chenna, Senior Embedded DSP Engineer for Computer Vision at Magic Leap, presents the “Practical Approaches to DNN Quantization” tutorial at the May 2023 Embedded Vision Summit.
Convolutional neural networks, widely used in computer vision tasks, require substantial computation and memory resources, making it challenging to run these models on resource-constrained devices. Quantization involves modifying CNNs to use smaller data types (e.g., switching from 32-bit floating-point values to 8-bit integer values).
Quantization is an effective way to reduce the computation and memory bandwidth requirements of these models, and their memory footprints, making it easier to run them on edge devices. However, quantization does degrade the accuracy of CNNs. In this talk, Chenna surveys practical techniques for CNN quantization and shares best practices, tools and recipes to enable you to get the best results from quantization, including ways to minimize accuracy loss.
Overview of quantum computing and it's application in artificial intelligenceBincySam2
This document discusses the basic building blocks of quantum computing, including qubits, superposition, entanglement, and qubit gates. It explains how quantum computing differs from classical computing and reviews some applications in artificial intelligence like quantum principal component analysis and quantum support vector machines. While quantum computing could enable solving problems more efficiently, challenges remain in areas like algorithm creation, low temperature requirements, and building large-scale quantum computers.
Quantum algorithms like VQE and QAOA were used to analyze the impact of COVID-19 on optimal portfolio selection across different industries. Three time periods were considered - pre-COVID, during COVID, and post-COVID. Results found that COVID disrupted optimal portfolios, with sectors like retail, technology and automotive favored more pre-COVID, while oil/gas and airlines/hospitality favored post-COVID. Quantum algorithms provided comparable results to classical methods like Markowitz for portfolio optimization under changing market conditions from the pandemic.
The 1st workshop on engineering processes and practices for quantum software ...Mahdi_Fahmideh
This document summarizes a presentation on developing quantum software engineering practices for quantum algorithm development for multiphysics simulations. It discusses Quanscient's work on developing quantum-native simulation algorithms like the Quantum Lattice-Boltzmann Method. It notes that quantum software engineering has some peculiarities due to the non-deterministic nature of quantum computations and immaturity of quantum hardware. It also describes an ongoing case study developing a flexible Quantum Lattice-Boltzmann module using an API and discusses some challenges of applying agile practices to quantum software development.
Quantum Machine Learning is all you Need – PhD Assistance.pdfPhD Assistance
Quantum computing can be used in Deep learning and machine learning to reduce the time taken train the deep neural network.
For #Enquiry:
Website: https://www.phdassistance.com/blog/quantum-machine-learning-is-all-you-need/
India: +91 91769 66446
Email: info@phdassistance.com
This presentation is a keynote in the AI4SE International Workshop exploring the challenges and opportunities of bringing Systems Engineering the development of AI/ML functions for safety-critical systems.
Capitol Tech U Masters Presentation May 2024CapitolTechU
Slides from a Virtual Information Session from Capitol Technology University covering accredited Master's degrees offered online by the university. Includes program details, costs, financial aid and the application process.
Slides from a Capitol Technology University presentation covering the doctoral programs offered by the university. Includes information on the degrees available, disciplines offered, modalities, tuition, financial aid and the application process. Presented by members of the faculty assisted by Admissions staff.
More Related Content
Similar to slides CapTechTalks Webinar May 2024 Alexander Perry.pptx
Machine Learning and AI: Core Methods and ApplicationsQuantUniversity
This session was presented at the CFA Institute on May 6th 2020
This deep-dive session discusses core methods and applications to provide an understanding of supervised and unsupervised machine learning. Participants will be introduced to advanced topics that include time series analysis, reinforcement learning, anomaly detection, and natural language processing. Case studies will also examine how to predict interest rates and credit risk with alternative data sets and how to analyze earning calls from EDGAR using Natural Language Processing Techniques.
Data science and quantum computing are two rapidly growing fields that can be combined to lead to major advances. Quantum computing uses quantum bits and quantum mechanics to solve problems too complex for classical computers, enabling more efficient problem solving, new algorithms, and tools for data scientists. However, quantum computing also faces challenges like noisy qubits and error correction. Collaboration between data scientists, quantum physicists, and domain experts is needed to fully realize the potential of quantum computing in data science.
This summarizes my work during my first year of PhD at Institute for Manufacturing, University of Cambridge where I investigate the feasibility of deploying machine learning under uncertainty for cyber-physical manufacturing systems.
Modern Computing: Cloud, Distributed, & High Performanceinside-BigData.com
In this video, Dr. Umit Catalyurek from Georgia Institute of Technology presents: Modern Computing: Cloud, Distributed, & High Performance.
Ümit V. Çatalyürek is a Professor in the School of Computational Science and Engineering in the College of Computing at the Georgia Institute of Technology. He received his Ph.D. in 2000 from Bilkent University. He is a recipient of an NSF CAREER award and is the primary investigator of several awards from the Department of Energy, the National Institute of Health, and the National Science Foundation. He currently serves as an Associate Editor for Parallel Computing, and as an editorial board member for IEEE Transactions on Parallel and Distributed Computing, and the Journal of Parallel and Distributed Computing.
Learn more: http://www.bigdatau.org/data-science-seminars
Watch the video presentation: http://wp.me/p3RLHQ-ghU
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
The $5 Million Question: How Can We Make Quantum Computing Useful?uolgdsc
Google has recently announced a $5 million dollar, 3-year competition to develop quantum algorithms that can solve real-world problems. At GDSC UoL we explored realm of quantum computing and its potential with Petar Korponaić.
The slides of the first Meetup of the Quantum Technology community in Paris ! Hosted on 10/16/2018 at WeWork Lafayette
Contribtutors
- Chris Erven CEO of KETS Quantum Security
- Michael Marthaler CEO of Heisenberg Quantum Simulations
- Wojciech Burkot CPO of Beit, on quantum optimization
- Christophe Jurczak CEO of Quantonation, on VC funding
Unlocking the Power of Quantum Machine Learning with Azure QuantumKumton Suttiraksiri
This document introduces quantum computing and machine learning with Azure Quantum. It defines quantum theory, technologies, and computing. It explains how qubits can represent information in superposition and how measurements cause collapse. It also outlines Azure Quantum's ecosystem for quantum algorithms, hardware, and solutions. Finally, it demonstrates a quantum machine learning example on Azure Quantum and provides additional learning resources.
Quantum computers are designed to perform tasks much more accurately and efficiently than conventional computers, providing developers with a new tool for specific applications.
It is clear in the short-term that quantum computers will not replace their traditional counterparts; instead, they will require classical computers to support their specialized abilities, such as systems optimization.
This document provides an introduction to quantum computing, including its history, principles, uses, applications, challenges and conclusions. It discusses how quantum computing leverages phenomena like superposition and entanglement to perform computations more powerful than classical computing. Potential applications include cryptography, artificial intelligence, big data analysis, medicine, banking, and transportation. Key challenges to address include developing reliable hardware, complex software, and managing regulation around its immense capabilities. In conclusion, quantum computing provides unprecedented computational power to solve previously intractable problems.
Quantum computing has the potential to solve certain problems exponentially faster than classical computers by exploiting principles like superposition, entanglement, and interference. Current quantum computers with 50-100 qubits operate in the Noisy Intermediate-Scale Quantum (NISQ) era and use algorithms like the Variational Quantum Eigensolver (VQE) that are hybrid quantum-classical and incorporate techniques like quantum error mitigation. Major players in the field include IBM, Google, and Rigetti who are developing quantum hardware and software for applications in optimization, simulation, and machine learning.
Sample Codes: https://github.com/davegautam/dotnetconfsamplecodes
Presentation on How you can get started with ML.NET. If you are existing .NET Stack Developer and Wanna use the same technology into Machine Learning, this slide focuses on how you can use ML.NET for Machine Learning.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2023/10/practical-approaches-to-dnn-quantization-a-presentation-from-magic-leap/
Dwith Chenna, Senior Embedded DSP Engineer for Computer Vision at Magic Leap, presents the “Practical Approaches to DNN Quantization” tutorial at the May 2023 Embedded Vision Summit.
Convolutional neural networks, widely used in computer vision tasks, require substantial computation and memory resources, making it challenging to run these models on resource-constrained devices. Quantization involves modifying CNNs to use smaller data types (e.g., switching from 32-bit floating-point values to 8-bit integer values).
Quantization is an effective way to reduce the computation and memory bandwidth requirements of these models, and their memory footprints, making it easier to run them on edge devices. However, quantization does degrade the accuracy of CNNs. In this talk, Chenna surveys practical techniques for CNN quantization and shares best practices, tools and recipes to enable you to get the best results from quantization, including ways to minimize accuracy loss.
Overview of quantum computing and it's application in artificial intelligenceBincySam2
This document discusses the basic building blocks of quantum computing, including qubits, superposition, entanglement, and qubit gates. It explains how quantum computing differs from classical computing and reviews some applications in artificial intelligence like quantum principal component analysis and quantum support vector machines. While quantum computing could enable solving problems more efficiently, challenges remain in areas like algorithm creation, low temperature requirements, and building large-scale quantum computers.
Quantum algorithms like VQE and QAOA were used to analyze the impact of COVID-19 on optimal portfolio selection across different industries. Three time periods were considered - pre-COVID, during COVID, and post-COVID. Results found that COVID disrupted optimal portfolios, with sectors like retail, technology and automotive favored more pre-COVID, while oil/gas and airlines/hospitality favored post-COVID. Quantum algorithms provided comparable results to classical methods like Markowitz for portfolio optimization under changing market conditions from the pandemic.
The 1st workshop on engineering processes and practices for quantum software ...Mahdi_Fahmideh
This document summarizes a presentation on developing quantum software engineering practices for quantum algorithm development for multiphysics simulations. It discusses Quanscient's work on developing quantum-native simulation algorithms like the Quantum Lattice-Boltzmann Method. It notes that quantum software engineering has some peculiarities due to the non-deterministic nature of quantum computations and immaturity of quantum hardware. It also describes an ongoing case study developing a flexible Quantum Lattice-Boltzmann module using an API and discusses some challenges of applying agile practices to quantum software development.
Quantum Machine Learning is all you Need – PhD Assistance.pdfPhD Assistance
Quantum computing can be used in Deep learning and machine learning to reduce the time taken train the deep neural network.
For #Enquiry:
Website: https://www.phdassistance.com/blog/quantum-machine-learning-is-all-you-need/
India: +91 91769 66446
Email: info@phdassistance.com
This presentation is a keynote in the AI4SE International Workshop exploring the challenges and opportunities of bringing Systems Engineering the development of AI/ML functions for safety-critical systems.
Similar to slides CapTechTalks Webinar May 2024 Alexander Perry.pptx (20)
Capitol Tech U Masters Presentation May 2024CapitolTechU
Slides from a Virtual Information Session from Capitol Technology University covering accredited Master's degrees offered online by the university. Includes program details, costs, financial aid and the application process.
Slides from a Capitol Technology University presentation covering the doctoral programs offered by the university. Includes information on the degrees available, disciplines offered, modalities, tuition, financial aid and the application process. Presented by members of the faculty assisted by Admissions staff.
CapTechU Masters Presentation April 2024.pptxCapitolTechU
Slides from a Capitol Technology University virtual information session held April 24, 2024 and covering Online accredited master's degrees offered by the university. Includes details about degrees offered, tuition and fees, financial aid, and how the programs are set up.
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitolTechU
Slides from a virtual open house held April 21, 2024 by Capitol Technology University and detailing online regtionally accredited doctoral degree programs offered by the unversity. Features information about programs, modalities, tuition, application procedures, and the doctoral process.
Slides CapTechTalks Webinar April 2024 Ilia Kolochenko.pptxCapitolTechU
Slides from a webinar presented by Capitol Technology University on April 18, 2024. Features a presentation given by Dr. Ilia Kolochenko and Cyber Law, Cybercrime Investigations and Response.
CapTechU Masters Info Session March 2024.pptxCapitolTechU
Slides from a Master's Degree Virtual Information Session held March 27, 2024 by Capitol Technology University. Slides cover the history of the university, accreditation, degrees offered, modalities, the online format, tuition, financial aid and the application process.
Slides CapTechTalks Webinar March 2024 Joshua Sinai.pptxCapitolTechU
Slides from a Capitol Technology University webinar presented on March 21, 2024 by Dr. Joshua Sinai. The webinar detailed how to develop a framework to assess risk and looked at the Maui Fires of 2023 and the Hamas attack of Israel, also in 2023. Dr. Sinai, an expert on counterterrorism and risk management looked at the causes of the failtures to anticipate the catastrophes how they should have been counteracted.
Slides from a Virtual Information Session from Capitol Technology University covering the doctoral programs offered by the university. Includes information on the university, its accreditation, doctoral academics, admissions, tuition and more. Presented March 20, 2024
Masters Presentation - February 2024.pptxCapitolTechU
Slides from Feb. 28, 2024 presentation by Capitol Technology University covering the online accredited masters degrees the University offers. Includes what programs are offered, who they are organized, and information on the application process, financial aid and more.
Slides from a webinar presented on Feb. 25, 2024 discussing doctoral programs at Capitol Technology University. Features information on degree programs, schedules, application, financial aid and more. Presenters include Dr. Ian McAndrew, Mr. Allen Exnor, Ms. Carmit Levin, and Mr. Bill Gibbs.
CapTechTalks Webinar Feb 2024 Darrell Burrell.pptxCapitolTechU
Slides from a Capitol Technology University webinar presented on Feb. 15, 2024 and featuring Dr. Darrell Burrell discussing "Finding Your Scholarly Voice: Using Peer-reviewed Publications to Showcase Your Expertise.
Masters Presentation - January 2024.pptxCapitolTechU
Slides from a Capitol Technology University presentation on accredited Master's degree programs offered online by the university. Includes information on degrees, program details, tuition, financial assistance and more.
CapTech Talks Webinar December 2023 Diane Janosek.pptxCapitolTechU
Slides from a webinar from Capitol Technology University presented in December 2023 and featuring Dr. Diane M. Janoske presenting on Data Governance. This session was part of the fall "Women in Cyber" Leadership Series.
CapTech Talks Webinar November 2023 Tom Vazdar slides.pptxCapitolTechU
Slides from a webinar presented Nov. 16, 2023 by Capitol Technology University and featuring Tom Vazdar, a noted banking cybersecurity expert from Europe.
Slides from a Master's Program Virtual Open House held on Nov. 15, 2023 by Capitol Technology University. Features information about online accredited Master's degree programs, application procedures, tuition and more.
Slides from a Capitol Technology University Doctoral Programs Virtual Open House-Information Session, held Nov. 12, 2023 and featuring information on programs, costs, and other details related to the doctoral degrees offered by the university.
CapTech Talks Webinar October 2023 Bill Butler.pptxCapitolTechU
Slides from a webinar presented Oct. 19, 2023 by Capitol Technology University and featuring Dr. William Butler, discussing cyber education challenges in 2024.
CapTechU Masters Presentation October 2023.pptxCapitolTechU
Slides from a Virtual Open House held Oct. 11, 2023 by Capitol Technology University. Covers all master's degrees offered by the University, plus information on tuition, financial aid and the application process.
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPRAHUL
This Dissertation explores the particular circumstances of Mirzapur, a region located in the
core of India. Mirzapur, with its varied terrains and abundant biodiversity, offers an optimal
environment for investigating the changes in vegetation cover dynamics. Our study utilizes
advanced technologies such as GIS (Geographic Information Systems) and Remote sensing to
analyze the transformations that have taken place over the course of a decade.
The complex relationship between human activities and the environment has been the focus
of extensive research and worry. As the global community grapples with swift urbanization,
population expansion, and economic progress, the effects on natural ecosystems are becoming
more evident. A crucial element of this impact is the alteration of vegetation cover, which plays a
significant role in maintaining the ecological equilibrium of our planet.Land serves as the foundation for all human activities and provides the necessary materials for
these activities. As the most crucial natural resource, its utilization by humans results in different
'Land uses,' which are determined by both human activities and the physical characteristics of the
land.
The utilization of land is impacted by human needs and environmental factors. In countries
like India, rapid population growth and the emphasis on extensive resource exploitation can lead
to significant land degradation, adversely affecting the region's land cover.
Therefore, human intervention has significantly influenced land use patterns over many
centuries, evolving its structure over time and space. In the present era, these changes have
accelerated due to factors such as agriculture and urbanization. Information regarding land use and
cover is essential for various planning and management tasks related to the Earth's surface,
providing crucial environmental data for scientific, resource management, policy purposes, and
diverse human activities.
Accurate understanding of land use and cover is imperative for the development planning
of any area. Consequently, a wide range of professionals, including earth system scientists, land
and water managers, and urban planners, are interested in obtaining data on land use and cover
changes, conversion trends, and other related patterns. The spatial dimensions of land use and
cover support policymakers and scientists in making well-informed decisions, as alterations in
these patterns indicate shifts in economic and social conditions. Monitoring such changes with the
help of Advanced technologies like Remote Sensing and Geographic Information Systems is
crucial for coordinated efforts across different administrative levels. Advanced technologies like
Remote Sensing and Geographic Information Systems
9
Changes in vegetation cover refer to variations in the distribution, composition, and overall
structure of plant communities across different temporal and spatial scales. These changes can
occur natural.
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
This document provides an overview of wound healing, its functions, stages, mechanisms, factors affecting it, and complications.
A wound is a break in the integrity of the skin or tissues, which may be associated with disruption of the structure and function.
Healing is the body’s response to injury in an attempt to restore normal structure and functions.
Healing can occur in two ways: Regeneration and Repair
There are 4 phases of wound healing: hemostasis, inflammation, proliferation, and remodeling. This document also describes the mechanism of wound healing. Factors that affect healing include infection, uncontrolled diabetes, poor nutrition, age, anemia, the presence of foreign bodies, etc.
Complications of wound healing like infection, hyperpigmentation of scar, contractures, and keloid formation.
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
Main Java[All of the Base Concepts}.docxadhitya5119
This is part 1 of my Java Learning Journey. This Contains Custom methods, classes, constructors, packages, multithreading , try- catch block, finally block and more.
Walmart Business+ and Spark Good for Nonprofits.pdfTechSoup
"Learn about all the ways Walmart supports nonprofit organizations.
You will hear from Liz Willett, the Head of Nonprofits, and hear about what Walmart is doing to help nonprofits, including Walmart Business and Spark Good. Walmart Business+ is a new offer for nonprofits that offers discounts and also streamlines nonprofits order and expense tracking, saving time and money.
The webinar may also give some examples on how nonprofits can best leverage Walmart Business+.
The event will cover the following::
Walmart Business + (https://business.walmart.com/plus) is a new shopping experience for nonprofits, schools, and local business customers that connects an exclusive online shopping experience to stores. Benefits include free delivery and shipping, a 'Spend Analytics” feature, special discounts, deals and tax-exempt shopping.
Special TechSoup offer for a free 180 days membership, and up to $150 in discounts on eligible orders.
Spark Good (walmart.com/sparkgood) is a charitable platform that enables nonprofits to receive donations directly from customers and associates.
Answers about how you can do more with Walmart!"
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
slides CapTechTalks Webinar May 2024 Alexander Perry.pptx
1. Presented by Dr. Alexander Perry
May 23, 2024
Hybrid Quantum Machine Learning
Utilizing Limited-Scale Quantum Computing
2. Agenda
Bill Gibbs, Host
1. About Capitol Technology University
2. Session Pointers
3. About the Presenter
4. Presentation
5. Q and A
6. Upcoming Webinars
7. Recording, Slides, Certificate
3. About
Established in 1927, we are one
of the few private Universities in
the U.S. specifically dedicated to
STEM-Based
academic programs. The
University offers degrees at the
Associate, Bachelor, Master, and
Doctoral levels
3
4. Nonprofit, Private &
Accredited
Capitol is a nonprofit, private accredited university
located in Laurel, Maryland, USA
Capitol Technology University is
accredited by the Commission on
Higher Education of the Middle
States Association of Colleges and
Schools
The University is authorized by the
State of Maryland to confer
Associate’s (A.A.S.), Bachelor’s (B.S.),
Master’s (M.S., M.B.A., M.Ed, M.Res.,
T.M.B.A, M.Phil.), and Doctoral (D.Sc.,
Ph.D., D.B.A., Ed.D.) degrees.
5. Capitol offers 16 accredited
degrees from the Bachelor’s to
Doctoral levels related to this
webinar. For more information
about degrees and certificates
offered in related areas, visit
CapTechU.edu.fields-of-study
Join us for Master’s and Doctoral
Virtual Information Sessions. Held
monthly. To learn more:
Email: gradadmit@captechu.edu
Phone: 1- 800-950-1992
6. Session Pointers
• We will answer questions at the conclusion of the presentation. At any time, you
can post a question in the text chat and we will answer as many as we can.
• Microphones and webcams are not activated for participants.
• A link to the recording and to the slides will be sent to all registrants and available
on our webinar web page.
• A participation certificate is available by request for both Live Session and On
Demand viewers.
7. Dr. Alexander Perry
• Adjunct Professor at Capitol
• Data Scientist: Hybrid Quantum-Classical Machine
Learning (HQML)
• Experience: Cyber, Data Science, AI/ML, Quantum
Computing
• 30-year career as software engineer, system
administrator, data scientist, technical director
• Doctor of Science (DSc) in Cybersecurity from Capitol
Technology University
8. Presented by Dr. Alexander Perry
May 23, 2024
Hybrid Quantum Machine Learning
Utilizing Limited-Scale Quantum Computing
9. Hybrid Quantum Machine Learning
Utilizing Limited-Scale Quantum Computing
Dr. Alexander Perry
CapitolTechnology University
May 23, 2024
10. Modified Heilmeier Catechism
• What is HQML via Limited-Scale Quantum Computing?
• Who cares? If you are successful, what difference will it make?
• What are you trying to do?
• How is it done today, and what are the limits of current practice?
• What is new in your approach and why do you think it will be successful?
• What are the risks?
• How much will it cost?
• How long will it take?
• What are the mid-term and final “exams” to check for success?
11. Quantum Computing
• The goal of quantum computation is not a single output but rather to create a
sampling device of a probability distribution.
• A qubit is the computational unit in quantum computers.
• Quantum Superposition:The notion that tiny objects can exist in multiple places
or states simultaneously—is a cornerstone of quantum physics.
• Knowing the quantum state of the system allows us to predict the outcomes of
experiments.
• The Two Golden Rules of Quantum Mechanics:
1. A particle can be in quantum superposition where it behaves as though it is in multiple
states at once.
2. When measured, the particle will be found in a single state.
13. Quantum Machine Learning (QML)
• Quantum Machine Learning (QML) explores how to devise and
implement quantum software that could enable machine learning on
quantum computers (including noisy intermediate-scale quantum,
or NISQ) that is faster than classical computers.
• Hybrid quantum machine learning (HQML) explores how to implement
QML using quantum computers (including noisy intermediate-scale
quantum, or NISQ) in conjunction with classical computers to solve ML
problems faster than classical computers.
14. Modified Heilmeier Catechism
• What is HQML via Limited-ScaleQuantum Computing?
• Who cares? If you are successful, what difference will it make?
• What are you trying to do?
• How is it done today, and what are the limits of current practice?
• What is new in your approach and why do you think it will be successful?
• What are the risks?
• How much will it cost?
• How long will it take?
• What are the mid-term and final “exams” to check for success?
17. Government and Industry Investments
• May 09, 2024: DOE Announces $60-70M Quantum Information Science Funding
Opportunity
• Aug 30, 2023: DOE Announces $24M for Research on Quantum Networks
• Aug 24, 2023: "The administration has requested $75 million for a new account focused
on near-term applications of quantum information science.“
• Aug 17, 2023: NIST Issues Congressionally Mandated Report on EmergingTech Areas
• Aug 16, 2023: NSF Invests $38M to Advance Quantum Information Science and
Engineering
• Aug 15, 2023: AFRL opens Extreme Computing centre for quantum computing research
• Jul 27, 2023: DOE Announces $11.7 Million for Research on Quantum Computing
• Jul 12, 2023:Truist and IBM Collaborate on EmergingTechnology Innovation and
Quantum Computing
• Jun 22, 2023: Expansion of National Quantum Initiative Pitched to Science Committee
18. Quantum Potential
• Quantum computing makes use of intrinsically quantum properties such as
entanglement and superposition to design algorithms that are faster than
classical ones for some class of problems.
• They offer computational speed-up, that provably no classical system could ever
exhibit.
• Some approaches are based on a parameterized quantum circuit (PQC, discussed
in detail later), using neural network-inspired algorithms to train them.
19. Modified Heilmeier Catechism
• What is HQML via Limited-ScaleQuantum Computing?
• Who cares? If you are successful, what difference will it make?
• What are you trying to do?
• How is it done today, and what are the limits of current practice?
• What is new in your approach and why do you think it will be successful?
• What are the risks?
• How much will it cost?
• How long will it take?
• What are the mid-term and final “exams” to check for success?
20. Hybrid Quantum Machine Learning (HQML)
High-level depiction of hybrid algorithms used for machine learning.
Explore implementing a hybrid quantum machine learning (HQML) prototype
using noisy intermediate scale quantum (NISQ) computers (a type of LSQC) in
conjunction with classical computers to solve machine learning problems faster
than classical computers.
21. Outcome, NotTechnology, Focused
• Goal: Use Data Classification via Machine Learning as a way to learn quantum thinking.
• Method:
• Variational Quantum Kernel-Based Classification (VQC):
• Operates through using a variational quantum circuit to classify a training set in direct analogy to
conventional SVMs.119
• NISQ (a type of LSQC) computers via Parameterized Quantum Circuits (PQCs):
• PQCs offer a concrete way to implement algorithms in the NISQ era.2,102
• IBM’s Qiskit will be the quantum simulator of choice for prototyping.
• Currently a de-facto community standard.
• Offers a rich set of quantum computing examples.
• Offers backends that can run simulator code multiple NISQ devices.
22. Parameterized Quantum Circuit
• A parameterized quantum circuit (PQC) is a type of ansatz (educated guess or
starting point). The core idea is basedVariational Quantum Eigensolver (VQE).
• The goal of aVQE is to find the ground state (expected value of a quantum
measurement in this case) of a Hamiltonian H by minimizing the parameters 𝜃 of
a PQC given by 𝑈(𝜃) with regards to an objective function that represents the
energy of a given Hamiltonian (classification of data in this case).
24. Modified Heilmeier Catechism
• What is HQML via Limited-ScaleQuantum Computing?
• Who cares? If you are successful, what difference will it make?
• What are you trying to do?
• How is it done today, and what are the limits of current practice?
• What is new in your approach and why do you think it will be successful?
• What are the risks?
• How much will it cost?
• How long will it take?
• What are the mid-term and final “exams” to check for success?
25. Classical ML ClassificationToday
• Done via:
• Linear regression (univariate and multivariate)
• Support vector machine (SVM) for support vector classification (SVC).
• Deep Neural Networks (Deep Learning)
• Others…
• Limitations:
• Speed of processing the data at scale
• Certain categories problems are intractable for classical computers in:
• Encryption and Cybersecurity
• Financial Services
• Drug Research and Development
26. Modified Heilmeier Catechism
• What is HQML via Limited-ScaleQuantum Computing?
• Who cares? If you are successful, what difference will it make?
• What are you trying to do?
• How is it done today, and what are the limits of current practice?
• What is new in your approach and why do you think it will be successful?
• What are the risks?
• How much will it cost?
• How long will it take?
• What are the mid-term and final “exams” to check for success?
27. What’s New
This diagram gives a brief overview of theVariational Quantum Classification protocol.
28. Why it will work:
HQML in Feature Hilbert spaces
• The basic idea of quantum computing is surprisingly similar to that of kernel
methods in machine learning, namely to efficiently perform computations in an
intractably large Hilbert space.
• We interpret the process of encoding inputs in a quantum state as a nonlinear
feature map that maps data to quantum Hilbert space.
• PQCs can form Gaussian Kernels that can be used to derive adaptive learning
rates for gradient ascent.
• Even at low circuit depth, some classes of PQCs can generate highly non-trivial
outputs.
• PQCs may offer a concrete way to implement QML algorithms on NISQ devices.
29. Kernel Functions
• The “kernel trick” maps input data into a higher dimensional space, making it
easier to solve non-linearly separable problems.
• Mathematically, a kernel function can be defined as:
𝑘 𝑥𝑖, 𝑥𝑗 = ⟨𝑓 𝑥𝑖 , 𝑓 𝑥𝑗 ⟩
where 𝑘 is the kernel function, 𝑥𝑖 and 𝑥𝑗 are 𝑛-dimensional inputs, 𝑓 is a
map from n-dimension to 𝑚-dimension space and ⟨𝑎, 𝑏⟩ denotes the inner
product.When considering finite data, a kernel function can be
represented as a matrix:
𝐾𝑖𝑗 = 𝑘 𝑥𝑖, 𝑥𝑗
31. Modified Heilmeier Catechism
• What is HQML via Limited-ScaleQuantum Computing?
• Who cares? If you are successful, what difference will it make?
• What are you trying to do?
• How is it done today, and what are the limits of current practice?
• What is new in your approach and why do you think it will be successful?
• What are the risks?
• How much will it cost?
• How long will it take?
• What are the mid-term and final “exams” to check for success?
32. Risks: Potentially Expensive Failure
• Hardware challenges:
• Quantum Decoherence: In quantum information processing, the term decoherence is
often used loosely to describe any kind of noise that can affect/collapse quantum
particles to a classical state, as if it’s being measured, and eliminate the quantum
behavior of particles.
• Algorithmic Challenges:
• Supervised HQML training often requires extensive amounts of time.
• HQML suffers from the barren plateau problem.
33. Modified Heilmeier Catechism
• What is HQML via Limited-ScaleQuantum Computing?
• Who cares? If you are successful, what difference will it make?
• What are you trying to do?
• How is it done today, and what are the limits of current practice?
• What is new in your approach and why do you think it will be successful?
• What are the risks?
• How much will it cost?
• How long will it take?
• What are the mid-term and final “exams” to check for success?
34. Extremely Expensive
• Commercial usage of existing NISQ systems can easily reach into the
$100,000 to +$1,000,000 range.
• Vendors offer researchers credits and/or free usage of their smaller NISQ
systems (often with time limits).
35. Modified Heilmeier Catechism
• What is HQML via Limited-ScaleQuantum Computing?
• Who cares? If you are successful, what difference will it make?
• What are you trying to do?
• How is it done today, and what are the limits of current practice?
• What is new in your approach and why do you think it will be successful?
• What are the risks?
• How much will it cost?
• How long will it take?
• What are the mid-term and final “exams” to check for success?
36. Timeframes
• General Purpose Quantum Computers with millions of logical qubits are 10-
15 years away (at best).
• NISQ systems with up to 1000 raw qubits and 48 logical qubits exist:
• Dec 4, 2023: IBM releases first-ever 1,000-qubit quantum chip
• Dec 7, 2023: Logical quantum processor based on reconfigurable atom arrays
37. Modified Heilmeier Catechism
• What is HQML via Limited-ScaleQuantum Computing?
• Who cares? If you are successful, what difference will it make?
• What are you trying to do?
• How is it done today, and what are the limits of current practice?
• What is new in your approach and why do you think it will be successful?
• What are the risks?
• How much will it cost?
• How long will it take?
• What are the mid-term and final “exams” to check for success?
38. Success Checkpoints
• If HQML is going to work, we should see successful prototypes in the next 3-
5 years.
• In 5-10 years, we should see productions applications of HQMLs if the
technology is successful.
42. Recording, Slides & Certificate
Links to the slides and recording will be
sent to all registrants. Watch for an
email
A Certificate of Completion is available
upon request to both live session and
On Demand viewers
Simply reply to the email
43. Thanks for Joining Us!
Thank You!
This concludes today’s webinar
Watch for a follow up email that contains:
1. How to get a Participation Certificate
(Available by request for both Live Session
and On Demand viewers)
2. Link to the webinar recording and slides
Editor's Notes
General Purpose Quantum Computers with millions of logical (fault-tolerant) qubits are 10-15 years away (at best).
Anyone who tells you they have quantum computer that will solve your problems should be met with the GREATEST of skepticism.
This topic, HQML, is merely research in preparation for the future. Think of it as beginning to train learners of today to think in a “quantum way”.
Depending on the quantum computing platform, different approaches can be divided in two groups: digital approaches using gate-based quantum computers and analog approaches using analog quantum computing platforms.
NISQ (Noisy Intermediate Scale Quantum) devices contain a limited number of qubits that are stable for a concise period.25
This work takes a science-first approach that aligns with the NQIAC efforts under the National Quantum Initiative (NQI) in alignment with the NQIAC.
The strategic goal of this work is to help accelerate technology development toward mission applications of Limited-Scale Quantum Computer(s) (LSQC).
As of December 16, 2022: https://www.quantum.gov/wp-content/uploads/2023/01/NQIAC-Slides-2022-12-16.pdf
High-level depiction of hybrid algorithms used for machine learning:2,25
The role of the human is to set up the model using prior information, assess the learning process, and exploit the forecasts (Quantum State Preparation).
Within the hybrid system, the quantum computer prepares quantum states according to a set of parameters (Quantum State Processing).
The outcomes of the quantum states are measured (Quantum State Measurement).
Using the measurement outcomes, the classical learning algorithm adjusts the parameters in order to minimize an objective function (Quantum State Preparation).
The updated parameters, now defining a new quantum circuit, are fed back to the quantum hardware in a closed loop.
A Variational Quantum Algorithm (VQA) uses both quantum and classical computers to accomplish a task.
VQC: A VQA based approach leveraging linear kernels (in this case)
VQNN: A VQA that uses qubits to emulate the hidden layers of a classical neural network (NN) to estimate the gradient of a function. This estimate from the measured quantum state is sent to a classical optimizer in epochs like classical NN.
VQR: : A VQA quantum reservoir computing where quantum noise can be beneficial to the machine learning.
A Hamiltonian matrix is a 2𝑛-by-2𝑛 matrix 𝐴 such that 𝐽𝐴 is symmetric, where 𝐽 is the skew-symmetric matrix:
𝑱= 𝑶 𝒏 𝑰 𝒏 −𝑰 𝒏 𝑶 𝒏 𝑬𝒒.
and 𝑰 𝒏 is the 𝒏-by-𝒏 identity matrix.
In other words, 𝑨 is Hamiltonian if and only if (𝑱𝑨) † = 𝑱𝑨 where () † denotes the transpose in ℝ and the adjoint (complex conjugate) in ℂ.65
A machine learning model comprised of classical pre/post-processing and parameterized quantum circuit.
A data vector is sampled from the dataset distribution, 𝑥~ 𝑃 𝐷 .
The pre-processing scheme maps it to the vector 𝜙(𝑥) that parameterizes the encoder circuit 𝑈 𝜙(𝑥) .
A variational circuit 𝑈 𝜃 , parameterized by a vector 𝜃, acts on the state prepared by the encoder circuit and possibly on an additional register of ancilla qubits, producing the state 𝑈 𝜃 𝑈 𝜙(𝑥) |0⟩.
A set of observable quantities ⟨ 𝑀 𝑘 ⟩ 𝑥,𝜃 𝑘−1 𝐾 is estimated from the measurements.
These estimates are then mapped to the output space through classical post-processing function 𝑓.
For a supervised model, this output is the forecast associated to input 𝑥.
Generative models can be expressed in this framework with small adaptations.
Due to the strong parallelism of quantum computing in Hilbert space, ordinarily intractable calculation problems could now be solved very efficiently with non-classical means. [https://www.sciencedirect.com/science/article/abs/pii/S0577907321001039]
Quantum machine learning in feature Hilbert spaces: https://arxiv.org/abs/1803.07128
Kernel methods use kernel functions to analyze patterns in high-dimensional feature spaces.
Support Vector Machines (SVMs) are a popular application for classification tasks in supervised learning, establishing decision boundaries to separate data into distinct classes.
Kernels are particularly useful when data spaces are not linearly separable, allowing for the identification of hyperplanes within the space.
General ML Risks:
https://hbr.org/2021/01/when-machine-learning-goes-off-the-rails
Technical Risk:
The algorithms typically rely on the probability of an event and may be wrong.
The operational environment may differ from the development environment.
The complexity of the overall systems it’s embedded in (see Agency Risk).
Agency Risk:
Risks stemming from things that aren’t under the control of a specific business or user.
Because machine learning is typically embedded within a complex system, it will often be unclear what led to a breakdown.
Moral Risk (Responsible Algorithm Design):
Products and services that make decisions autonomously will also need to resolve ethical dilemmas raising and regulatory and product development challenges.
This is framed as "responsible algorithm design".
Noise: Irregular fluctuations that accompany a transmitted signal that tend to obscure it, attributable to the discrete and probabilistic nature of physical phenomena and their interactions.
Quantum Barren Plateaus:
The magnitude of the gradients vanishes as the number of qubits increases.1
The gradients of a VQA do not vanish when the fidelity between the initial state and the state to be learned is bounded from below.1
Despite what type of optimization method is used, if the loss landscape is fairly flat, it can be difficult for the method to determine which direction to search. This situation is called a barren plateau. For a wide class of reasonable parameterized quantum circuits, the probability that the gradient along any reasonable direction is non-zero to some fixed precision is exponentially small as a function of the number of qubits.104
One approach to overcome this problem is to use structured initial guesses, such as those adopted in quantum simulation. Another possibility is to consider the full quantum circuit as a sequence of shallow blocks, selecting some parameters randomly and choosing the rest of the parameters such that all shallow blocks implement the identity to restrict the effective depth. This is an area of current investigation.104