Quantum algorithms like VQE and QAOA were used to analyze the impact of COVID-19 on optimal portfolio selection across different industries. Three time periods were considered - pre-COVID, during COVID, and post-COVID. Results found that COVID disrupted optimal portfolios, with sectors like retail, technology and automotive favored more pre-COVID, while oil/gas and airlines/hospitality favored post-COVID. Quantum algorithms provided comparable results to classical methods like Markowitz for portfolio optimization under changing market conditions from the pandemic.
Quantum computing uses quantum mechanics phenomena like superposition, entanglement, and interference to perform computation. Quantum computers are improving at an exponential rate according to Neven's Law, doubling their processing power exponentially faster than classical computers. The basic unit of quantum information is the qubit, which can exist in superposition and represent a '1' and '0' simultaneously. This allows quantum computers to explore all computational paths at once, greatly increasing their processing speed over classical computers for certain problems.
The document provides an overview of fundamental concepts in quantum computing, including quantum properties like superposition, entanglement, and uncertainty principle. It discusses how quantum bits can represent more than classical bits by being in superpositions of states. Basic quantum gates like Hadamard, Pauli X, and phase shift gates are also introduced, along with pioneers in the field like Feynman, Deutsch, Shor, and Grover. Potential applications of quantum computing are listed.
This document discusses the history and future of quantum computing. It begins with a brief history of quantum computing from the early 1900s work of Planck, Heisenberg, Schrodinger and Einstein establishing quantum theory. It then discusses key developments like Feynman's proposal of a quantum computer in 1981 and Shor's algorithm in 1994. The document notes the first simple quantum computations in the late 1990s and D-Wave's first commercial 28-bit quantum computer in 2007. It explains how quantum computers use superposition and parallelism differently than classical computers. While quantum computers may be useful for some problems like optimization, dynamic portfolio management and trading strategy research, their advantages are still being explored. The document concludes that while quantum
Nanotechnology involves manipulating matter at the atomic scale between 1 to 100 nanometers. It has applications in quantum computing which operates at the quantum level using quantum bits that can represent both 1s and 0s through superposition and entanglement. While a quantum computer could solve certain problems much faster than classical computers by processing vast amounts of calculations simultaneously, they still face limitations such as unpredictability, difficulty retrieving data, and requiring total isolation from the environment to maintain fragile quantum states.
A quantum computer performs calculations based on quantum mechanics, the behavior of particles at the subatomic level. Unlike conventional computers that use bits of 0s and 1s, quantum computers use quantum bits or qubits that can be 0 and 1 simultaneously. This superposition allows quantum computers to manipulate enormous combinations of states at once, potentially performing calculations millions of times faster than classical computers. If built, quantum computers could revolutionize computing in the 21st century by tapping directly into the vast potential of quantum mechanics.
1) Quantum computing uses quantum mechanics and quantum states that can represent multiple values simultaneously, unlike classical computing which uses discrete binary states.
2) Some important concepts in quantum computing include quantum gates which perform operations on quantum bits (qubits), entanglement where quantum states of particles are linked, and the no-cloning theorem which prevents copying of unknown quantum states.
3) Quantum computing has potential applications in factoring integers, search algorithms, random number generation, and quantum key distribution, but challenges remain in building large-scale quantum computers and overcoming issues like short quantum coherence times.
Quantum Computers New Generation of Computers PART1 by Prof Lili SaghafiProfessor Lili Saghafi
This lecture is intended to introduce the concepts and terminology used in Quantum Computing, to provide an overview of what a Quantum Computer is, and why you would want to program one.
The material here is using very high level concepts and is designed to be accessible to both technical and non-technical audiences.
Some background in physics, mathematics and programming is useful to help understand the concepts presented.
Exploits Quantum Mechanical effects
Built around “Qubits” rather than “bits”
Operates in an extreme environment
Enables quantum algorithms to solve very hard problems
Quantum computing uses quantum mechanics phenomena like superposition, entanglement, and interference to perform computation. Quantum computers are improving at an exponential rate according to Neven's Law, doubling their processing power exponentially faster than classical computers. The basic unit of quantum information is the qubit, which can exist in superposition and represent a '1' and '0' simultaneously. This allows quantum computers to explore all computational paths at once, greatly increasing their processing speed over classical computers for certain problems.
The document provides an overview of fundamental concepts in quantum computing, including quantum properties like superposition, entanglement, and uncertainty principle. It discusses how quantum bits can represent more than classical bits by being in superpositions of states. Basic quantum gates like Hadamard, Pauli X, and phase shift gates are also introduced, along with pioneers in the field like Feynman, Deutsch, Shor, and Grover. Potential applications of quantum computing are listed.
This document discusses the history and future of quantum computing. It begins with a brief history of quantum computing from the early 1900s work of Planck, Heisenberg, Schrodinger and Einstein establishing quantum theory. It then discusses key developments like Feynman's proposal of a quantum computer in 1981 and Shor's algorithm in 1994. The document notes the first simple quantum computations in the late 1990s and D-Wave's first commercial 28-bit quantum computer in 2007. It explains how quantum computers use superposition and parallelism differently than classical computers. While quantum computers may be useful for some problems like optimization, dynamic portfolio management and trading strategy research, their advantages are still being explored. The document concludes that while quantum
Nanotechnology involves manipulating matter at the atomic scale between 1 to 100 nanometers. It has applications in quantum computing which operates at the quantum level using quantum bits that can represent both 1s and 0s through superposition and entanglement. While a quantum computer could solve certain problems much faster than classical computers by processing vast amounts of calculations simultaneously, they still face limitations such as unpredictability, difficulty retrieving data, and requiring total isolation from the environment to maintain fragile quantum states.
A quantum computer performs calculations based on quantum mechanics, the behavior of particles at the subatomic level. Unlike conventional computers that use bits of 0s and 1s, quantum computers use quantum bits or qubits that can be 0 and 1 simultaneously. This superposition allows quantum computers to manipulate enormous combinations of states at once, potentially performing calculations millions of times faster than classical computers. If built, quantum computers could revolutionize computing in the 21st century by tapping directly into the vast potential of quantum mechanics.
1) Quantum computing uses quantum mechanics and quantum states that can represent multiple values simultaneously, unlike classical computing which uses discrete binary states.
2) Some important concepts in quantum computing include quantum gates which perform operations on quantum bits (qubits), entanglement where quantum states of particles are linked, and the no-cloning theorem which prevents copying of unknown quantum states.
3) Quantum computing has potential applications in factoring integers, search algorithms, random number generation, and quantum key distribution, but challenges remain in building large-scale quantum computers and overcoming issues like short quantum coherence times.
Quantum Computers New Generation of Computers PART1 by Prof Lili SaghafiProfessor Lili Saghafi
This lecture is intended to introduce the concepts and terminology used in Quantum Computing, to provide an overview of what a Quantum Computer is, and why you would want to program one.
The material here is using very high level concepts and is designed to be accessible to both technical and non-technical audiences.
Some background in physics, mathematics and programming is useful to help understand the concepts presented.
Exploits Quantum Mechanical effects
Built around “Qubits” rather than “bits”
Operates in an extreme environment
Enables quantum algorithms to solve very hard problems
-It is a good ppt for a beginner to learn about Quantum
Computer.
-Quantum computer a solution for every present day computing
problems.
-Quantum computer a best solution for AI making
We discuss the emerging threat and implications of quantum computing technology on the security of cryptosystems currently deployed in applications, and why system designers should consider addressing this risk already in the near term. We then discuss an overview of the current approaches for building quantum safe cryptosystems and their security and performance aspects. We conclude with a glimpse at the state of the art and research challenges in the area of quantum-safe cryptography, including the design of more advanced quantum-safe cryptographic protocols, such as privacy-preserving cryptocurrencies.
Quantum computing uses quantum bits (qubits) that can exist in superpositions of states rather than just 1s and 0s. This allows quantum computers to perform exponentially more calculations in parallel than classical computers. Some of the main challenges to building quantum computers are preventing qubit decoherence from environmental interference, developing effective error correction methods, and observing outputs without corrupting data. Quantum computers may one day be able to break current encryption methods and solve optimization problems much faster than classical computers.
After Moore’s law-which states that the number of
microprocessors/transistors on an integrated circuit doubles
once every two years at the same cost—is running out of
steam. The question is what might replace it
Gordon Moore’s Law benefits for some degree of expansion.
Already larger smartphones and tablets and improvements in
hardware efficiency are picking up some of the slack as it
becomes harder and harder to fit more transistors on a dense
integrated circuit.
So the Moore’s Law must come to an end because it is a
physical phenomenon governed by the physical limits of the
universe.
To solve for the future we need to design a new type of
computer which, aptly named “Quantum computers”, utilizes
the laws of quantum mechanics to create exponentially greater
processing power and uses a new unit of information called a “
Qubit ”, rather than a bit.
Scientists have already built basic Quantum computers that can
perform certain calculations; but a practical quantum computer
is still years away. In this presentation you’ll learn what a
quantum computer is and for what it’ll be used in the next era of
computing.
Quantum computing is a new paradigm that utilizes quantum mechanics phenomena like superposition and entanglement. It has the potential to solve certain problems exponentially faster than classical computers by using qubits that can be in superposition of states. Some key applications are factoring, simulation, and optimization problems. However, building large-scale quantum computers faces challenges like preventing decoherence of qubits and developing error correction techniques. While still in development, quantum computing could revolutionize fields like encryption, communication, and material science in the future through a hybrid model combining classical and quantum processing.
A Shore Introduction to Quantum Computer and the computation of ( Quantum Mechanics),
Nowadays we work on classical computer that work with bits which is either 0s or 1s, but Quantum Computer work with qubits which is either 0s or 1s or 0 and 1 in the same time.
Quantum computing and quantum communications utilize principles of quantum mechanics such as superposition and entanglement to process and transmit information in novel ways. Current research is exploring how to build reliable quantum computers and networks using technologies like ion traps, quantum dots, and optical methods. While still in early stages, quantum information science shows promise for solving computationally difficult problems in fields such as artificial intelligence, cybersecurity, and drug discovery. Pioneering work by groups like D-Wave, IBM, and China are helping advance our understanding of how to harness quantum effects for powerful new computing and communication applications.
The document discusses quantum computing and its potential impacts. It notes that current quantum computers have around 50-70 qubits, which is small compared to classical computers, and errors still need to be addressed. Quantum computers may achieve "quantum supremacy" by solving problems that classical computers cannot. One potential impact area is cryptography - most public-key encryption relies on problems like factoring or discrete logs, which can be broken by Shor's algorithm on a large quantum computer. This is not an imminent threat but could affect secure documents stored now. Post-quantum cryptography aims to base encryption on alternative hard problems not vulnerable to quantum attacks.
This document provides an overview of key mathematical concepts relevant to machine learning, including linear algebra (vectors, matrices, tensors), linear models and hyperplanes, dot and outer products, probability and statistics (distributions, samples vs populations), and resampling methods. It also discusses solving systems of linear equations and the statistical analysis of training data distributions.
This document discusses quantum neural networks. It begins by defining artificial neural networks as interconnected processing elements that process information through dynamic responses to external inputs. The document then provides more details on the basics of neural networks, including their typical layered organization and use of weighted connections and activation functions. It also discusses how neural networks differ from conventional computing by operating in parallel rather than sequentially, and provides some examples of neural network applications and limitations.
1) Quantum computers operate using quantum bits (qubits) that can exist in superpositions of states rather than just 1s and 0s like classical bits.
2) Keeping qubits coherent and isolated from the external environment is extremely challenging as interaction causes decoherence within nanoseconds to seconds.
3) While prototypes of 5-7 qubit quantum computers exist, scaling them up to practical sizes of 50-100 qubits or more to outperform classical computers remains an outstanding challenge due to decoherence issues.
This document discusses quantum computing applications in the financial sector. It describes how quantum computers work using qubits that can be in multiple states at once, allowing for greater processing power. Examples of applications include improving traffic management through quantum machine learning, predicting crimes using social media analysis, and addressing the COVID pandemic through computational chemistry. Major companies developing quantum computing include Microsoft, Google, which has achieved quantum supremacy. The document also discusses how quantum cryptography can enhance banking security through quantum key distribution and the ability to detect any third party interference.
An overview of quantum computing, with its features, capabilities and types of problems it can solve. Also covers some current and future implementations of quantum computing, and a view of the patent landscape.
The document provides an overview of quantum computing concepts and the IBM Quantum Experience platform. It begins with a short history of quantum computing developments from the 1930s to present. It then explains basic quantum concepts like qubits, superposition, entanglement, and quantum gates. The document outlines requirements for building a quantum computer, including well-defined qubits, initialization, gates, coherence times, and measurement. It describes the IBM Quantum Experience as a platform that provides access to an actual quantum processor via the cloud, along with simulation and tutorial capabilities. Users can design circuits using a graphical Quantum Composer interface and run algorithms on real quantum hardware or simulation.
This document provides an overview of quantum computing, including:
- The current state of quantum computing technology, which involves noisy intermediate-scale quantum computers with 10s to 100s of qubits and moderate error rates.
- The difference between quantum and classical information, noting that quantum information uses superposition and entanglement, exponentially increasing computational power.
- An example quantum algorithm, Bernstein-Vazirani, which can solve a problem in one query that classical computers require n queries to solve, demonstrating quantum computing's potential computational advantages.
This document discusses quantum error correction. It explains that while quantum states and operators are theoretically perfect, in reality approximations must be made which can cause errors. Quantum error correction deals with these imperfections. It describes different types of quantum errors and discusses barriers to quantum error correction, such as the no-cloning theorem. The document introduces classical error correction techniques and explains how similar techniques can be applied to encode quantum states to correct bit flip and phase flip errors by measuring the parity of qubits without collapsing their superpositions. Specific quantum error correcting codes are presented, including Shor's code which can correct both types of errors.
With the introduction of quantum computing on the horizon, computer security organizations are stepping up research and development to defend against a new kind of computer power. Quantum computers pose a very real threat to the global information technology infrastructure of today. Many security implementations in use based on the difficulty for modern-day computers to perform large integer factorization. Utilizing a specialized algorithm such as mathematician Peter Shor’s, a quantum computer can compute large integer factoring in polynomial time versus classical computing’s sub-exponential time. This theoretical exponential increase in computing speed has prompted computer security experts around the world to begin preparing by devising new and improved cryptography methods. If the proper measures are not in place by the time full-scale quantum computers produced, the world’s governments and major enterprises could suffer from security breaches and the loss of massive amounts of encrypted data
Pattern Recognition is the branch of machine learning a computer science which deals with the regularities and patterns in the data that can further be used to classify and categorize the data with the help of Pattern Recognition System.
“The assignment of a physical object or event to one of several pre-specified categories”-- Duda & Hart
Pattern Recognition System is responsible for generating patterns and similarities among given problem/data space, that can further be used to generate solutions to complex problems effectively and efficiently.
Certain problems that can be solved by humans, can also be made to be solved by machine by using this process.
An introduction to quantum machine learning.pptxColleen Farrelly
Very basic introduction to quantum computing given at Indaba Malawi 2022. Overviews some basic hardware in classical and quantum computing, as well as a few quantum machine learning algorithms in use today. Resources for self-study provided.
Quantum computing has the potential to solve certain problems exponentially faster than classical computers by exploiting principles like superposition, entanglement, and interference. Current quantum computers with 50-100 qubits operate in the Noisy Intermediate-Scale Quantum (NISQ) era and use algorithms like the Variational Quantum Eigensolver (VQE) that are hybrid quantum-classical and incorporate techniques like quantum error mitigation. Major players in the field include IBM, Google, and Rigetti who are developing quantum hardware and software for applications in optimization, simulation, and machine learning.
The document discusses applications of superconductor materials and devices in quantum information science. It covers 5 topics: 1) an overview of the quantum information landscape, 2) macroscopic quantum phenomena in superconductor devices and superconductor qubits, 3) the transmon qubit which is a leading qubit platform, 4) topological superconducting qubits based on Majorana fermion states, and 5) S-TI-S Josephson junctions which are a compelling qubit platform. Superconductivity is expected to play a major role in developing qubit devices and quantum circuits.
-It is a good ppt for a beginner to learn about Quantum
Computer.
-Quantum computer a solution for every present day computing
problems.
-Quantum computer a best solution for AI making
We discuss the emerging threat and implications of quantum computing technology on the security of cryptosystems currently deployed in applications, and why system designers should consider addressing this risk already in the near term. We then discuss an overview of the current approaches for building quantum safe cryptosystems and their security and performance aspects. We conclude with a glimpse at the state of the art and research challenges in the area of quantum-safe cryptography, including the design of more advanced quantum-safe cryptographic protocols, such as privacy-preserving cryptocurrencies.
Quantum computing uses quantum bits (qubits) that can exist in superpositions of states rather than just 1s and 0s. This allows quantum computers to perform exponentially more calculations in parallel than classical computers. Some of the main challenges to building quantum computers are preventing qubit decoherence from environmental interference, developing effective error correction methods, and observing outputs without corrupting data. Quantum computers may one day be able to break current encryption methods and solve optimization problems much faster than classical computers.
After Moore’s law-which states that the number of
microprocessors/transistors on an integrated circuit doubles
once every two years at the same cost—is running out of
steam. The question is what might replace it
Gordon Moore’s Law benefits for some degree of expansion.
Already larger smartphones and tablets and improvements in
hardware efficiency are picking up some of the slack as it
becomes harder and harder to fit more transistors on a dense
integrated circuit.
So the Moore’s Law must come to an end because it is a
physical phenomenon governed by the physical limits of the
universe.
To solve for the future we need to design a new type of
computer which, aptly named “Quantum computers”, utilizes
the laws of quantum mechanics to create exponentially greater
processing power and uses a new unit of information called a “
Qubit ”, rather than a bit.
Scientists have already built basic Quantum computers that can
perform certain calculations; but a practical quantum computer
is still years away. In this presentation you’ll learn what a
quantum computer is and for what it’ll be used in the next era of
computing.
Quantum computing is a new paradigm that utilizes quantum mechanics phenomena like superposition and entanglement. It has the potential to solve certain problems exponentially faster than classical computers by using qubits that can be in superposition of states. Some key applications are factoring, simulation, and optimization problems. However, building large-scale quantum computers faces challenges like preventing decoherence of qubits and developing error correction techniques. While still in development, quantum computing could revolutionize fields like encryption, communication, and material science in the future through a hybrid model combining classical and quantum processing.
A Shore Introduction to Quantum Computer and the computation of ( Quantum Mechanics),
Nowadays we work on classical computer that work with bits which is either 0s or 1s, but Quantum Computer work with qubits which is either 0s or 1s or 0 and 1 in the same time.
Quantum computing and quantum communications utilize principles of quantum mechanics such as superposition and entanglement to process and transmit information in novel ways. Current research is exploring how to build reliable quantum computers and networks using technologies like ion traps, quantum dots, and optical methods. While still in early stages, quantum information science shows promise for solving computationally difficult problems in fields such as artificial intelligence, cybersecurity, and drug discovery. Pioneering work by groups like D-Wave, IBM, and China are helping advance our understanding of how to harness quantum effects for powerful new computing and communication applications.
The document discusses quantum computing and its potential impacts. It notes that current quantum computers have around 50-70 qubits, which is small compared to classical computers, and errors still need to be addressed. Quantum computers may achieve "quantum supremacy" by solving problems that classical computers cannot. One potential impact area is cryptography - most public-key encryption relies on problems like factoring or discrete logs, which can be broken by Shor's algorithm on a large quantum computer. This is not an imminent threat but could affect secure documents stored now. Post-quantum cryptography aims to base encryption on alternative hard problems not vulnerable to quantum attacks.
This document provides an overview of key mathematical concepts relevant to machine learning, including linear algebra (vectors, matrices, tensors), linear models and hyperplanes, dot and outer products, probability and statistics (distributions, samples vs populations), and resampling methods. It also discusses solving systems of linear equations and the statistical analysis of training data distributions.
This document discusses quantum neural networks. It begins by defining artificial neural networks as interconnected processing elements that process information through dynamic responses to external inputs. The document then provides more details on the basics of neural networks, including their typical layered organization and use of weighted connections and activation functions. It also discusses how neural networks differ from conventional computing by operating in parallel rather than sequentially, and provides some examples of neural network applications and limitations.
1) Quantum computers operate using quantum bits (qubits) that can exist in superpositions of states rather than just 1s and 0s like classical bits.
2) Keeping qubits coherent and isolated from the external environment is extremely challenging as interaction causes decoherence within nanoseconds to seconds.
3) While prototypes of 5-7 qubit quantum computers exist, scaling them up to practical sizes of 50-100 qubits or more to outperform classical computers remains an outstanding challenge due to decoherence issues.
This document discusses quantum computing applications in the financial sector. It describes how quantum computers work using qubits that can be in multiple states at once, allowing for greater processing power. Examples of applications include improving traffic management through quantum machine learning, predicting crimes using social media analysis, and addressing the COVID pandemic through computational chemistry. Major companies developing quantum computing include Microsoft, Google, which has achieved quantum supremacy. The document also discusses how quantum cryptography can enhance banking security through quantum key distribution and the ability to detect any third party interference.
An overview of quantum computing, with its features, capabilities and types of problems it can solve. Also covers some current and future implementations of quantum computing, and a view of the patent landscape.
The document provides an overview of quantum computing concepts and the IBM Quantum Experience platform. It begins with a short history of quantum computing developments from the 1930s to present. It then explains basic quantum concepts like qubits, superposition, entanglement, and quantum gates. The document outlines requirements for building a quantum computer, including well-defined qubits, initialization, gates, coherence times, and measurement. It describes the IBM Quantum Experience as a platform that provides access to an actual quantum processor via the cloud, along with simulation and tutorial capabilities. Users can design circuits using a graphical Quantum Composer interface and run algorithms on real quantum hardware or simulation.
This document provides an overview of quantum computing, including:
- The current state of quantum computing technology, which involves noisy intermediate-scale quantum computers with 10s to 100s of qubits and moderate error rates.
- The difference between quantum and classical information, noting that quantum information uses superposition and entanglement, exponentially increasing computational power.
- An example quantum algorithm, Bernstein-Vazirani, which can solve a problem in one query that classical computers require n queries to solve, demonstrating quantum computing's potential computational advantages.
This document discusses quantum error correction. It explains that while quantum states and operators are theoretically perfect, in reality approximations must be made which can cause errors. Quantum error correction deals with these imperfections. It describes different types of quantum errors and discusses barriers to quantum error correction, such as the no-cloning theorem. The document introduces classical error correction techniques and explains how similar techniques can be applied to encode quantum states to correct bit flip and phase flip errors by measuring the parity of qubits without collapsing their superpositions. Specific quantum error correcting codes are presented, including Shor's code which can correct both types of errors.
With the introduction of quantum computing on the horizon, computer security organizations are stepping up research and development to defend against a new kind of computer power. Quantum computers pose a very real threat to the global information technology infrastructure of today. Many security implementations in use based on the difficulty for modern-day computers to perform large integer factorization. Utilizing a specialized algorithm such as mathematician Peter Shor’s, a quantum computer can compute large integer factoring in polynomial time versus classical computing’s sub-exponential time. This theoretical exponential increase in computing speed has prompted computer security experts around the world to begin preparing by devising new and improved cryptography methods. If the proper measures are not in place by the time full-scale quantum computers produced, the world’s governments and major enterprises could suffer from security breaches and the loss of massive amounts of encrypted data
Pattern Recognition is the branch of machine learning a computer science which deals with the regularities and patterns in the data that can further be used to classify and categorize the data with the help of Pattern Recognition System.
“The assignment of a physical object or event to one of several pre-specified categories”-- Duda & Hart
Pattern Recognition System is responsible for generating patterns and similarities among given problem/data space, that can further be used to generate solutions to complex problems effectively and efficiently.
Certain problems that can be solved by humans, can also be made to be solved by machine by using this process.
An introduction to quantum machine learning.pptxColleen Farrelly
Very basic introduction to quantum computing given at Indaba Malawi 2022. Overviews some basic hardware in classical and quantum computing, as well as a few quantum machine learning algorithms in use today. Resources for self-study provided.
Quantum computing has the potential to solve certain problems exponentially faster than classical computers by exploiting principles like superposition, entanglement, and interference. Current quantum computers with 50-100 qubits operate in the Noisy Intermediate-Scale Quantum (NISQ) era and use algorithms like the Variational Quantum Eigensolver (VQE) that are hybrid quantum-classical and incorporate techniques like quantum error mitigation. Major players in the field include IBM, Google, and Rigetti who are developing quantum hardware and software for applications in optimization, simulation, and machine learning.
The document discusses applications of superconductor materials and devices in quantum information science. It covers 5 topics: 1) an overview of the quantum information landscape, 2) macroscopic quantum phenomena in superconductor devices and superconductor qubits, 3) the transmon qubit which is a leading qubit platform, 4) topological superconducting qubits based on Majorana fermion states, and 5) S-TI-S Josephson junctions which are a compelling qubit platform. Superconductivity is expected to play a major role in developing qubit devices and quantum circuits.
This presentation is about quantum computing.which going to be new technological concept for computer operating system.In this subject the research is going on.
This document discusses quantum computers, which harness quantum phenomena like superposition and entanglement to perform operations. A qubit, the basic unit of information in a quantum computer, can exist in multiple states simultaneously. While this allows massive parallelism and an exponential increase in computational power over classical computers, building large-scale quantum computers faces challenges in maintaining coherence. Potential applications include cryptography, optimization problems, and software testing due to quantum computers' probabilistic solving approach.
Quantum computing is a rapidly developing field of computer science that explores the application of quantum mechanics to information processing. It promises to revolutionize the way we solve complex problems that are currently beyond the capabilities of classical computers.
This PowerPoint presentation provides an introduction to the basics of quantum computing, including the principles of quantum mechanics, the properties of quantum bits or qubits, quantum entanglement, quantum superposition, and types of quantum computing .
Quantum computing is a rapidly emerging technology that uses principles of quantum mechanics like superposition and entanglement to perform operations on quantum bits (qubits) and solve complex problems. It has the potential to vastly outperform classical computers for certain problems. The document discusses key aspects of quantum computing including how it differs from classical computing, what qubits are, how quantum computers work using elements like superconductors and Josephson junctions, and potential applications in areas like artificial intelligence, drug development, weather forecasting, and cybersecurity. It also covers advantages like speed and ability to solve complex problems, as well as current disadvantages like difficulty to build and susceptibility to errors.
Quantum communication and quantum computingIOSR Journals
Abstract: The subject of quantum computing brings together ideas from classical information theory, computer
science, and quantum physics. This review aims to summarize not just quantum computing, but the whole
subject of quantum information theory. Information can be identified as the most general thing which must
propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics.
However, the mathematical treatment of information, especially information processing, is quite recent, dating
from the mid-20th century. This has meant that the full significance of information as a basic concept in physics
is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information
and computing puts this significance on a firm footing, and has led to some profound and exciting new insights
into the natural world. Among these are the use of quantum states to permit the secure transmission of classical
information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of
quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible
noise processes (quantum error correction), and the use of controlled quantum evolution for efficient
computation (quantum computation). The common theme of all these insights is the use of quantum
entanglement as a computational resource.
Keywords: quantum bits, quantum registers, quantum gates and quantum networks
Quantum Computer is a machine that is used for Quantum Computation with the help of using Quantum Physics properties. Where classical computers encode information in binary “bits” that can either 0s or 1s but quantum computer use Qubits. Like the classical computer, the Quantum computer also uses 0 and 1, but qubits have a third state that allows them to represent one or zero at the same time and it’s called “Superposition”. This research paper has presented the Basics of Quantum Computer and The Future of Quantum Computer. So why Quantum Computer can be Future Computer, Because Quantum Computer is faster than any other computer, as an example, IBM’s Computer Deep Blue examined 200 million possible chess moves each second. Quantum Computer would be able to examine 1 trillion possible chess moves per second. It can be 100 million times faster than a classical computer. The computer makes human life easier and also focuses on increasing performance to make technology better. One such way is to reduce the size of the transistor and another way is to use Quantum Computer. The main aim of this paper is to know that how Quantum Computers can become the future computer.
A quantum computer harnesses the power of atoms and molecules to perform calculations billions of times faster than silicon-based computers. Unlike classical bits that are either 0 or 1, quantum bits or qubits can be in a superposition of both states simultaneously. While current quantum computers have only manipulated a few qubits, their potential applications include efficiently solving problems like integer factorization that are intractable for classical computers. Significant challenges remain to controlling quantum phenomena necessary for building useful quantum computers.
Strengths and limitations of quantum computingVinayak Sharma
Quantum computing as a research field has been around for about 30 years. It seems like a way to overcome the challenges that classical (boolean based) computers are facing due to “quantum tunneling” effect. Although, there are various theoretical and practical challenges that are needed to be dealt with if we want quantum computes to perform better that classical computers (i.e achieving “quantum supremacy”). This seminar will aim to shed light on basics of quantum computing and its strengths and weaknesses.
Video Links
Part 1: https://www.youtube.com/watch?v=-WLD_HnUvy0
Part 2: https://www.youtube.com/watch?v=xXzUmpk8ztU
- The document discusses quantum computing and its potential applications. It notes that while quantum computers may be able to efficiently simulate physical processes, quantum error correction is needed for scalability.
- Current quantum hardware has around 50-100 qubits but higher qubit numbers and lower error rates are needed. Quantum computers in the "Noisy Intermediate-Scale Quantum" era may be able to explore physics and have some commercial uses, but more powerful quantum technologies will likely require decades more of work.
This document summarizes quantum computing. It begins with an introduction explaining the differences between classical and quantum bits, with qubits being able to exist in superpositions of states. The history of quantum computing is discussed, including early explorations in the 1970s-80s and Peter Shor's breakthrough in 1994. D-Wave Systems is mentioned as the first company to develop a quantum computer in 2011. The scope, architecture, working principles, advantages and applications of quantum computing are then outlined at a high level. The document concludes by discussing the growing field of quantum computing research and applications.
Quantum computers have the potential to vastly outperform classical computers for certain problems. They make use of quantum bits (qubits) that can exist in superpositions of states and become entangled with each other. This allows quantum computers to perform calculations on all possible combinations of inputs simultaneously. However, building large-scale quantum computers faces challenges such as maintaining quantum coherence long enough to perform useful computations. Researchers are working to develop quantum algorithms and overcome issues like decoherence. If successful, quantum computers could solve problems in domains like cryptography, simulation, and machine learning that are intractable for classical computers.
Quantum computers is a machine that performs calculations based on the laws of quantum mechanics which is the behaviour of particles at the subatomic level.
Quantum computers have the potential to solve certain problems much faster than classical computers by exploiting principles of quantum mechanics, such as superposition and entanglement. However, building large-scale, reliable quantum computers faces challenges related to decoherence and controlling quantum systems. Current research aims to develop quantum algorithms and overcome issues in scaling up quantum hardware to perform more complex computations than today's most powerful supercomputers.
Quantum computing uses principles of quantum mechanics such as superposition, entanglement, and tunneling to perform operations on quantum bits (qubits) that allow for greater processing power than classical computers. A qubit can represent a 1, 0, or both values simultaneously, enabling quantum computers to evaluate all possible solutions to problems simultaneously. While challenges remain in building large-scale quantum computers, they have applications in optimization, simulation, machine learning, and potentially breaking current encryption methods. Quantum cryptography also provides solutions for secure communication using quantum principles.
This is a seminar on Quantum Computing given on 9th march 2017 at CIME, Bhubaneswar by me(2nd year MCA).
Video at - https://youtu.be/vguxg0RYg7M
PDF at - http://www.slideshare.net/deepankarsandhibigraha/quantum-computing-73031375
Quantum Computing: Unleashing the Power of Quantum MechanicsTechCyber Vision
Quantum computing is an emerging field that utilizes principles of quantum mechanics to process information. While still in early stages, it has made progress in areas like quantum algorithms, error correction, and physical implementations. Major challenges remain around scaling up qubits, reducing errors, and developing practical applications. Continued research and collaboration are needed to realize quantum computing's full potential to solve problems beyond the capabilities of classical computers.
This interactive course aims to equip students with an in-depth comprehension of
data science principles and methodologies, with a strong emphasis on practical
applications.
This document summarizes Avesta Sasan's background and research focus. It begins with an introduction of Avesta Sasan and their position as an Associate Professor at UC Davis. The document then discusses some challenges with keeping hardware development pace with the rapid growth and increasing resource demands of artificial intelligence. This includes charts showing the growth in model size and training costs outpacing hardware improvements. It argues that contributions are needed from across machine learning, hardware architecture, circuit design, and manufacturing to improve AI speed and efficiency. The document proposes using machine learning techniques in electronic design automation flows to help automate and accelerate physical design and RTL synthesis steps in order to help close the gap between hardware and AI.
The CHIPS Alliance is a Linux Foundation project that develops open source hardware specifications, implementations, verification tools, and IP blocks. It aims to lower the costs of hardware development through collaboration and shared resources. Members include companies and organizations working on CPUs, interconnects, I/O, machine learning accelerators, and more. The CHIPS Alliance uses Apache 2.0 licensing to encourage IP contribution and participation while allowing commercial use of outputs. It provides a neutral environment for hardware collaboration across companies and countries.
The document discusses various RTL design methodologies including modular design, hierarchical design, design abstraction, RTL coding guidelines, design verification, timing constraints, power optimization, area optimization, and design reuse. It defines each methodology, provides examples, and outlines the advantages of applying these methodologies in RTL design.
This document discusses the challenges of designing AI chips and how high-level synthesis (HLS) can help address them. It notes that AI design requirements change rapidly, requiring a nimble design approach. HLS allows designs to be specified at a high level in languages like C++ and optimized through automated exploration, enabling faster design cycles compared to manual RTL development. Case studies demonstrate how HLS can achieve better power, performance and area than hand-coded RTL. The document argues HLS is well-suited for AI chip design due to its ability to rapidly apply design intent and optimize architectures through automation.
AI-Inspired IOT Chiplets and 3D Heterogeneous IntegrationObject Automation
Ultra low power processor cores and 2.5D/3D heterogeneous chiplet integration are required for emerging IOT applications. Wafer-Level-Substrate demonstrates solid RF performance with sub 1um L/S capability, pad less vias, and active/passive embedding capabilities, enabling multi-die/chiplet and silicon photonic packaging. 3DHI stacking using high bandwidth substrate enables modular testing and provides effective thermal management.
This document discusses the state of artificial intelligence (AI) and NASSCOM's role in enabling AI adoption in India. It covers three key drivers of the AI revolution: vast amounts of data, mega computing power, and massive funding. NASSCOM's focus areas include accelerating India's digital transformation, developing talent and infrastructure for AI, and addressing barriers to responsible AI adoption such as skills shortages and regulatory uncertainty. The document presents data on global AI readiness and the potential economic impact of generative AI technologies.
CDAC presentation as part of Global AI Festival and FutureObject Automation
This document discusses generative artificial intelligence (GenAI) and its opportunities and challenges. It provides an overview of what GenAI is, including its evolution from earlier forms of AI. It discusses GenAI applications in areas like advisory systems, assistants, cooperation, augmentation, and autonomous systems. Examples are given of image captioning, screenplay assistance, and cybersecurity applications like flagging unusual network behavior. Challenges around invisible data perturbation and GenAI attacks are mentioned. The document concludes with a discussion of Gartner's hype cycle for GenAI, opportunities, challenges, and recommendations around GenAI threat mitigation including trust, risk management, democratization, and governance frameworks.
Global AI Festival and Future is a digital broadcast of thought-provoking discussions and insights from world AI leaders. The event covers global and regional streams, helping you learn about the latest technological improvements, practical use cases, and industry trends.
Key Outcomes of the Event
- Gain knowledge about Latest Technology Trends
- Networking Opportunity with Technical Leaders
- Opportunities to become a thought leader
- Learn and Understand Industry based AI Solutions
- Getting Access to world super fast GPU compute
- International Internship opportunities
- Quiz Competition to win prizes and placements
Object Automation, a technology company based in California,
has been concentrating on latest technologies and emerging
tech partnerships. These include research and solution
development, the development of onshore and offshore
technology projects, the establishment of tech centers of
excellence in AI, quantum, and chip design, Technology
workshops and boot camps for corporates, special labs for
universities, and cutting-edge industry projects.
This document provides an overview of an AI and Applications bootcamp program. The program includes a variety of courses that provide both theoretical foundations and practical skills in AI, machine learning, and related topics. It utilizes a blended learning approach including online videos, live virtual classes, projects, and masterclasses. The program aims to help professionals gain expertise in in-demand AI skills and advance their careers. It covers topics such as deep learning, computer vision, natural language processing, and more.
This document provides an overview of an AI and Applications bootcamp program. The program includes a variety of courses that provide both theoretical foundations and practical skills in AI, machine learning, and related topics. It utilizes a blended learning approach with online videos, live virtual classes, projects, and masterclasses. The program aims to help professionals gain expertise in in-demand AI skills and advance their careers. It covers topics such as deep learning, computer vision, natural language processing, and more.
This document provides an overview of an AI and Applications bootcamp program. The program includes a variety of courses that provide both theoretical foundations and practical skills in AI, machine learning, and related topics. It utilizes a blended learning approach with online videos, live virtual classes, projects, and masterclasses. The program aims to help professionals gain expertise in in-demand AI skills and advance their careers. It covers topics such as deep learning, computer vision, natural language processing, and more.
The document provides information on various technology course offerings from Object Automation System Solutions Inc. The courses include AI in Enterprise, Generative AI, UI Developer, Cyber Crime, Integrated Azure, and Chip Design. For each course, a brief description is given of the topics that will be covered. The document also provides information on who should take each course and highlights of the course delivery approach, which includes classroom and online live classes, pre-recorded classes, placement support, and implementing concepts in real-time projects.
The document outlines a 40-hour training agenda on enterprise artificial intelligence covering machine learning libraries and algorithms, building and deploying machine learning models using structured and unstructured databases, digital agriculture using deep learning, and model operations for production. The agenda is split over 5 days and covers topics such as regression, classification, model building with Docker, GitHub, IBM DB2, Cassandra, and deployment operations.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
2. • The quantum world exhibit unique characteristics of superposition of states,
interference and entanglement which are not to be observed in the classical
sense
• Superposition of states – if and are two states of quantum system then
+ s also an allowed state with = 1
• Quantum interference is the result of addition or subtraction of the amplitudes
arising from the wave nature of the particles
• Quantum entanglement is the non locality experienced in measuring or observing
the particles
• These are counter intuitive and strange to the humans of today
• Richard Feynman said in 1985, if we could compute using atoms we would
compute as nature computes
Unique Nature of Quantum Mechanics
3. • Moore’s law projects that the number of transistors in the integrated circuits should double
itself every two years
• Also by Morse law , the size of the transistors will be reaching that of molecules and atoms in
the near future
• And when it does, quantum effects such as tunneling, coherence will take over and
detrimentally affect the efficiency of computing
• Therefore, In order to scale, it is compelling to look for different ways of computation
Source : Prof. Christopher Monroe , Univ. of Maryland
(https://www.youtube.com/watch?v=Y3mcgq3_yEY)
Need for Quantum Computing
4. • Bit is the fundamental unit of classical computing and it can take either of the
two states, a 1 or 0, an On or Of, True or False
• Quantum bit or Qubit is the fundamental unit of quantum information
processing
• We can store bits of information in quantum computing. For eg. has 8
different probabilities or amplitudes which means 8 bits of information can
be stored for a three input process
+ + + + + +
+ , leads to quantum parallelism
• The QC derives its supremacy over CQ based on this exponential nature of
processing/computing
• When no. of Qubits N = 300 , we have more information to
store/process/compute with, than there are particles in the universe
Quantum Computing
5. • number of states is the result of superposition of states
• Allowing the interference to happen between the probability waves (by means
of manipulating the input states through quantum gates)of qubits/states and
obtaining the results as few tens to few hundreds/thousands (sparse and still
being dependent on any number inputs) and repeating the experiment (shots)
to get statistical estimate /distribution harnesses the concept of quantum
interference for computation
• Quantum entanglement is the correlation between two different qubits, which
means the two members of a pair exist in a single quantum state
• Observing the state of one of the qubits instantaneously changes the state of
the other one in a predictable manner, even at a long distance
• Realization: Superconducting circuit, trapped ions, silicon quantum dot and
diamond vacancies
Quantum Mechanical Resources for Computing
Output
Input
6. Quantum mechanical states are extremely fragile and require near absolute isolation from
the environment.
Creating such conditions require temperatures near absolute zero and shielding from
radiation.
Challenge only increases with increasing the size (number of qubits and the length of time
they must be coherent).
Thus, building quantum computers is expensive and difficult.
Requires contributions from many different fields, such as the design of quantum
algorithms and error correcting codes, the architecture design of the computer itself, and
the development of more reliable quantum devices.
Development of quantum versions of devices, architectures, languages, compilers, and
layers of abstraction.
Challenges and Opportunities
7. Quantum Algorithms – General Principle
Equally weighted
Superposition of
states
Single pulse on single qubit affects works on the
massive superposition of states
Pulse leads to interference which in turn
changes the prob. Amplitude
Coupled qubit gates (CNOT) a pair of qubits.
Pulse leads to interference which in turn
changes the prob. amplitude
Prob. amp. constructively interfere
on one state only and thus the result
8. Computational Complexity
Complexity of a problem gives information about how long it takes to solve a problem.
T(n) denotes time or the number of steps required to solve a problem with n being the no.
length of no. of digits of input, there broadly exists two classes of complexity namely, P and
NP. Complexity is basically knowing how as n grows.
If the time scales as , where k and p are positive numbers, then problem can be
solved in polynomial time.
If the time scales as , where k and c are positive numbers, such that for every
value of n the problem is considered to be solvable in exponential time. Note: n is an
exponent here.
In classical computing, problems are tractable if it grows in polynomial time while intractable
if it grows exponentially and these problems are called easy and hard respectively.
If a problem can be solved in polynomial time it is considered to belonging to a class known
as P while that of exponential time it is called NP (non-deterministic polynomial).
Generally, it is believed that P is not equal to NP and that there are problems in NP but not in
P, which can be solved by quantum computers in polynomial time.
9. Motivation for Quantum Algorithms
Quantum algorithms are better than classical computers at specific tasks.
Identify what kind of problems can be more efficiently solved by QC.
Speedups: How QC performs in terms of scaling with the size of the problem. Gives an idea of how big
the speedups will be as quantum hardware improves .
Also, this metric is hardware agnostic and scales similarly across different hardware platforms such as
Ion trap or superconducting or photonic etc.
Exponential speedups can offer practical speedups even at smaller problem sizes & with small
quantum computers (NISQ ??)
Polynomial speedups may require medium to large scale QC.
Both polynomial and exponential are useful.
10. Quantum Algorithms
Grover’s algorithm can provide quadratic speedup
Square of 1 million ( 1,000,000) = 1000
SQRT(Classical Algo) = Quantum Algo
Although this is polynomial, huge reduction in time.
Provable and Heuristic Quantum Algorithms
Provable: Mathematically, can be shown to outperform its classical counterparts. Eg. Shor’s
factorization , Grover’s search. All provable QAs medium to large scale QCs or fault tolerant
QCs. Challenge is to build the QC.
Most of the originally proposed quantum algorithms require millions of physical qubits to
incorporate these QEC techniques successfully.
Heuristic: No mathematical proof that quantum can outperform, driven by intuitions. Do
not know in advance how it can perform but can be run on NISQ. Need to test with real life
data.
11. NISQ
In 2017, John Preskill coined the term Noisy Intermediate Scale Quantum Computing (NISQ) to
denote the present era of quantum computing.
Intermediate scale refers to the no. of qubits available which ranges from 50 to a few hundreds of
qubits
Noisy refers to not so robust qubit meaning the present generation qubits are more highly prone to
decoherence.
NISQ era of computing, an efficient program is required to mitigate the error to extract reliable
results.
Quantum algorithms such as Shor’s prime factorization, Deutech-Jozsca algorithms operate under the
assumption that the qubits are robust and therefore do not incorporate any error mitigation
techniques.
Fault tolerance is the property that enables a system to continue operating properly in the event of
the failure of one or more faults within some of its components.
FTQC refers to the framework of ideas that allow qubits to be protected from quantum errors
introduced by poor control or environmental interactions (Quantum Error Correction, QEC) and the
appropriate design of quantum circuits to implement both QEC and encoded logic operations in a way
to avoid these errors cascading through quantum circuits.
12. NISQ
Algorithms and tools have been developed specifically for near-term quantum computers
Variational Quantum Algorithms (VQAs): Hybrid quantum-classical approach which has potential noise
reduction. In NISQ , all known quantum algorihtms are heuristic in nature.
Quantum Error Mitigation (QEM): Techniques to reduce the computational errors and then evaluate
accurate results from noisy quantum circuits
Quantum Circuit Compilation (QCC): To transform the nonconforming quantum circuit to an executable
circuit on the target quantum platform according to its constraints
Benchmarking Protocols: To evaluate the basic performance of a quantum computer and even the
capacity to solve realworld problems.
Classical Simulation: Classical simulation of quantum circuits is one of the core tools for designing
quantum algorithms and validating quantum devices
VQAs, QEMs, QCC, and quantum benchmarking
may all require the help of classical simulation
for verification or algorithm design.
Main goal of the NISQ era is to extract the maximum quantum
computational power from current devices while developing
techniques that may also be suited for the
long-term goal of the FTQC
14. • In 2017, John Preskill coined the term Noisy Intermediate Scale Quantum
Computing (NISQ) to denote the present era of quantum computing.
• Noisy – not so robust qubits.
• Intermediate – 50s to few hundreds of qubits.
• Peruzzo, A., McClean, J., Shadbolt, P. et al. A variational eigenvalue solver on a
photonic quantum processor. Nat Commun 5, 4213 (2014).
https://doi.org/10.1038/ncomms5213
• Fedorov, D.A., Peng, B., Govind, N. et al. VQE method: a short survey and recent
developments. Mater Theory 6, 2 (2022). https://doi.org/10.1186/s41313-021-
00032-6
Variational Quantum Eigensolver (VQE)
15. • Prepare a variational quantum circuit representing the chemical problem –Qn.
Comp
• Measure the circuit, calculate the expectation value - Qn. Comp
• Update the variational parameters by optimizing algorithm – Classical Computer
• Measure the circuit again - Qn. Comp
• If present value better than previous one, stop the process
Variational Quantum Eigensolver (VQE)
16. Quantum Computing for Finance
• Finance sector encounters several computationally challenging problems such as asset
portfolio optimization, stock market prediction, arbitrage opportunities, fraud detection,
credit scoring etc.
• In a world where hug volume of data generated per second, QC promises potential
reduction in time and memory space for the computational tasks.
• Broadly, there are three classes of problems in finance:
• Optimization: Problems that scale exponentially in time required can be best solved
using quantum optimization. Eg. portfolio optimization, arbitrage opportunity,
optimal feature selection for credit scoring.
• Machine Learning: Highly Complex data structures hinder classification or pre-
diction accuracy. The multidimensional data modeling capacity of quantum
computers may allow us to find better patterns, with increasing accuracy.
E.g. Anomaly detection, Quantum NLP for virtual agents, Risk Assessment
• Simulation: Time constraints to perform sufficient scenario tests to find the best
possible solution. Efficient sampling methods leveraging quantum computers may
require less samples to reach a more accurate solution faster.
E.g. Pricing of financial derivatives, risk analysis.
17. Algorithms can improve computational efficiency, accuracy, and addressability for
defined use case
Financial services focus areas and algorithms
Ref: Quantum Computing for Finance: State-of-the-Art and Future Prospects
Quantum Algorithms for Finance
18. Fully scaled quantum
technology is still a way off,
but some banks are already
thinking ahead to the
potential value.
Major MoUs
20. Step 1: Encode the classical data into a quantum state
Step 2: Apply a parameterized model
Step 3: Measure the circuit to extract labels
Step 4: Use optimization techniques (like gradient descent) to
update model parameters
QML Steps
31. Unraveling the Effect of COVID-19 on the Selection of Optimal Portfolio Using Hybrid
Quantum Algorithms
1Shrey Upadhyay, 2Vaidehi Dhande, 1Rupayan Bhattacharjee, 1Ishan NH Mankodi, 1Aaryav Mishra, 2Anindita Banerjee, 1Raghavendra Venkatraman
1QKrishi, 2C-DAC- India
The unforeseen COVID-19 pandemic delivered a huge blow to the global economy. This
poster elaborates the effect of COVID-19 on the portfolio optimization across different
industrial sectors retail, technology, automotive, oil & gas, airlines & hospitality.
Portfolio Optimization is to select best portfolios with an objective to maximize the return
value and minimize the risk factor. To understand the trend in Portfolio Optimization pre
covid-19 and during covid-19 three time intervals are considered and the results from
different quantum algorithms are compared with classical results. The quantum algorithms
used are Variational Quantum Eigen solver (VQE), Quantum Approximate Optimization
Algorithm (QAOA).
Outline
Covariance Graphs
Results
Conclusions
Abstract
1. Portfolio Optimization- Maximize Returns and Minimize Risk
2. Classical Algorithms- Markowitz, Numpy EigenSolver
3. Quantum Computing-VQE, QAOA
4. Impact of Covid-19 on portfolio optimization
Pool Non-COVID 1
(Jan ‘16-Dec ‘17)
Non-COVID 2
(Jan ‘18-Dec ‘19)
COVID
(Jan ‘20-Dec ‘21)
Retail
Technolog
y
Automoti
ve
Oil & Gas
Airlines &
Hospitalit
y
Pool
Non-COVID 1
(Jan ‘16-Dec ‘17)
Non-COVID 2
(Jan ‘18-Dec ‘19)
COVID
(Jan ‘20-Dec ‘21)
Retail
Technology
Automotive
Oil & Gas
Airlines &
Hospitality
Impact of Covid
Pool
Non-
COVID1
Non-
COVID2
COVID Reason
Retail
(Costco,
Amazon, Target,
Walmart)
COST TGT COST
COST & TGT are major
market share holders and as
they open new stores to at
more locations and while
offering the products at
affordable prices, drives the
growth of COST.
Technology
(Google, IBM,
Intel, Microsoft)
GOOG GOOG MSFT
GOOG remains the most
used IT service in the world
in terms of apps and
browsers. MSFT also control
majority of the OS used
worldwide, while launching
its own hardware products.
Automotive
(General
Motors,
Mercedes,
Tesla, Ford )
GM TSLA TSLA
GM owned a large market
cap in automotive around
2016, but as people accept
EV as a better alternative to
gas powered engines, and
look for greener ways of
transport which is also more
technology wise advanced,
TSLA soars after 2017.
Oil & Gas
(Shell, Conoco
Phillips,
Marathon Oil,
Chevron Corp.)
CVX COP CVX
CVX & COP control majority
of gas and oil extraction in us
and also in some parts of the
world. As they continue to
innovate and expand in the
hydrocarbon fuel markets.
Airlines &
Hospitality
(Marriott Int,
Choice Hotels,
LTC Properties,
Alaska Air)
MAR CHH MAR
MAR and CHH remains
people’s first choice. As they
continue to grown and make
newer and more luxurious
properties. The in them
considerably increases with
time
Main objective of portfolio optimization is:
1. The investor’s goal is to maximize return for low level of risk
2. Risk can be reduced by diversifying a portfolio through individual, unrelated securities
Initially, the problem of portfolio optimization is translated into the form of variation
circuit called ansatz to enable the quantum computer to perform optimization on the
objective function.
VQE is Hybrid Quantum-classical algorithm. VQE which is developed on Variational
Principle calculates the lowest energy which corresponds to the optimal portfolio
It aims to find an upper bound of the lowest eigenvalue of a given Hamiltonian.
Methods
VQE has two fundamental steps:
1. Prepare the quantum state |Ψ(θ)⟩
2. Measure the expectation value ⟨Ψ(θ)|H|Ψ(θ)⟩
3. Optimize the parameter θ on classical computer and generate the updated wavefunction
4. Calculate the expectation value again for the updated wavefunction
5. Iterate until convergence criteria is met
QAOA is widely popular method for solving combinatorial optimization problems. The VQE algorithm applies
classical optimization to minimize the energy expectation of an ansatz state to find the ground state energy.
Methods Cont..
[0 1 0 0], -
0.0012
[0 1 0 0],-
0.0012
[0 1 0 0], -
0.0012
[0
1 0
0]
[0 0 1 0], -
0.0014
[1 0 0 0], -
0.0014
[1 0 0
0]
[0 0 1 0], -
0.0014
[1 0 0 0], -
0.0014
[1 0 0 0]
[1 0 0 0], -
0.0014
[1 0 0 0],-
0.0014
[0 0 0 1], -0.001 [0 0 0 1]
[0 0 0 1], -0.001 [0 0 0 1] , -
0.001
[0 0 0 1},-
0.0013
[0 0 0 1]
[0 0 0 1] , -
0.0013
[0 0 0 1] , -
0.0013
[0 0 0 1] , -
0.0015
[0 0 0 1] , -
0.0015
[0 0 0 1]
[0 0 0 1] , -
0.0015
[0 0 1 0] , -
0.007
[1 0 0 0]
[1 0 0 0] , -
0.006
[0 0 1 0] , -
0.007
[0 0 1 0] , -
0.005
[0 0 1 0]
[1 0 0 0] , 0.001 [0 0 1 0] , -
0.005
[0 0 1 0] , -
0.005
[0 0 1 0]
[0 0 0 1], , -
0.0016
[0 0 1 0], , -
0.005
[1 0 0 0] , -
0.001
[1 0 0 0]
[1 0 0 0] , -
0.001
[1 0 0 0] , -
0.001
[0 1 0 0] , -
0.0004
[0 1 0 0]
[0 1 0 0] , -
0.0004
[0 1 0 0] , -
0.0004
[0 0 1 0] , -
0.0005
[0 0 1 0]
[0 0 1 0] , -
0.0005
[0 0 1 0] , -
0.0005
[1 0 0 0] , -
0.0015
[1 0 0 0] [0 1 0 0] , -
0.0006
[1 0 0 0] , -
0.0015
[1 0 0 0] , -
0.0015
[0 1 0 0]
[0 1 0 0] , -
0.0006
[0 1 0 0] , -
0.0006
[0 1 0 0] , -
0.0008
[0 1 0 0]
[0 1 0 0] , -
0.0008
[0 1 0 0] , -
0.0008
References
• Egger, D.J., Gambella, C., Marecek, J., McFaddin, S., Mevissen, M.,
Raymond, R., Simonetto, A., Woerner, S. and Yndurain, E. (2020).
Quantum Computing for Finance: State-of-the-Art and Future
Prospects. IEEE Transactions on Quantum Engineering, 1, pp.1–24.
doi:10.1109/tqe.2020.3030314.
• Herman, D., Googin, C., Liu, X., Galda, A., Safro, I., Sun, Y., Pistoia,
M., Alexeev, Y. and Chase Bank, J. (2022). A Survey of Quantum
Computing for Finance. arxiv:2201.02773
Classical
VQE
Classical
VQE
QAOA
Classical
VQE
QAOA QAOA
Classical
VQE
Classical
VQE
QAOA
Classical
VQE
QAOA QAOA
32. Portfolio Optimization results using quantum algorithms(Work
done by Qkrishi Scientists)
Quantum based Portfolio Optimization
33. Qkrishi Projects
Forex optimization
Post quantum cryptography
Product recommendation
Electricity theft using QML
Protein folding and drug discovery
Computational chemistry and material science
36. • Prabha Narayanan – Founder Qkrishi
• Prof. Monika Agarwal - Founder Qkrishi
• Qkrishi Colleagues: Chetan, Sree, Sangram
• JR: Ragavan
• Other experts from the field
• We are also open to joint proposal/collaboration,
skilling, internship!!!
Acknowledgement