2. Why Do We Need Quantum Computing
● Moore's Law.
● Calculations (Very Very …. Large ones)
● Security
● Optimizations
● Machine Learning
● Simulation
3.
4. RSA-2048 (617 Digits)
Prime Factorisation
● Using classical
General Number Field Sieve
● Using Quantum
Peter Shor's Factoring Alogorith
● t ~ 10seconds● T ~ 4 x 1017
seconds
~13 x 109
Years
i.e Age of Universe
Time to solve
6. Optimization
● To find the best value to your
capabilities,money,capital,etc
● Classical Light Switch Problem
● Water Network Optimization
● Radiotherapy Optimaztion
● Protein Folding
8. Harnessing Quantum Phenomenon
for Computations
● Quantum Tunnelling ● Quantization of Energy
● Entanglement● Superposition
Delhi
Hyderabad
9. Bit vs Qubit
● 0 or 1
● Freely readable
● Programming is done
in 0's or 1's
● 0 or 1 or
● It is destroyed when
reading Q-bit
● Programming is done in
terms of energy to be
given to a Q-bit
● Intrinsic Randomness
● Entanglement
● Uncertainty Principle
11. Classical Vs Quantum Register
Quantum Classical
If the n = 300, The Quantum Register could hold more
numbers simultaneously than there are atoms in the
'Observable Universe'
13. Implementing
●
Different Models:Different Models: Gate, One-way Topological,
Adiabatic Annealing
●
Different Basis:Different Basis: Superconducting metals, ion
traps, photons, solid state
SQuID Nuclear Spin
Based Q-bit
Trapped Ion QC
14. Decoherence
● Schrodinger's cat can be 'Dead' and 'Alive' only
if its completely isolated from all kinds of
interactions with the world.
● Likewise the Quantum computer should not be
sneaked while computing.
● This is the major hinderance in Q-Computing
16. Till Now
● First execution of Shor's algorithm at IBM's Almaden
Research Center and Stanford University. The number 15
was factored using 1018 identical molecules, each
containing seven active nuclear spins
● Scientists transfer data by quantum teleportation over a
distance of 10 feet with zero percent error rate, a vital step
towards a quantum internet
● D-wave systems bult a 512 Q-bit Processor with NASA
and Google's collaboration
"Moore's law" is the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit doubles approximately every two years. The observation is named after Gordon E. Moore, co-founder of the Intel Corporation, who first described the trend in a 1965 paper[1][2][2][3] and formulated its current statement in 1975. His prediction has proven to be accurate, in part because the law now is used in the semiconductor industry to guide long-term planning and to set targets for research and development.[4] The capabilities of many digital electronic devices are strongly linked to Moore's law: quality-adjusted microprocessor prices,[5] memory capacity, sensors and even the number and size of pixels in digital cameras.[6] All of these are improving at roughly exponential rates as well.
This exponential improvement has dramatically enhanced the effect of digital electronics in nearly every segment of the world economy.[7] Moore's law describes a driving force of technological and social change, productivity, and economic growth in the late twentieth and early twenty-first centuries.[8][9][10][11]
The period is often quoted as 18 months because of Intel executive David House, who predicted that chip performance would double every 18 months (being a combination of the effect of more transistors and their being faster).[12]
Although this trend has continued for more than half a century, "Moore's law" should be considered an observation or conjecture and not a physical or natural law. Sources in 2005 expected it to continue until at least 2015 or 2020.[note 1][14] The 2010 update to the International Technology Roadmap for Semiconductors predicted that growth will slow at the end of 2013,[15] however, when transistor counts and densities are to double only every three years.