SlideShare a Scribd company logo
1 of 60
Download to read offline
THE QUANTUM
ADVANTAGE
Challenges for industry and
training
Repor
t
no 11
EDITORIAL
Over the past 20 years, quantum technologies have evolved from Nobel Prize-winning quantum physics
experiments (1997: Chu, Cohen-Tanoudji, Phillips; 2001: Cornell, Ketterle, Wieman; 2005: Hall, Hänsch,
Glauber; 2012: Haroche, Wineland) to transdisciplinary applied research. Driven by two great «quantum
mysteries», wave-particle dualism and entanglement, these technologies are the result of long-term
fundamental research and long-standing conceptualizations—such as quantum computing in the early
1980s—with experiments in the laboratory and then applications outside the laboratory many years later.
Today, they are based on individual quantum states and fully exploit the properties of superposition and
entanglement: this period is being called the second quantum revolution.
While much more work is still needed on the fundamentals, the progress in these technologies allows us to
imagine the first applications and industrialization before reaching a destination that is still referred to as one
of the holy grails: a universal quantum computer. Start-ups are coming out of laboratories, funds are being
raised, and win-win partnerships are being created between industry and academia to build the key equipment
needed. With the Quantum Flagship, Europe, the birthplace of the first quantum revolution, is on the move.
The scientific community has identified four main fields of research: quantum communications—including quantum
cryptography—, quantum simulations, quantum computation, and quantum sensing and metrology. In addition to
basic research, we must add to these four pillars quantum systems control and engineering, software issues, and
training. This last point is the special focus of the 11th
report from the Fondation Mines-Télécom.
ItisessentialtobegintrainingfuturequantumspecialistsrightnowtoreinforceandexpanduponFranceandEurope’s
advantageinquantumtechnology.FutureNobelPrizewinners,visionaries,andinvestorswillemergefromtheirranks.
They must think outside the box and acquire skills in a multitude of fields to design hybrid quantum-classic solutions.
This report aims to convince you of this. On behalf of the Foundation, we hope you enjoy reading it.
September 2019
A reading path tailored to your
level of knowledge!
5.What makes something quantum?
Beginners: explore the basics and genesis of
the quantum at the turn of the 20th century.
17.Quantum realities
Advanced: plunge into the concrete
applications of quantum physics.
35.Quantum engineering
Experts : research, innovation and
training, explore a field at the heart of
today’s industrial issues.
If there is one question that arises as soon as we
talk about quantum technology, it’s “when will a
universal quantum computer be readily available?”
This is a machine that would be able to run all the
algorithms that use quantum formalism—there
are currently around sixty classes of these types
of algorithms—not one just specialized in a few of
them. Optimists, pessimists, and skeptics exchange
their views in publications and symposia. Start-ups
and industry are betting on a variety of physical
technologies to create the basic building blocks
of quantum computing, the famous “qubits”. The
potential applications are proliferating; some have
already been made possible by quantum simulators
and emulators, but traditional electronics technology
and algorithmic research are still relevant, which
means that several timescales are overlapping and
are not stable.
However, everyone agrees that the current period is
teeming with activity, making it an important stage
that is not to be missed. As they did for artificial
intelligence, countries such as the United Kingdom,
Canada, China, and the United States drafted their
“quantum technology plans” years ago; France’s
plan is expected after Summer 2019. And, just as for
artificial intelligence, Europe is the scale that counts
a sufficiently high level of skills and knowledge.
With the Quantum Flagship (see page ), Europe
has begun to provide the resources (in funding and
ecosystem organization) to achieve its ambitions.
For European players in quantum technologies, this
region that gave birth to most of the components of
the two quantum revolutions has every legitimacy
and must maintain its leading position.
The search for the quantum
advantage
Supremacy—where conventional computers are
incapable of competing with a quantum machine on
the types of problems for which the latter is well-
suited—or advantage—where a quantum machine
is efficient and affordable enough to make using it
economicallyviable?Duringaround-tablediscussion
organized on June 20, 2019 by Bpifrance as part of
an international conference attended by more than
200 people, participants agreed on the idea that
“defining quantum advantage is probably a moving
target”. The increase in supercomputer power and
the efficiency of new algorithms are pushing back the
prospect of quantum advantage. But the perceived
advantage is not just computational. Compared to
supercomputers, quantum computers have a much
smaller footprint, mass, and power consumption,
which are three significant advantages.
FOREWORD
55.Postscript
News from the quantum revolution between
September 2019 and January 2020 specially
for the English publication of this report.
2 3—
The quest for a quantum
computer opens new doors
We should not allow quantum computing and the
quest for a quantum computer to make us think that
we are heading for a “quantum everything” that wins
out over the rest. The reality is more nuanced. Future
supercomputers will be hybrid machines, so a major
player like Atos can reasonably claim, through its
enthusiastic and pioneering chairman Thierry
Breton, that quantum accelerators will be on the
market in the next five years. These components are
small “co-processors” that will bring the advantages
of quantum computing to traditional machines.
“The landscape of the quantum advantage is being
explored step-by-step, between hardware and
software at the same time”, emphasizes CNRS
chairman Antoine Petit. Like other major research
organizations in France, the CNRS is undertaking a
transversal approach that ranges from fundamental
research to engineering. And each new type of
hardware that is made available—whether quantum
particle sources, repeaters, or sensors—each new
algorithm that uses quantum formalism, opens the
way to new applications.
With the shift from scientific research to concrete
use cases over the past few years (see page )
the industrial stakes are being taken seriously. A
study published by McKinsey & Company in June
2019 pointed out that the value chain in quantum
computing is divided into three distinct groups: a
third in hardware (mainly in the United States), half
in software (the majority in startups), and a fifth in
facilitators. Among the latter, you should note the
crucial importance in supplying the small quantum
hardware discussed above.
The funding is following, whether driven by public
policy or private investors (for example, Quantonation
in France). In mid-June 2019, the United Kingdom
announced a £153 million program to help bring
quantum products to the market, backed by £205
million from industry, bringing the sums raised to more
than £1 billion. For scientists, industry funds are a
model where everyone wins. Industry benefits from
scientists’ advice and ideas, and in return they obtain
the major technological equipment they need.
Developing ecosystems and
training
To take on the challenges of quantum technologies,
the ecosystem must be mobilized as well. There
are a great many initiatives in France, but they
remain rather dispersed at regional level. The Paris
Center for Quantum Computing, SIRTEQ in Ile-de-
France, and cities like Grenoble and Montpellier
are now joining forces, driven jointly by scientific
teams and companies like Atos, IBM, and Microsoft.
Bilateral cooperation agreements are being set up
between countries, too. Things look to be speeding
up: according to the participants of the Bpifrance
conference in June 2019, such an event would have
been unthinkable one year earlier. Some people are
beginning to imagine a “quantum Airbus” in Europe,
which seems all the more logical given that, not only
does the aeronautics industry need the capacities
and forms of computation that quantum technologies
bring, but also because satellites are natural entities
for a future “quantum internet”, a network that
allows for both quantum communications and links
to heavier quantum equipment.
While currently an “emerging technology”, quantum
could one day become a “general purpose
technology”. Combining science and engineering,
it requires the public and decision-makers to be
informed so they can understand the stakes, raise
ambitions, and make the right choices. This new
way of seeing the world also requires investment in
training at all levels, from secondary school through
university, from initial training to job conversion,
from the wider public to specialists, if we want to
give ourselves every change of conserving the
quantum advantage.
Find all the contributors as well as the glossary pages -.
Follow Alice and Alan’s conversation in the green boxes.
WHAT MAKES
SOMETHING QUANTUM?
Alice is a young woman driven by an unquenchable
thirst for knowledge founded on scientific progress
but who also appreciates a bit of mystery and
marvel. She has traversed time and tales, explored
disciplines and countries. In the 19th
century, she
traveled both in England and in a country of the
absurd, where those she met led her to question her
understanding of the world’s reality... In the early
20th
century, she cultivated a passion for physics,
time, and space by reading about great discoveries
in atoms and things microscopically small. In the late
70s, she entered the communications era, having
fun sending encrypted messages to her friends.
Today, Alice handles data and artificial intelligence
systems and uses increasingly subtle, sometimes
futile technological objects every day, always
conscious of the major stakes of the transitions and
the underlying strategic and economic realities.
For all these years, Alice has taken an interest
in the developments in quantum physics and
mechanics and has seen their formalism gradually
seep into other sciences and fields. No doubt she
knows, even if a little confusedly, that quantum
technologies are at the heart of many of today’s
technologies. But what does that mean, exactly?
There still seems to be a touch of strangeness,
something incomprehensible when we talk about
quantum: we think we have understood... but then
something entirely different turns up.
Today, at a time when the many ongoing transitions
are calling on all disciplines at the same time,
Alice would like to be able to think ahead and
make better decisions about her future. The young
scientist would like to know more about these
quantum revolutions everyone is talking about—so
she sets out to meet the people who can help her
fully understand it all. She has an appointment with
Alan, a teacher-researcher specializing in quantum
cryptography and communication who has created
a training course in quantum engineering with his
colleagues. Alice thinks she may need to take such
a course. She has also read in the media about the
latest news in quantum computers, these machines
that they say will make all our current computers
obsolete. Alice wants to know what is real and what
is fantasy, but where should she start? She would
also like to explain to her grandmother why she is
going down this road, but how can she find the right
words so that her grandmother will understand?
4 5—What makes something quantum?
Everything really started to come together when we
asked ourselves about the nature of light. For a long
time, we believed that light was made up of little grains,
corpuscles. In 1801, Thomas Young, a 28-year-old
Englishman who was well-versed in many artistic and
scientific disciplines, conducted an optics experiment
that showcased light’s wave-like character.
To explain what we see in this founding experiment,
sometimes called “Young’s slits” (see opposite), the
corpuscular nature of light as previously proposed no
longer fit. This experiment, which was reproduced later
in other conditions and with other things than beams of
light, made many people think.
How can we define a barrier
between the classical
and the quantum?
Alice has a meeting with Alan one afternoon in May. Alan starts by asking Alice what
she knows. It’s always good to know where to start.
“When you hear the word ‘quantum’ or when you hear people talking about it, what is the first
thing you think about?”
“Oh, that’s easy; often I think about that story of the cat. It makes me think that it’s been
following me from Cheshire into the internet. This cat that Schrödinger made living and dead
at the same time in that box, it fascinates and terrifies me. It’s like I have mixed, superposed
feelings about this creature.”
“It’s true, this cat is rather interesting,” Alan replies, “but it is also very big, and before we talk
about it, we need to get down to a microscopic level where there are other ‘at the same times’
we need to explain.”
Alan began explaining quantum discoveries so that Alice would not only understand, but
also be able to explain them later herself.
“
Screen view 
Young’s double-slit experiment
On the screen, we can see a diffraction
pattern with alternating light and dark
bands.
Throughout the 19th
century, many scientific
experiments resulted in questions that led to some
well-accepted theories, like classical mechanics,
being considered insufficient to explain the observed
phenomena. Science needed new tools to describe
the world, especially at the microscopic level. Material
physics on the macroscopic scale is fundamentally
different from the physics of elementary atoms and
particles on the microscopic scale. The first is called
classical physics, and the second is called quantum
theory. The switch from the first to the second really
occurred in the early 20 century when the first task of
this emerging theory was to clarify the real nature of
light: was it a wave or a particle?
In 1900, German physicist Max Planck (1858-1947)
introduced the term quanta to explain the experimental
characteristics of black-body radiation referring to the
indivisible packets (particles) in which the energy of
electromagnetic radiation is confined, each packet
having an energy E proportional to the frequency ν of
the radiation under consideration. He proposed a new
universal constant (h, now called Planck’s constant)
that connected the two scales. Planck’s quanta referred
to the energy’s discontinuous character, which can
only take certain values.
In March 1905, Albert Einstein (1879-1955) reviewed
this hypothesis in his article “On a Heuristic Point of
View Concerning the Production and Transformation
of Light” and revolutionized the world by stating that
light was both a wave and a particle. Depending on the
case, light behaves like a wave or like a particle. This
is the work that won him the Nobel Prize for Physics
in 1921, once the skepticism had been overcome.
Then, in 1923, American physicist Arthur Compton
(1892-1962) discovered the phenomenon (called the
Compton effect) where the wavelength increases after
colliding with an electron, which we would soon be
calling a photon, finally concluding that light is made
of particles.
In 1924, French physicist Louis de Broglie inverted how
we saw things by postulating that one can associate
a wave and a material particle (i.e. with a mass),
a hypothesis known as wave-particle dualism. An
experiment to verify this wave of matter was performed
in 1927 by Davisson and Germer by diffracting electrons
on a crystal of nickel.
“
“To separate the classical and quantum worlds in the 20th
century, De Broglie’s
wavelength h/mv is the right quantity. m being the mass and v the speed, this
wavelength, for example for an electron in a hydrogen atom, is comparable to the
distance between the electron and the proton (Bohr radius). However, for a moving
object of our size, this wavelength is infinitely smaller than the obstacles encountered,
and classical physics prevails.”
6 7—What makes something quantum?
““These quantas for light, we have called them photons since 1926, but the fields all
inherited this word. You see, in the beginning, it was the questions on the nature of
light that opened the door to quantum physics.”
“So, in the 30s, everyone agreed that quantum mechanics was the right theory of Nature?”
“Well, there were doubts about the completeness of the quantum formalism that appeared, and
it was Einstein, among others, who gave voice to them. For example, he wrote to Schrödinger in
August 1935 to express his surprise that the wave function he proposed allowed a particle to be
in a paradoxical superpositioned state, both A and non-A. This ‘both’ can lead to confusion. Let’s
skip ahead in time to 1989, when a Japanese team repeated Young’s slit experiment, not with a
beam of light (photons) but with electrons, which are of course particles, one by one. Visibly, these
electrons behave like a wave, since we observe interferences, but if we want to know by which
slit the electron traveled, this measurement causes the interferences to disappear! It is therefore
reasonable to say that electrons ‘manifest in a corpuscular or undulatory way, depending on how
we speak to them.’ Choosing the right words is essential in quantum physics.”
Scientists are the first to admit that they have been
confronted with strange, counterintuitive phenomena,
and that they needed to have radical ideas to make
suitable proposals, build the foundations of this new
science, and imagine the experiments that could
confirm their intuitions. Quantum sciences rely on
a set of postulates (see page ). Some of these
strange phenomena have profoundly disrupted the
worldview that had emerged from the Classics. A dose
of randomness seemed to be needed, an object that
could no longer be in the space around it, an object
that could have several states at the same time. And
then also: the act of observing influences and changes
that which is observed, and the events that did not
occur, but which could have occurred, have an impact
on the experiment conducted.
Reading scientific publications and letters between
physicists, as well as their thought experiments,
demonstrate their troubles and their search for the
right word. Heisenberg (see page ) used for his
theorem the words Unsicherheit (uncertainty) and
Ungenauigkeit (indeterminacy), then, seeing the risk
of confusion, decided on the term Unbestimmtheit
(undeterminedness), but unfortunately “uncertainty”
had already been used in the translations.
Several interpretations of the phenomena were
proposed, forming as many schools of thought and
controversies. Einstein himself was a dissident
voice, and some still today are proposing unique
interpretations, so how can we be convinced that the
whole thing is solid?
The way of describing these phenomena, the
mathematical tools and the “language” underlying the
formalism have also changed over time and continue
to do so. It is even more necessary since the flavor of
these formalisms itself can guide scientists’ ideas in
unexpected directions. At the same time, the equations
all seem simple (and beautiful), which seems to
contradict the specificities they describe. It is possible
and useful to work through a few aspects to better
understand the landscape that is taking shape.
Quantum sciences are also traversed by (conditions
on the) limits, thresholds, and inequalities, which
reflect physical impossibilities. Are we faced with
insurmountable limits? Is it possible that we might not
be all-powerful? But in this case, how can we explain
certain phenomena with immediate repercussions
over very long distances exceeding those allowed
by the speed of light, a limit that we have accepted
elsewhere?
These oddities have been the delight of movies and
films that have helped give a distorted image. For
example, talking about quantum teleportation is always
delicate, and we must take care not to generalize the
macroscopic world too quickly.
Quantum sciences...
why this reputation
of being difficult?
This famous quote from physicist
Richard Feynman (The Character
of Physical Law, 1965) is often
cited to express the impossibility
for laypeople to understand
anything about this science.
« I think I can safely say that
nobody understands quantum
mechanics. »
8 9—What makes something quantum?
The quantum system in all its states:
notation and the superposition principle
In classical physics, the state of a classical system
allows us to determine the result of the measurement
of a physical quantity without any possible ambiguity.
In quantum physics, the state of a quantum system
only allows us to calculate the probabilities of different
results of measurements, of which only one can be
effectively observed after the measurement.
The quantum state of a quantum system is written
∣𝛙⟩, according to a notation (called bra-ket notation)
proposed by the French physicist Paul Dirac (1902-
1984) in 1939, who sensed that many scientists would
use it on a daily basis and would need a specific
and practical way to write it. This is a vector, a well-
known mathematical object, a state vector which fully
accounts for the state of the quantum system and
which lives in a Hilbert vector space, another, less-
known mathematical object with properties perfectly
suited to the quantum world, in particular describing
situations where we are choosing between mutually
exclusive alternative terms. The module and phase
of ∣𝛙⟩ uniquely characterize this vector.
A common quantum statement like ∣𝛙⟩  =  𝑎∣↑⟩  +𝑏∣↓⟩
expresses 1) that the sum of two quantum states of
a system forms one possible quantum state of this
system, and 2) that by choosing particular states (here
∣↑⟩ and ∣↓⟩, which in this example refer to the up and
down status of an electron’s spin, a property that an
electron possesses) we can describe any quantum
state as a superposition of these “base” states.
With 𝑎∣↑⟩  +𝑏∣↓⟩ above, the quantum state is defined
by the quantum property of the electron’s spin, but
we could choose the quantum property of light’s
polarization according to a construction based on the
horizontal polarization ∣𝐻⟩ and vertical polarization
∣𝑉⟩, hence ∣𝛙⟩  =  𝑎∣𝐻⟩  +𝑏∣𝑉⟩. Later, we will discuss
qubits, noted as 𝑎∣0⟩  +𝑏∣1⟩, a quantum state that does
not refer to a specific physical incarnation, ∣0⟩ and ∣1⟩
only being a convention that refers to two particular
states.
Opposite: A Young’s slit experiment made with
electrons, one quantum passing at a time, 1989.
First, the electrons seem to be arranging
themselves randomly on the target screen
with a corpuscular behavior, but little by little
interference fringes appear, like a wave would
make. Inserting a device between the slits and
the screen to observe where the electrons pass
causes the fringes to disappear.
Image taken from Tonomura, A., Endo, J., Matsuda, T.,
Kawasaki, T., & Ezawa, H. (1989). Demonstration of
single-electron buildup of an interference pattern.
Am. J. Phys, 57(2), 117-120.
Several mathematical descriptions of quantum phenomena were proposed
in the early days of quantum mechanics: the matrix mechanics by Werner
Heisenberg, Max Born, and Pascual Jordan in 1925, and the differential
equations of wave mechanics by Erwin Schrödinger in 1926. The two proved
to be mathematically equivalent, but some scientists were more familiar with
one formalism or another. How the chosen formalisms’ evolved went hand
in hand with both the understanding of the phenomena observed and the
intuitions leading to future discoveries. In 1932, John Von Neumann published
Mathematical Foundations of Quantum Mechanics, whose axioms and
postulates, for which we should also credit the previous works of Paul Dirac
and Hermann Weyl, established a mathematical framework through Hilbert’s
vectoral spaces.
It is a remarkable fact that many quantum discoveries were first made on purely
mathematical bases, to such an extent that the physicist and philosopher Alexei
Grinbaum wrote in Mécanique des étreintes, a work that described the various
steps that led to these theoretical breakthroughs, that “mathematics dictates
its laws to quantum systems”. From mathematics describing the behavior of
waves, objects with the particularity of being combined by addition and retaining
their wave properties after such a combination, the current view is to study
the various degrees of composition (and correlation) between entities, and how
to take several entities and process them into a single one. When two Hilbert
spaces 𝓗  (each representing a quantum system) come together, the related
mathematical operation is the product ⊗. The product can be weak or strong
(or something in between). It translates the two systems’ capacity to be able to
separate after having been combined, like a mix of seeds could be, whereas a
mix of juices would have changed natures. A weak product contains separable
statuses; beyond this, mathematics opens the field of so-called entangled states
Suggested reading: Mécanique des étreintes. Alexei Grinbaum.
Éditions Les Belles Lettres, 2014.
Mathematics
lays down the law
“I feel like the mathematics is just as important as the physics questions
themselves,” says Alice.
“Exactly! And we could add that, in the 21st
century, as Alexei Grinbaum
says, ‘the border between classical physics and quantum physics passes
through the quantitative characterization of the force of the correlations
between the subsystems of a compound system’.”
10 11—What makes something quantum?
The physicists (and (future) Nobel Prize winners in physics) who began to handle quantum
fundamentals, gathered here at the Solvay Conference held in 1927, were European: (From
top to bottom and left to right) Erwin Schrödinger, Wolfgang Pauli, Werner Heisenberg,
Paul Dirac, Arthur Compton, Louis de Broglie, Max Born, Niels Bohr, Max Planck, Albert Einstein.
Born, Heisenberg, and Pauli created the term “Quantenmechanik” in the 1920s.
Starting in the 20s, physicists began creating
and using a specific vocabulary that we must
now introduce. To be able to progress in their
understanding of counter-intuitive phenomena,
scientists also offered the postulates that defined
the behavior of these quantum objects that differed
from that of classical objects.
This allowed them to predict the existence of
quantum particles or properties well before
they were observed, providing a direction for
further research. Thus, in December 1924,
Austrian physicist Wolfgang Pauli (1900-1958)
theorized the existence of an additional attribute
of electrons (in addition to its mass and charge)
needed to explain a result of an experiment that
had evaded all explanation until then. He did it by
adding a postulate to those already made for non-
relativistic quantum mechanics. This property, first
interpreted as a rotation of the electron on itself,
is called spin. In fact, it was an intrinsic kinetic
moment that Uhlenbeck and Goudsmit discovered
in September 1925.
This anecdote reveals the boldness of the
physicists who postulated this quantum property,
since nothing in the world of classical physics
could have inspired it. Furthermore, interpretation
is an exercise that continually reappears in the
quantum sciences.
In quantum mechanics, physical values like mass
and position cannot be determined as the result of
a function in a deterministic way. They can only be
known in a probabilistic way (i.e. as probabilities
of the allowed values) through what is called an
observable. This concept is what comes closest
to that of measurement as we known it in the
classical world.
Systems, states, and observables
Hungarian physicist János Lajos Neumann (1903-1957) proposed a series of postulates that
provided a framework for the quantum sciences and gave them a frame of reference. This
mathematics is constantly presented in current scientific publications.
1. The definition of the quantum state. The knowledge of the state of a quantum system is
completely contained, at time t, in a normalizable vector of a Hilbert space 𝓗. This vector is
written ∣𝛙(t)⟩.
2. The principle of correspondence. To any observable property, for example position,
energy, spin, etc., corresponds a linear Hermitian operator acting on the vectors of a Hilbert
space 𝓗. This operator is called an observable.
3. Measurement: the possible values of an observable The measurement of a physical
quantity represented by the observable A can only provide one of the eigenvalues of A (𝑎n
). Any
vector ∣𝛙(t)⟩ can uniquely be broken down based on these eigenvalues:
∣𝛙⟩  =  𝑐1
∣𝛗1
⟩ + 𝑐2
∣𝛗2
⟩ + … + 𝑐n
∣𝛗n
⟩ + …
4. Born’s rule: a probabilistic interpretation of the wave function. The measurement of
a physical quantity represented by the observable A performed on the normalized quantum
state∣𝛙(t)⟩, gives the result 𝑎n
, with a probability equal to |𝑐n
|2
.
5. The wave packet reduction postulate. If the measurement of a physical quantity results in 𝑎n
,
then the state of the system immediately after the measurement is projected onto the eigenspace
associated with 𝑎n
.
6. Postulate of unitary evolution. It allows us to calculate the wave function. The state ∣Φ,t⟩ of
any quantum system is a solution of the time-dependent Schrödinger’s equation as follows:
𝑖ℏ∂/∂t∣Φ,t⟩ = 𝐻̂∣Φ,t⟩
Without going into the details of the math, here is an example of the thoughts that these postulates
were to lead to. Since the evolution of the wave function is linear and unitary (conservation of
the norm and the scalar product, according to postulate 6), how can quantum superpositioning
disappear (what postulate 5 says), whereas linearity and unitarity should lead to the conservation
of superpositioned states? The concept of decoherence (see next page) will provide a solution.
Naturalized as an American citizen in 1933, the scientist now known as John von Neumann also
made a major contribution to the development of computer science (and many other disciplines).
Computer hardware architecture (control unit, memory, inputs/outputs, etc.) is sometimes called
von Neumann architecture. Note that quantum computers do not follow these design principles.
A science based
on postulates
12 13—What makes something quantum?
In the thirty years or so after Planck proposed the term quanta in 1900, the
theoretical foundations of quantum mechanics were laid. We can consider
that “quantum mechanics is complete”. But isn’t that surprising, maybe
even strange? How can we give it meaning? How can we interpret it?
Especially since the underlying mathematical formalism (Hilbert’s vector
spaces with infinite dimensions) is far removed from the physical space
where the described phenomena occur. Several interpretations were
proposed.
In 1927, Werner Heisenberg (1901-1976) published what would be one of
the fundamental concepts in quantum physics, known as the indeterminacy
principle (or the uncertainty principle) which established mathematically
that it is impossible to establish with total precision both a particle’s position
and the momentum (and, more generally, with total precision the value
of two separate observables of a quantum system). When measuring
these two quantities many times, we always obtain different values, and
the product of the standard deviations, 𝛔 𝑥
for the position and 𝛔𝑝
for the
momentum, cannot be smaller than a certain threshold (see opposite).
Another view of the uncertainty principle is that, in the atomic world,
the measuring device influences the object being measured. But what
happens? What exists between the two measurements? The various
interpretations of quantum mechanics differ on the meaning to be given
to non-observed, non-measured states. This becomes a philosophical
problem on the meaning of reality, which is still under discussion today.
“
“Wave-particle dualism, quanta, the superposition principle, quantum
states, postulates, observables,” continues Alan, “there are still two terms
to be introduced: decoherence, and non-locality.”
Uncertainties
and interpretations
Heisenberg’s Indeterminacy
Principle (also known as the
Uncertainty Principle):
𝛔 𝑥
· 𝛔𝑝
≥ h / 4𝜋
One major interpretation is the so-called Copenhagen interpretation, named after the place
where the holders of this view usually met (Bohr, Born, Heisenberg, Jordan). According to this
interpretation, physical systems do not possess properties if we have not measured them, and
quantum mechanics can only predict probabilities among a set of possibilities. Measurement
immediately results in the reduction of all the possibilities into one of the possible values, which
is called “wave function collapse”. Many objections were made to this interpretation. These
objections did not call into question the predictions of quantum mechanics and mathematical
formalisms, but critics felt that the failure to explain certain macroscopic phenomena meant
that the emerging theory was incomplete. The “Schrödinger’s Cat” thought experiment is thus a
refutation of the Copenhagen interpretation.
Other interpretations still exist today, such as the CSM (Context, Systems and Modalities) theory
proposed by French physicists Alexia Auffèves and Philippe Grangier in 2015, which attempts to
propose a framework that reconciles previous approaches. The central issue is still the definition
of the exact boundaries of a quantum system: to what extent does a quantum system contain
the observer?
A quantum system cannot be considered as isolated because it interacts with its environment.
Each of these numerous interactions shifts the wave functions of its subsystems relative to
each other, rapidly rendering the probability of observing a superposed state zero. This explains
why we never see a superposed state at a macroscopic level: they are untenable. The different
possibilities quickly become incoherent. This decoherence phenomenon exists in the natural
state; it is a spontaneous reduction of the wave packet, the duration of which depends on the
environment. Hence many precautions must be taken to isolate the quantum systems that we
wish to manipulate as qubits in quantum computing, for example, especially since decoherence
seems to be a universal and therefore inevitable phenomenon.
Another problem was raised by scientists in the late 1930s: the locality principle whereby
an object can only be influenced by its immediate environment. In 1935, Albert Einstein, Boris
Podolsky, and Nathan Rosen published a paper that is famous today (known as the EPR paradox)
but which was not very well explored at the time because quantum mechanics was keeping
physicists busy enough elsewhere. In it, they described the existence of the states, allowed by
quantum formalism, of two quantum objects, two observables of which are highly correlated,
even when the objects are far apart. Since the limited speed of light makes it impossible for these
two subsystems to communicate instantaneously, observing one of them and thus knowing the
precise value of the other implies that there would be variables not considered in quantum theory.
In other words, the question raised in 1935 was: can the quantum description of physical
reality be considered complete?
14 15—What makes something quantum?
Suggested reading or viewing :
« Les bases de la physique quantique. » Juin
2018. 76’43 . Masterclass de Roland Lehoucq,
astrophysicien au CEA. http://www.cea.fr/go/
masterclass-physique-quantique
«Einstein’s revolutionary paper.» Avril 2005.
Physics World. https://physicsworld.com/a/
einsteins-revolutionary-paper/
« Le laser, ou l’impensable ingénierie
quantique. » Eric Picholle. Noesis n°5. 2003.
https://journals.openedition.org/noesis/1507
«Elements of Quantum Computing: History,
Theories and Engineering Applications.»
Seiki Akama. Springer 2014. eBook 133 pages.
http://mmrc.amss.cas.cn/tlb/201702/
W020170224608149203392.pdf
« Révolutions quantiques », Clefs CEA n°66.
Juin 2018. http://www.cea.fr/multimedia/
Pages/editions/clefs-cea/revolutions-
quantiques.aspx
The controversy between Einstein and Bohr
did not fascinate the scientific community
for long. Physicists were indeed to be kept
very busy with quantum field theory, i.e. the
study of particle movement.
The applications resulting from this, and
from what is now called the first quantum
revolution, are limitless. These are the
basic components of the digital age.
As with artificial intelligence systems, we
must be aware that we are surrounded by
objects and services that have been made
possible by the early results of quantum
science and which continue to shape the
digital reality right now.
The first quantum revolution
in everyday life
C
MOS Transi
stors
Lasers
Hard driv
es
LEDs
Atomic cl
ocks
GPS
QUANTUM
REALITIES
For years, Einstein, Podolsky and Rosen’s 1935
publication did not attract the attention of the
scientific community, which was very busy with
quantum field theory, a source of many discoveries.
In 1964, nine years after Einstein’s death, a
physicist working at CERN in Geneva revived
interest in it and opened unexpected prospects.
In a short article, lrish physicist John Bell set out
principles in the form of inequalities that would later
provide irrefutable proof of the notion of quantum
entanglement. This is a remarkable property that
causes a pair of quantum objects to behave as
a single quantum system, even when these two
subsystems(eachobjectinthepair)areatadistance
that makes “communication” impossible. John
Bell’s contribution is a mathematical expression, a
measurable quantity of Einstein’s doubt about the
interpretation of Copenhagen.
Let’s emphasize that. Bell’s work is remarkable
because the dogmas of quantum mechanics
were to be irreversibly turned upside down. It
is remarkable also because it paved the way for
experimental verifications as to whether strong
correlations between quantum subsystems exist
in nature. By demonstrating Einstein’s error of
judgment, physicists obtained new and surprising
results on the mathematical structure of quantum
mechanics.
But several years passed between Bell and the
experimental proof. In the 1970s, Bell’s inequality
was reformulated, its mathematical expression
taking on more manageable forms while retaining
Bell’s original spirit. One of these formulations,
proposed by physicists Clauser, Horne, Shimony,
and Holt, and known as the CHSH form after
their initials, is particularly suited to digital
communications and deserves our attention. Alice
is going to help us with this.
“
“Alice, the second quantum revolution will come once we start to actively manipulate
single particles and to have multiple particles interact with each other. Do you still
have that friend you write to regularly? We will do a few experiments with him to
understand all this.”
“Oh yes,” says Alice, “you’re talking about Bob! It’s true we’ve sent each other a lot of messages
over the years. We are very close, yes... I’ll call him right now.”
16 17—Quantum realities
Alice and her friend Bob are in the habit of emitting
bits, noted respectively 𝑥 and 𝑦, at the entrance of a
box that represents the level of correlation existing
between these two colleagues (who are also quantum
subsystems in private).
To translate this correlation, the box outputs one bit 𝑎 to
Alice,andonebit 𝑏toBob,thecombinations(𝑥,𝑦,𝑎,𝑏)thus
having 4x4 possible values. P(𝑎=𝑏|𝑥𝑦) is the probability
that 𝑎 and 𝑏 are identical, and P(a≠𝑏|𝑥𝑦) is the probability
that they are not. The sum of these two probabilities
is equal to 1, but the difference, called the correlator
between the input pairs, E 𝑥𝑦
  = P(𝑎=𝑏|𝑥𝑦)−P(a≠𝑏|𝑥𝑦) 
varies between -1 and 1.
There are four possible correlators, and Bell’s proposition
in CHSH form is to look at CHSH = |E00
+E10
+E01
−E11
|,
a quantity that can therefore be equal to 4 at most. John
Bell demonstrates that, in the world of classical physics,
the inequality CHSH ≤ 2. is verified. In quantum systems,
the upper bound of Bell’s inequality is greater than 2. It
is called Tsirelson’s bound and is equal to approximately
2.8284 (2�2). By creating an experimental device that
takes this box and the two characters Alice and Bob, and
counting the output correlations over a very large number
of cases, if the value 2 was exceeded, then it would mean
that Einstein’s point of view is wrong.
Experimentation was therefore needed to determine
whether correlations of the kind predicted by quantum
formalism exist in nature and “violate” Bell’s inequalities.
In 1982, the French physicist Alain Aspect and his team
at the Institut d’Optique d’Orsay empirically measured
the first CHSH value, with the two Alice and Bob
subsystems a few meters apart. The experiments were
performed using a pair of photons on which a polarization
measurement was made (see figure below). The results
were indisputable. The engineering feat was impressive
and so was the result. CHSH proved to be greater than
2, demonstrating that the correlation of two entangled
quantum systems is a reality.
Proof of quantum entanglement
paved the way for a second revolution
PM
PM
C2
b
b’
PM
PM
C1
a
a’
S
𝑁(a,b) ,  𝑁(a,b’)
𝑁(a’,b) ,  𝑁(a’,b’)
An arrangement of one of Alain Aspect’s and his team’s 1982 experiments (here with variable polarizers; C1 and C2
are switches), which highlighted entanglement and non-locality. 𝑁(a,b)…  𝑁(a’,b’) are the coincidence rates whose
measurements are used to calculate the E-mode polarization correlation coefficients.
The impact of the
second quantum
revolution for
the future of
communication
networks
At Nokia, Bell Labs is conducting
research into breakthroughs from the
existing technologies and solutions
for the future of communications
networks. In the past, Bell Labs
made a significant contribution to
the invention and development of
the technologies in the first quantum
revolution (transistors, lasers, optical
amplifiers, etc.). As the emerging
technologiesofthesecondquantum
revolution have great potential to
represent a break from existing
technologies, certain researchers
at Bell Labs are participating in their
development.
Among the breakthroughs, the one
promised by quantum computing
is probably the strongest. In 1994,
Peter Shor, then a researcher at
Bell Labs, invented an algorithm
that used quantum computers
that could easily break RSA
encoding, which is very difficult
with classical computers. If we
could build quantum computers
with enough qubits, it would
call into question the security
of the many communication
systems that currently use RSA
encryption. However, quantum
decoherence currently limits the
number of correlated qubits in
quantum computers to just a few
dozen. “Topological qubits” could
overcome these limits, and they are
currently being studied at Bell Labs.
Another important breakthrough
is the realization, using quantum
cryptography systems, of
communications that are
intrinsically secure due to the
nature of quantum physics itself:
while the quantum theory is not
called into question, there is no
physical system that can break
communications systems that
are secured correctly by quantum
cryptography.
Developing such systems is even
more important given the threat
quantum computers pose to systems
based on RSA encryption. Bell
Labs contributes to designing such
systems for future communications
networks, particularly in conjunction
with partners on projects such as
CiViQ, a part of the EU’s Horizon
2020 program.
The company
viewpoint
Nokia
Today, quantum optics is still used to measure CHSH,
and increasingly accurate experiments with increasing
distances between Alice and Bob are being carried out.
Tsirelson’s bound, which is found in all formulations
of Bell’s inequality, has been approached, but not yet
reached. If it is reached, we will be certain that quantum
mechanics is the right interpretation of Nature.
For a more comprehensive and very approachable presentation
of these fundamental experiences, read, in French : « Présentation
naïve des inégalités de Bell ». Alain Aspect. [PDF, 35 pages]  ; « Des
objections d’Einstein aux bits quantiques : les stupéfiantes propriétés de
l’intrication ». Alain Aspect & Philippe Grangier. [presentation, 34 slides].
The illustrated story with Alice and her friend is expanded on at greater
length in Mécanique des étreintes, op. cit.
18 19—Quantum realities
Quantum Flagship: Europe invests
in the second quantum revolution
In May 2016, the European scientific
community issued a structuring and
ambitious 150-page roadmap considered to
be representative of research into quantum
technologies well beyond Europe’s borders.
Based on this, Brussels allocated €1 billion
over 10 years to a FET (Future and Emerging
Technologies) Flagship starting in 2018. This
program is important enough for countries
outside the EU such as Turkey, Switzerland,
and Israel to join the program. Similarly, while
the FET Flagship concept is being overhauled
and other programs were stopped in early
2019, the Quantum Flagship did not suffer the
same fate.
In addition to this data and over the same
period, we should include programs with
hundreds of millions of euros in countries
such as the United Kingdom (€650 million),
Germany (€500 million), and the Netherlands
(€150 million), some of which are partnerships
with North American players. These figures
should be compared to the Chinese strategy
estimated at €2 billion as well as the recent
(December 2018) National Quantum Initiative
Act in the US that includes €1 billion over the
first five years.
As you can see, Europe, the United States,
Japan, Singapore, and China are some of the
state actors who are very much established
in the race. China has made significant
investments in quantum engineering,
particularly in quantum communication
pathways by satellite. Europe did not make
these investments, but a real movement now
exists.
Basic science
Engineering/
Control
Software/
Theory
Education/
Training
Computation
Simulation
Sensing/Metrology
Communication
The final report of the Quantum Flagship
High-Level Steering Committee structured
research into fundamental pillars and
cross-cutting actions, illustrated in the
figure opposite that has widely circulated
in the literature.
The Communications, Computations,
Simulations and Sensing and metrology
pillars are introduced in this report on ,
,  and  respectively. The “Education
and Training” cross-cutting domain is the
subject of -.
Stay up to date at: https://qt.eu/
Suggested reading:
The quantum technologies roadmap: a European
community view. Antonio Acín et al 2018 New J. Phys.
20 080201
On April 9, 2019, representatives of the European Commission’s Directorate-General for
Communication Networks, Content and Technology (DG Connect) and the European Space
Agency (ESA) signed a technical agreement to collaborate on the design of a quantum
communications infrastructure, the next generation of ultra-secure communications in Europe.
Based on Quantum Key Distribution (QKD, see next page) technology, this infrastructure will
include components on Earth and in space and aim to strengthen Europe’s cybersecurity and
communications capabilities. The space component, called SAGA(SecurityAnd cryptoGrAphic
mission), will include quantum satellite communication systems. Combining and interconnecting
the activities of the two parts is necessary to provide coverage for the whole of the European
Union. This infrastructure should also stimulate the market for EU industrial players in quantum
technologies and further consolidate and expand Europe’s scientific excellence in quantum
research.
With SAGA, the next chapter is being written in space
A roadmap laid out
over twenty years
The European Quantum Manifesto sets out Europe’s ambitions in quantum research by setting
milestones at 5 and 10 years, and more, taking care to note the interplay between the four key
pillars of research.
Thus, the appearance of new quantum algorithms for simulators and communications will
coincide with the existence of small quantum hardware to run them (2020). Quantum repeaters in
the proposed quantum internet should be capable of detecting attempts to listen in on encrypted
communications, while everyday communications equipment will have quantum components on-
board (2025-2030). Finally, universal quantum computers that exceed the capacity of computers
with traditional architecture should not appear before a pan-European communication network
that combines classical and quantum technologies and whose security components will be
solutions that can resist attacks from quantum computers (see also page ).
Atomic cl
ocks
Quantum
sensors
Intercity q
uantumlinks
Quantum
simulators
Quantum
internet
Universal
quantumcom
puters
2015 2035
20 21—Quantum realities
The principles of superposition and
entanglement that we saw earlier are
the basis of a major field of research into
applications of quantum technologies:
quantum communications. In this field,
quantum systems and states are created
and used with a view to setting up ultra-
secure communication protocols. We often
talk about “quantum cryptography”, and
with good reason, because it is the search
for improvements in certain parts of the
security protocols that has been driving this
line of research. But this now also includes
the design of telecommunications networks,
some of which may call on quantum
technologies. The quantum objects used
are mainly photons through fibers or in open
space.
The security aspect is essential for two
reasons. This first is that having secure
communications is a major objective today
for governments, companies, and citizens. As
we will see, some quantum properties provide
the means to create truly inviolable encryption
processes. But these same breakthroughs
in quantum technology are paving the way
for a new generation of computers which
are otherwise far more powerful—and here
“otherwise” should be understood both
as “differently” and “much more”—than
conventional computers and, therefore, likely
to break the security of the communications
currently provided by traditional computers.
Quantum Key Distribution (QKD) is part of
quantum cryptography, as is the search for
high quality random numbers, i.e. numbers
whose randomness is truly unpredictable, a
property that is also desirable in generating
encryption keys.
Traditional fiber-optic QKD systems are
already operational today over commercial
distances of around 100 km, with a 307-km
connection demonstrated in Geneva in 2015,
up to 2,000 km in 2018 via a Chinese satellite.
Quantum communications
Using single or entangled photons to transmit data
in a manner proven to be secure.
Quantum teleportation is a communication
protocol where what is “teleported” is an
arbitrary and unknown qubit using a pair of
entangled particles that are placed remotely
from each other.
In 2016, China established the first line of
communication on this basis through the
QUESS (Quantum Experiments at Space Scale)
program’s Micius satellite.
QKT (Quantum Key Transceiver), EDT (Entanglement
Distribution Transmitter), EPS (Entangled Photon Source) and
ECP (Experimental Control Processor). Excerpt from: «QUESS
Operations at Chinese Space Science Mission Center», SU et
al., Proceedings of the 15th International Conference on Space
Operations, 2018.
In 1994, a quantum algorithm (based on
the quantum mathematical formalism)
for efficient factorization in polynomial
time was introduced. Shor’s algorithm
therefore presents a risk to RSA-type
systems, even though there is no
quantum computer available today with
the capabilities to run it. The technological
leap needed to achieve this is still huge (it
would take thousands of qubits), but we
need to prepare now.
As is often the case in quantum physics,
and this is a fact that should inspire current
generations, a discovery was born out of
a chance meeting between two different
fields.
The physicist Charles Bennett met
cryptologist Gilles Brassard at a
conference, and they published (at a
minor conference, because nobody
understood it at first) the first mechanism
for exchanging encryption keys by
quantum methods: BB84. This clever
algorithm came long before Shor’s (which
endangered the RSAencryption protocol),
but it was impossible to test it with the
technologies of the time. Several variants
have been proposed since the 1990s.
RSA encryption (after the initials of
its three inventors Rivest, Shamir
and Adleman) is an asymmetric
cryptographic algorithm described
in 1977. It is used every day online for
e-commerce transactions and, more
generally, for exchanging confidential
data (see diagram opposite). This
encryption is based on mathematical
(factoring) problems that are difficult
to solve. .
A
lic e
Bob
Clear text
Encryption Decryption
Clear text
Bob’s
public
key
Bob’s
private
key
Encrypted
text
A
lic e
Bob
Quantum
channel
Non
polarized
photon
Polarization
H–V or
diagonal
Filters
The multi-stage BB84 protocol allows Alice and Bob to establish a
sharedencryptionkey.Photonsofachosenpolarizationaretransmitted
on the communication channel. In the first stage, Alice chooses a
random sequence of bits, transmits them as photons with a random
polarization, and Bob filters them according to a random reception
pattern. In the next step, Bob gives Alice this list of bases, which tells
her which ones he has chosen correctly. This gives a sequence of bits
from which a shared key can be deduced in a third stage, that is unless
there is a disagreement on the values of these bits, which is proof that
a hacker may have tried to eavesdrop on the channel (which will have
disturbed the state of the emitted photons).
22 23—Quantum realities
The impossibility of cloning quantum
information is an unrivalled advantage for
secure communications. However, it is a
major drawback in fiber-optic networks
which need the signal to be repeated and
reamplified at regular intervals to overcome
losses in transmission. The repeaters
react with the quantum signal (through
decoherence) and prevent the quantum
keys from being transmitted. They must
be completely redesigned. Moreover,
since they are the weakest link in quantum
communications, they must be physically
incorruptible, and embedding them on
satellites and high-altitude platforms (e.g.
drones) is one solution being considered for
long-distance connections. This is a trusted
node architecture consisting of several QKD
systems chained together, providing a core
or access network.
For long-distance quantum networks,
fully quantum solutions designed around
repeaters using multi-mode quantum
memories distribute the entanglement along
the entire chain. Once the different sections
are ready, they are connected to each other
by so-called “Bell state” measurements,
until communication is entangled from
end to end, offering the desired possibility
of teleporting the qubits directly to their
destination, thus avoiding transmission
losses. There is currently significant activity
in the development of quantum memories
using a wide variety of physical platforms that
are both efficient (information is not lost) and
offer scalable solutions.
CiViQ
 Integration of CV-QKD in telecommunication networks to ensure
long-term, reliable data confidentiality. Development of flexible
and profitable CV-QKD systems Study of new QKD protocols and
theoretical approaches to open the wat to future global quantum
networks
 21 partners
 €9,974,006.25
Projects funded as part of the Quantum
Technologies Flagship in the Communications
pillar for the period 2018-2021.
The race between countries exists, but also between
major companies, regularly bringing spectacular
breakthrough announcements. In this context, the
term quantum supremacy first appeared back in
2011. It designates the moment—the number of
qubits—beyond which no conventional computer could
compete with a quantum computer. This is thought by
some to be situated beyond 50 qubits—qubits with a
low error rate and being as stable as possible. Recent
announcements have suggested that this threshold
may have been reached; although often for a very
short period of time and many commentators remain
skeptical. This is especially true since this ultimate
threshold will have to be reached for algorithms that
are genuinely useful.
The term quantum advantage might be preferable,
for three reasons. Firstly, it serves as a reminder that
Europe has a definite theoretical advantage and must
retain it. Secondly, there are concrete use cases where
deploying quantum solutions provides useful advanced
functionalities: for example, the quantum credit card
(see page ), quantum computation delegated to the
cloud, or the design of complex secure communication
schemes.
The third good reason for using the term advantage
lies in the context of quantum cryptography and post-
quantum cryptography. These are two different things.
The former uses quantum properties to implement its
(key distribution) protocols, but the algorithms it uses
(e.g. RSA) could be brought into question by quantum
computers, and the content of messages (which may be
recorded today by intelligence agencies in the hope of
being deciphered in the future) could be compromised.
Post-quantum cryptography thus consists in devising
algorithmic encryption solutions that are completely
different from what quantum (and classical) computers
are good at, and doing so right now to keep our
advantage (by deploying them as soon as possible),
while remaining efficient in encryption time. This is a
very exciting area of mathematical research, with a
wide variety of solutions, some of them hybrid, which
also need to be proven to be secure.
The quest for the quantum advantage may be more
fruitful than the search for quantum supremacy.
Supremacy or advantage?
Quantum Internet Alliance
 Construction of a quantum internet
that would enable communication
applications between any two points
on Earth.
 34 partners
 €10,406,113.5
QRANGE
 Improvement
of random number
generation technologies.
 8 partners
 €3,187,282.50
UNIQORN
 Development of low-
cost, robust, reliable and
small products for the
mass market.
 17 partners
 €9,979,905
24 25—Quantum realities
Because trust and security are a bank’s main assets,
banks are studying the impact of the first applications
of quantum computing that could emerge in the next
few years. For BNP Paribas, quantum computing will
break the public key algorithm systems that are used
for key exchange, key distribution, authentication, and
electronic signature. This is why banks and insurance
companies must be prepared to be proactive in
protecting and guaranteeing the authenticity of
information with a long lifecycle created today. This
involves mapping current uses of cryptography
and anticipating the transition to post-quantum
algorithms. And they must prepare to capitalize
on the contributions of quantum infrastructure to
strengthen encryption key security, which is a major
strategic and economic challenge.
Quantum computing is particularly effective in
stochastic computing, the basis for financial models.
Four types of applications will emerge and become a
competitive advantage:
•	 filtering transactions on sanctions and embargo
lists;
•	 audits to fight fraud, money laundering, and
terrorism funding;
•	 asset-liability management by implementing
more relevant models to optimize financing;
•	 the pricing of structured products and derivatives
by valuing the cost of risk as accurately as
possible by increasing the number of paths in
Monte Carlo simulations or by evaluating more
elaborate scenarios.
For BNP Paribas, the emergence of quantum
computing poses three immediate challenges for
Human Resource management in the banking sector:
•	 anticipate the impact on changes to the
professions and skills of IT employees;
•	 reinforce expertise in quantum and post-
quantum encryption;
•	 attract new expert profiles by co-building
appropriate training courses and model skills
with schools, universities, and research
laboratories.
Quantum computing, as a complement to current
computing, can help banks serve customers better
while respecting the regulatory framework. A global
understanding of technology and its potential use
cases is needed to better grasp new cybersecurity
risks and tailor regulations to these new quantum
technologies.
Quantum’s impact on banks
The company viewpoint
BNP Paribas
Towards a quantum credit card?
What if the anti-cloning theorem of quantum systems could inspire totally
secure financial transactions? This is one of the areas of research for
quantum cryptography specialist Eleni Diamanti, a CNRS researcher at
Sorbonne University in Paris, who previously spent a few years at Télécom
Paris. She co-published in 2018 in the journal Nature “Experimental
investigation of practical unforgeable quantum money “, a description of
experiments carried out around the practical applications of the theoretical
work of Stephen Wiesner in 1983 and the anti-cloning theorem. This quantum
property prohibits a malicious third party from obtaining information about a
quantum system without disturbing it, and therefore without leaving a trace.
Avoiding transactions being disrupted or duplicated is one of the reasons for
the existence of cryptocurrencies, and quantum approaches could achieve
this without having to go through their currently energy-intensive mining
algorithms.
Another visible experiment in her laboratory is an optical bench implementing
QKD-CV key sharing (see page ), which is a protocol that does not require
the production of single photons, and allows the use of coherent detectors
and other common technologies. “This research is intended for use outside
the lab,” says Eleni Diamanti. “This is very important for imagining the future
of the quantum Internet, where we will need much more compact devices at
reasonable costs.”
Profile of a
quantum player
Eleni Diamanti
26 27—Quantum realities
Problems and use cases
that benefit from quantum technologies
Each of the Quantum Flagship’s four pillars
is aimed at producing applications and
uses in the short or medium term. Sensors
and metrology (page ) and quantum
communications, discussed above, have
already led to commercial applications. For
quantum communications, the quantum
hardware needs to prepare and measure
only one qubit at a time, whereas quantum
computation and simulation depend on
hardware with several hundred or even
several thousand physical qubits, as we will
see later. However, this does not prevent us
from identifying right now the use cases and
problems that will be particularly suited to
quantum technology.
The first applications resulting from the
entanglement properties concern use cases
in transaction security, leading to a new
design phase for communication networks.
The quantum property of superposition of
states opens the way to other categories of
problems. As for the key sharing protocols
that were redesigned at the dawn of quantum
technology and that have left the lab, a series
ofnewalgorithmshasbeenimaginedsincethe
1980s, promising to resolve in a reasonable
amount of time several types of problems
that are either difficult or impossible to solve
with classical computing architectures. These
are problems for which the resources needed
do not grow linearly with the input data and
which can benefit from parallel processing of
this data.
Smart management of transportation
data, modeling fuselages, autonomous
flight systems, management of
satellite constellations, etc.: a
manufacturer like Airbus sees the
capacities and forms of calculation
that quantum technology offers
as a natural extension of its High-
Performance Computing activities.
Direct physical problems
Hard-to-grasp physical problems are natural
candidates, especially those involving nuclear
physics. The aim is to take advantage of a
direct homology between these problems and
the quantum hardware equipment that would
be developed to solve them. Applications in
materials physics, fluid mechanics (airflow
on a fuselage, meteorology, etc.), molecular
mechanics, chemistry, pharmacology, and
synthetic biology are candidates, and some
have already shown results.
To understand the behavior of these
large sets of particles, we must solve the
Schrödinger wave function. However, the
classical approaches for doing so involve
approximations, which become greater and
more critical as the molecules studied and
their environment become more complex.
Using a quantum simulator (see page  and
on) offers the best approach to modeling the
problem.
Sampling problems
This is a class of problems consisting in finding
representatives among a large population of
data to recreate the population distribution.
Simulators based on quantum annealing
principles are well suited to this category.
Optimization problems
This is without a doubt one of the main fields
of interest for quantum computing right now.
Optimization problems are challenges whose
goal is to find the best decision among a great
number of possible decisions. They are often
very concrete problems that we find in nearly
all fields of business. For example: find the
least expensive way to ship goods, identify
the most efficient way to extract resources
from a mine, find the most productive way
to allocate resources in a production chain,
find innovative ways to discover medicines,
or identify the best way to manage risk in
financial portfolios.
While the processing time needed for a
classical computer to provide acceptable
Quantum machines can handle a
few types of problems that are rather
common in business. Let’s take three:
•	 Optimizing journeys by
minimum cost (duration, length,
etc.). Use cases: scheduling,
supply chain, distribution and
energy, transportation and
traffic optimization. Industries
concerned: energy & public
services, transportation &
telecoms.
•	 The “backpack” problem of
filling a volume with a maximum
number of objects without
exceeding a certain weight.
Use cases: filling containers,
allocating telecoms capacity,
sizing in transportation and
energy. Industries concerned:
all, especially transportation &
telecoms.
•	 The statistical evaluation of
scenarios, average values, and
trends. Use cases: risk analysis,
portfolio management, actuarial
science, oil exploration, weather
forecasting, maintenance,
fraud. Industries concerned: all,
especially banking and insurance.
28 29—Quantum realities
solutions quickly becomes prohibitive
since optimization problems can increase
exponentially with the size of the problem,
quantum computing promises to provide an
answermuchmorequickly.Theconsequences
of such a change will be very beneficial since
companies that master these technologies
or that are able to access them will have
an increased capacity for optimization and
improve their competitiveness.
Machine learning problems
Machine learning is a special class of
optimization problems, a class of problems
that can greatly benefit from parallel
computations, and a set of techniques that
has received a great deal of attention from
the scientific community in recent years,
so it is definitely a candidate for future
quantum machines. Hybrid approaches
involving both classical and quantum
approaches on platforms combining FPGAs
(Field-Programmable Gate Array—flexible
processors whose integrated circuit is
itself reprogrammable), GPUs (Graphics
Processing Unit—processors dedicated to
image processing), TPUs (Tensor Processing
Unit—processors dedicated to computation
in artificial neural networks), and quantum
equipment are studied.
Certification challenges
There is also currently a great deal of
theoretical work being done to develop new
protocols and new approaches to system
certification, particularly in the field of
quantum security. In field deployments, it has
been shown that, even with perfect proof of
key security, attacks using physical devices
(e.g. noise observation) can be carried out.
The term quantum hacking appeared.
There are various approaches ranging from
work bringing together experts in quantum
and classical security to the development
of practical proofs of security and actions
at the more fundamental level of quantum
properties. Certification is also beginning to
take business considerations into account
so that devices and systems are certified
to industry standards. The standards
themselves represent an important challenge
that has begun to be addressed by research
projects involving industry and academia as
well as national metrology institutes.
“
“The combination of classical and quantum approaches often appears when we look
closely at quantum engineering projects,” notes Alice. “I doubt that is a coincidence.”
“Absolutely,” replies Alan, “and that’s also why quantum technologies need multidisciplinary
knowledge in mathematics, physics, and computing. But you should also keep in mind the
two following points. These hybrid approaches are needed right now to push back the frontiers of
knowledge in quantum technologies, little by little. In the future, they will always be present because
quantum computing is not suited to all algorithmic problems.”
Discovering her calling
Yuan Yao is a Chinese student who discovered
quantum physics at IMTAtlantique on a course
with Professor Francesco P. Andriulli. Her
passion for this mysterious field continued in
2017-2018 at the Technische Universität in
Berlin (Erasmus+ program) where she carried
out a project with Professor Eckehard Schöll on
“Non-Markovianity and Information Backflow”,
which is used to measure the information
coming from the interaction between the
quantum system and its environment.
“Like everyone else, I was quite disturbed
by what I was discovering in the quantum
field, and I had many questions. What is the
use of quantum physics? Is Schrödinger’s
cat alive or dead? How do we use quantum
entanglement in the real world? Where can I
continue my studies in this field?”
In 2019, after a meeting with Romain
Alléaume and Filippo Miatto at Télécom
Paris, she joined their quantum engineering
program (see page ) and is about to
start her PhD. Until then, she is doing an
internship as a quantum engineer at Total.
“During the six months I spent in the
quantum engineering program, I took
professional and specific courses on
trends in the quantum field. We also have
a research-oriented project. What I like in
this training course is that it brings together
students in the same field: we have courses
with students from the Optical Institute and
master’s students at Diderot University.”
At Total, her internship is tutored by Henri
Calandra and focuses on applying quantum
algorithms in chemistry to simulate molecules.
By using a quantum computer simulator
platform, she can see a quantum computer’s
results in real time. “Two algorithms are at
the heart of my research: VQE (Variational-
Quantum-Eigensolver) and Quantum
Approximate Optimization Algorithm (QAOA).
The first one is used to find the minimum
eigenvalue of an operator by combining the
quantum and the classical algorithms, and
it allows us to calculate larger molecules in
a reasonable amount of time compared to
classical methods. Many companies have
implemented it: Qiskit Aqua at IBM, Grove at
Rigetti as well as ATOS who developed it on its
quantum simulator.” The QAOA algorithm will
be studied for its application in combinatorial
optimization.
Full of enthusiasm, Yuan Yao is delighted
that more and more people are joining a
burgeoning field, spurring the Quantum
Machine Learning field which will integrate with
so many others.
Profile of a
quantum player
Yuan Yao
30 31—Quantum realities
Quantum sensors and metrology
Since May 20, 2019, the International
System of Units has been based on seven
fundamental constants, including the Planck
constant h, which is now used to calculate
the precise value of the kilogram. Quantum
electronic devices (semiconductors and
superconductors) using the quantum Hall
effect, for example, have been used to
redefine these units.
Measurement—using sensors—is an
act at the heart of science. It can also be
found at the heart of business activities
that require metrological standards without
which exchanges and transactions would
never be accurate. Quantum sensors
are needed for navigation, underground
prospecting and mapping, materials analysis
and characterization, medical analysis as
well as the fundamental sciences, from the
subatomic scale—using localized spins—
to the planetary scale—based on photons.
Some platforms are already close to
commercial application; others need further
engineering to be fully viable.
A quantum sensor is a probe prepared in a
particular quantum state, which interacts
with the system to be measured, and whose
reaction (projection of the values to be
measured over space and disappearance
or decoherence of the probe) allows us to
estimate what we want to measure. The
number of particles in the probe and the
measurement time indicate the sensor’s
effectiveness. While conventional sensors
have an effectiveness of N of the number of
particles involved, the best quantum sensors
can in principle achieve N accuracy.
Atomic clocks
Today, the precision of time measurement
has reached 1 second over 1 billion
years. Optical clocks currently under
study range from neutral atoms in
optical arrays to highly charged ions
and even nuclear transitions. Neutral
atoms offer a high signal-to-noise ratio but
are generally more sensitive to external fields
and collision offsets, requiring careful control
of their environment.
Atomic sensors
This refers to atomic interferometry,
which exploits the sensitivity of
quantum superposition to create
ultra-precise sensors to measure
gravity, rotation, magnetic fields,
and time, surpassing their best
conventional counterparts, and enabling
inertial navigation without GPS.These sensors
are ready to be translated into commercial
products, such as magnetometers using
diamond’s “NV” color center. Current atomic
gravity sensors offer absolute measurements
at the nano-g level or gravity gradient
sensitivities exceeding a change of 100 pico-g
at a distance of one meter.
Using the extreme sensitivity of coherent quantum systems
to external perturbations to drastically improve the
performance of measurements of physical quantities.
Atomic cl
ocks
Atomic s
ensors
Optomechanical sensors
These are miniature mechanical systems
(MEMS, NEMS, and their derivatives) that are
currently integrated in chips and embedded.
Over the last decade, a technological and
scientific paradigm shift has taken place
around the optical and quantum control of
these devices. They can now be measured
andcontrolledatthequantumlevelbycoupling
them to optical cavities or superconducting
microwave circuits. Current research in
this field explores the physical limitations of
hybrid opto- and electromechanical devices
for converting, synthesizing, processing,
detecting, and measuring electromagnetic
fields from radio and microwave frequencies
up to a THz. Fields of application include
medicine (magnetic resonance imaging),
security (radar and THz surveillance),
timekeeping, and navigation.
Quantum imaging
This involves acquiring images with an optical
resolution beyond standard
wavelength limits with low
light levels or with strong
background lighting.
Detecting details with
this level of accuracy has
immediate applications in the fields of
microscopy, pattern recognition, image
segmentation, and optical data storage.
Correlations between beams of quantum
light allow for new imaging modes such as
“ghost imaging” in which the image of an
object illuminated by one beam is acquired
by a camera that receives a different beam
that has not reached the object.
Spin qubit detection
Using spin qubits for detection is a relatively
new field. Detecting magnetic fields, which
are of crucial importance for chemistry,
biology, medicine, and material sciences, is
naturally done with spin sensors, but these
have also shown their ability to measure
other quantities, such as temperature, electric
fields, pressure, force, and near-field optics
(with technologies based on diamond
and silicon carbide defects). These
sensors derive their advantage and
robustness from the spins’ long-
term quantum coherence.
All this research
has led to progress
in the creation of
small quantum
devices, which
is of primary
importance for
future engineering
concerns in the
design of quantum
computers.
MetaboliQs and
ASTERIQS rely on
doped-diamond
technologies
iqClock
 The development of affordable onboard
ultra-precise optical clocks.
 12 partners
 €10,092,468.75
MetaboliQs
 Molecular sensors to
diagnose cardiovascular
diseases.
 7 partners
 €6,667,801.25
macQsimal
 All types
of sensors
(autonomous
vehicles, medical
imaging, etc.).
 13 partners
 €10,209,943.75
ASTERIQS
 Sensors for electric and
magnetic fields, pressure,
temperature, etc.
 23 partners
 €9,747,888.75
Optomec
hanicalsenso
rs
Quantum
imaging
Spin qubi
tdetection
Projects funded as part of
the Quantum Technologies
Flagship in the “Quantum
sensors and metrology” pillar
for the period 2018-2021.
32 33—Quantum realities
Qubits:
the building blocks
of quantum computing
It is often said that the qubit is to quantum computing
what the bit is to classical computing, but the analogy
has its limits. For example, saying that a qubit ∣𝛙⟩ can
take any value between ∣0⟩ and ∣1⟩ might suggest that
this is the classical interval between 0 and 1, whereas it
is a combination of two quantum states (in the case of a
bipartite quantum system), a base state written ∣0⟩ and
an excited state ∣1⟩, and that this combination ∣𝛙⟩  =  𝑎∣0⟩ 
+𝑏∣1⟩  follows strict rules written into quantum postulates,
such as ∣𝑎∣2
+∣𝑏∣2
= 1, 𝑎 and 𝑏 being complex numbers. A
more realistic visual representation is the Bloch Sphere
(opposite), where the qubit is like a vector of unit length
starting at the center of the sphere and ending at a point
on its surface..
The analogy is also misleading if we consider that bits
can be assembled into bytes and represent information
by themselves, be stored in memory, be processed by
the gigabyte, be copied, etc. For example, while it is
possible to transport the state of one qubit to a second
qubit (teleportation), the first one is reset by this action,
so it is not a copy in the classical sense. In the same way,
qubits’ lifetimes (due to decoherence) have an essential
importance that bits do not have.
On the other hand, as in classical computing, qubits
(initialized in their state ∣0⟩) can be manipulated to use
them as a computing medium according to a notion of
circuit and (a succession of) gates. These can process
one, two, or more qubits, and are built with sophistication.
For example, the Hadamard gate is a mechanism that
acts on a single qubit and creates a superposed state, the
direct measurement at the output of this gate giving an
equal probability of finding a state of 0 or 1. The expertise
lies in assembling these gates (both algorithmically and
physically) in such a way that the output measurement
has a meaning appropriate to the problem being studied.
y
∣0⟩
∣1⟩
x
∣𝛙⟩
The “CNOT” gate is a so-called
controlled gate. It is designed to act on
two qubits, with a NOT (state) inversion
operation being performed on the second
if and only if the first is ∣1⟩. It acts like a
controller of the second. This gate allows
us to generate entangled states. A dual
matrix-and-graphic formalism allows
the gates (above, the CNOT gate) to be
represented and made more familiar.
They are also basic building blocks of
quantum computing. Depending on the
type of the qubit’s physical support, some
of them can be easily produced, and
others cannot.
When talking about qubits, it is important to keep
in mind their mathematical representation, the
realities of their physical implementations, and
the particular properties of the quantum states on
which they are based.
1 0 0 0
0 1 0 0
0 0 0 1
0 0 1 0
[ ]
QUANTUM
ENGINEERING
“What strikes me,” continues Alice, beginning to take the full
measure of the quantum revolutions, “is that scientists have
often made hypotheses, design thought experiences at one
time, but it took many years before we could verify them
through experiments and then transform them into practical
applications.”
“Indeed,” adds Alan, “most of the time the technology wasn’t quite ready
yet, or no technology was even imagined. It’s almost a quantum constant in
itself, and concerning quantum computation, it’s exactly what’s happening
right now. To implement qubits, some manufacturers are even betting on
quantum particles that haven’t even been discovered yet.”
“Qubits: I think that’s a topic I will need to spend some time on, and not
compare them to computing bits too easily.”
“You’re right there, too. There are many physical ways to make qubits, and
a variety of quantum states on which to build the superposition of states
for which they are so interesting. But also, it may be necessary to use
several physical qubits to obtain the equivalent of one logical qubit. But
whatever the case, the quantum algorithms are already here, and have
been for a long time in the case of Shor’s algorithm, and we mustn’t delay;
we must invest in simulators and then in quantum computers. To build
quantum engineering, we will need to take on a series of both scientific and
technological challenges.”
“
34 35—Quantum engineering
Quantum simulation
The idea of designing quantum simulators
dates back to 1982, when Richard Feynman
pointed out that, since the evolution
of a quantum system is a very difficult
mathematical problem to solve with the size
of the corresponding Hilbert space increasing
exponentially with the number of quantum
states, only another, controllable quantum
system could solve this problem.
Quantum simulators are physical quantum
devices that are precisely prepared or
engineered to study an interesting property
of a complex and interacting quantum or
classical system. Used to address problems in
the quantum world, they could yield previously
unknown knowledge, creating a feedback loop
so we can design increasingly sophisticated
equipment. In any case, we must choose
interesting, complex problems that can be
simulated by physical systems that are easier
to create, prepare, and control, and for which
the results can be obtained by measurement.
The term quantum simulator covers several
distinct varieties:
1.	 static simulators that study the
properties of interactive systems such as
the characteristics of base states,
2.	 adiabatic computers, which are applied
to classical optimization problems
(these machines get their name from
the so-called adiabatic processes, which
are processes in which the external
conditions of a system change slowly
enough for the system to adapt),
3.	 dynamic simulators that study the
properties of systems far from their
equilibrium.
There are two possible implementations:
1.	 analogue, which reconstitute a quantum
system’s temporal evolution under very
precise control conditions,
2.	 digital, based on quantum circuits doing
quantum computation (see following
pages).
Using well-controlled quantum systems
to mimic the behavior of other quantum
or complex systems that are less
accessible to direct study.
A 512-qubit chip from D-Wave Two
(Vesuvius), 2013.
Credit :SteveJurvetson,Flickr.CCBY2.0.
Qombs
 A new generation of simulators based
on ultra-cold atoms that can be used
to design quantum cascade lasers
characterized by entangled emissions
within the frequency comb.
 10 partners
 €9,335,635
PASQuanS
 The next generation of quantum
simulators beyond 1,000 atoms or
ions, fully programmable.
 15 partners, including Atos BULL
and French startup Muquans
 €9,257,515
There are many present challenges: identifying
modelsthataredifficultforclassicalsimulations
to compute but interesting and important from
a physical point of view; developing validation
and verification tools; designing experimental
setups and implementations of enough size
while presenting a high degree of qubit control.
One major challenge lies in determining if a
quantum simulation was executed correctly.
The goal is to solve problems that are
inaccessible to classical methods: a quantum
simulator executes tasks that we cannot track
effectively, while we want proof that it was
carried out with precision. One approach
is to assume that there are still appropriate
parameter sets for which these models
will become totally—or at least partially—
accessible to classical simulation.
Projects funded as part of the Quantum
Technologies Flagship in the Simulation
pillar for the period 2018-2021.
Some images depict quantum
equipment (here the D-Wave
2000Q™) that is impressive given the
cooling engineering required to reach
an operating temperature close to
absolute 0. However, they must not
lead us to believe that all quantum
hardware will be massive. Other
kinds of physical qubits that operate
at room temperature could allow for
miniaturization.
Suggested reading: Introduction to
the D-Wave Quantum Hardware sur
le site de D-Wave.
36 37—Quantum engineering
With the race for quantum supremacy, quantum
computer announcements come in quick succession,
and it is often difficult to keep track. The DiVicenzo
criteria (see opposite), which no qubit currently fulfills,
are low-level criteria that provide an initial assessment.
In addition to the distinction between analogue
approaches (which are mainly used for simulations)
and digital approaches (by circuits and gates), we
must also distinguish between a universal quantum
computer, onto which we can port all the algorithms
we want, and a dedicated quantum computer that is
specialized in certain kinds of problems, or even just
a single task. The latter is what is currently appearing
now. Furthermore, their qubits are not yet pure enough
to avoid computation errors. Announcements should
be studied in light of their error rates. Given qubits’
current stability, upcoming announcements with figures
reaching a hundred qubits will mean qubits without
error correction.
A higher-order certification will take the following points
into account:
A.	 Base functions: are the DiVicenzo criteria fulfilled
for at least two qubits?
B.	 The quality of operations: has the error rate of
all relevant operations been measured? Are they
compatible with the error correction thresholds?
Have all the ingredients of a fault-tolerant
architecture been demonstrated?
C.	 Error correction: Has quantum error correction
been demonstrated, and is it effective? Are the
logical error rates less than the physical error
rates?
D.	 Fault-tolerance operations: have the operations
on the logical bits been implemented to tolerate
faults? Have these objectives been reached for a
universal set of gates (Clifford + T)?
E.	 Algorithms: have complex, fault-tolerant
algorithms and operations been implemented?
Quantum computation
and computers
The clever use of quantum properties—such as superposition—to
manipulate quantum systems that are seen as elementary parts of an
algorithm, greatly accelerating it.
For more information:
« Entwicklungsstand Quantencomputer », Bundesamt für Sicherheit in der Informationstechnik (German Federal
Office for Information Security). May 2018. 238 pages.
IBM Q quantum computer 2018
Credit :LarsPlougmann.Flickr.CCBY-SA2.0.
The criteria that David DiVincenzo
proposed to IBM in 2000 as conditions
for the practical realization of a quantum
computer are as follows:
1. The number of qubits must be known
precisely.
2. The operator must be able to initialize
the qubit register in a stable state.
3. The coherence time must be
sufficiently long.
4. A universal set of quantum gates must
be available.
5. A qubit’s state must be measurable.
Rethinking algorithms
OpenSuperQ
 Creation of a quantum computer
based on superconducting circuits.
 10 partners
 €10,334,392.50
AQTION
 Creation of an ion-based quantum
computer.
 9 partners
 €9,587,252.50
Projects funded as part of the
Quantum Technologies Flagship in
the Computation pillar for the period
2018-2021.
Performing a quantum calculation requires steps which
are not common on a classical computer. First, we
must prepare all the qubits needed for the computation
by placing each of them in an initial quantum state
that is appropriate to the problem. This set of qubits
then undergoes a sequence of operations through the
quantum gates while their states change. Only at the
end do we take a measurement of the probability of
the final state which gives us the desired solution—a
probabilistic one, and not a deterministic one as in
classical computing. So, it is reasonable to perform
these steps several times to increase the reliability of
the results.
Classical computer architectures prepare the
computations and measure the results. Therefore, a
quantum computer is intrinsically hybrid.
The differences in nature between classical computing
bits and quantum qubits (see page ), including
the impossibility of copying a given qubit’s state, for
example, have led to a profound rethinking of the way
in which calculations are performed on data, which
has resulted in the modernization of some classical
algorithms, targeting classical architectures and
improving the knowledge of current computing.
As we move toward universal quantum computers, an
article published in early 2018 by physicist John Preskill
introduced the concept and term NISQ (Noisy Intermediate-
Scale Quantum), which has been widely circulated since
then. NISQ is defined as the current and future class of
computers with an intermediate number of qubits (50 to a
few hundred) that are controlled imperfectly and subject to
a remaining quantum noise that is sufficiently acceptable to
provide scientific results that are worthy of interest.
« Quantum Computing in the NISQ era and beyond », John Preskill,
January 2018.
A milestone called NISQ
38 39—Quantum engineering
Profile of a
quantum player
Adrien Facon
Towards post-quantum cryptography
While quantum technologies offer tremendous opportunities, they also pose a threat
to the security of today’s information systems. Today’s information systems are largely
based on solving problems that are said to be computationally difficult for a classical
computer as we know it today—factoring large numbers (for RSA), discrete logarithm
problems (on elliptic curves)—which will not resist attacks from quantum machines
for very long. “We need to find new problems that are difficult to solve, ones that are
fundamentally complex,” explains Adrien Facon, Director of Research and Innovation
from 2016-2019 at Secure-IC, a leader in integrated security. “That’s what post-quantum
cryptography is all about.”
After studying at the École Polytechnique and doing research at the Harvard University
Physics Department, Adrien continued his studies at the École Normale Supérieure in
Paris before joining the CQED (Cavity Quantum Electrodynamics) research team at the
Collège de France. He defended his doctoral thesis entitled “Chats de Schrödinger d’un
atome de Rydberg pour la métrologie quantique” (Schrödinger cat states of a Rydberg
atom for quantum metrology) under the supervision of Professor Serge Haroche. Adrien’s
work is a major step forward in the field and was published in the journal Nature.
Adrien Facon, who moved on to Innovative Industry after completing his thesis, is now
project leader of the French national post-quantum cryptography program (PIA RISQ)
which brings together leading French players from academia (ENS Paris, CEA, Inria,
UPMC, PCQC, University of Versailles and IRISA), industry (Secure-IC, Thales and
formerly Gemalto, Orange, CS, Airbus, and CryptoExperts), and the ANSSI (French
National Agency for Scientific Research). The project submitted more than a quarter of
the candidates selected for Phase 2 by the United States’ NIST as part of an international
effort to develop new standards for public key cryptography, digital signatures, and key
exchange. “The gradual transition to post-quantum encryption systems, particularly via
hybrid crypto-systems, will permanently safeguard the confidentiality and integrity of our
communications, starting today,” the researcher adds.
Since September 2017, Adrien Facon has also been a part-time lecturer at Télécom
Paris and an associate researcher in the IT department of École Normale Supérieure
within the Information Security group headed by David Naccache.
A sensitive electrometer based on a Rydberg atom in a Schrödinger-cat state. July 2016.
https://www.nature.com/articles/nature18327
Regroupement de l’Industrie française pour la Sécurité Post-Quantique  : http://www.risq.fr/
2D-SPIC
 Single-photon optical
technologies on 2D
substrates.
 5 partners
 €2,976,812.50
S2QUIP
 Integrated circuits to provide
quantum light sources on demand.
 8 partners
 €2,999,298.75
QMICS
 Microwave quantum local area
network and microwave single-
photon detector.
 8 partners
 €2,999,595
PhoQuS
 Developing a quantum
simulation platform based
on quantum fluids of light.
 19 partners
 €2,999,757.50
MicroQC
 Demonstrating microwave-
controlled 2- and n-qubit gates
that are fast and fault-tolerant
(trapped ions).
 5 partners
 €2,363,343.75
SQUARE
 Research into using rare earth
ions as qubits for high-density
integration.
 8 partners
 €2,990,277.50
PhoG
 Developing photon
sources from non-classical
states of light.
 5 partners
 €2,761,866.25
Fundamental research,
more necessary than ever
Projects funded as part of the Quantum
Technologies Flagship in the “Scientific
Fundamentals” cross-cutting domain for
the period 2018-2021.
The list of Quantum Flagship projects that fall under the
“Scientific Fundamentals” cross-cutting domain speaks
for itself. As ever, this research is not only abundant, but
essential to continue paving the way for industrial solutions.
This includes finding new ways of producing quantum
systems that are easy to manipulate (the space required,
production and operating temperature, etc.), sufficiently
stable, affordable, available in large quantities, etc.
Here, both the basic physics research and the need for
multidisciplinary approaches are key. Several research
teams from IMT Atlantique and its partners, both academic
and industrial, are participating in this movement.
In the Subatech laboratory in Nantes, researcher Audrey
Francisco-Bosson uses the Large Hadron Collider
at CERN’s ALICE detector to probe the very depths of
matter. In particular, she is working on the particle J/ψ,
to better understand the quark gluon plasma’s behavior.
This research allows us to test the properties and laws of
quantum chromodynamics—the theory that describes the
strong interaction between quarks—and to check whether
the model we use to describe matter is correct.
In Brest, Vincent Castel and his team are working on the
magnetic properties of synthetic Yttrium Iron Garnet (YIG),
which allows photons and magnons to be strongly coupled,
creating magnon-polaritons, hybrid light-matter quasi-
particles. This could pave the way for memories and sensors
that can be used in quantum computers. Quantum is not only
photonic; it is also magnetic!
40 41—Quantum engineering
The quest
for ideal qubits
Josephso
n
effectsupe
rconductorju
nctions
Neutral a
toms
Trapped i
ons
Understanding the differences between
conventional computing bits and qubits
(page ) is necessary but not enough. The
qubits themselves differ in nature, and to
be able to find our way in the landscape
of existing or expected quantum machines,
the promised computing capacities, the
universality or not of these calculations,
the number of qubits needed to reach any
performance threshold, or the promises
of miniaturization, it is important to
understand the different aspects by which
we characterize qubits.
Two common questions
“Which qubit is the best?” is not the right
question. There is no intrinsically ideal
qubit, but qubits that are suited for what
we expect from them, for example having
the longest possible decoherence time
(stability) or able to be “moved” over a
certain distance.
“Hundreds or thousands of qubits?” We
need to distinguish between two things: a
logical qubit can require deploying dozens
ofphysicalqubitstoaccountforthephysical
limitations: the qubits in 2,000-qubit
simulators are not as connected to each
other as those in machines with a few
dozen.
Managing errors
The results measured on quantum
observables are not always reliable, and
a vast field of research has developed
around quantum error correction.
Diamond
NVcenters
The methods consist in ensuring
redundancy of the physical qubits and then
evaluating an average of their behavior,
as well as applying classical numerical
techniques in parallel.
Quantum volume
In the quest for qubit stability, what
really counts is the ratio between the
coherence time (currently a few hundred
microseconds on superconducting
circuits) and the number of operations
that can be performed (from 1,000 to
10,000, compared to only a single one
20 years ago). More than the number
of qubits in a quantum machine, what
counts is what the community now calls
quantum volume: the effective volume
of computation that can be performed
before errors hide the result.
The search for
entanglement and
connectivity
An important aspect of a quantum
machine’s architecture is the degree of
connection a qubit can have to become
entangled with other qubits: can it do this
with just its neighbors, or with qubits that
are much more distant? This underlying
geometry has repercussions on these
architectures’ scalability: does adding a
qubit only allow for a quadratic increase
in performance (as with D-Wave’s 2048
qubit machines), or does it allow for an
exponential increase in performance?
In this regard, trapped ion technology
allows for total connectivity in addition
to a rather long coherence time that
requires less error correction, highlights
the specialized startup IonQ.
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training
THE QUANTUM ADVANTAGE: Challenges for industry and training

More Related Content

Similar to THE QUANTUM ADVANTAGE: Challenges for industry and training

Mobile Monday (October 2014) - Riding Global Tech Trends
Mobile Monday (October 2014) - Riding Global Tech TrendsMobile Monday (October 2014) - Riding Global Tech Trends
Mobile Monday (October 2014) - Riding Global Tech TrendsMobile Monday Yangon
 
Catlabs collaboratory-2017
Catlabs collaboratory-2017Catlabs collaboratory-2017
Catlabs collaboratory-2017Artur Serra
 
Catlabs Collaboratory-2017
Catlabs Collaboratory-2017Catlabs Collaboratory-2017
Catlabs Collaboratory-2017Artur Serra
 
Presentatie Amsterdamse Innovatie Motor door Diogo Vasconcelos
Presentatie Amsterdamse Innovatie Motor door Diogo VasconcelosPresentatie Amsterdamse Innovatie Motor door Diogo Vasconcelos
Presentatie Amsterdamse Innovatie Motor door Diogo VasconcelosCreative Cities Amsterdam Area
 
Quantum Computing in Financial Services - Executive Summary
Quantum Computing in Financial Services - Executive SummaryQuantum Computing in Financial Services - Executive Summary
Quantum Computing in Financial Services - Executive SummaryMEDICI Inner Circle
 
Quantum Computing in Financial Services Executive Summary
Quantum Computing in Financial Services Executive SummaryQuantum Computing in Financial Services Executive Summary
Quantum Computing in Financial Services Executive SummaryMEDICI Inner Circle
 
Global Expert Mission Report “Quantum Technologies in the USA 2019"
Global Expert Mission Report “Quantum Technologies in the USA 2019"Global Expert Mission Report “Quantum Technologies in the USA 2019"
Global Expert Mission Report “Quantum Technologies in the USA 2019"KTN
 
Khanty Mansiysk October 2011
Khanty Mansiysk October 2011Khanty Mansiysk October 2011
Khanty Mansiysk October 2011Ilkka Kakko
 
Redissenyar les politiques de societat digital
Redissenyar les politiques de societat digitalRedissenyar les politiques de societat digital
Redissenyar les politiques de societat digitalArtur Serra
 
FIMM#1 Paris unReport -2015
FIMM#1 Paris unReport -2015FIMM#1 Paris unReport -2015
FIMM#1 Paris unReport -2015Izumi Aizu
 
Is Technological Progress a Thing of the Past?, Joel Mokyr
Is Technological Progress a Thing of the Past?, Joel MokyrIs Technological Progress a Thing of the Past?, Joel Mokyr
Is Technological Progress a Thing of the Past?, Joel MokyrStructuralpolicyanalysis
 
Future Profiles of e-Research
Future Profiles of e-Research Future Profiles of e-Research
Future Profiles of e-Research Ian Miles
 
IoTMeetupGuildford#1: SOCIOTAL EU - Rob Van Kranenburg (Resonance Design)
IoTMeetupGuildford#1: SOCIOTAL EU - Rob Van Kranenburg (Resonance Design)IoTMeetupGuildford#1: SOCIOTAL EU - Rob Van Kranenburg (Resonance Design)
IoTMeetupGuildford#1: SOCIOTAL EU - Rob Van Kranenburg (Resonance Design)MicheleNati
 
Hackathons of technology for good: Co-creating and deploying during COVID-19 ...
Hackathons of technology for good: Co-creating and deploying during COVID-19 ...Hackathons of technology for good: Co-creating and deploying during COVID-19 ...
Hackathons of technology for good: Co-creating and deploying during COVID-19 ...European Network of Living Labs (ENoLL)
 
Different types of infrastructures
Different types of infrastructuresDifferent types of infrastructures
Different types of infrastructuresSomerco Research
 

Similar to THE QUANTUM ADVANTAGE: Challenges for industry and training (20)

Mobile Monday (October 2014) - Riding Global Tech Trends
Mobile Monday (October 2014) - Riding Global Tech TrendsMobile Monday (October 2014) - Riding Global Tech Trends
Mobile Monday (October 2014) - Riding Global Tech Trends
 
Catlabs collaboratory-2017
Catlabs collaboratory-2017Catlabs collaboratory-2017
Catlabs collaboratory-2017
 
Catlabs Collaboratory-2017
Catlabs Collaboratory-2017Catlabs Collaboratory-2017
Catlabs Collaboratory-2017
 
What Future for ICT?
What Future for ICT?What Future for ICT?
What Future for ICT?
 
Interactions 39: UTC startup serie III
Interactions 39: UTC startup serie IIIInteractions 39: UTC startup serie III
Interactions 39: UTC startup serie III
 
Presentatie Amsterdamse Innovatie Motor door Diogo Vasconcelos
Presentatie Amsterdamse Innovatie Motor door Diogo VasconcelosPresentatie Amsterdamse Innovatie Motor door Diogo Vasconcelos
Presentatie Amsterdamse Innovatie Motor door Diogo Vasconcelos
 
Interactions 30: Novel research projects
Interactions 30: Novel research projectsInteractions 30: Novel research projects
Interactions 30: Novel research projects
 
Quantum Computing in Financial Services - Executive Summary
Quantum Computing in Financial Services - Executive SummaryQuantum Computing in Financial Services - Executive Summary
Quantum Computing in Financial Services - Executive Summary
 
Quantum Computing in Financial Services Executive Summary
Quantum Computing in Financial Services Executive SummaryQuantum Computing in Financial Services Executive Summary
Quantum Computing in Financial Services Executive Summary
 
Global Expert Mission Report “Quantum Technologies in the USA 2019"
Global Expert Mission Report “Quantum Technologies in the USA 2019"Global Expert Mission Report “Quantum Technologies in the USA 2019"
Global Expert Mission Report “Quantum Technologies in the USA 2019"
 
Khanty Mansiysk October 2011
Khanty Mansiysk October 2011Khanty Mansiysk October 2011
Khanty Mansiysk October 2011
 
Redissenyar les politiques de societat digital
Redissenyar les politiques de societat digitalRedissenyar les politiques de societat digital
Redissenyar les politiques de societat digital
 
Franco Cima - Big Data Project
Franco Cima - Big Data ProjectFranco Cima - Big Data Project
Franco Cima - Big Data Project
 
FIMM#1 Paris unReport -2015
FIMM#1 Paris unReport -2015FIMM#1 Paris unReport -2015
FIMM#1 Paris unReport -2015
 
Is Technological Progress a Thing of the Past?, Joel Mokyr
Is Technological Progress a Thing of the Past?, Joel MokyrIs Technological Progress a Thing of the Past?, Joel Mokyr
Is Technological Progress a Thing of the Past?, Joel Mokyr
 
Interactions 32: Comic Books a gateway to enhanced Imagination
Interactions 32: Comic Books a gateway to enhanced ImaginationInteractions 32: Comic Books a gateway to enhanced Imagination
Interactions 32: Comic Books a gateway to enhanced Imagination
 
Future Profiles of e-Research
Future Profiles of e-Research Future Profiles of e-Research
Future Profiles of e-Research
 
IoTMeetupGuildford#1: SOCIOTAL EU - Rob Van Kranenburg (Resonance Design)
IoTMeetupGuildford#1: SOCIOTAL EU - Rob Van Kranenburg (Resonance Design)IoTMeetupGuildford#1: SOCIOTAL EU - Rob Van Kranenburg (Resonance Design)
IoTMeetupGuildford#1: SOCIOTAL EU - Rob Van Kranenburg (Resonance Design)
 
Hackathons of technology for good: Co-creating and deploying during COVID-19 ...
Hackathons of technology for good: Co-creating and deploying during COVID-19 ...Hackathons of technology for good: Co-creating and deploying during COVID-19 ...
Hackathons of technology for good: Co-creating and deploying during COVID-19 ...
 
Different types of infrastructures
Different types of infrastructuresDifferent types of infrastructures
Different types of infrastructures
 

Recently uploaded

Artificial Intelligence In Microbiology by Dr. Prince C P
Artificial Intelligence In Microbiology by Dr. Prince C PArtificial Intelligence In Microbiology by Dr. Prince C P
Artificial Intelligence In Microbiology by Dr. Prince C PPRINCE C P
 
Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )aarthirajkumar25
 
Biological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfBiological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfmuntazimhurra
 
Caco-2 cell permeability assay for drug absorption
Caco-2 cell permeability assay for drug absorptionCaco-2 cell permeability assay for drug absorption
Caco-2 cell permeability assay for drug absorptionPriyansha Singh
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTSérgio Sacani
 
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...jana861314
 
Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Nistarini College, Purulia (W.B) India
 
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...anilsa9823
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfnehabiju2046
 
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |aasikanpl
 
Botany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfBotany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfSumit Kumar yadav
 
Presentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxPresentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxgindu3009
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsAArockiyaNisha
 
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...Labelling Requirements and Label Claims for Dietary Supplements and Recommend...
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...Lokesh Kothari
 
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfAnalytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfSwapnil Therkar
 
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxSOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxkessiyaTpeter
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhousejana861314
 
Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)PraveenaKalaiselvan1
 

Recently uploaded (20)

Artificial Intelligence In Microbiology by Dr. Prince C P
Artificial Intelligence In Microbiology by Dr. Prince C PArtificial Intelligence In Microbiology by Dr. Prince C P
Artificial Intelligence In Microbiology by Dr. Prince C P
 
Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )
 
Biological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfBiological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdf
 
Caco-2 cell permeability assay for drug absorption
Caco-2 cell permeability assay for drug absorptionCaco-2 cell permeability assay for drug absorption
Caco-2 cell permeability assay for drug absorption
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOST
 
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
 
Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...
 
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdf
 
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
 
Botany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfBotany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdf
 
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
 
Presentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxPresentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptx
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based Nanomaterials
 
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...Labelling Requirements and Label Claims for Dietary Supplements and Recommend...
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...
 
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfAnalytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
 
The Philosophy of Science
The Philosophy of ScienceThe Philosophy of Science
The Philosophy of Science
 
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxSOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhouse
 
Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)
 

THE QUANTUM ADVANTAGE: Challenges for industry and training

  • 1. THE QUANTUM ADVANTAGE Challenges for industry and training Repor t no 11
  • 2. EDITORIAL Over the past 20 years, quantum technologies have evolved from Nobel Prize-winning quantum physics experiments (1997: Chu, Cohen-Tanoudji, Phillips; 2001: Cornell, Ketterle, Wieman; 2005: Hall, Hänsch, Glauber; 2012: Haroche, Wineland) to transdisciplinary applied research. Driven by two great «quantum mysteries», wave-particle dualism and entanglement, these technologies are the result of long-term fundamental research and long-standing conceptualizations—such as quantum computing in the early 1980s—with experiments in the laboratory and then applications outside the laboratory many years later. Today, they are based on individual quantum states and fully exploit the properties of superposition and entanglement: this period is being called the second quantum revolution. While much more work is still needed on the fundamentals, the progress in these technologies allows us to imagine the first applications and industrialization before reaching a destination that is still referred to as one of the holy grails: a universal quantum computer. Start-ups are coming out of laboratories, funds are being raised, and win-win partnerships are being created between industry and academia to build the key equipment needed. With the Quantum Flagship, Europe, the birthplace of the first quantum revolution, is on the move. The scientific community has identified four main fields of research: quantum communications—including quantum cryptography—, quantum simulations, quantum computation, and quantum sensing and metrology. In addition to basic research, we must add to these four pillars quantum systems control and engineering, software issues, and training. This last point is the special focus of the 11th report from the Fondation Mines-Télécom. ItisessentialtobegintrainingfuturequantumspecialistsrightnowtoreinforceandexpanduponFranceandEurope’s advantageinquantumtechnology.FutureNobelPrizewinners,visionaries,andinvestorswillemergefromtheirranks. They must think outside the box and acquire skills in a multitude of fields to design hybrid quantum-classic solutions. This report aims to convince you of this. On behalf of the Foundation, we hope you enjoy reading it. September 2019 A reading path tailored to your level of knowledge! 5.What makes something quantum? Beginners: explore the basics and genesis of the quantum at the turn of the 20th century. 17.Quantum realities Advanced: plunge into the concrete applications of quantum physics. 35.Quantum engineering Experts : research, innovation and training, explore a field at the heart of today’s industrial issues.
  • 3. If there is one question that arises as soon as we talk about quantum technology, it’s “when will a universal quantum computer be readily available?” This is a machine that would be able to run all the algorithms that use quantum formalism—there are currently around sixty classes of these types of algorithms—not one just specialized in a few of them. Optimists, pessimists, and skeptics exchange their views in publications and symposia. Start-ups and industry are betting on a variety of physical technologies to create the basic building blocks of quantum computing, the famous “qubits”. The potential applications are proliferating; some have already been made possible by quantum simulators and emulators, but traditional electronics technology and algorithmic research are still relevant, which means that several timescales are overlapping and are not stable. However, everyone agrees that the current period is teeming with activity, making it an important stage that is not to be missed. As they did for artificial intelligence, countries such as the United Kingdom, Canada, China, and the United States drafted their “quantum technology plans” years ago; France’s plan is expected after Summer 2019. And, just as for artificial intelligence, Europe is the scale that counts a sufficiently high level of skills and knowledge. With the Quantum Flagship (see page ), Europe has begun to provide the resources (in funding and ecosystem organization) to achieve its ambitions. For European players in quantum technologies, this region that gave birth to most of the components of the two quantum revolutions has every legitimacy and must maintain its leading position. The search for the quantum advantage Supremacy—where conventional computers are incapable of competing with a quantum machine on the types of problems for which the latter is well- suited—or advantage—where a quantum machine is efficient and affordable enough to make using it economicallyviable?Duringaround-tablediscussion organized on June 20, 2019 by Bpifrance as part of an international conference attended by more than 200 people, participants agreed on the idea that “defining quantum advantage is probably a moving target”. The increase in supercomputer power and the efficiency of new algorithms are pushing back the prospect of quantum advantage. But the perceived advantage is not just computational. Compared to supercomputers, quantum computers have a much smaller footprint, mass, and power consumption, which are three significant advantages. FOREWORD 55.Postscript News from the quantum revolution between September 2019 and January 2020 specially for the English publication of this report. 2 3—
  • 4. The quest for a quantum computer opens new doors We should not allow quantum computing and the quest for a quantum computer to make us think that we are heading for a “quantum everything” that wins out over the rest. The reality is more nuanced. Future supercomputers will be hybrid machines, so a major player like Atos can reasonably claim, through its enthusiastic and pioneering chairman Thierry Breton, that quantum accelerators will be on the market in the next five years. These components are small “co-processors” that will bring the advantages of quantum computing to traditional machines. “The landscape of the quantum advantage is being explored step-by-step, between hardware and software at the same time”, emphasizes CNRS chairman Antoine Petit. Like other major research organizations in France, the CNRS is undertaking a transversal approach that ranges from fundamental research to engineering. And each new type of hardware that is made available—whether quantum particle sources, repeaters, or sensors—each new algorithm that uses quantum formalism, opens the way to new applications. With the shift from scientific research to concrete use cases over the past few years (see page ) the industrial stakes are being taken seriously. A study published by McKinsey & Company in June 2019 pointed out that the value chain in quantum computing is divided into three distinct groups: a third in hardware (mainly in the United States), half in software (the majority in startups), and a fifth in facilitators. Among the latter, you should note the crucial importance in supplying the small quantum hardware discussed above. The funding is following, whether driven by public policy or private investors (for example, Quantonation in France). In mid-June 2019, the United Kingdom announced a £153 million program to help bring quantum products to the market, backed by £205 million from industry, bringing the sums raised to more than £1 billion. For scientists, industry funds are a model where everyone wins. Industry benefits from scientists’ advice and ideas, and in return they obtain the major technological equipment they need. Developing ecosystems and training To take on the challenges of quantum technologies, the ecosystem must be mobilized as well. There are a great many initiatives in France, but they remain rather dispersed at regional level. The Paris Center for Quantum Computing, SIRTEQ in Ile-de- France, and cities like Grenoble and Montpellier are now joining forces, driven jointly by scientific teams and companies like Atos, IBM, and Microsoft. Bilateral cooperation agreements are being set up between countries, too. Things look to be speeding up: according to the participants of the Bpifrance conference in June 2019, such an event would have been unthinkable one year earlier. Some people are beginning to imagine a “quantum Airbus” in Europe, which seems all the more logical given that, not only does the aeronautics industry need the capacities and forms of computation that quantum technologies bring, but also because satellites are natural entities for a future “quantum internet”, a network that allows for both quantum communications and links to heavier quantum equipment. While currently an “emerging technology”, quantum could one day become a “general purpose technology”. Combining science and engineering, it requires the public and decision-makers to be informed so they can understand the stakes, raise ambitions, and make the right choices. This new way of seeing the world also requires investment in training at all levels, from secondary school through university, from initial training to job conversion, from the wider public to specialists, if we want to give ourselves every change of conserving the quantum advantage.
  • 5. Find all the contributors as well as the glossary pages -. Follow Alice and Alan’s conversation in the green boxes. WHAT MAKES SOMETHING QUANTUM? Alice is a young woman driven by an unquenchable thirst for knowledge founded on scientific progress but who also appreciates a bit of mystery and marvel. She has traversed time and tales, explored disciplines and countries. In the 19th century, she traveled both in England and in a country of the absurd, where those she met led her to question her understanding of the world’s reality... In the early 20th century, she cultivated a passion for physics, time, and space by reading about great discoveries in atoms and things microscopically small. In the late 70s, she entered the communications era, having fun sending encrypted messages to her friends. Today, Alice handles data and artificial intelligence systems and uses increasingly subtle, sometimes futile technological objects every day, always conscious of the major stakes of the transitions and the underlying strategic and economic realities. For all these years, Alice has taken an interest in the developments in quantum physics and mechanics and has seen their formalism gradually seep into other sciences and fields. No doubt she knows, even if a little confusedly, that quantum technologies are at the heart of many of today’s technologies. But what does that mean, exactly? There still seems to be a touch of strangeness, something incomprehensible when we talk about quantum: we think we have understood... but then something entirely different turns up. Today, at a time when the many ongoing transitions are calling on all disciplines at the same time, Alice would like to be able to think ahead and make better decisions about her future. The young scientist would like to know more about these quantum revolutions everyone is talking about—so she sets out to meet the people who can help her fully understand it all. She has an appointment with Alan, a teacher-researcher specializing in quantum cryptography and communication who has created a training course in quantum engineering with his colleagues. Alice thinks she may need to take such a course. She has also read in the media about the latest news in quantum computers, these machines that they say will make all our current computers obsolete. Alice wants to know what is real and what is fantasy, but where should she start? She would also like to explain to her grandmother why she is going down this road, but how can she find the right words so that her grandmother will understand? 4 5—What makes something quantum?
  • 6. Everything really started to come together when we asked ourselves about the nature of light. For a long time, we believed that light was made up of little grains, corpuscles. In 1801, Thomas Young, a 28-year-old Englishman who was well-versed in many artistic and scientific disciplines, conducted an optics experiment that showcased light’s wave-like character. To explain what we see in this founding experiment, sometimes called “Young’s slits” (see opposite), the corpuscular nature of light as previously proposed no longer fit. This experiment, which was reproduced later in other conditions and with other things than beams of light, made many people think. How can we define a barrier between the classical and the quantum? Alice has a meeting with Alan one afternoon in May. Alan starts by asking Alice what she knows. It’s always good to know where to start. “When you hear the word ‘quantum’ or when you hear people talking about it, what is the first thing you think about?” “Oh, that’s easy; often I think about that story of the cat. It makes me think that it’s been following me from Cheshire into the internet. This cat that Schrödinger made living and dead at the same time in that box, it fascinates and terrifies me. It’s like I have mixed, superposed feelings about this creature.” “It’s true, this cat is rather interesting,” Alan replies, “but it is also very big, and before we talk about it, we need to get down to a microscopic level where there are other ‘at the same times’ we need to explain.” Alan began explaining quantum discoveries so that Alice would not only understand, but also be able to explain them later herself. “ Screen view  Young’s double-slit experiment On the screen, we can see a diffraction pattern with alternating light and dark bands.
  • 7. Throughout the 19th century, many scientific experiments resulted in questions that led to some well-accepted theories, like classical mechanics, being considered insufficient to explain the observed phenomena. Science needed new tools to describe the world, especially at the microscopic level. Material physics on the macroscopic scale is fundamentally different from the physics of elementary atoms and particles on the microscopic scale. The first is called classical physics, and the second is called quantum theory. The switch from the first to the second really occurred in the early 20 century when the first task of this emerging theory was to clarify the real nature of light: was it a wave or a particle? In 1900, German physicist Max Planck (1858-1947) introduced the term quanta to explain the experimental characteristics of black-body radiation referring to the indivisible packets (particles) in which the energy of electromagnetic radiation is confined, each packet having an energy E proportional to the frequency ν of the radiation under consideration. He proposed a new universal constant (h, now called Planck’s constant) that connected the two scales. Planck’s quanta referred to the energy’s discontinuous character, which can only take certain values. In March 1905, Albert Einstein (1879-1955) reviewed this hypothesis in his article “On a Heuristic Point of View Concerning the Production and Transformation of Light” and revolutionized the world by stating that light was both a wave and a particle. Depending on the case, light behaves like a wave or like a particle. This is the work that won him the Nobel Prize for Physics in 1921, once the skepticism had been overcome. Then, in 1923, American physicist Arthur Compton (1892-1962) discovered the phenomenon (called the Compton effect) where the wavelength increases after colliding with an electron, which we would soon be calling a photon, finally concluding that light is made of particles. In 1924, French physicist Louis de Broglie inverted how we saw things by postulating that one can associate a wave and a material particle (i.e. with a mass), a hypothesis known as wave-particle dualism. An experiment to verify this wave of matter was performed in 1927 by Davisson and Germer by diffracting electrons on a crystal of nickel. “ “To separate the classical and quantum worlds in the 20th century, De Broglie’s wavelength h/mv is the right quantity. m being the mass and v the speed, this wavelength, for example for an electron in a hydrogen atom, is comparable to the distance between the electron and the proton (Bohr radius). However, for a moving object of our size, this wavelength is infinitely smaller than the obstacles encountered, and classical physics prevails.” 6 7—What makes something quantum?
  • 8. ““These quantas for light, we have called them photons since 1926, but the fields all inherited this word. You see, in the beginning, it was the questions on the nature of light that opened the door to quantum physics.” “So, in the 30s, everyone agreed that quantum mechanics was the right theory of Nature?” “Well, there were doubts about the completeness of the quantum formalism that appeared, and it was Einstein, among others, who gave voice to them. For example, he wrote to Schrödinger in August 1935 to express his surprise that the wave function he proposed allowed a particle to be in a paradoxical superpositioned state, both A and non-A. This ‘both’ can lead to confusion. Let’s skip ahead in time to 1989, when a Japanese team repeated Young’s slit experiment, not with a beam of light (photons) but with electrons, which are of course particles, one by one. Visibly, these electrons behave like a wave, since we observe interferences, but if we want to know by which slit the electron traveled, this measurement causes the interferences to disappear! It is therefore reasonable to say that electrons ‘manifest in a corpuscular or undulatory way, depending on how we speak to them.’ Choosing the right words is essential in quantum physics.”
  • 9. Scientists are the first to admit that they have been confronted with strange, counterintuitive phenomena, and that they needed to have radical ideas to make suitable proposals, build the foundations of this new science, and imagine the experiments that could confirm their intuitions. Quantum sciences rely on a set of postulates (see page ). Some of these strange phenomena have profoundly disrupted the worldview that had emerged from the Classics. A dose of randomness seemed to be needed, an object that could no longer be in the space around it, an object that could have several states at the same time. And then also: the act of observing influences and changes that which is observed, and the events that did not occur, but which could have occurred, have an impact on the experiment conducted. Reading scientific publications and letters between physicists, as well as their thought experiments, demonstrate their troubles and their search for the right word. Heisenberg (see page ) used for his theorem the words Unsicherheit (uncertainty) and Ungenauigkeit (indeterminacy), then, seeing the risk of confusion, decided on the term Unbestimmtheit (undeterminedness), but unfortunately “uncertainty” had already been used in the translations. Several interpretations of the phenomena were proposed, forming as many schools of thought and controversies. Einstein himself was a dissident voice, and some still today are proposing unique interpretations, so how can we be convinced that the whole thing is solid? The way of describing these phenomena, the mathematical tools and the “language” underlying the formalism have also changed over time and continue to do so. It is even more necessary since the flavor of these formalisms itself can guide scientists’ ideas in unexpected directions. At the same time, the equations all seem simple (and beautiful), which seems to contradict the specificities they describe. It is possible and useful to work through a few aspects to better understand the landscape that is taking shape. Quantum sciences are also traversed by (conditions on the) limits, thresholds, and inequalities, which reflect physical impossibilities. Are we faced with insurmountable limits? Is it possible that we might not be all-powerful? But in this case, how can we explain certain phenomena with immediate repercussions over very long distances exceeding those allowed by the speed of light, a limit that we have accepted elsewhere? These oddities have been the delight of movies and films that have helped give a distorted image. For example, talking about quantum teleportation is always delicate, and we must take care not to generalize the macroscopic world too quickly. Quantum sciences... why this reputation of being difficult? This famous quote from physicist Richard Feynman (The Character of Physical Law, 1965) is often cited to express the impossibility for laypeople to understand anything about this science. « I think I can safely say that nobody understands quantum mechanics. » 8 9—What makes something quantum?
  • 10. The quantum system in all its states: notation and the superposition principle In classical physics, the state of a classical system allows us to determine the result of the measurement of a physical quantity without any possible ambiguity. In quantum physics, the state of a quantum system only allows us to calculate the probabilities of different results of measurements, of which only one can be effectively observed after the measurement. The quantum state of a quantum system is written ∣𝛙⟩, according to a notation (called bra-ket notation) proposed by the French physicist Paul Dirac (1902- 1984) in 1939, who sensed that many scientists would use it on a daily basis and would need a specific and practical way to write it. This is a vector, a well- known mathematical object, a state vector which fully accounts for the state of the quantum system and which lives in a Hilbert vector space, another, less- known mathematical object with properties perfectly suited to the quantum world, in particular describing situations where we are choosing between mutually exclusive alternative terms. The module and phase of ∣𝛙⟩ uniquely characterize this vector. A common quantum statement like ∣𝛙⟩ = 𝑎∣↑⟩ +𝑏∣↓⟩ expresses 1) that the sum of two quantum states of a system forms one possible quantum state of this system, and 2) that by choosing particular states (here ∣↑⟩ and ∣↓⟩, which in this example refer to the up and down status of an electron’s spin, a property that an electron possesses) we can describe any quantum state as a superposition of these “base” states. With 𝑎∣↑⟩ +𝑏∣↓⟩ above, the quantum state is defined by the quantum property of the electron’s spin, but we could choose the quantum property of light’s polarization according to a construction based on the horizontal polarization ∣𝐻⟩ and vertical polarization ∣𝑉⟩, hence ∣𝛙⟩ = 𝑎∣𝐻⟩ +𝑏∣𝑉⟩. Later, we will discuss qubits, noted as 𝑎∣0⟩ +𝑏∣1⟩, a quantum state that does not refer to a specific physical incarnation, ∣0⟩ and ∣1⟩ only being a convention that refers to two particular states. Opposite: A Young’s slit experiment made with electrons, one quantum passing at a time, 1989. First, the electrons seem to be arranging themselves randomly on the target screen with a corpuscular behavior, but little by little interference fringes appear, like a wave would make. Inserting a device between the slits and the screen to observe where the electrons pass causes the fringes to disappear. Image taken from Tonomura, A., Endo, J., Matsuda, T., Kawasaki, T., & Ezawa, H. (1989). Demonstration of single-electron buildup of an interference pattern. Am. J. Phys, 57(2), 117-120.
  • 11. Several mathematical descriptions of quantum phenomena were proposed in the early days of quantum mechanics: the matrix mechanics by Werner Heisenberg, Max Born, and Pascual Jordan in 1925, and the differential equations of wave mechanics by Erwin Schrödinger in 1926. The two proved to be mathematically equivalent, but some scientists were more familiar with one formalism or another. How the chosen formalisms’ evolved went hand in hand with both the understanding of the phenomena observed and the intuitions leading to future discoveries. In 1932, John Von Neumann published Mathematical Foundations of Quantum Mechanics, whose axioms and postulates, for which we should also credit the previous works of Paul Dirac and Hermann Weyl, established a mathematical framework through Hilbert’s vectoral spaces. It is a remarkable fact that many quantum discoveries were first made on purely mathematical bases, to such an extent that the physicist and philosopher Alexei Grinbaum wrote in Mécanique des étreintes, a work that described the various steps that led to these theoretical breakthroughs, that “mathematics dictates its laws to quantum systems”. From mathematics describing the behavior of waves, objects with the particularity of being combined by addition and retaining their wave properties after such a combination, the current view is to study the various degrees of composition (and correlation) between entities, and how to take several entities and process them into a single one. When two Hilbert spaces 𝓗 (each representing a quantum system) come together, the related mathematical operation is the product ⊗. The product can be weak or strong (or something in between). It translates the two systems’ capacity to be able to separate after having been combined, like a mix of seeds could be, whereas a mix of juices would have changed natures. A weak product contains separable statuses; beyond this, mathematics opens the field of so-called entangled states Suggested reading: Mécanique des étreintes. Alexei Grinbaum. Éditions Les Belles Lettres, 2014. Mathematics lays down the law “I feel like the mathematics is just as important as the physics questions themselves,” says Alice. “Exactly! And we could add that, in the 21st century, as Alexei Grinbaum says, ‘the border between classical physics and quantum physics passes through the quantitative characterization of the force of the correlations between the subsystems of a compound system’.” 10 11—What makes something quantum?
  • 12. The physicists (and (future) Nobel Prize winners in physics) who began to handle quantum fundamentals, gathered here at the Solvay Conference held in 1927, were European: (From top to bottom and left to right) Erwin Schrödinger, Wolfgang Pauli, Werner Heisenberg, Paul Dirac, Arthur Compton, Louis de Broglie, Max Born, Niels Bohr, Max Planck, Albert Einstein. Born, Heisenberg, and Pauli created the term “Quantenmechanik” in the 1920s. Starting in the 20s, physicists began creating and using a specific vocabulary that we must now introduce. To be able to progress in their understanding of counter-intuitive phenomena, scientists also offered the postulates that defined the behavior of these quantum objects that differed from that of classical objects. This allowed them to predict the existence of quantum particles or properties well before they were observed, providing a direction for further research. Thus, in December 1924, Austrian physicist Wolfgang Pauli (1900-1958) theorized the existence of an additional attribute of electrons (in addition to its mass and charge) needed to explain a result of an experiment that had evaded all explanation until then. He did it by adding a postulate to those already made for non- relativistic quantum mechanics. This property, first interpreted as a rotation of the electron on itself, is called spin. In fact, it was an intrinsic kinetic moment that Uhlenbeck and Goudsmit discovered in September 1925. This anecdote reveals the boldness of the physicists who postulated this quantum property, since nothing in the world of classical physics could have inspired it. Furthermore, interpretation is an exercise that continually reappears in the quantum sciences. In quantum mechanics, physical values like mass and position cannot be determined as the result of a function in a deterministic way. They can only be known in a probabilistic way (i.e. as probabilities of the allowed values) through what is called an observable. This concept is what comes closest to that of measurement as we known it in the classical world. Systems, states, and observables
  • 13. Hungarian physicist János Lajos Neumann (1903-1957) proposed a series of postulates that provided a framework for the quantum sciences and gave them a frame of reference. This mathematics is constantly presented in current scientific publications. 1. The definition of the quantum state. The knowledge of the state of a quantum system is completely contained, at time t, in a normalizable vector of a Hilbert space 𝓗. This vector is written ∣𝛙(t)⟩. 2. The principle of correspondence. To any observable property, for example position, energy, spin, etc., corresponds a linear Hermitian operator acting on the vectors of a Hilbert space 𝓗. This operator is called an observable. 3. Measurement: the possible values of an observable The measurement of a physical quantity represented by the observable A can only provide one of the eigenvalues of A (𝑎n ). Any vector ∣𝛙(t)⟩ can uniquely be broken down based on these eigenvalues: ∣𝛙⟩ = 𝑐1 ∣𝛗1 ⟩ + 𝑐2 ∣𝛗2 ⟩ + … + 𝑐n ∣𝛗n ⟩ + … 4. Born’s rule: a probabilistic interpretation of the wave function. The measurement of a physical quantity represented by the observable A performed on the normalized quantum state∣𝛙(t)⟩, gives the result 𝑎n , with a probability equal to |𝑐n |2 . 5. The wave packet reduction postulate. If the measurement of a physical quantity results in 𝑎n , then the state of the system immediately after the measurement is projected onto the eigenspace associated with 𝑎n . 6. Postulate of unitary evolution. It allows us to calculate the wave function. The state ∣Φ,t⟩ of any quantum system is a solution of the time-dependent Schrödinger’s equation as follows: 𝑖ℏ∂/∂t∣Φ,t⟩ = 𝐻̂∣Φ,t⟩ Without going into the details of the math, here is an example of the thoughts that these postulates were to lead to. Since the evolution of the wave function is linear and unitary (conservation of the norm and the scalar product, according to postulate 6), how can quantum superpositioning disappear (what postulate 5 says), whereas linearity and unitarity should lead to the conservation of superpositioned states? The concept of decoherence (see next page) will provide a solution. Naturalized as an American citizen in 1933, the scientist now known as John von Neumann also made a major contribution to the development of computer science (and many other disciplines). Computer hardware architecture (control unit, memory, inputs/outputs, etc.) is sometimes called von Neumann architecture. Note that quantum computers do not follow these design principles. A science based on postulates 12 13—What makes something quantum?
  • 14. In the thirty years or so after Planck proposed the term quanta in 1900, the theoretical foundations of quantum mechanics were laid. We can consider that “quantum mechanics is complete”. But isn’t that surprising, maybe even strange? How can we give it meaning? How can we interpret it? Especially since the underlying mathematical formalism (Hilbert’s vector spaces with infinite dimensions) is far removed from the physical space where the described phenomena occur. Several interpretations were proposed. In 1927, Werner Heisenberg (1901-1976) published what would be one of the fundamental concepts in quantum physics, known as the indeterminacy principle (or the uncertainty principle) which established mathematically that it is impossible to establish with total precision both a particle’s position and the momentum (and, more generally, with total precision the value of two separate observables of a quantum system). When measuring these two quantities many times, we always obtain different values, and the product of the standard deviations, 𝛔 𝑥 for the position and 𝛔𝑝 for the momentum, cannot be smaller than a certain threshold (see opposite). Another view of the uncertainty principle is that, in the atomic world, the measuring device influences the object being measured. But what happens? What exists between the two measurements? The various interpretations of quantum mechanics differ on the meaning to be given to non-observed, non-measured states. This becomes a philosophical problem on the meaning of reality, which is still under discussion today. “ “Wave-particle dualism, quanta, the superposition principle, quantum states, postulates, observables,” continues Alan, “there are still two terms to be introduced: decoherence, and non-locality.” Uncertainties and interpretations Heisenberg’s Indeterminacy Principle (also known as the Uncertainty Principle): 𝛔 𝑥 · 𝛔𝑝 ≥ h / 4𝜋
  • 15. One major interpretation is the so-called Copenhagen interpretation, named after the place where the holders of this view usually met (Bohr, Born, Heisenberg, Jordan). According to this interpretation, physical systems do not possess properties if we have not measured them, and quantum mechanics can only predict probabilities among a set of possibilities. Measurement immediately results in the reduction of all the possibilities into one of the possible values, which is called “wave function collapse”. Many objections were made to this interpretation. These objections did not call into question the predictions of quantum mechanics and mathematical formalisms, but critics felt that the failure to explain certain macroscopic phenomena meant that the emerging theory was incomplete. The “Schrödinger’s Cat” thought experiment is thus a refutation of the Copenhagen interpretation. Other interpretations still exist today, such as the CSM (Context, Systems and Modalities) theory proposed by French physicists Alexia Auffèves and Philippe Grangier in 2015, which attempts to propose a framework that reconciles previous approaches. The central issue is still the definition of the exact boundaries of a quantum system: to what extent does a quantum system contain the observer? A quantum system cannot be considered as isolated because it interacts with its environment. Each of these numerous interactions shifts the wave functions of its subsystems relative to each other, rapidly rendering the probability of observing a superposed state zero. This explains why we never see a superposed state at a macroscopic level: they are untenable. The different possibilities quickly become incoherent. This decoherence phenomenon exists in the natural state; it is a spontaneous reduction of the wave packet, the duration of which depends on the environment. Hence many precautions must be taken to isolate the quantum systems that we wish to manipulate as qubits in quantum computing, for example, especially since decoherence seems to be a universal and therefore inevitable phenomenon. Another problem was raised by scientists in the late 1930s: the locality principle whereby an object can only be influenced by its immediate environment. In 1935, Albert Einstein, Boris Podolsky, and Nathan Rosen published a paper that is famous today (known as the EPR paradox) but which was not very well explored at the time because quantum mechanics was keeping physicists busy enough elsewhere. In it, they described the existence of the states, allowed by quantum formalism, of two quantum objects, two observables of which are highly correlated, even when the objects are far apart. Since the limited speed of light makes it impossible for these two subsystems to communicate instantaneously, observing one of them and thus knowing the precise value of the other implies that there would be variables not considered in quantum theory. In other words, the question raised in 1935 was: can the quantum description of physical reality be considered complete? 14 15—What makes something quantum?
  • 16. Suggested reading or viewing : « Les bases de la physique quantique. » Juin 2018. 76’43 . Masterclass de Roland Lehoucq, astrophysicien au CEA. http://www.cea.fr/go/ masterclass-physique-quantique «Einstein’s revolutionary paper.» Avril 2005. Physics World. https://physicsworld.com/a/ einsteins-revolutionary-paper/ « Le laser, ou l’impensable ingénierie quantique. » Eric Picholle. Noesis n°5. 2003. https://journals.openedition.org/noesis/1507 «Elements of Quantum Computing: History, Theories and Engineering Applications.» Seiki Akama. Springer 2014. eBook 133 pages. http://mmrc.amss.cas.cn/tlb/201702/ W020170224608149203392.pdf « Révolutions quantiques », Clefs CEA n°66. Juin 2018. http://www.cea.fr/multimedia/ Pages/editions/clefs-cea/revolutions- quantiques.aspx The controversy between Einstein and Bohr did not fascinate the scientific community for long. Physicists were indeed to be kept very busy with quantum field theory, i.e. the study of particle movement. The applications resulting from this, and from what is now called the first quantum revolution, are limitless. These are the basic components of the digital age. As with artificial intelligence systems, we must be aware that we are surrounded by objects and services that have been made possible by the early results of quantum science and which continue to shape the digital reality right now. The first quantum revolution in everyday life C MOS Transi stors Lasers Hard driv es LEDs Atomic cl ocks GPS
  • 17. QUANTUM REALITIES For years, Einstein, Podolsky and Rosen’s 1935 publication did not attract the attention of the scientific community, which was very busy with quantum field theory, a source of many discoveries. In 1964, nine years after Einstein’s death, a physicist working at CERN in Geneva revived interest in it and opened unexpected prospects. In a short article, lrish physicist John Bell set out principles in the form of inequalities that would later provide irrefutable proof of the notion of quantum entanglement. This is a remarkable property that causes a pair of quantum objects to behave as a single quantum system, even when these two subsystems(eachobjectinthepair)areatadistance that makes “communication” impossible. John Bell’s contribution is a mathematical expression, a measurable quantity of Einstein’s doubt about the interpretation of Copenhagen. Let’s emphasize that. Bell’s work is remarkable because the dogmas of quantum mechanics were to be irreversibly turned upside down. It is remarkable also because it paved the way for experimental verifications as to whether strong correlations between quantum subsystems exist in nature. By demonstrating Einstein’s error of judgment, physicists obtained new and surprising results on the mathematical structure of quantum mechanics. But several years passed between Bell and the experimental proof. In the 1970s, Bell’s inequality was reformulated, its mathematical expression taking on more manageable forms while retaining Bell’s original spirit. One of these formulations, proposed by physicists Clauser, Horne, Shimony, and Holt, and known as the CHSH form after their initials, is particularly suited to digital communications and deserves our attention. Alice is going to help us with this. “ “Alice, the second quantum revolution will come once we start to actively manipulate single particles and to have multiple particles interact with each other. Do you still have that friend you write to regularly? We will do a few experiments with him to understand all this.” “Oh yes,” says Alice, “you’re talking about Bob! It’s true we’ve sent each other a lot of messages over the years. We are very close, yes... I’ll call him right now.” 16 17—Quantum realities
  • 18. Alice and her friend Bob are in the habit of emitting bits, noted respectively 𝑥 and 𝑦, at the entrance of a box that represents the level of correlation existing between these two colleagues (who are also quantum subsystems in private). To translate this correlation, the box outputs one bit 𝑎 to Alice,andonebit 𝑏toBob,thecombinations(𝑥,𝑦,𝑎,𝑏)thus having 4x4 possible values. P(𝑎=𝑏|𝑥𝑦) is the probability that 𝑎 and 𝑏 are identical, and P(a≠𝑏|𝑥𝑦) is the probability that they are not. The sum of these two probabilities is equal to 1, but the difference, called the correlator between the input pairs, E 𝑥𝑦   = P(𝑎=𝑏|𝑥𝑦)−P(a≠𝑏|𝑥𝑦) varies between -1 and 1. There are four possible correlators, and Bell’s proposition in CHSH form is to look at CHSH = |E00 +E10 +E01 −E11 |, a quantity that can therefore be equal to 4 at most. John Bell demonstrates that, in the world of classical physics, the inequality CHSH ≤ 2. is verified. In quantum systems, the upper bound of Bell’s inequality is greater than 2. It is called Tsirelson’s bound and is equal to approximately 2.8284 (2�2). By creating an experimental device that takes this box and the two characters Alice and Bob, and counting the output correlations over a very large number of cases, if the value 2 was exceeded, then it would mean that Einstein’s point of view is wrong. Experimentation was therefore needed to determine whether correlations of the kind predicted by quantum formalism exist in nature and “violate” Bell’s inequalities. In 1982, the French physicist Alain Aspect and his team at the Institut d’Optique d’Orsay empirically measured the first CHSH value, with the two Alice and Bob subsystems a few meters apart. The experiments were performed using a pair of photons on which a polarization measurement was made (see figure below). The results were indisputable. The engineering feat was impressive and so was the result. CHSH proved to be greater than 2, demonstrating that the correlation of two entangled quantum systems is a reality. Proof of quantum entanglement paved the way for a second revolution PM PM C2 b b’ PM PM C1 a a’ S 𝑁(a,b) , 𝑁(a,b’) 𝑁(a’,b) , 𝑁(a’,b’) An arrangement of one of Alain Aspect’s and his team’s 1982 experiments (here with variable polarizers; C1 and C2 are switches), which highlighted entanglement and non-locality. 𝑁(a,b)… 𝑁(a’,b’) are the coincidence rates whose measurements are used to calculate the E-mode polarization correlation coefficients.
  • 19. The impact of the second quantum revolution for the future of communication networks At Nokia, Bell Labs is conducting research into breakthroughs from the existing technologies and solutions for the future of communications networks. In the past, Bell Labs made a significant contribution to the invention and development of the technologies in the first quantum revolution (transistors, lasers, optical amplifiers, etc.). As the emerging technologiesofthesecondquantum revolution have great potential to represent a break from existing technologies, certain researchers at Bell Labs are participating in their development. Among the breakthroughs, the one promised by quantum computing is probably the strongest. In 1994, Peter Shor, then a researcher at Bell Labs, invented an algorithm that used quantum computers that could easily break RSA encoding, which is very difficult with classical computers. If we could build quantum computers with enough qubits, it would call into question the security of the many communication systems that currently use RSA encryption. However, quantum decoherence currently limits the number of correlated qubits in quantum computers to just a few dozen. “Topological qubits” could overcome these limits, and they are currently being studied at Bell Labs. Another important breakthrough is the realization, using quantum cryptography systems, of communications that are intrinsically secure due to the nature of quantum physics itself: while the quantum theory is not called into question, there is no physical system that can break communications systems that are secured correctly by quantum cryptography. Developing such systems is even more important given the threat quantum computers pose to systems based on RSA encryption. Bell Labs contributes to designing such systems for future communications networks, particularly in conjunction with partners on projects such as CiViQ, a part of the EU’s Horizon 2020 program. The company viewpoint Nokia Today, quantum optics is still used to measure CHSH, and increasingly accurate experiments with increasing distances between Alice and Bob are being carried out. Tsirelson’s bound, which is found in all formulations of Bell’s inequality, has been approached, but not yet reached. If it is reached, we will be certain that quantum mechanics is the right interpretation of Nature. For a more comprehensive and very approachable presentation of these fundamental experiences, read, in French : « Présentation naïve des inégalités de Bell ». Alain Aspect. [PDF, 35 pages]  ; « Des objections d’Einstein aux bits quantiques : les stupéfiantes propriétés de l’intrication ». Alain Aspect & Philippe Grangier. [presentation, 34 slides]. The illustrated story with Alice and her friend is expanded on at greater length in Mécanique des étreintes, op. cit. 18 19—Quantum realities
  • 20. Quantum Flagship: Europe invests in the second quantum revolution In May 2016, the European scientific community issued a structuring and ambitious 150-page roadmap considered to be representative of research into quantum technologies well beyond Europe’s borders. Based on this, Brussels allocated €1 billion over 10 years to a FET (Future and Emerging Technologies) Flagship starting in 2018. This program is important enough for countries outside the EU such as Turkey, Switzerland, and Israel to join the program. Similarly, while the FET Flagship concept is being overhauled and other programs were stopped in early 2019, the Quantum Flagship did not suffer the same fate. In addition to this data and over the same period, we should include programs with hundreds of millions of euros in countries such as the United Kingdom (€650 million), Germany (€500 million), and the Netherlands (€150 million), some of which are partnerships with North American players. These figures should be compared to the Chinese strategy estimated at €2 billion as well as the recent (December 2018) National Quantum Initiative Act in the US that includes €1 billion over the first five years. As you can see, Europe, the United States, Japan, Singapore, and China are some of the state actors who are very much established in the race. China has made significant investments in quantum engineering, particularly in quantum communication pathways by satellite. Europe did not make these investments, but a real movement now exists. Basic science Engineering/ Control Software/ Theory Education/ Training Computation Simulation Sensing/Metrology Communication The final report of the Quantum Flagship High-Level Steering Committee structured research into fundamental pillars and cross-cutting actions, illustrated in the figure opposite that has widely circulated in the literature. The Communications, Computations, Simulations and Sensing and metrology pillars are introduced in this report on , ,  and  respectively. The “Education and Training” cross-cutting domain is the subject of -. Stay up to date at: https://qt.eu/ Suggested reading: The quantum technologies roadmap: a European community view. Antonio Acín et al 2018 New J. Phys. 20 080201
  • 21. On April 9, 2019, representatives of the European Commission’s Directorate-General for Communication Networks, Content and Technology (DG Connect) and the European Space Agency (ESA) signed a technical agreement to collaborate on the design of a quantum communications infrastructure, the next generation of ultra-secure communications in Europe. Based on Quantum Key Distribution (QKD, see next page) technology, this infrastructure will include components on Earth and in space and aim to strengthen Europe’s cybersecurity and communications capabilities. The space component, called SAGA(SecurityAnd cryptoGrAphic mission), will include quantum satellite communication systems. Combining and interconnecting the activities of the two parts is necessary to provide coverage for the whole of the European Union. This infrastructure should also stimulate the market for EU industrial players in quantum technologies and further consolidate and expand Europe’s scientific excellence in quantum research. With SAGA, the next chapter is being written in space A roadmap laid out over twenty years The European Quantum Manifesto sets out Europe’s ambitions in quantum research by setting milestones at 5 and 10 years, and more, taking care to note the interplay between the four key pillars of research. Thus, the appearance of new quantum algorithms for simulators and communications will coincide with the existence of small quantum hardware to run them (2020). Quantum repeaters in the proposed quantum internet should be capable of detecting attempts to listen in on encrypted communications, while everyday communications equipment will have quantum components on- board (2025-2030). Finally, universal quantum computers that exceed the capacity of computers with traditional architecture should not appear before a pan-European communication network that combines classical and quantum technologies and whose security components will be solutions that can resist attacks from quantum computers (see also page ). Atomic cl ocks Quantum sensors Intercity q uantumlinks Quantum simulators Quantum internet Universal quantumcom puters 2015 2035 20 21—Quantum realities
  • 22. The principles of superposition and entanglement that we saw earlier are the basis of a major field of research into applications of quantum technologies: quantum communications. In this field, quantum systems and states are created and used with a view to setting up ultra- secure communication protocols. We often talk about “quantum cryptography”, and with good reason, because it is the search for improvements in certain parts of the security protocols that has been driving this line of research. But this now also includes the design of telecommunications networks, some of which may call on quantum technologies. The quantum objects used are mainly photons through fibers or in open space. The security aspect is essential for two reasons. This first is that having secure communications is a major objective today for governments, companies, and citizens. As we will see, some quantum properties provide the means to create truly inviolable encryption processes. But these same breakthroughs in quantum technology are paving the way for a new generation of computers which are otherwise far more powerful—and here “otherwise” should be understood both as “differently” and “much more”—than conventional computers and, therefore, likely to break the security of the communications currently provided by traditional computers. Quantum Key Distribution (QKD) is part of quantum cryptography, as is the search for high quality random numbers, i.e. numbers whose randomness is truly unpredictable, a property that is also desirable in generating encryption keys. Traditional fiber-optic QKD systems are already operational today over commercial distances of around 100 km, with a 307-km connection demonstrated in Geneva in 2015, up to 2,000 km in 2018 via a Chinese satellite. Quantum communications Using single or entangled photons to transmit data in a manner proven to be secure. Quantum teleportation is a communication protocol where what is “teleported” is an arbitrary and unknown qubit using a pair of entangled particles that are placed remotely from each other. In 2016, China established the first line of communication on this basis through the QUESS (Quantum Experiments at Space Scale) program’s Micius satellite. QKT (Quantum Key Transceiver), EDT (Entanglement Distribution Transmitter), EPS (Entangled Photon Source) and ECP (Experimental Control Processor). Excerpt from: «QUESS Operations at Chinese Space Science Mission Center», SU et al., Proceedings of the 15th International Conference on Space Operations, 2018.
  • 23. In 1994, a quantum algorithm (based on the quantum mathematical formalism) for efficient factorization in polynomial time was introduced. Shor’s algorithm therefore presents a risk to RSA-type systems, even though there is no quantum computer available today with the capabilities to run it. The technological leap needed to achieve this is still huge (it would take thousands of qubits), but we need to prepare now. As is often the case in quantum physics, and this is a fact that should inspire current generations, a discovery was born out of a chance meeting between two different fields. The physicist Charles Bennett met cryptologist Gilles Brassard at a conference, and they published (at a minor conference, because nobody understood it at first) the first mechanism for exchanging encryption keys by quantum methods: BB84. This clever algorithm came long before Shor’s (which endangered the RSAencryption protocol), but it was impossible to test it with the technologies of the time. Several variants have been proposed since the 1990s. RSA encryption (after the initials of its three inventors Rivest, Shamir and Adleman) is an asymmetric cryptographic algorithm described in 1977. It is used every day online for e-commerce transactions and, more generally, for exchanging confidential data (see diagram opposite). This encryption is based on mathematical (factoring) problems that are difficult to solve. . A lic e Bob Clear text Encryption Decryption Clear text Bob’s public key Bob’s private key Encrypted text A lic e Bob Quantum channel Non polarized photon Polarization H–V or diagonal Filters The multi-stage BB84 protocol allows Alice and Bob to establish a sharedencryptionkey.Photonsofachosenpolarizationaretransmitted on the communication channel. In the first stage, Alice chooses a random sequence of bits, transmits them as photons with a random polarization, and Bob filters them according to a random reception pattern. In the next step, Bob gives Alice this list of bases, which tells her which ones he has chosen correctly. This gives a sequence of bits from which a shared key can be deduced in a third stage, that is unless there is a disagreement on the values of these bits, which is proof that a hacker may have tried to eavesdrop on the channel (which will have disturbed the state of the emitted photons). 22 23—Quantum realities
  • 24. The impossibility of cloning quantum information is an unrivalled advantage for secure communications. However, it is a major drawback in fiber-optic networks which need the signal to be repeated and reamplified at regular intervals to overcome losses in transmission. The repeaters react with the quantum signal (through decoherence) and prevent the quantum keys from being transmitted. They must be completely redesigned. Moreover, since they are the weakest link in quantum communications, they must be physically incorruptible, and embedding them on satellites and high-altitude platforms (e.g. drones) is one solution being considered for long-distance connections. This is a trusted node architecture consisting of several QKD systems chained together, providing a core or access network. For long-distance quantum networks, fully quantum solutions designed around repeaters using multi-mode quantum memories distribute the entanglement along the entire chain. Once the different sections are ready, they are connected to each other by so-called “Bell state” measurements, until communication is entangled from end to end, offering the desired possibility of teleporting the qubits directly to their destination, thus avoiding transmission losses. There is currently significant activity in the development of quantum memories using a wide variety of physical platforms that are both efficient (information is not lost) and offer scalable solutions. CiViQ  Integration of CV-QKD in telecommunication networks to ensure long-term, reliable data confidentiality. Development of flexible and profitable CV-QKD systems Study of new QKD protocols and theoretical approaches to open the wat to future global quantum networks  21 partners  €9,974,006.25 Projects funded as part of the Quantum Technologies Flagship in the Communications pillar for the period 2018-2021.
  • 25. The race between countries exists, but also between major companies, regularly bringing spectacular breakthrough announcements. In this context, the term quantum supremacy first appeared back in 2011. It designates the moment—the number of qubits—beyond which no conventional computer could compete with a quantum computer. This is thought by some to be situated beyond 50 qubits—qubits with a low error rate and being as stable as possible. Recent announcements have suggested that this threshold may have been reached; although often for a very short period of time and many commentators remain skeptical. This is especially true since this ultimate threshold will have to be reached for algorithms that are genuinely useful. The term quantum advantage might be preferable, for three reasons. Firstly, it serves as a reminder that Europe has a definite theoretical advantage and must retain it. Secondly, there are concrete use cases where deploying quantum solutions provides useful advanced functionalities: for example, the quantum credit card (see page ), quantum computation delegated to the cloud, or the design of complex secure communication schemes. The third good reason for using the term advantage lies in the context of quantum cryptography and post- quantum cryptography. These are two different things. The former uses quantum properties to implement its (key distribution) protocols, but the algorithms it uses (e.g. RSA) could be brought into question by quantum computers, and the content of messages (which may be recorded today by intelligence agencies in the hope of being deciphered in the future) could be compromised. Post-quantum cryptography thus consists in devising algorithmic encryption solutions that are completely different from what quantum (and classical) computers are good at, and doing so right now to keep our advantage (by deploying them as soon as possible), while remaining efficient in encryption time. This is a very exciting area of mathematical research, with a wide variety of solutions, some of them hybrid, which also need to be proven to be secure. The quest for the quantum advantage may be more fruitful than the search for quantum supremacy. Supremacy or advantage? Quantum Internet Alliance  Construction of a quantum internet that would enable communication applications between any two points on Earth.  34 partners  €10,406,113.5 QRANGE  Improvement of random number generation technologies.  8 partners  €3,187,282.50 UNIQORN  Development of low- cost, robust, reliable and small products for the mass market.  17 partners  €9,979,905 24 25—Quantum realities
  • 26. Because trust and security are a bank’s main assets, banks are studying the impact of the first applications of quantum computing that could emerge in the next few years. For BNP Paribas, quantum computing will break the public key algorithm systems that are used for key exchange, key distribution, authentication, and electronic signature. This is why banks and insurance companies must be prepared to be proactive in protecting and guaranteeing the authenticity of information with a long lifecycle created today. This involves mapping current uses of cryptography and anticipating the transition to post-quantum algorithms. And they must prepare to capitalize on the contributions of quantum infrastructure to strengthen encryption key security, which is a major strategic and economic challenge. Quantum computing is particularly effective in stochastic computing, the basis for financial models. Four types of applications will emerge and become a competitive advantage: • filtering transactions on sanctions and embargo lists; • audits to fight fraud, money laundering, and terrorism funding; • asset-liability management by implementing more relevant models to optimize financing; • the pricing of structured products and derivatives by valuing the cost of risk as accurately as possible by increasing the number of paths in Monte Carlo simulations or by evaluating more elaborate scenarios. For BNP Paribas, the emergence of quantum computing poses three immediate challenges for Human Resource management in the banking sector: • anticipate the impact on changes to the professions and skills of IT employees; • reinforce expertise in quantum and post- quantum encryption; • attract new expert profiles by co-building appropriate training courses and model skills with schools, universities, and research laboratories. Quantum computing, as a complement to current computing, can help banks serve customers better while respecting the regulatory framework. A global understanding of technology and its potential use cases is needed to better grasp new cybersecurity risks and tailor regulations to these new quantum technologies. Quantum’s impact on banks The company viewpoint BNP Paribas
  • 27. Towards a quantum credit card? What if the anti-cloning theorem of quantum systems could inspire totally secure financial transactions? This is one of the areas of research for quantum cryptography specialist Eleni Diamanti, a CNRS researcher at Sorbonne University in Paris, who previously spent a few years at Télécom Paris. She co-published in 2018 in the journal Nature “Experimental investigation of practical unforgeable quantum money “, a description of experiments carried out around the practical applications of the theoretical work of Stephen Wiesner in 1983 and the anti-cloning theorem. This quantum property prohibits a malicious third party from obtaining information about a quantum system without disturbing it, and therefore without leaving a trace. Avoiding transactions being disrupted or duplicated is one of the reasons for the existence of cryptocurrencies, and quantum approaches could achieve this without having to go through their currently energy-intensive mining algorithms. Another visible experiment in her laboratory is an optical bench implementing QKD-CV key sharing (see page ), which is a protocol that does not require the production of single photons, and allows the use of coherent detectors and other common technologies. “This research is intended for use outside the lab,” says Eleni Diamanti. “This is very important for imagining the future of the quantum Internet, where we will need much more compact devices at reasonable costs.” Profile of a quantum player Eleni Diamanti 26 27—Quantum realities
  • 28. Problems and use cases that benefit from quantum technologies Each of the Quantum Flagship’s four pillars is aimed at producing applications and uses in the short or medium term. Sensors and metrology (page ) and quantum communications, discussed above, have already led to commercial applications. For quantum communications, the quantum hardware needs to prepare and measure only one qubit at a time, whereas quantum computation and simulation depend on hardware with several hundred or even several thousand physical qubits, as we will see later. However, this does not prevent us from identifying right now the use cases and problems that will be particularly suited to quantum technology. The first applications resulting from the entanglement properties concern use cases in transaction security, leading to a new design phase for communication networks. The quantum property of superposition of states opens the way to other categories of problems. As for the key sharing protocols that were redesigned at the dawn of quantum technology and that have left the lab, a series ofnewalgorithmshasbeenimaginedsincethe 1980s, promising to resolve in a reasonable amount of time several types of problems that are either difficult or impossible to solve with classical computing architectures. These are problems for which the resources needed do not grow linearly with the input data and which can benefit from parallel processing of this data. Smart management of transportation data, modeling fuselages, autonomous flight systems, management of satellite constellations, etc.: a manufacturer like Airbus sees the capacities and forms of calculation that quantum technology offers as a natural extension of its High- Performance Computing activities.
  • 29. Direct physical problems Hard-to-grasp physical problems are natural candidates, especially those involving nuclear physics. The aim is to take advantage of a direct homology between these problems and the quantum hardware equipment that would be developed to solve them. Applications in materials physics, fluid mechanics (airflow on a fuselage, meteorology, etc.), molecular mechanics, chemistry, pharmacology, and synthetic biology are candidates, and some have already shown results. To understand the behavior of these large sets of particles, we must solve the Schrödinger wave function. However, the classical approaches for doing so involve approximations, which become greater and more critical as the molecules studied and their environment become more complex. Using a quantum simulator (see page  and on) offers the best approach to modeling the problem. Sampling problems This is a class of problems consisting in finding representatives among a large population of data to recreate the population distribution. Simulators based on quantum annealing principles are well suited to this category. Optimization problems This is without a doubt one of the main fields of interest for quantum computing right now. Optimization problems are challenges whose goal is to find the best decision among a great number of possible decisions. They are often very concrete problems that we find in nearly all fields of business. For example: find the least expensive way to ship goods, identify the most efficient way to extract resources from a mine, find the most productive way to allocate resources in a production chain, find innovative ways to discover medicines, or identify the best way to manage risk in financial portfolios. While the processing time needed for a classical computer to provide acceptable Quantum machines can handle a few types of problems that are rather common in business. Let’s take three: • Optimizing journeys by minimum cost (duration, length, etc.). Use cases: scheduling, supply chain, distribution and energy, transportation and traffic optimization. Industries concerned: energy & public services, transportation & telecoms. • The “backpack” problem of filling a volume with a maximum number of objects without exceeding a certain weight. Use cases: filling containers, allocating telecoms capacity, sizing in transportation and energy. Industries concerned: all, especially transportation & telecoms. • The statistical evaluation of scenarios, average values, and trends. Use cases: risk analysis, portfolio management, actuarial science, oil exploration, weather forecasting, maintenance, fraud. Industries concerned: all, especially banking and insurance. 28 29—Quantum realities
  • 30. solutions quickly becomes prohibitive since optimization problems can increase exponentially with the size of the problem, quantum computing promises to provide an answermuchmorequickly.Theconsequences of such a change will be very beneficial since companies that master these technologies or that are able to access them will have an increased capacity for optimization and improve their competitiveness. Machine learning problems Machine learning is a special class of optimization problems, a class of problems that can greatly benefit from parallel computations, and a set of techniques that has received a great deal of attention from the scientific community in recent years, so it is definitely a candidate for future quantum machines. Hybrid approaches involving both classical and quantum approaches on platforms combining FPGAs (Field-Programmable Gate Array—flexible processors whose integrated circuit is itself reprogrammable), GPUs (Graphics Processing Unit—processors dedicated to image processing), TPUs (Tensor Processing Unit—processors dedicated to computation in artificial neural networks), and quantum equipment are studied. Certification challenges There is also currently a great deal of theoretical work being done to develop new protocols and new approaches to system certification, particularly in the field of quantum security. In field deployments, it has been shown that, even with perfect proof of key security, attacks using physical devices (e.g. noise observation) can be carried out. The term quantum hacking appeared. There are various approaches ranging from work bringing together experts in quantum and classical security to the development of practical proofs of security and actions at the more fundamental level of quantum properties. Certification is also beginning to take business considerations into account so that devices and systems are certified to industry standards. The standards themselves represent an important challenge that has begun to be addressed by research projects involving industry and academia as well as national metrology institutes. “ “The combination of classical and quantum approaches often appears when we look closely at quantum engineering projects,” notes Alice. “I doubt that is a coincidence.” “Absolutely,” replies Alan, “and that’s also why quantum technologies need multidisciplinary knowledge in mathematics, physics, and computing. But you should also keep in mind the two following points. These hybrid approaches are needed right now to push back the frontiers of knowledge in quantum technologies, little by little. In the future, they will always be present because quantum computing is not suited to all algorithmic problems.”
  • 31. Discovering her calling Yuan Yao is a Chinese student who discovered quantum physics at IMTAtlantique on a course with Professor Francesco P. Andriulli. Her passion for this mysterious field continued in 2017-2018 at the Technische Universität in Berlin (Erasmus+ program) where she carried out a project with Professor Eckehard Schöll on “Non-Markovianity and Information Backflow”, which is used to measure the information coming from the interaction between the quantum system and its environment. “Like everyone else, I was quite disturbed by what I was discovering in the quantum field, and I had many questions. What is the use of quantum physics? Is Schrödinger’s cat alive or dead? How do we use quantum entanglement in the real world? Where can I continue my studies in this field?” In 2019, after a meeting with Romain Alléaume and Filippo Miatto at Télécom Paris, she joined their quantum engineering program (see page ) and is about to start her PhD. Until then, she is doing an internship as a quantum engineer at Total. “During the six months I spent in the quantum engineering program, I took professional and specific courses on trends in the quantum field. We also have a research-oriented project. What I like in this training course is that it brings together students in the same field: we have courses with students from the Optical Institute and master’s students at Diderot University.” At Total, her internship is tutored by Henri Calandra and focuses on applying quantum algorithms in chemistry to simulate molecules. By using a quantum computer simulator platform, she can see a quantum computer’s results in real time. “Two algorithms are at the heart of my research: VQE (Variational- Quantum-Eigensolver) and Quantum Approximate Optimization Algorithm (QAOA). The first one is used to find the minimum eigenvalue of an operator by combining the quantum and the classical algorithms, and it allows us to calculate larger molecules in a reasonable amount of time compared to classical methods. Many companies have implemented it: Qiskit Aqua at IBM, Grove at Rigetti as well as ATOS who developed it on its quantum simulator.” The QAOA algorithm will be studied for its application in combinatorial optimization. Full of enthusiasm, Yuan Yao is delighted that more and more people are joining a burgeoning field, spurring the Quantum Machine Learning field which will integrate with so many others. Profile of a quantum player Yuan Yao 30 31—Quantum realities
  • 32. Quantum sensors and metrology Since May 20, 2019, the International System of Units has been based on seven fundamental constants, including the Planck constant h, which is now used to calculate the precise value of the kilogram. Quantum electronic devices (semiconductors and superconductors) using the quantum Hall effect, for example, have been used to redefine these units. Measurement—using sensors—is an act at the heart of science. It can also be found at the heart of business activities that require metrological standards without which exchanges and transactions would never be accurate. Quantum sensors are needed for navigation, underground prospecting and mapping, materials analysis and characterization, medical analysis as well as the fundamental sciences, from the subatomic scale—using localized spins— to the planetary scale—based on photons. Some platforms are already close to commercial application; others need further engineering to be fully viable. A quantum sensor is a probe prepared in a particular quantum state, which interacts with the system to be measured, and whose reaction (projection of the values to be measured over space and disappearance or decoherence of the probe) allows us to estimate what we want to measure. The number of particles in the probe and the measurement time indicate the sensor’s effectiveness. While conventional sensors have an effectiveness of N of the number of particles involved, the best quantum sensors can in principle achieve N accuracy. Atomic clocks Today, the precision of time measurement has reached 1 second over 1 billion years. Optical clocks currently under study range from neutral atoms in optical arrays to highly charged ions and even nuclear transitions. Neutral atoms offer a high signal-to-noise ratio but are generally more sensitive to external fields and collision offsets, requiring careful control of their environment. Atomic sensors This refers to atomic interferometry, which exploits the sensitivity of quantum superposition to create ultra-precise sensors to measure gravity, rotation, magnetic fields, and time, surpassing their best conventional counterparts, and enabling inertial navigation without GPS.These sensors are ready to be translated into commercial products, such as magnetometers using diamond’s “NV” color center. Current atomic gravity sensors offer absolute measurements at the nano-g level or gravity gradient sensitivities exceeding a change of 100 pico-g at a distance of one meter. Using the extreme sensitivity of coherent quantum systems to external perturbations to drastically improve the performance of measurements of physical quantities. Atomic cl ocks Atomic s ensors
  • 33. Optomechanical sensors These are miniature mechanical systems (MEMS, NEMS, and their derivatives) that are currently integrated in chips and embedded. Over the last decade, a technological and scientific paradigm shift has taken place around the optical and quantum control of these devices. They can now be measured andcontrolledatthequantumlevelbycoupling them to optical cavities or superconducting microwave circuits. Current research in this field explores the physical limitations of hybrid opto- and electromechanical devices for converting, synthesizing, processing, detecting, and measuring electromagnetic fields from radio and microwave frequencies up to a THz. Fields of application include medicine (magnetic resonance imaging), security (radar and THz surveillance), timekeeping, and navigation. Quantum imaging This involves acquiring images with an optical resolution beyond standard wavelength limits with low light levels or with strong background lighting. Detecting details with this level of accuracy has immediate applications in the fields of microscopy, pattern recognition, image segmentation, and optical data storage. Correlations between beams of quantum light allow for new imaging modes such as “ghost imaging” in which the image of an object illuminated by one beam is acquired by a camera that receives a different beam that has not reached the object. Spin qubit detection Using spin qubits for detection is a relatively new field. Detecting magnetic fields, which are of crucial importance for chemistry, biology, medicine, and material sciences, is naturally done with spin sensors, but these have also shown their ability to measure other quantities, such as temperature, electric fields, pressure, force, and near-field optics (with technologies based on diamond and silicon carbide defects). These sensors derive their advantage and robustness from the spins’ long- term quantum coherence. All this research has led to progress in the creation of small quantum devices, which is of primary importance for future engineering concerns in the design of quantum computers. MetaboliQs and ASTERIQS rely on doped-diamond technologies iqClock  The development of affordable onboard ultra-precise optical clocks.  12 partners  €10,092,468.75 MetaboliQs  Molecular sensors to diagnose cardiovascular diseases.  7 partners  €6,667,801.25 macQsimal  All types of sensors (autonomous vehicles, medical imaging, etc.).  13 partners  €10,209,943.75 ASTERIQS  Sensors for electric and magnetic fields, pressure, temperature, etc.  23 partners  €9,747,888.75 Optomec hanicalsenso rs Quantum imaging Spin qubi tdetection Projects funded as part of the Quantum Technologies Flagship in the “Quantum sensors and metrology” pillar for the period 2018-2021. 32 33—Quantum realities
  • 34. Qubits: the building blocks of quantum computing It is often said that the qubit is to quantum computing what the bit is to classical computing, but the analogy has its limits. For example, saying that a qubit ∣𝛙⟩ can take any value between ∣0⟩ and ∣1⟩ might suggest that this is the classical interval between 0 and 1, whereas it is a combination of two quantum states (in the case of a bipartite quantum system), a base state written ∣0⟩ and an excited state ∣1⟩, and that this combination ∣𝛙⟩ = 𝑎∣0⟩ +𝑏∣1⟩ follows strict rules written into quantum postulates, such as ∣𝑎∣2 +∣𝑏∣2 = 1, 𝑎 and 𝑏 being complex numbers. A more realistic visual representation is the Bloch Sphere (opposite), where the qubit is like a vector of unit length starting at the center of the sphere and ending at a point on its surface.. The analogy is also misleading if we consider that bits can be assembled into bytes and represent information by themselves, be stored in memory, be processed by the gigabyte, be copied, etc. For example, while it is possible to transport the state of one qubit to a second qubit (teleportation), the first one is reset by this action, so it is not a copy in the classical sense. In the same way, qubits’ lifetimes (due to decoherence) have an essential importance that bits do not have. On the other hand, as in classical computing, qubits (initialized in their state ∣0⟩) can be manipulated to use them as a computing medium according to a notion of circuit and (a succession of) gates. These can process one, two, or more qubits, and are built with sophistication. For example, the Hadamard gate is a mechanism that acts on a single qubit and creates a superposed state, the direct measurement at the output of this gate giving an equal probability of finding a state of 0 or 1. The expertise lies in assembling these gates (both algorithmically and physically) in such a way that the output measurement has a meaning appropriate to the problem being studied. y ∣0⟩ ∣1⟩ x ∣𝛙⟩ The “CNOT” gate is a so-called controlled gate. It is designed to act on two qubits, with a NOT (state) inversion operation being performed on the second if and only if the first is ∣1⟩. It acts like a controller of the second. This gate allows us to generate entangled states. A dual matrix-and-graphic formalism allows the gates (above, the CNOT gate) to be represented and made more familiar. They are also basic building blocks of quantum computing. Depending on the type of the qubit’s physical support, some of them can be easily produced, and others cannot. When talking about qubits, it is important to keep in mind their mathematical representation, the realities of their physical implementations, and the particular properties of the quantum states on which they are based. 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 [ ]
  • 35. QUANTUM ENGINEERING “What strikes me,” continues Alice, beginning to take the full measure of the quantum revolutions, “is that scientists have often made hypotheses, design thought experiences at one time, but it took many years before we could verify them through experiments and then transform them into practical applications.” “Indeed,” adds Alan, “most of the time the technology wasn’t quite ready yet, or no technology was even imagined. It’s almost a quantum constant in itself, and concerning quantum computation, it’s exactly what’s happening right now. To implement qubits, some manufacturers are even betting on quantum particles that haven’t even been discovered yet.” “Qubits: I think that’s a topic I will need to spend some time on, and not compare them to computing bits too easily.” “You’re right there, too. There are many physical ways to make qubits, and a variety of quantum states on which to build the superposition of states for which they are so interesting. But also, it may be necessary to use several physical qubits to obtain the equivalent of one logical qubit. But whatever the case, the quantum algorithms are already here, and have been for a long time in the case of Shor’s algorithm, and we mustn’t delay; we must invest in simulators and then in quantum computers. To build quantum engineering, we will need to take on a series of both scientific and technological challenges.” “ 34 35—Quantum engineering
  • 36. Quantum simulation The idea of designing quantum simulators dates back to 1982, when Richard Feynman pointed out that, since the evolution of a quantum system is a very difficult mathematical problem to solve with the size of the corresponding Hilbert space increasing exponentially with the number of quantum states, only another, controllable quantum system could solve this problem. Quantum simulators are physical quantum devices that are precisely prepared or engineered to study an interesting property of a complex and interacting quantum or classical system. Used to address problems in the quantum world, they could yield previously unknown knowledge, creating a feedback loop so we can design increasingly sophisticated equipment. In any case, we must choose interesting, complex problems that can be simulated by physical systems that are easier to create, prepare, and control, and for which the results can be obtained by measurement. The term quantum simulator covers several distinct varieties: 1. static simulators that study the properties of interactive systems such as the characteristics of base states, 2. adiabatic computers, which are applied to classical optimization problems (these machines get their name from the so-called adiabatic processes, which are processes in which the external conditions of a system change slowly enough for the system to adapt), 3. dynamic simulators that study the properties of systems far from their equilibrium. There are two possible implementations: 1. analogue, which reconstitute a quantum system’s temporal evolution under very precise control conditions, 2. digital, based on quantum circuits doing quantum computation (see following pages). Using well-controlled quantum systems to mimic the behavior of other quantum or complex systems that are less accessible to direct study. A 512-qubit chip from D-Wave Two (Vesuvius), 2013. Credit :SteveJurvetson,Flickr.CCBY2.0.
  • 37. Qombs  A new generation of simulators based on ultra-cold atoms that can be used to design quantum cascade lasers characterized by entangled emissions within the frequency comb.  10 partners  €9,335,635 PASQuanS  The next generation of quantum simulators beyond 1,000 atoms or ions, fully programmable.  15 partners, including Atos BULL and French startup Muquans  €9,257,515 There are many present challenges: identifying modelsthataredifficultforclassicalsimulations to compute but interesting and important from a physical point of view; developing validation and verification tools; designing experimental setups and implementations of enough size while presenting a high degree of qubit control. One major challenge lies in determining if a quantum simulation was executed correctly. The goal is to solve problems that are inaccessible to classical methods: a quantum simulator executes tasks that we cannot track effectively, while we want proof that it was carried out with precision. One approach is to assume that there are still appropriate parameter sets for which these models will become totally—or at least partially— accessible to classical simulation. Projects funded as part of the Quantum Technologies Flagship in the Simulation pillar for the period 2018-2021. Some images depict quantum equipment (here the D-Wave 2000Q™) that is impressive given the cooling engineering required to reach an operating temperature close to absolute 0. However, they must not lead us to believe that all quantum hardware will be massive. Other kinds of physical qubits that operate at room temperature could allow for miniaturization. Suggested reading: Introduction to the D-Wave Quantum Hardware sur le site de D-Wave. 36 37—Quantum engineering
  • 38. With the race for quantum supremacy, quantum computer announcements come in quick succession, and it is often difficult to keep track. The DiVicenzo criteria (see opposite), which no qubit currently fulfills, are low-level criteria that provide an initial assessment. In addition to the distinction between analogue approaches (which are mainly used for simulations) and digital approaches (by circuits and gates), we must also distinguish between a universal quantum computer, onto which we can port all the algorithms we want, and a dedicated quantum computer that is specialized in certain kinds of problems, or even just a single task. The latter is what is currently appearing now. Furthermore, their qubits are not yet pure enough to avoid computation errors. Announcements should be studied in light of their error rates. Given qubits’ current stability, upcoming announcements with figures reaching a hundred qubits will mean qubits without error correction. A higher-order certification will take the following points into account: A. Base functions: are the DiVicenzo criteria fulfilled for at least two qubits? B. The quality of operations: has the error rate of all relevant operations been measured? Are they compatible with the error correction thresholds? Have all the ingredients of a fault-tolerant architecture been demonstrated? C. Error correction: Has quantum error correction been demonstrated, and is it effective? Are the logical error rates less than the physical error rates? D. Fault-tolerance operations: have the operations on the logical bits been implemented to tolerate faults? Have these objectives been reached for a universal set of gates (Clifford + T)? E. Algorithms: have complex, fault-tolerant algorithms and operations been implemented? Quantum computation and computers The clever use of quantum properties—such as superposition—to manipulate quantum systems that are seen as elementary parts of an algorithm, greatly accelerating it. For more information: « Entwicklungsstand Quantencomputer », Bundesamt für Sicherheit in der Informationstechnik (German Federal Office for Information Security). May 2018. 238 pages. IBM Q quantum computer 2018 Credit :LarsPlougmann.Flickr.CCBY-SA2.0.
  • 39. The criteria that David DiVincenzo proposed to IBM in 2000 as conditions for the practical realization of a quantum computer are as follows: 1. The number of qubits must be known precisely. 2. The operator must be able to initialize the qubit register in a stable state. 3. The coherence time must be sufficiently long. 4. A universal set of quantum gates must be available. 5. A qubit’s state must be measurable. Rethinking algorithms OpenSuperQ  Creation of a quantum computer based on superconducting circuits.  10 partners  €10,334,392.50 AQTION  Creation of an ion-based quantum computer.  9 partners  €9,587,252.50 Projects funded as part of the Quantum Technologies Flagship in the Computation pillar for the period 2018-2021. Performing a quantum calculation requires steps which are not common on a classical computer. First, we must prepare all the qubits needed for the computation by placing each of them in an initial quantum state that is appropriate to the problem. This set of qubits then undergoes a sequence of operations through the quantum gates while their states change. Only at the end do we take a measurement of the probability of the final state which gives us the desired solution—a probabilistic one, and not a deterministic one as in classical computing. So, it is reasonable to perform these steps several times to increase the reliability of the results. Classical computer architectures prepare the computations and measure the results. Therefore, a quantum computer is intrinsically hybrid. The differences in nature between classical computing bits and quantum qubits (see page ), including the impossibility of copying a given qubit’s state, for example, have led to a profound rethinking of the way in which calculations are performed on data, which has resulted in the modernization of some classical algorithms, targeting classical architectures and improving the knowledge of current computing. As we move toward universal quantum computers, an article published in early 2018 by physicist John Preskill introduced the concept and term NISQ (Noisy Intermediate- Scale Quantum), which has been widely circulated since then. NISQ is defined as the current and future class of computers with an intermediate number of qubits (50 to a few hundred) that are controlled imperfectly and subject to a remaining quantum noise that is sufficiently acceptable to provide scientific results that are worthy of interest. « Quantum Computing in the NISQ era and beyond », John Preskill, January 2018. A milestone called NISQ 38 39—Quantum engineering
  • 40. Profile of a quantum player Adrien Facon Towards post-quantum cryptography While quantum technologies offer tremendous opportunities, they also pose a threat to the security of today’s information systems. Today’s information systems are largely based on solving problems that are said to be computationally difficult for a classical computer as we know it today—factoring large numbers (for RSA), discrete logarithm problems (on elliptic curves)—which will not resist attacks from quantum machines for very long. “We need to find new problems that are difficult to solve, ones that are fundamentally complex,” explains Adrien Facon, Director of Research and Innovation from 2016-2019 at Secure-IC, a leader in integrated security. “That’s what post-quantum cryptography is all about.” After studying at the École Polytechnique and doing research at the Harvard University Physics Department, Adrien continued his studies at the École Normale Supérieure in Paris before joining the CQED (Cavity Quantum Electrodynamics) research team at the Collège de France. He defended his doctoral thesis entitled “Chats de Schrödinger d’un atome de Rydberg pour la métrologie quantique” (Schrödinger cat states of a Rydberg atom for quantum metrology) under the supervision of Professor Serge Haroche. Adrien’s work is a major step forward in the field and was published in the journal Nature. Adrien Facon, who moved on to Innovative Industry after completing his thesis, is now project leader of the French national post-quantum cryptography program (PIA RISQ) which brings together leading French players from academia (ENS Paris, CEA, Inria, UPMC, PCQC, University of Versailles and IRISA), industry (Secure-IC, Thales and formerly Gemalto, Orange, CS, Airbus, and CryptoExperts), and the ANSSI (French National Agency for Scientific Research). The project submitted more than a quarter of the candidates selected for Phase 2 by the United States’ NIST as part of an international effort to develop new standards for public key cryptography, digital signatures, and key exchange. “The gradual transition to post-quantum encryption systems, particularly via hybrid crypto-systems, will permanently safeguard the confidentiality and integrity of our communications, starting today,” the researcher adds. Since September 2017, Adrien Facon has also been a part-time lecturer at Télécom Paris and an associate researcher in the IT department of École Normale Supérieure within the Information Security group headed by David Naccache. A sensitive electrometer based on a Rydberg atom in a Schrödinger-cat state. July 2016. https://www.nature.com/articles/nature18327 Regroupement de l’Industrie française pour la Sécurité Post-Quantique  : http://www.risq.fr/
  • 41. 2D-SPIC  Single-photon optical technologies on 2D substrates.  5 partners  €2,976,812.50 S2QUIP  Integrated circuits to provide quantum light sources on demand.  8 partners  €2,999,298.75 QMICS  Microwave quantum local area network and microwave single- photon detector.  8 partners  €2,999,595 PhoQuS  Developing a quantum simulation platform based on quantum fluids of light.  19 partners  €2,999,757.50 MicroQC  Demonstrating microwave- controlled 2- and n-qubit gates that are fast and fault-tolerant (trapped ions).  5 partners  €2,363,343.75 SQUARE  Research into using rare earth ions as qubits for high-density integration.  8 partners  €2,990,277.50 PhoG  Developing photon sources from non-classical states of light.  5 partners  €2,761,866.25 Fundamental research, more necessary than ever Projects funded as part of the Quantum Technologies Flagship in the “Scientific Fundamentals” cross-cutting domain for the period 2018-2021. The list of Quantum Flagship projects that fall under the “Scientific Fundamentals” cross-cutting domain speaks for itself. As ever, this research is not only abundant, but essential to continue paving the way for industrial solutions. This includes finding new ways of producing quantum systems that are easy to manipulate (the space required, production and operating temperature, etc.), sufficiently stable, affordable, available in large quantities, etc. Here, both the basic physics research and the need for multidisciplinary approaches are key. Several research teams from IMT Atlantique and its partners, both academic and industrial, are participating in this movement. In the Subatech laboratory in Nantes, researcher Audrey Francisco-Bosson uses the Large Hadron Collider at CERN’s ALICE detector to probe the very depths of matter. In particular, she is working on the particle J/ψ, to better understand the quark gluon plasma’s behavior. This research allows us to test the properties and laws of quantum chromodynamics—the theory that describes the strong interaction between quarks—and to check whether the model we use to describe matter is correct. In Brest, Vincent Castel and his team are working on the magnetic properties of synthetic Yttrium Iron Garnet (YIG), which allows photons and magnons to be strongly coupled, creating magnon-polaritons, hybrid light-matter quasi- particles. This could pave the way for memories and sensors that can be used in quantum computers. Quantum is not only photonic; it is also magnetic! 40 41—Quantum engineering
  • 42. The quest for ideal qubits Josephso n effectsupe rconductorju nctions Neutral a toms Trapped i ons Understanding the differences between conventional computing bits and qubits (page ) is necessary but not enough. The qubits themselves differ in nature, and to be able to find our way in the landscape of existing or expected quantum machines, the promised computing capacities, the universality or not of these calculations, the number of qubits needed to reach any performance threshold, or the promises of miniaturization, it is important to understand the different aspects by which we characterize qubits. Two common questions “Which qubit is the best?” is not the right question. There is no intrinsically ideal qubit, but qubits that are suited for what we expect from them, for example having the longest possible decoherence time (stability) or able to be “moved” over a certain distance. “Hundreds or thousands of qubits?” We need to distinguish between two things: a logical qubit can require deploying dozens ofphysicalqubitstoaccountforthephysical limitations: the qubits in 2,000-qubit simulators are not as connected to each other as those in machines with a few dozen. Managing errors The results measured on quantum observables are not always reliable, and a vast field of research has developed around quantum error correction. Diamond NVcenters The methods consist in ensuring redundancy of the physical qubits and then evaluating an average of their behavior, as well as applying classical numerical techniques in parallel. Quantum volume In the quest for qubit stability, what really counts is the ratio between the coherence time (currently a few hundred microseconds on superconducting circuits) and the number of operations that can be performed (from 1,000 to 10,000, compared to only a single one 20 years ago). More than the number of qubits in a quantum machine, what counts is what the community now calls quantum volume: the effective volume of computation that can be performed before errors hide the result. The search for entanglement and connectivity An important aspect of a quantum machine’s architecture is the degree of connection a qubit can have to become entangled with other qubits: can it do this with just its neighbors, or with qubits that are much more distant? This underlying geometry has repercussions on these architectures’ scalability: does adding a qubit only allow for a quadratic increase in performance (as with D-Wave’s 2048 qubit machines), or does it allow for an exponential increase in performance? In this regard, trapped ion technology allows for total connectivity in addition to a rather long coherence time that requires less error correction, highlights the specialized startup IonQ.