MEDICI’s 'Quantum Computing in Financial Services' report, a deep dive into the impact of Quantum Computing on the financial services sector, highlights key players in the ecosystem across hardware, software, and services, discusses the adoption of Quantum Computing by the financial services industry, and analyzes collaborative efforts exploring its early use cases in financial services.
Quantum Computing in Financial Services Executive SummaryMEDICI Inner Circle
The ‘Quantum Computing in Financial Services’ report is an in-depth analysis of Quantum Computing and its applicability and impact on financial services. The report highlights key players in the ecosystem across hardware, software, and services, discusses the adoption of Quantum Computing by the financial services industry, and analyzes collaborative efforts exploring its early use cases in financial services.
Présentation du Keynote du jeudi 20 octobre 2016 - M. Paul RamseyACSG Section Montréal
Présentation du Keynote du jeudi 20 octobre 2016 - M. Paul Ramsey dans le cadre de Géomatique 2016
Très actif dans le domaine de l’Open Source, il a mis sur pied une entreprise spécialisée dans le développement de logiciels de géomatique. Il nous a exposé son expérience concernant la symbiose de la technologie et de la culture. Location Omniscience, Free data, Free software, free machine et Utility computing, un discours sur les tendances technologiques accompagné d’un peu de philosophie. Il nous a exposé à quel point la convergence de l’accessibilité des technologies, l’abondance des données et les faibles coûts des services de traitement ouvrent la porte à un univers infini de possibilités. Il a également amené l’auditoire à réfléchir sur le fait que nous sommes épiés en tout temps, que ce soit par des caméras, des drones, des algorithmes ou autres, nous sommes surveillés en permanence.
Inspirational talk on AI (artificial intelligence) and machine learning, i.e., how to give birth to an AI. Introductory and intentionally kept simple for non experts and non technical executives. Care should be taken not too over interpret some of the intentional simplified statements in the presentation.
Quantum Computing in Financial Services Executive SummaryMEDICI Inner Circle
The ‘Quantum Computing in Financial Services’ report is an in-depth analysis of Quantum Computing and its applicability and impact on financial services. The report highlights key players in the ecosystem across hardware, software, and services, discusses the adoption of Quantum Computing by the financial services industry, and analyzes collaborative efforts exploring its early use cases in financial services.
Présentation du Keynote du jeudi 20 octobre 2016 - M. Paul RamseyACSG Section Montréal
Présentation du Keynote du jeudi 20 octobre 2016 - M. Paul Ramsey dans le cadre de Géomatique 2016
Très actif dans le domaine de l’Open Source, il a mis sur pied une entreprise spécialisée dans le développement de logiciels de géomatique. Il nous a exposé son expérience concernant la symbiose de la technologie et de la culture. Location Omniscience, Free data, Free software, free machine et Utility computing, un discours sur les tendances technologiques accompagné d’un peu de philosophie. Il nous a exposé à quel point la convergence de l’accessibilité des technologies, l’abondance des données et les faibles coûts des services de traitement ouvrent la porte à un univers infini de possibilités. Il a également amené l’auditoire à réfléchir sur le fait que nous sommes épiés en tout temps, que ce soit par des caméras, des drones, des algorithmes ou autres, nous sommes surveillés en permanence.
Inspirational talk on AI (artificial intelligence) and machine learning, i.e., how to give birth to an AI. Introductory and intentionally kept simple for non experts and non technical executives. Care should be taken not too over interpret some of the intentional simplified statements in the presentation.
Introduction to Artificial Intelligence. Not complex and should be relative easy to follow. Be aware that due to its high levelness (and no voice over) some care should be taken by the simplified examples used.
This document is a briefing of the Conference Exponential Manufacturing organized by Singularity University in may 2016. We enrieched it with examples and articles by our own.
Virtual/Augmented reality, digital tools and superpowers for health applicati...Boo Aguilar
Keynote apresentado no minicurso realizado durante o primeiro simpósio mineiro de engenharia biomédica no INATEL no dia 14/08/15
Sobre a ordem das coisas, tentei comprimir a estrutura mais ou menos assim:
1-Um pouco sobre a FLAGCX e como vemos a ciência como propulsora que estica os limites da tecnologia.
2- Alguns cases legais, pra não ficar só na arena teórica (Get shit done!)
3- Um pouco sobre meu trabalho (e sobre como só consigo fazer o que faço graças ao crescimento exponencial, super ferramentas e outros fatores chave). Falamos de VR, AR, tecnologias que uso, demos técnicos e visão pessoal de futuro a curto/médio prazo desse mercado e do mercado de tecnologia pra healthcare
4- Finalmente a parte que mais queria dividir com vocês: Referências de super tools pra health care, softwares, serviços e cases de empresas disruptivas, desde realidade virtual/aumentada até databases conectadas, inteligencia artificial, Uberização de serviços de saúde, bioengenharia, plataformas de educação, treinamento e colaboração descentralizadas etc etc
5- Finalizamos com uma chamada pra vocês se inspirarem sempre, agirem, e ajudarem a espécie humana a transcender nossas limitações através das nossas ferramentas e do nosso intelecto.
Como prometido, aí estão todas as referências pra serem revistadas com calma. Espero que tenham gostado da experiência tanto quanto eu, e saibam que vocês foram a primeira turma pra quem eu apresentei 170 slides em mais de 3 horas de talk sem perder um único
interessado. You rock!
We're together in this ocean, let's do it!
#RadicalOpenness #Transcend ; )
Technical drivers of cloud centralization and megacorporate dominationAndrew Oram
Much hand-wringing appears in the press about the seemingly unstoppable ascendance of a few large corporations in computing. Everything seems to be increasingly centralized in such corporations. This presentation will explain why such centralization and the triumph of first movers is facilitated by three technological factors: the end of Moore's Law, compiling complex algorithms into hardware (which may reach its climax in quantum computing), and the value of aggregating large amounts of data. Remedies are also discussed.
My keynote at the Open Exchange Summit in Nashville on April 18, 2018. I talk about the implications for many different kinds of companies of the fact that increasingly large segments of our economy are being dominated by algorithmically managed network marketplaces.
The following document was elaborated by InPeople Consulting & UpsideRisks as a consecuence of the participation at the Conference Exponential Finance and their own research.
In connection with the EU Water Project awarded to CNR Catania, I gave this inspirational talk to Physics Students at Catania University and old research colleagues on how to transition from Academia to the often very non-scientific world of Corporation. How to keep your sanity, curiosity (i.e., “But Why?”) and continue to have fun throughout your career.
Futurist Speaker Gerd Leonhard: Bottom Line Future Trends (summary)Gerd Leonhard
These are some of my favourite memes and bottom lines from 10+ recent slideshows and presentations see http://www.futuristgerd.com/category/gerd/gerds-presentations/ and www.gerdtube.com for videos
If you enjoy my slideshares please take a look at my new book “Technology vs Humanity” http://www.techvshuman.com or buy it via Amazon http://gerd.fm/globalTVHamazon
More at http://www.futuristgerd.com or www.gerdleonhard.de
Download all of my videos and PDFs at http://www.gerdcloud.net
About my new book: are you ready for the greatest changes in recent human history? Futurism meets humanism in Gerd Leonhard’s ground-breaking new work of critical observation, discussing the multiple Megashifts that will radically alter not just our society and economy but our values and our biology. Wherever you stand on the scale between technomania and nostalgia for a lost world, this is a book to challenge, provoke, warn and inspire.
SoLoMo The Future of Business in a networked societyGerd Leonhard
An edited version of my presentation at BPost in Brussels, June 26, 2012, on the future of business, publishing, ecommerce, marketing, social media... and print:) Enjoy and spread the word
We hear specific technology terms more frequently, however some individuals may not know what they mean.
My goal is to help you understand the topics that are changing our world and will most likely continue to play an integral part in how we interact with technology.
Qu'est ce que le Big Data ? Avec Victoria Galano Data Scientist chez Air FranceJedha Bootcamp
Depuis les 5 dernières années, nous avons créé plus de données que depuis les débuts de l'humanité. Nous produisons aujourd'hui tellement de données qu'il devient difficile de les gérer. C'est ce qu'on appelle le Big Data. Durant ce workshop nous parlerons des enjeux du Big Data et de ses applications concrètes dans notre société.
What's Next? Megatrends Shaping Tomorrow's Society and Rebooting DemocracyNino Lo Cascio
Megatrends Shaping Tomorrow's Society & Rebooting Democracy;
- IT Industrialisation
- Information Explosion
- "Everyware" - The Mobile Internet
- Natural UI
- Aging Population
- Digital Natives
- New emerging democracy model
- Scenarios 2020
A Chinese team of researchers has recently unveiled the world’s most powerful quantum computer – capable of manipulating 66 qubits of data. At the same time, a team at Cambridge University in the UK has created a quantum computing desktop operating system – which could be as significant a step at bringing quantum capabilities into the mainstream as Microsoft’s development of MS-DOS and Windows was for classical desktop computing.
Introduction to Artificial Intelligence. Not complex and should be relative easy to follow. Be aware that due to its high levelness (and no voice over) some care should be taken by the simplified examples used.
This document is a briefing of the Conference Exponential Manufacturing organized by Singularity University in may 2016. We enrieched it with examples and articles by our own.
Virtual/Augmented reality, digital tools and superpowers for health applicati...Boo Aguilar
Keynote apresentado no minicurso realizado durante o primeiro simpósio mineiro de engenharia biomédica no INATEL no dia 14/08/15
Sobre a ordem das coisas, tentei comprimir a estrutura mais ou menos assim:
1-Um pouco sobre a FLAGCX e como vemos a ciência como propulsora que estica os limites da tecnologia.
2- Alguns cases legais, pra não ficar só na arena teórica (Get shit done!)
3- Um pouco sobre meu trabalho (e sobre como só consigo fazer o que faço graças ao crescimento exponencial, super ferramentas e outros fatores chave). Falamos de VR, AR, tecnologias que uso, demos técnicos e visão pessoal de futuro a curto/médio prazo desse mercado e do mercado de tecnologia pra healthcare
4- Finalmente a parte que mais queria dividir com vocês: Referências de super tools pra health care, softwares, serviços e cases de empresas disruptivas, desde realidade virtual/aumentada até databases conectadas, inteligencia artificial, Uberização de serviços de saúde, bioengenharia, plataformas de educação, treinamento e colaboração descentralizadas etc etc
5- Finalizamos com uma chamada pra vocês se inspirarem sempre, agirem, e ajudarem a espécie humana a transcender nossas limitações através das nossas ferramentas e do nosso intelecto.
Como prometido, aí estão todas as referências pra serem revistadas com calma. Espero que tenham gostado da experiência tanto quanto eu, e saibam que vocês foram a primeira turma pra quem eu apresentei 170 slides em mais de 3 horas de talk sem perder um único
interessado. You rock!
We're together in this ocean, let's do it!
#RadicalOpenness #Transcend ; )
Technical drivers of cloud centralization and megacorporate dominationAndrew Oram
Much hand-wringing appears in the press about the seemingly unstoppable ascendance of a few large corporations in computing. Everything seems to be increasingly centralized in such corporations. This presentation will explain why such centralization and the triumph of first movers is facilitated by three technological factors: the end of Moore's Law, compiling complex algorithms into hardware (which may reach its climax in quantum computing), and the value of aggregating large amounts of data. Remedies are also discussed.
My keynote at the Open Exchange Summit in Nashville on April 18, 2018. I talk about the implications for many different kinds of companies of the fact that increasingly large segments of our economy are being dominated by algorithmically managed network marketplaces.
The following document was elaborated by InPeople Consulting & UpsideRisks as a consecuence of the participation at the Conference Exponential Finance and their own research.
In connection with the EU Water Project awarded to CNR Catania, I gave this inspirational talk to Physics Students at Catania University and old research colleagues on how to transition from Academia to the often very non-scientific world of Corporation. How to keep your sanity, curiosity (i.e., “But Why?”) and continue to have fun throughout your career.
Futurist Speaker Gerd Leonhard: Bottom Line Future Trends (summary)Gerd Leonhard
These are some of my favourite memes and bottom lines from 10+ recent slideshows and presentations see http://www.futuristgerd.com/category/gerd/gerds-presentations/ and www.gerdtube.com for videos
If you enjoy my slideshares please take a look at my new book “Technology vs Humanity” http://www.techvshuman.com or buy it via Amazon http://gerd.fm/globalTVHamazon
More at http://www.futuristgerd.com or www.gerdleonhard.de
Download all of my videos and PDFs at http://www.gerdcloud.net
About my new book: are you ready for the greatest changes in recent human history? Futurism meets humanism in Gerd Leonhard’s ground-breaking new work of critical observation, discussing the multiple Megashifts that will radically alter not just our society and economy but our values and our biology. Wherever you stand on the scale between technomania and nostalgia for a lost world, this is a book to challenge, provoke, warn and inspire.
SoLoMo The Future of Business in a networked societyGerd Leonhard
An edited version of my presentation at BPost in Brussels, June 26, 2012, on the future of business, publishing, ecommerce, marketing, social media... and print:) Enjoy and spread the word
We hear specific technology terms more frequently, however some individuals may not know what they mean.
My goal is to help you understand the topics that are changing our world and will most likely continue to play an integral part in how we interact with technology.
Qu'est ce que le Big Data ? Avec Victoria Galano Data Scientist chez Air FranceJedha Bootcamp
Depuis les 5 dernières années, nous avons créé plus de données que depuis les débuts de l'humanité. Nous produisons aujourd'hui tellement de données qu'il devient difficile de les gérer. C'est ce qu'on appelle le Big Data. Durant ce workshop nous parlerons des enjeux du Big Data et de ses applications concrètes dans notre société.
What's Next? Megatrends Shaping Tomorrow's Society and Rebooting DemocracyNino Lo Cascio
Megatrends Shaping Tomorrow's Society & Rebooting Democracy;
- IT Industrialisation
- Information Explosion
- "Everyware" - The Mobile Internet
- Natural UI
- Aging Population
- Digital Natives
- New emerging democracy model
- Scenarios 2020
A Chinese team of researchers has recently unveiled the world’s most powerful quantum computer – capable of manipulating 66 qubits of data. At the same time, a team at Cambridge University in the UK has created a quantum computing desktop operating system – which could be as significant a step at bringing quantum capabilities into the mainstream as Microsoft’s development of MS-DOS and Windows was for classical desktop computing.
Running head: QUANTUM COMPUTING
QUANTUM COMPUTING 9
Research Paper: Quantum Computing
(Student’s Name)
(Professor’s Name)
(Course Title)
(Date of Submission)
Abstract
Quantum computers are a new era of invention, and its innovation is still to come. The revolution of the quantum computers produced a lot of challenges for ethical decision-making and predictions at different levels of life; therefore, it raised new concerns such as invasion of privacy and national security. In fact, it can be used easily to access and steal private information and data, while on the other hand, quantum computers can help to eliminate these unethical intrusions and secure the information.
Quantum computers will be the most powerful computer in the world that would open the door to encrypt the information in much less time. On the contrary, the supercomputers sometimes take so many hours to encrypt, whereas quantum computers can be used for the same purpose in a shorter time period making it harder to decrypt the data and information.
Many years from now, quantum computers will become mainstays throughout the world of computing. It will serve the individual and the community, but there is a significant concern that quantum computers could be used to invade people’s privacy (Hirvensalo, 2012).
Literature Review
The study area that is aimed on the implementation of quantum theory principles to develop computer technology is called Quantum computing. The field of quantum mechanics arose from German physicist Max Planck’s attempts to describe the spectrum emitted by hot bodies and specifically he wondered the reason behind the shift in color from red to yellow to blue as the temperature of a flame increased.
https://www.stratfor.com/analysis/approaching-quantum-leap-computing
There has been tremendous development in quantum computing since then and more research is been done to realize its full potential. Generally, quantum computing depends on quantum laws of physics. Rather than store information as 0s or 1s as conventional computers do, a quantum computer uses qubits which can be a 1 or a 0 or both at the same time. The quantum superposition along with the quantum effects of entanglement and quantum tunneling enable computers to consider and manipulate all combinations of bits simultaneously. This effect will make quantum computation powerful and fast (Williams, 2014).
http://www.dwavesys.com/quantum-computing
Researchers in quantum computing have enjoyed a greater level of success. The first small 2-qubit quantum computer was developed in 1997 and in 2001 a 5-qubit quantum computer was used to successfully factor the number 15 [85].Since then, experimental progress on a number of different technologies has been steady but slow, although the practical problems facing physical realizations of quantum computers can be addressed. It is believed that a quant.
In the 2011 book “Physics of the Future”, author Michio Kaku predicted that Moore’s Law will end and this would turn Silicon Valley into rust if an alternative and suitable replacement for silicon was not found. For the last 4 decades, Moore’s Law came about to represent unstoppable technological progress. At its heart was the observation that the number of transistors fabricated onto a chip would double every two years and that the cost would also fall off at a similar rate. It is very important to note that this law is an observation and not an actual physical or natural law. However, as of 2010 the update to the International Technology Roadmap for Semiconductors has shown growth slowing by 2013 after which densities are going to double only every three years. We are hitting the limits of the number of electrons that can be fit in a given area.
One option to overcome this limitation is to create quantum computers that will take advantage of the quantum character of molecules to perform the processing tasks of a conventional computer. Quantum computers could very possibly one day be able to replace silicon chips, just as the transistor replaced vacuum tube.
Revolution of Quantum Computing in AI EraPrinceBarpaga
What is a Quantum Computer?
How does a Quantum Computer Work?
How will Quantum Computing revolutionize AI?
Can a Computer think like a human?
These are all the questions that I seek to answer in my presentation which was delivered at York University on 22nd March at Lassonde School of Engineering.
Presentation done and delivered by Prince Barpaga
What is a quantum computerA quantum computer harnesses some of th.docxtroutmanboris
What is a quantum computer?A quantum computer harnesses some of the almost-mystical phenomena of quantum mechanics to deliver huge leaps forward in processing power. Quantum machines promise to outstrip even the most capable of today’s—and tomorrow’s—supercomputers.
They won’t wipe out conventional computers, though. Using a classical machine will still be the easiest and most economical solution for tackling most problems. But quantum computers promise to power exciting advances in various fields, from materials science to pharmacuticals research. Companies are already experimenting with them to develop things like lighter and more powerful batteries for electric cars, and to help create novel drugs.
The secret to a quantum computer’s power lies in its ability to generate and manipulate quantum bits, or qubits.
What is entanglement? Researchers can generate pairs of qubits that are “entangled,” which means the two members of a pair exist in a single quantum state. Changing the state of one of the qubits will instantaneously change the state of the other one in a predictable way. This happens even if they are separated by very long distances.
Nobody really knows quite how or why entanglement works. It even baffled Einstein, who famously described it as “spooky action at a distance.” But it’s key to the power of quantum computers. In a conventional computer, doubling the nmber of bits doubles its processing power. But thanks to entanglement, adding extra qubits to a quantum machine produces an exponential increase in its number-crunching ability.
Quantum computers harness entangled qubits in a kind of quantum daisy chain to work their magic. The machines’ ability to speed up calculations using specially designed quantum algorithms is why there’s so much buzz about their potential.
That’s the good news. The bad news is that quantum machines are way more error-prone than classical computers because of decoherence.
What is a qubit? Today's computers use bits—a stream of electrical or optical pulses representing
1s or
0s. Everything from your tweets and e-mails to your iTunes songs and YouTube videos are essentially long strings of these binary digits.
Quantum computers, on the other hand, use qubits, which are typically subatomic particles such as electrons or photons. Generating and managing qubits is a scientific and engineering challenge. Some companies, such as IBM, Google, and Rigetti Computing, use superconducting circuits cooled to temperatures colder than deep space. Others, like IonQ, trap individual atoms in electromagnetic fields on a silicon chip in ultra-high-vacuum chambers. In both cases, the goal is to isolate the qubits in a controlled quantum state.
Qubits have some quirky quantum properties that mean a connected group of them can provide way more processing power than the same number of binary bits. One of those properties is known as superposition and another is c.
THE EVOLUTION OF HUMANITY'S GREATEST INVENTION, THE COMPUTER, AND ITS FUTURE.pdfFaga1939
This article aims to present how the computer, humanity's greatest invention, evolved and how its most likely future will be. The computer is humanity's greatest invention because the worldwide computer network made possible the use of the Internet as the technology that most changed the world with the advent of the information society. IBM developed the mainframe computer starting in 1952. In the 1970s, the dominance of mainframes began to be challenged by the emergence of microprocessors. The innovations greatly facilitated the task of developing and manufacturing smaller computers - then called minicomputers. In 1976, the first microcomputers appeared whose costs represented only a fraction of those practiced by manufacturers of mainframes and minicomputers. The existence of the computer provided the conditions for the advent of the Internet which is undoubtedly one of the greatest inventions of the 20th century, whose development took place in 1965. At the beginning of the 21st century, cloud computing emerged, which symbolizes the tendency to place all the infrastructure and information available digitally on the Internet. Current computers are electronic because they are made up of transistors used in electronic chips that have limitations given that there will be a time when it will no longer be possible to reduce the size of one of the components of the processors, the transistor. Quantum computers have been shown to be the newest answer in Physics and Computing to problems related to the limited capacity of electronic computers. Canadian company D-Wave claims to have produced the first commercial quantum computer. In addition to the quantum computer, Artificial Intelligence (AI) can reinvent computers.
Global Expert Mission Report “Quantum Technologies in the USA 2019"KTN
Innovate UK’s Global Missions Programme is one of its most important tools to support the UK’s Industrial Strategy’s ambition for the UK to be the international partner of choice for science and innovation. Global collaborations are crucial in meeting the Industrial Strategy’s Grand Challenges and will be further supported by the launch of a new International Research and Innovation Strategy.
The Global Expert Missions, led by the Knowledge Transfer Network (KTN), play an important role in building strategic partnerships, providing deep insight into the opportunities for UK innovation and shaping future programmes.
Find out more here: https://ktn-uk.co.uk/news/new-report-published-for-ktn-quantum-technologies-global-expert-mission-to-usa
The ‘A Deep Dive into the BNPL Ecosystem’ report analyzes the rapidly evolving BNPL sector across the world and the factors fueling BNPL payments. Read this report to learn about the drivers, partnerships, business models, regulatory landscape, and future trends impacting the BNPL ecosystem.
The Australia FinTech Report 2021 report is an in-depth analysis of the rapidly evolving FinTech sector in Australia. The report takes a close look at the dynamic FinTech startups in the continent to understand the factors driving innovation. Read Australia FinTech Report 2021 to discover what makes Australia’s FinTech landscape unique—CDR and Open Banking, M&A, the FinTech segments powered by a flourishing ecosystem, growth in the FinTech ecosystem, and much more!
MEDICI’s new ‘Open Banking’ report is a detailed analysis of the Open Banking landscape. Read about the evolution of Open Banking, the regulatory landscape, critical factors affecting the implementation of Open Banking, partnerships, market dynamics, and more!
MEDICI’s new ‘Indonesia FinTech Report 2021’ analyzes the country’s FinTech sector and trends in the last three years—a deep-dive by segments & subsegments, funding patterns, M&As, ecosystem partnerships, industry drivers, and perspectives drawn out of regulatory, geopolitical, economic, and market dynamics.
ESG Meets FinTech – A Strategic Analysis Executive SummaryMEDICI Inner Circle
MEDICI’s new ‘ESG Meets FinTech – A Strategic Analysis’ covers the impact of financial technology on Environmental, Social, and Governance (ESG) criteria. It analyzes the various dimensions of ESG and sustainability in the context of FinTech.
MEDICI’s new ‘Africa FinTech Report 2020’ is a deep-dive into the sector; it analyzes segments, funding patterns, M&As, partnerships, and countries, and offers perspectives that have been drawn out of regulatory, economic, and market dynamics.
The Banking-as-a-Service 2.0 report is an in-depth analysis of the fast-evolving BaaS segment. In this report, we analyze the global landscape of specialized FinTech companies and banks that have BaaS as core to their business, funding and investment patterns since 2018, regulatory & market drivers, and a host of industry expert opinions.
Top 21 Decentralized Finance (DeFi) Projects - Executive SummaryMEDICI Inner Circle
In this research report, MEDICI analyzes the DeFi landscape and brings to you 21 of the best projects. The report highlights the rapid growth of DeFi globally, how COVID-19 has catalyzed its growth, and some of the most prominent projects working to disintermediate different facets of finance, viz. lending, decentralized exchange, payments and wallets, derivatives, and insurance.
MEDICI's new India InsurTech Report 2020 explores the InsurTech sector in India. The report delves into what drives transformation in the sector, regulatory initiatives, funding & investment activity, prominent players, and business models.
MEDICI's new India InsurTech Report 2020 explores the InsurTech sector in India. The report delves into what drives transformation in the sector, regulatory initiatives, funding & investment activity, prominent players, and business models.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
2. Introduction
2
Drawbacks of Traditional Computers and How Quantum Computing Can Fill the Gaps
Machines were created to ease the life of humans. From the invention of the wheel to the development of computers, humans have found ingenious
ways to make life easier. Charles Babbage, considered the ‘father of the computer,’ conceptualized and invented the first mechanical computer in the
early 19th century to facilitate navigation calculations. The principle of the modern computer was proposed by Alan Turing in his 1936 paper, On
Computable Numbers. Since then, computers have permeated all walks of life; a world without computers is now unimaginable. Supercomputers are
now used in computing-intensive tasks in numerous fields, including weather forecasting, climate research, and oil exploration. Artificial Intelligence
(AI), Machine Learning (ML), and cloud computing are some technologies that have helped us leverage the existing computing prowess of these
machines.
In classic computing, uncertainty is unacceptable. With quantum computers, however, it’s an asset as they have the unique ability to learn about the
world using probability; they explore multiple answers to arrive at complex decisions. Traditional computer science involves the flow and
manipulation of bits—the basic units of information in a computer—that can hold the value of 1 or 0 but not simultaneously. With quantum
computers, however, 1s and 0s give way to qubits or quantum bits as the fundamental building block of quantum information, experienced as a two-
state quantum mechanical system. The power of these qubits is their inherent ability to scale exponentially; a two-qubit machine allows for four
calculations simultaneously, a three-qubit machine allows for eight calculations, and a four-qubit machine performs 16 simultaneous calculations. As
the technology develops, Quantum Computing could lead to significant advances in numerous fields, from chemistry and materials science to nuclear
physics and financial services.
All this leads to the question of what is the future of computing? Although Quantum Computing can no longer be called the future of computing, as
we are already developing these computers, its applications and full potential will only be realized after several years. The often-discussed Moore's
Law, which predicted the development of robust computer systems with more transistors, is coming to an end because engineers are unable to
develop chips with smaller and more transistors. The creation of more powerful computers is regarded as the most crucial aspect of a computer
system. However, energy efficiency, device lifetime, and economic viability are just as important, requiring numerous transistors, especially when it
comes to large cloud data centers that power large portions of online web applications. These are among the many reasons that have forced
computer science engineers and corporations to look beyond traditional systems and toward areas such as Quantum Computing.
3. We have barely scratched the surface of Quantum Computing. New areas in the field such as
cloud-based Quantum Computing, allowing users to use these quantum-powered computers
through the internet, are focusing on making Quantum Computing more accessible. With the
limitations of traditional computing getting amplified and vast amounts of data being generated
every second, the rise of Quantum Computing systems to fill the gap is inevitable. Both
corporations and governments are focusing on developing Quantum Computing. India announced
a $1 billion fund for the National Mission on Quantum Technologies and Applications. In 2020, the
White House Office of Science and Technology Policy, the National Science Foundation, and the
Department of Energy in the US announced a $1 billion fund to establish 12 AI and quantum
information science research centers nationwide. Companies and governments are working
together to lead the next wave of innovations in computing systems. Innovations and higher
processing power need to ensure that these systems can solve real-world problems.
Outlook
4. Over the past fifty years, computers have become faster, smaller, and more powerful. They have been
transforming and impacting our society in innumerable ways. But like any exponential explosion of resources,
this growth, as described by Moore’s Law, must soon come to an end. Gordon Moore, the co-founder of Intel
Corporation, found that the number of transistors on a silicon chip doubled every year. In his paper in Electronics,
he proposed that this rate of growth would continue, later revising this to a more conservative doubling every 2
years in 1975. While not a law in the mathematical sense, Moore’s Law bore out—about every 18 months, a
transistor would be half the size of the current transistor.
This meant more transistors could be packed into a chip, driving the exponential growth of computing power for
the subsequent 40 years. However, in the middle of the past decade, the laws of physics finally had their say,
denying transistors the ability to be shrunken any more than they already were. It has been predicted that it
would be difficult to shrink the transistor density beyond 7nm or 5nm.
What Led to the Exploration of Quantum
Computing?
Quantum Computing began in 1980 when physicist Paul Benioff proposed a quantum mechanical model of the
Turing machine. Richard Feynman and Yuri Manin later suggested that a quantum computer could simulate
things a classical computer cannot.
5. This trend is evident in the
field of supercomputers as well. For
the past 25 years, the growth rate in
supercomputer performance has
been consistent at about 80%
annually. Though there were
changes from year to year, the
growth rate has stayed firm in three-
year increments. From 2002 to
2013, performance growth
multiplied by 1,000 times. Though
there has been exponential growth
since 2013, the growth rate has
declined significantly, approximately
40% each year.
Source: Explainthatstuff.com
50 Years of Moore’s Law
1970
1972
1974
1979
1982
1985
1989
1993
1995
2000
2007
2007
2013
2015
2016
2017
2018
2019
2,300 Intel 4004
3,500 Intel 8008
6,000 Intel 8080
40,000 Motorola 68000
1,34,000 | Intel 80286
2,75,000 | Intel 80386
10,00,000 | Intel 80486
31,00,000 | Intel Pentium P5
75,00,000 | Intel Pentium P6
4,20,00,000 | Intel Pentium 4
46,30,00,000 | AMD K10
78,90,00,000 | IBM Power 6
1,00,00,00,000 | Apple A7
1,90,00,00,000 | Intel Broadwell
3,00,00,00,000 | Qualcomm Snapdragon
20,00,00,00,000 | AMD Epyc
39,54,00,00,000 | AMD Epyc Rome
30,00,00,00,000 | AWS Graviton 2
Number of Transistors
Until now, we’ve relied on supercomputers—massive classical computers, often with thousands of classical CPU and GPU cores—to
resolve most problems. However, supercomputers aren’t adept at solving certain types of problems that seem easy at first glance, a key
reason why we need quantum computers. Supercomputers don't have the working memory to hold the myriad combinations of real-
world problems. Instead, they analyze each combination one after the other, making the process time consuming. Unlike classical
supercomputers, quantum computers can create vast multidimensional spaces to represent these large problems.
7. Prominent Hardware Players
Canadian Quantum Computing company D-Wave Systems is
considered the world's first company to sell computers to
exploit quantum effects in its operation. On May 11, 2011, it
announced D-Wave One, described as "the world's first
commercially available quantum computer,” operating on a 128-
qubit chipset using quantum annealing. D-Wave does not
implement a generic quantum computer; its computers
implement specialized quantum annealing, a general method
for finding the global minimum of a function by a process using
quantum fluctuations.
In 2015, D-Wave's 2X quantum computer with more than 1,000
qubits was installed at the Quantum Artificial Intelligence Lab at
NASA Ames Research Center. Since then, it has shipped 2,048-
qubits systems. In 2019, D-Wave announced its next-generation
Pegasus quantum processor chip with 15 connections per qubit
instead of 6; the company said that the chip would have more
than 5,000 qubits and produce less noise.
Honeywell is another major tech company making significant
strides in the quantum computing sphere. In June 2020, it
announced its first quantum computer, Model HO machine, with a
record quantum volume of 64. In September 2020, Honeywell
announced a new H1 Model with a quantum volume of 128. IBM
developed quantum volume in 2017 as a hardware-agnostic
method to measure gate-based quantum computers' performance
and assist in the ongoing development of quantum computers.
Honeywell claims that the H1 Model’s quantum volume of 128 is
the highest in the industry as the model has an architecture
consisting of 10 fully connected qubits.
Enterprises can directly access H1 Model via a cloud API &
Microsoft Azure Quantum and channel partners such as Zapata
Computing and Cambridge Quantum Computing. DHL and Merck
are among the companies that have partnered with Honeywell to
use its quantum systems.
8. Google rose to prominence in the quantum
computing world in 2019 when it announced that
Sycamore, its state-of-the-art quantum computer,
achieved "quantum supremacy.” The quantum
computer carried out a specific calculation beyond
the practical capabilities of regular, ‘classical’
machines. As per Google estimates, even the best
classical supercomputer would have taken 10,000
years to complete the calculation.
IBM has been exploring superconducting qubits since the mid-2000s, increasing coherence
times and reducing errors to enable multi-qubit devices from 2010. It claims to have built
the first quantum computer on the cloud in 2016. In 2017, IBM announced that it would
add two 20-qubit machines to its quantum cloud. The same year, IBM declared that it had
constructed a 50-qubit quantum processor. As of 2020, the company made 28 quantum
computers available.
In 2017, IBM was the first company to offer universal quantum computing systems via the
IBM Q Network. The network now includes more than 125 organizations, including
Fortune 500s, startups, research labs, and educational institutions. Partners include
Daimler AG, JPMorgan Chase, and ExxonMobil. Some use cases of IBM's quantum
computers are the simulation of new materials for batteries, model portfolios and
financial risks, and the simulation of chemistry for new energy technologies.
IBM has launched IBM Quantum—an initiative that uses IBM’s full-stack approach,
including Quantum computing systems, together with software tools and cloud services to
build quantum systems for business and science applications. In 2020, the company
achieved a new milestone on its quantum computing roadmap, achieving its highest
quantum volume to date. Combining a series of new software and hardware techniques to
improve overall performance, IBM has upgraded one of its newest 27-qubit client-deployed
systems to achieve a Quantum Volume 64. Currently, it’s working on IBM Quantum Condor,
a 1,000+ qubit device, likely to be launched by the end of 2023.
Eyeing the future, Google has announced that it would
build a “useful, error-corrected quantum computer” by
the end of the decade. While current quantum
computers are made up of less than 100 qubits, Google
is targeting building a machine with 1,000,000 qubits.
Like many other companies investing in quantum
computing, Google plans to offer its commercial-grade
quantum computing services over the cloud. Google
Cloud has announced its collaboration with Quantum
Computing startup IonQ to make its quantum
hardware accessible through its cloud computing
platform.
9. A Monte Carlo Simulation (MCS) is a model used to predict the probability of different outcomes when the intervention of random variables is present. This
technique is used to understand the impact of risk and uncertainty in quantitative analysis and decision-making. MCS provides a decision-maker with a range of
possible outcomes and the probabilities resulting from an action. It shows the extreme possibilities—the outcomes of going for broke and for the most
conservative decision—and all possible consequences of middle-of-the-road decisions. The technique, first used by scientists who worked on the atom bomb
during the Second World War, was named after Monte Carlo, the Monaco resort town renowned for its casinos.
Monte Carlo Simulations
Goldman Sachs tried overcoming this challenge through its partnership with QC Ware, a quantum software provider. Taking a significant
step toward quantum advantage for financial applications, Goldman Sachs and QC Ware researchers have designed quantum algorithms
that outperform classical algorithms for Monte Carlo simulations and can be used on near-term quantum hardware. The research
community is aware of quantum algorithms that can perform Monte Carlo simulations 1,000x faster than classical methods. However,
these algorithms require error-corrected quantum hardware projected to be available in 10–20 years. Current quantum devices have
very high error rates, and they can perform only a few calculation steps accurately before returning incorrect results. The newly
developed algorithm offered ways to speed up Monte Carlo Simulations with Quantum Computing on near-term hardware that is
expected to be available in the next 5–10 years.
Monte Carlo methods are used to evaluate risk and simulate prices for financial instruments that involve complex calculations and consume significant time and
computational resources. Typically, these calculations are executed once overnight. Thus, traders operating in volatile markets are forced to use outdated
results. Providing traders with a high-speed Quantum Computing approach to perform these risk assessments means that simulations could be executed
throughout the day and could transform the way financial markets worldwide operate.
In finance, a fair amount of uncertainty and risk is involved in estimating the future value of figures or amounts because of the wide variety of potential
outcomes. MCS helps reduce the uncertainty in estimating future outcomes and has multiple applications in finance. For example, in the development of
trading systems, MCS refers to the process of using randomized simulated trade sequences to evaluate the statistical properties of a trading system.
10. Quantum Computing is currently confined to the laboratory experiments by a select few technology companies and niche startups.
Commercial adoption of the technology is at least a decade away.
Adoption
Current and Future States
However, this does not essentially imply a long wait before we put Quantum Computing to use. For the next five years, the most likely
approach would be to combine the power of both classical and Quantum Computing into a hybrid computing framework. Hybrid
computing uses the exponential computing power of Quantum Computing to solve parts of a complex problem and combines them with
classical computing methods to solve other parts. In early 2020, CaixaBank developed a machine learning algorithm leveraging hybrid
computing to do better credit risk profiling based on a limited set of 1,000 user profiles. Google’s TensorFlow Quantum combines state-of-
the-art machine learning and Quantum Computing algorithms in a hybrid model.
Quantum computers need to amp up their processing power before their game-changing effects can be brought mainstream. The good
news is that while the processing power in classical computing increases at an exponential scale (measured by Moore’s Law), quantum
computers increase in power at a logarithmic scale (measured by Rose’s Law). A McKinsey study estimates that globally, only 2,000–5,000
quantum computers will be in operation by the end of this decade.
Although the availability of a competent quantum computer is many years away, Amazon Web Services, Google, and Microsoft provide
cloud-based quantum simulation capabilities for researchers and developers to try out potential applications. Amazon Braket is a
managed Quantum Computing service run in partnership with D-Wave, IonQ, and Rigetti. The availability of cloud-based computing
infrastructure and classical computing is expected to fuel adoption and accelerate learning in this decade. The industry expects that
during this period, quantum computational power will grow fast enough to start implementing specific algorithms and simulations to
solve some of the most complex problems in financial services.
11. Quantum Computing is expected to impact multiple areas in the
financial services industry. D-Wave and Accenture have together
identified over 150 use cases across industries, most of which belong to
financial services. Top sectors in financial services where Quantum
Computing can make a critical contribution are Investments, Insurance,
and Lending.
Impact Areas & Early Use Cases
Quantum Computing enables better machine learning models for
credit risk profiling with a smaller set of variables, without the need
for large amounts of training data. Classical algorithms can slow
down with too many variables in a dataset, adversely impacting
performance. Quantum-based models, however, can process millions
of risk scenarios in a fraction of the time and yet deliver highly
accurate assessments.
Multi-period portfolio optimization accounting for transaction costs and
changing market conditions is computationally complex and involves
processing numerous variables. Quantum-based algorithms can process
faster and help make more accurate decisions on the optimal asset mix for
efficient portfolios.
Lending
Investments & Capital Markets
In early 2020, CaixaBank developed the first quantum-based machine
learning algorithm that classifies risks in Spanish banking. Supported
by the Monetary Authority of Singapore (MAS), Tradeteq and
Singapore Management University are exploring the application of
quantum-based neural networks for better credit scoring of
businesses, helping small businesses gain better access to trade
finance.
The sheer processing power of quantum computers can transform High-
Frequency Trading (HFT) by making it more information-driven.
Combinatorial optimization can improve the trading algorithm by reducing
the number of possible solutions. Designing the optimal trading trajectory,
speeding up risk scenario analysis during stress testing of bank balance
sheets, and better asset-liability management are other use cases that
Quantum Computing is expected to transform.
Multiverse Computing, a Spanish startup, has a fairly mature quantum-
inspired portfolio optimization tool that has shown to generate twice the
average ROI compared to classical computational methods, risk and
volatility remaining constant. Multiverse has worked with BBVA and Credit
Agricole Corporate & Investment Bank in this area to achieve quantifiable
results. As adoption improves and costs become manageable, one can
imagine digital-centric fund platforms and robo-advisors becoming active
users of quantum in the realm of investments.
12. Banks Experimenting with Quantum
Computing
Hardware Players
Note: This is not an exhaustive list.
Software Players
13. Table of
Contents
Introduction
What Led to the Exploration of Quantum Computing?
Complementary Fields of Computing Being Explored
Data Representation
Quantum Supremacy and Key Players
Impact on Financial Services
Adoption & Early Use Cases in Financial Services
Limitations
Conclusion
Research Methodology
14. The ‘Quantum Computing in Financial Services’ report is a comprehensive study based on MEDICI’s
proprietary FinTech data on startups, deep market intelligence derived from years of tracking this
segment, and secondary research refined through brainstorming sessions. We reinforced our
inferences for this report through in-depth interviews with segment experts to extract valuable
market signals from the noise, identify market trends, and develop viewpoints.
Research Methodology
MEDICI has ample information in quantitative and qualitative forms, further curated by our industry
analysts. In the secondary research process, we conducted an in-depth study of the global Quantum
Computing landscape by identifying and understanding the key stakeholders, drivers, trends,
challenges, and opportunities. Key sources referred to for secondary research include company and
industry reports, press releases, government and other official sources, and our partners.
Primary research is the foundation of this study. It complements secondary research with insights
from veterans in the computing industry, founders of FinTech companies, venture capitalists, and
other industry influencers. Over a two month-period, multiple interviews were conducted with
industry experts to draw valuable insights for the report.
Research-based qualitative and quantitative findings and insights were curated by MEDICI analysts
to present a comprehensive view of the Quantum Computing landscape. These were further refined
through MEDICI’s years of experience in deeply tracking industry developments and bringing
together the ecosystem.
15. About
MEDICI is the world’s leading FinTech Research and Innovation Platform. MEDICI is a partner to banks, tech companies, and FIs globally with over 13,000 FinTechs on the
platform, enabling FinTechs to scale and create a global economic impact. MEDICI is committed to supporting the complex financial services ecosystem and enabling
stakeholders benefit from the industry’s accelerated growth and global impact.
Website: www.goMEDICI.com | Twitter: @gomedici
Global Contacts
Aditya Khurjekar
CEO & Founder
ak@goMEDICI.com
Amit Goel
Founder & CSO
amit@goMEDICI.com
Authors
Ravi Rathi
Principal, Research
ravi@goMEDICI.com
DISCLAIMER
All third-party trademarks, including logos and icons), referenced by MEDICI remain the property of their respective owners. Unless specifically identified as such, MEDICI’s use of third-party trademarks does not indicate any
relationship, sponsorship, or endorsement between MEDICI and the owners of these trademarks.
Salil Ravindran
Head of Digital Banking & Research
salil@gomedici.com
View report details. Click on the link below.
Sulesh Kumar
FinTech Strategy and Research
sulesh@goMEDICI.com