The biggest danger to Blockchain networks from quantum computing is its ability to break traditional encryption . Google sent shock waves around the internet when it was claimed, had built a quantum computer able to solve formerly impossible mathematical calculations–with some fearing crypto industry could be at risk . Google states that its experiment is the first experimental challenge against the extended Church-Turing thesis — also known as computability thesis — which claims that traditional computers can effectively carry out any “reasonable” model of computation
Lecture of Professor Amlan Chakrabarti, University of Calcutta on : Fundamentals of Quantum Computing, presented at the Quantum Conference organized by the Dept. of IT, Govt. of West Bengal, India on 12th October 2018
Quantum computing is an emerging new theory of computation based on the principles of quantum mechanics. It is the basis for a fundamentally new information processing model that is garnering increasing attention in the media and from commercial information technology companies. In certain computing tasks, it can theoretically arrive at a solution more efficiently than classical computers. In this session, we explore the basic principles behind quantum computing, including qubit superposition and entanglement -- the basis for quantum parallelism. We explore quantum logic gates as an abstracted representation of underlying hardware and discuss a simple quantum gate circuit that demonstrates parallelism. We also review the current state of the technology and what has been demonstrated compared to what is theoretically predicted. Current trends in the quantum computing industry will be presented along with proposed possible uses in biomedical informatics.
The Extraordinary World of Quantum ComputingTim Ellison
Originally presented at QCon London - 6 March-2018.
The classical computer on your lap or housed in your data centre manipulates data represented with a binary encoding -- quantum computers are different. They use atomic level mechanics to represent multiple data states simultaneously, leading to a phenomenal exponential increase in the representable state of data, and new solutions to problems that are infeasible using today's classical computers. This session assumes no prior knowledge of quantum technology and presents a introduction to the field of quantum computing, including an introduction to the quantum bit, the types of problem suited to quantum computing, a demo of running algorithms on IBM's quantum machines, and a peek into the future of quantum computers.
Quantum computers are incredibly powerful machines that take a new approach to processing information. Built on the principles of quantum mechanics, they exploit complex and fascinating laws of nature that are always there, but usually remain hidden from view. By harnessing such natural behavior, quantum computing can run new types of algorithms to process information more holistically. They may one day lead to revolutionary breakthroughs in materials and drug discovery, the optimization of complex manmade systems, and artificial intelligence. We expect them to open doors that we once thought would remain locked indefinitely. Acquaint yourself with the strange and exciting world of quantum computing.
Lecture of Professor Amlan Chakrabarti, University of Calcutta on : Fundamentals of Quantum Computing, presented at the Quantum Conference organized by the Dept. of IT, Govt. of West Bengal, India on 12th October 2018
Quantum computing is an emerging new theory of computation based on the principles of quantum mechanics. It is the basis for a fundamentally new information processing model that is garnering increasing attention in the media and from commercial information technology companies. In certain computing tasks, it can theoretically arrive at a solution more efficiently than classical computers. In this session, we explore the basic principles behind quantum computing, including qubit superposition and entanglement -- the basis for quantum parallelism. We explore quantum logic gates as an abstracted representation of underlying hardware and discuss a simple quantum gate circuit that demonstrates parallelism. We also review the current state of the technology and what has been demonstrated compared to what is theoretically predicted. Current trends in the quantum computing industry will be presented along with proposed possible uses in biomedical informatics.
The Extraordinary World of Quantum ComputingTim Ellison
Originally presented at QCon London - 6 March-2018.
The classical computer on your lap or housed in your data centre manipulates data represented with a binary encoding -- quantum computers are different. They use atomic level mechanics to represent multiple data states simultaneously, leading to a phenomenal exponential increase in the representable state of data, and new solutions to problems that are infeasible using today's classical computers. This session assumes no prior knowledge of quantum technology and presents a introduction to the field of quantum computing, including an introduction to the quantum bit, the types of problem suited to quantum computing, a demo of running algorithms on IBM's quantum machines, and a peek into the future of quantum computers.
Quantum computers are incredibly powerful machines that take a new approach to processing information. Built on the principles of quantum mechanics, they exploit complex and fascinating laws of nature that are always there, but usually remain hidden from view. By harnessing such natural behavior, quantum computing can run new types of algorithms to process information more holistically. They may one day lead to revolutionary breakthroughs in materials and drug discovery, the optimization of complex manmade systems, and artificial intelligence. We expect them to open doors that we once thought would remain locked indefinitely. Acquaint yourself with the strange and exciting world of quantum computing.
An introduction to quantum computing, its history and evolution from concept to commercial quantum computer, and an overview of relevant use in biomedical informatics and medice
-It is a good ppt for a beginner to learn about Quantum
Computer.
-Quantum computer a solution for every present day computing
problems.
-Quantum computer a best solution for AI making
This slide starts from a basic explanation between Bit and Qubit. It then follows with a brief history behind Quantum Computer, current trends, and update with concerns to make the quantum computer practically useful.
A file on Quantum Computing for people with least knowledge about physics, electronics, computers and programming. Perfect for people with management backgrounds. Covers understandable details about the topic.
Quantum Computers are the future and this manual explains the topic in the best possible way.
After Moore’s law-which states that the number of
microprocessors/transistors on an integrated circuit doubles
once every two years at the same cost—is running out of
steam. The question is what might replace it
Gordon Moore’s Law benefits for some degree of expansion.
Already larger smartphones and tablets and improvements in
hardware efficiency are picking up some of the slack as it
becomes harder and harder to fit more transistors on a dense
integrated circuit.
So the Moore’s Law must come to an end because it is a
physical phenomenon governed by the physical limits of the
universe.
To solve for the future we need to design a new type of
computer which, aptly named “Quantum computers”, utilizes
the laws of quantum mechanics to create exponentially greater
processing power and uses a new unit of information called a “
Qubit ”, rather than a bit.
Scientists have already built basic Quantum computers that can
perform certain calculations; but a practical quantum computer
is still years away. In this presentation you’ll learn what a
quantum computer is and for what it’ll be used in the next era of
computing.
Quantum computing is a type of computation that harnesses the collective properties of quantum states, such as superposition, interference, and entanglement, to perform calculations.
This presentation is designed to elucidate about the Quantum Computing - History - Principles - QUBITS - Quantum Computing Models - Applications - Advantages and Disadvantages.
The Quantum computing has become a buzzword now a days, however it has not been the favorite of the researchers until recent times.
Let's follow about
What's Quantum Computing?
It's Evolution
Primary Focus
Future
This is a seminar on Quantum Computing given on 9th march 2017 at CIME, Bhubaneswar by me(2nd year MCA).
Video at - https://youtu.be/vguxg0RYg7M
PDF at - http://www.slideshare.net/deepankarsandhibigraha/quantum-computing-73031375
Building a quantum internet is a key ambition for many countries around the world, such a breakthrough will give them competitive advantage in a promising disruptive technology, and opens a new world of innovations and unlimited possibilities.
Quantum computers are designed to perform tasks much more accurately and efficiently than conventional computers, providing developers with a new tool for specific applications.
It is clear in the short-term that quantum computers will not replace their traditional counterparts; instead, they will require classical computers to support their specialized abilities, such as systems optimization.
An introduction to quantum computing, its history and evolution from concept to commercial quantum computer, and an overview of relevant use in biomedical informatics and medice
-It is a good ppt for a beginner to learn about Quantum
Computer.
-Quantum computer a solution for every present day computing
problems.
-Quantum computer a best solution for AI making
This slide starts from a basic explanation between Bit and Qubit. It then follows with a brief history behind Quantum Computer, current trends, and update with concerns to make the quantum computer practically useful.
A file on Quantum Computing for people with least knowledge about physics, electronics, computers and programming. Perfect for people with management backgrounds. Covers understandable details about the topic.
Quantum Computers are the future and this manual explains the topic in the best possible way.
After Moore’s law-which states that the number of
microprocessors/transistors on an integrated circuit doubles
once every two years at the same cost—is running out of
steam. The question is what might replace it
Gordon Moore’s Law benefits for some degree of expansion.
Already larger smartphones and tablets and improvements in
hardware efficiency are picking up some of the slack as it
becomes harder and harder to fit more transistors on a dense
integrated circuit.
So the Moore’s Law must come to an end because it is a
physical phenomenon governed by the physical limits of the
universe.
To solve for the future we need to design a new type of
computer which, aptly named “Quantum computers”, utilizes
the laws of quantum mechanics to create exponentially greater
processing power and uses a new unit of information called a “
Qubit ”, rather than a bit.
Scientists have already built basic Quantum computers that can
perform certain calculations; but a practical quantum computer
is still years away. In this presentation you’ll learn what a
quantum computer is and for what it’ll be used in the next era of
computing.
Quantum computing is a type of computation that harnesses the collective properties of quantum states, such as superposition, interference, and entanglement, to perform calculations.
This presentation is designed to elucidate about the Quantum Computing - History - Principles - QUBITS - Quantum Computing Models - Applications - Advantages and Disadvantages.
The Quantum computing has become a buzzword now a days, however it has not been the favorite of the researchers until recent times.
Let's follow about
What's Quantum Computing?
It's Evolution
Primary Focus
Future
This is a seminar on Quantum Computing given on 9th march 2017 at CIME, Bhubaneswar by me(2nd year MCA).
Video at - https://youtu.be/vguxg0RYg7M
PDF at - http://www.slideshare.net/deepankarsandhibigraha/quantum-computing-73031375
Building a quantum internet is a key ambition for many countries around the world, such a breakthrough will give them competitive advantage in a promising disruptive technology, and opens a new world of innovations and unlimited possibilities.
Quantum computers are designed to perform tasks much more accurately and efficiently than conventional computers, providing developers with a new tool for specific applications.
It is clear in the short-term that quantum computers will not replace their traditional counterparts; instead, they will require classical computers to support their specialized abilities, such as systems optimization.
This research paper gives an overview of quantum computers – description of their operation, differences between quantum and silicon computers, major construction problems of a quantum computer and many other basic aspects. No special scientific knowledge is necessary for the reader.
The convergence of IoT and Quantum ComputingAhmed Banafa
One of the top candidates to help in securing IoT is Quantum Computing, while the idea of convergence of IoT and Quantum Computing is not a new topic, it was discussed in many works of literature and covered by various researchers, but nothing is close to practical applications so far.
Quantum Computing is not ready yet, it is years away from deployment on a commercial scale.
The Convergence of 5G and Internet of Things (IoT) is the next natural move for two advance technologies built to make users lives convenient, easier and more productive. But before talking about how they will unite we need to understand each of the two technologies.
Simply defined; 5G is the next-generation cellular network compared to 4G, the current standard, which offers speeds ranging from 7 Mbps to 17 Mbps for upload and 12 Mbps to 36 Mbps for download, 5G transmission speeds may be as high as 20 Gbps. Latency will also be close to 10% of 4G transmission, and the number of devices that can be connected scales up significantly which warranted the convergence with IoT.
COVID-19 has impacted countries, communities, and individuals in countless ways, from business and school closures to job losses not to undermined loss of lives.
Quantum teleportation is a technique for transferring quantum information from a sender at one location to a receiver some distance away.
While teleportation is portrayed in science fiction as a means to transfer physical objects from one location to the next, quantum teleportation only transfers quantum information.
The Zero Trust Model of information #security simplifies how #information security is conceptualized by assuming there are no longer “trusted” interfaces, applications, traffic, networks, or users. It takes the old model— “trust but verify”—and inverts it, because recent breaches have proven that when an organization trusts, it doesn’t verify
How blockchain is revolutionizing crowdfundingAhmed Banafa
According to experts, there are five key benefits of crowdfunding platforms: efficiency, reach, easier presentation, built-in PR and marketing, and near-immediate validation of concept, which explains why crowdfunding has become an extremely useful alternative to venture capital (VC), and has also allowed non-traditional projects, such as those started by in-need families or hopeful creatives, a new audience to pitch their cause.
Blockchain technology and supply chain managementAhmed Banafa
Managing today’s supply chains is extremely complex. For many products, the supply chain can span over hundreds of stages, multiple geographical (international) locations, a multitude of invoices and payments, have several individuals and entities involved, and extend over months of time.
8 key tech trends in a post covid-19 world editedAhmed Banafa
COVID-19 has demonstrated the importance of digital readiness, which allows business and people’s life to continue as usual during pandemics.
Building the necessary infrastructure to support a digitized world and stay current in the latest technology will be essential for any business or countryto remain competitive in a post-COVID-19 world.
The COVID-19 coronavirus has impacted countries, communities and individuals in countless ways, from school closures to health-care insurance issues not to undermined loss of lives.
As governments scramble to address these problems, different solutions based on blockchain technologies have sprung up to help deal with the worldwide health crisis. Blockchain will surely not prevent the emergence of new viruses itself, but what it can do is create the first line of rapid protection through a network of connected devices whose primary goal is to remain alert about disease outbreaks.
Therefore, the use of blockchain-enabled platforms can help prevent these pandemics by enabling early detection of epidemics, fast-tracking drug trials, and impact management of outbreaks and treatment.
It’s clear that blockchain will revolutionize operations and processes in many industries and governments agencies if adopted, but its adoption requires time and efforts, in addition blockchain technology will stimulate people to acquire new skills, and traditional business will have to completely reconsider their processes to harvest the maximum benefits from using this promising technology. The following 10 trends will dominate blockchain technology in 2020:
In this webinar Prof. Banafa will discuss in details the use of Blockchain in the following businesses: Insurance; Payments; Internet-of-Things (IoT); Supply Chain; Healthcare; Government; Identity; Advertising; Marketing; Banking.
In this seminar you will listen to in depth explanation of the hottest technologies in 2019 and beyond. Prof. Banafa will discuss each technology its applications and challenges with real life cases. The interaction among all the four technology will be explored with focus on future trends in each of technology. As all technologies can be summarized in one word IBAC (IoT, Blockchain, AI, Cybersecurity) they can be explained with the following words: IoT: senses, Blockchain: remembers, AI: thinks, and Cybersecurity: protects.
Blockchain and Artificial Intelligence are two of the hottest technology trends right now. Even though the two technologies have highly different developing parties and applications, researchers have been discussing and exploring their combination. With both these technologies able to effect and enact upon data in different ways, their coming together makes sense, and it can take the exploitation of data to new levels. At the same time, the integration of machine learning and AI into blockchain, and vice versa, can enhance blockchain’s underlying architecture and boost AI’s potential.
The following list of predictions (Figure 1) explores the state of IoT in 2019 and covering IoT impact on many aspects business and technology including Digital Transformation, Blockchain, AI, and 5G.
Secure and Smart IoT using Blockchain and AIAhmed Banafa
The first 29 pages of my book "Secure and Smart IoT Using Blockchain and AI " Including Forward, Preface, Table of Contents , list of Figures, and Chapter 1. https://www.amazon.com/Secure-Smart-Internet-Things-IoT/dp/8770220301/
The Blockchain Wave in 2019 and BeyondAhmed Banafa
We’re still in the early days of Blockchain as a technology, and so we’re yet to see the full impact that it will have on the world that we live in. Still, it’s already showing potential across a range of industries and started to enter the public consciousness, so the real question is what will happen when Blockchain technology starts to mature.
New trends of IoT in 2018 and beyond (SJSU Conference ) Ahmed Banafa
The Internet of things (IoT) is growing rapidly and 2018 will be a fascinating year for the IoT industry. IoT technology continues to evolve at an incredibly rapid pace. Consumers and businesses alike are anticipating the next big innovation. They are all set to embrace the ground-breaking impact of the Internet of Things on our lives like ATMs that report crimes around them, forks that tell you if you are eating fast, or IP address for each organ of your body for doctors to connect and check
Second line of defense for cybersecurity : BlockchainAhmed Banafa
With the fact that cybercrime and cyber security attacks hardly seem to be out of the news these days and the threat is growing globally.
Nobody would appear immune to malicious and offensive acts targeting computer networks, infrastructures and personal computer devices.
Firms clearly must invest to stay resilient.
Gauging the exact size of cybercrime and putting a precise US dollar value on it is nonetheless tricky.
First line of defense for cybersecurity : AIAhmed Banafa
The year 2017 wasn't a great year for cyber-security; we saw a large number of high-profile cyber attacks; including Uber, Deloitte, Equifax and the now infamous WannaCry ransomware attack, and 2018 started with a bang too with the hacking of Winter Olympics.
The frightening truth about increasingly cyber-attacks is that most businesses and the cybersecurity industry itself are not prepared. Despite the constant flow of security updates and patches, the number of attacks continues to rise.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
2. Quantum Computing and Blockchain :
Facts and Myths
Prof. Ahmed Banafa
IoT-Blockchain-AI Expert | Faculty | Author | Keynote Speaker
Continuing Studies –Stanford
College of Engineering- San Jose State University CA, USA
3. Prof. Ahmed Banafa has extensive research work with
focus on IoT, Blockchain, cybersecurity and AI. He
served as a faculty at well-known universities and
colleges.
He is the recipient of several awards, including
Distinguished Tenured Staff Award, Instructor of the
the year and Certificate of Honor from the City and
County of San Francisco.
He was named as No.1 tech voice to follow by LinkedIn
LinkedIn (with 38k+ followers ), featured in Forbes,
Forbes, IEEE-IoT and MIT Technology Review, with
with frequent appearances on ABC, CBS, NBC, BBC and
and Fox TV and Radio stations.
He studied Electrical Engineering at Lehigh University,
University, Cybersecurity at Harvard University, and
and Digital Transformation at MIT .
Prof. Ahmed Banafa
6. • The biggest danger to Blockchain networks from quantum computing
is its ability to break traditional encryption [3].
7. • Google sent shock waves around the internet when it was claimed,
had built a quantum computer able to solve formerly impossible
mathematical calculations–with some fearing crypto industry could
be at risk [7].
8. • Google states that its experiment is the first experimental challenge
against the extended Church-Turing thesis — also known as
computability thesis — which claims that traditional computers can
effectively carry out any “reasonable” model of computation
9. • What is Quantum Computing?
• Quantum computing is the area of study focused on developing
computer technology based on the principles of quantum theory.
• The quantum computer, following the laws of quantum physics,
would gain enormous processing power through the ability to be in
multiple states, and to perform tasks using all possible permutations
simultaneously [5].
10. • A Comparison of Classical and Quantum Computing
• Classical computing relies, at its ultimate level, on principles
expressed by Boolean algebra. Data must be processed in an
exclusive binary state at any point in time or bits.
11. • While the time that each transistor or capacitor need be either in 0 or
1 before switching states is now measurable in billionths of a second,
there is still a limit as to how quickly these devices can be made to
switch state.
12. • As we progress to smaller and faster circuits, we begin to reach the
physical limits of materials and the threshold for classical laws of
physics to apply.
13. • Beyond this, the quantum world takes over.
• In a quantum computer, a number of elemental particles such as
electrons or photons can be used with either their charge or
polarization acting as a representation of 0 and/or 1.
14. • Each of these particles is known as a quantum bit, or qubit, the
nature and behavior of these particles form the basis of quantum
computing [5].
15. • Quantum Superposition and Entanglement
• The two most relevant aspects of quantum physics are the principles
of superposition and entanglement.
16. • Superposition: Think of a qubit as an electron in a magnetic field.
• The electron's spin may be either in alignment with the field, which is
known as a spin-up state, or opposite to the field, which is known as a
spin-down state.
17. • According to quantum law, the particle enters a superposition of
states, in which it behaves as if it were in both states simultaneously.
Each qubit utilized could take a superposition of both 0 and 1.
18. • Entanglement: Particles that have interacted at some point retain a
type of connection and can be entangled with each other in pairs, in a
process known as correlation.
19. • Knowing the spin state of one entangled particle - up or down - allows
one to know that the spin of its mate is in the opposite direction.
• Quantum entanglement allows qubits that are separated by
incredible distances to interact with each other instantaneously (not
limited to the speed of light).
20. • No matter how great the distance between the correlated particles,
they will remain entangled as long as they are isolated. Taken
together, quantum superposition and entanglement create an
enormously enhanced computing power.
21. • Where a 2-bit register in an ordinary computer can store only one of
four binary configurations (00, 01, 10, or 11) at any given time, a 2-
qubit register in a quantum computer can store all four numbers
simultaneously, because each qubit represents two values.
• If more qubits are added, the increased capacity is expanded
exponentially [5].
22. • Difficulties with Quantum Computers
• Interference - During the computation phase of a quantum
calculation, the slightest disturbance in a quantum system (say a stray
photon or wave of EM radiation) causes the quantum computation to
collapse, a process known as de-coherence.
• A quantum computer must be totally isolated from all external
interference during the computation phase.
23. • Error correction - Given the nature of quantum computing, error
correction is ultra-critical - even a single error in a calculation can
cause the validity of the entire computation to collapse.
• Output observance - Closely related to the above two, retrieving
output data after a quantum calculation is complete risks corrupting
the data.
24. • What is Quantum Supremacy ?
• According to the Financial Times, Google claims to have successfully
built the world’s most powerful quantum computer [7].
• What that means, according to Google’s researchers, is that
calculations that normally take more than 10,000 years to perform,
its computer was able to do in about 200 seconds, and potentially
mean Blockchain, and the encryption that underpins it, could be
broken.
25. • Asymmetric cryptography used in crypto relies on key pairs, namely a
private and public key.
• Public keys can be calculated from their private counterpart, but not
the other way around.
• This is due to the impossibility of certain mathematical problems.
• Quantum computers are more efficient in accomplishing this by
magnitudes, and if the calculation is done the other way then the
whole scheme breaks [3].
26. • It would appear Google is still some way away from building a
quantum computer that could be a threat to Blockchain cryptography
or other encryption.
• "Google's supercomputer currently has 53 qubits," said Dragos Ilie, a
quantum computing and encryption researcher at Imperial College
London.
27. • "In order to have any effect on bitcoin or most other financial systems
it would take at least about 1500 qubits and the system must allow
for the entanglement of all of them," Ilie said.
• Meanwhile, scaling quantum computers is "a huge challenge,"
according to Ilie [1].
28. • Blockchain networks including Bitcoin’s architecture relies on two
algorithms: Elliptic Curve Digital Signature Algorithm (ECDSA) for
digital signatures and SHA-256 as a hash function.
29. • A quantum computer could use Shor’s algorithm [8] to get your
private from your public key, but the most optimistic scientific
estimates say that even if this were possible, it won’t happen during
this decade.
30. • “A 160 bit elliptic curve cryptographic key could be broken on a
quantum computer using around 1000 qubits while factoring the
security-wise equivalent 1024 bit RSA modulus would require about
2000 qubits”.
31. • By comparison, Google's measly 53 qubits are still no match for this
kind of cryptography. According to research paper on the matter
published by Cornell University.
32. • But that isn’t to say that there’s no cause for alarm.
• While the native encryption algorithms used by Blockchain’s
applications are safe for now, the fact is that the rate of
advancements in quantum technology is increasing, and that could, in
time, pose a threat.
• "We expect their computational power will continue to grow at a
double exponential rate," Google researchers.
33. • Quantum Cryptography?
• Quantum cryptography uses physics to develop a cryptosystem
completely secure against being compromised without knowledge of
the sender or the receiver of the messages.
• The word quantum itself refers to the most fundamental behavior of
the smallest particles of matter and energy.
34. • Quantum cryptography is different from traditional cryptographic
systems in that it relies more on physics, rather than mathematics, as
a key aspect of its security model.
35. • Essentially, quantum cryptography is based on the usage of individual
particles/waves of light (photon) and their intrinsic quantum
properties to develop an unbreakable cryptosystem (because it is
impossible to measure the quantum state of any system without
disturbing that system.)
36. • Quantum cryptography uses photons to transmit a key. Once the key
is transmitted, coding and encoding using the normal secret-key
method can take place.
• But how does a photon become a key? How do you attach
information to a photon's spin?
37. • This is where binary code comes into play.
• Each type of a photon's spin represents one piece of information --
usually a 1 or a 0, for binary code.
• This code uses strings of 1s and 0s to create a coherent message.
• For example, 11100100110 could correspond with h-e-l-l-o. So a
binary code can be assigned to each photon -- for example, a photon
that has a vertical spin ( | ) can be assigned a 1.
38. • “If you build it correctly, no hacker can hack the system.
• The question is what it means to build it correctly,” said physicist
Renato Renner from the Institute of Theoretical Physics in Zurich.
39. • Regular, non-quantum encryption can work in a variety of ways but
generally a message is scrambled and can only be unscrambled using
a secret key.
• The trick is to make sure that whomever you’re trying to hide your
communication from doesn’t get their hands on your secret key.
40. • Cracking the private key in a modern crypto system would generally
require figuring out the factors of a number that is the product of
two insanely huge prime numbers.
• The numbers are chosen to be so large that, with the given processing
power of computers, it would take longer than the lifetime of the
universe for an algorithm to factor their product.
41. • Encryption techniques have their vulnerabilities.
• Certain products – called weak keys – happen to be easier to factor
than others.
• Also, Moore’s Law continually ups the processing power of our
computers.
• Even more importantly, mathematicians are constantly developing
new algorithms that allow for easier factorization.
42. • Quantum cryptography avoids all these issues.
• Here, the key is encrypted into a series of photons that get passed
between two parties trying to share secret information.
• The Heisenberg Uncertainty Principle dictates that an adversary can’t
look at these photons without changing or destroying them.
43. • “In this case, it doesn’t matter what technology the adversary has,
they’ll never be able to break the laws of physics,” said physicist
Richard Hughes of Los Alamos National Laboratory in New Mexico,
who works on quantum cryptography [6].