The document discusses quantum computing and some of its key concepts and challenges. It notes that quantum computing may help address problems in fields like life, physics, chemistry and more that classical computing cannot. However, quantum computing is still in early stages and faces challenges like noise and instability. The document also cautions against wild claims about what quantum computing can achieve today, while noting its long term potential if engineering challenges can be overcome.
In a world that appears riven by social media, ill-informed opinion, rumour, and conspiracy theories in preference to facts and established truths, it can be alarming to see scientists, doctors, and engineers challenged by vacuous statements that often hold sway over the hard-won truths of science. Moreover, large numbers of people do not understand the ‘scientific method’ and what makes it so powerful.
Paradoxically, those challenging science and scientists based on their belief systems do so using technologies that can only be furnished by scientific methodologies. For sure; no religion, belief system, great political mind, anarchist, professional protester, or social commentator will produce a TV set, mobile phone, laptop, tablet, supercomputer, MRI Scanner, AI system, or vaccine! But they will criticise, challenge, and be abusive based on their ignorance and inability.
So, this is the world that now influences the minds of young aspiring students, and this presentation is designed to go beyond the simple exposition and statement of the scientific principles and method, to provide an ancient, modern, and forward-looking perspective. It also includes a complex ‘worked example’ to highlight the rigour that must be applied to establish any truth!
We are engaged in an exponentially growing cyber war that we are visibly losing. Within the next 3 years it has been estimated that the global cost will equal, or overtake, the UK GDP, and it is clear that our defences are inadequate and often ineffective. Malware and ransomer-ware continue to extort more money, and cause damage and inconvenience to individuals, organisations and society, whilst hacker groups, criminals and rogue states continue to innovate and maintain their advantage. At the same time, our defences are subverted and rendered ineffective as we operate in a reactive and prescriptive, after the fact, mode with no foresight or anticipation.
In any war it is essential to know and understand as much about the enemy as possible, it is also necessary to establish the truth and validity of any situation or development. Doing this in the cyber domain is orders of magnitude more difficult than the real world, but some of the relevant tools are now available or at an advanced stage of development. For example; fully automated fact checkers and truth engines have been demonstrated, whilst situational awareness technologies are commercially available. However, what is missing is some level of context assessment on a continual basis. Without this we will continue to be ‘blind-sided’ by the actions and developments of the attackers as they maintain their element of surprise along every line of innovation.
What do we need? In short ; a Context Engine that continually monitors networks, servers, routers, machines, devices and people for anomalous behaviours that flag pending attacks as behavioural deviations that are generally easy to detect. In the case of attacker groups we have observed precursor events and trends in network activity days ahead of some big offensive. However, this requires a shift in the defenders thinking and operations away for the reactive and short term, to the long term continual monitoring, data collection and analysis in order to establish threat assessments on a real time.
The behavioural analysis of people, networks and ITC, is at the core of our ‘Context Engine’ solution which completes the triangle of: Truth; Situation; Context Awareness to provide defenders with a fuller and transformative picture. Most of the known precursor elements of this undertaken have been studied in some depth, with some behavioural elements identified on real networks and some physical situations. The unknown can only add more accuracy!
It has been estimated that the global earnings of Cyber Criminals will equal or exceed the GDP of the UK sometime in the 2022/23 window. If this was the capability of a country they would be joining the G8! Clearly, we are losing the Cyber War hands down, and the time has long passed when we might ignore the threat scenarios surrounding us.
In this lecture we examine global networks from home and office through the ‘last mile,’ and on to national and international networks to identify the key vulnerabilities and points of potential ingress. We identify the cyber risks as escalating as we approach the periphery of all forms of network. For the most part, the core/carrier networks are virtually unassailable physically as they are dominated by terrestrial and undersea optical fibre cables.
Throughout the ‘carrier’ network levels the difficulty of physical interception, encryption, routing, and path diversity employed renders them secure in the extreme. Attackers, therefore, tend to focus on the exploitation of people, devices, services, home, and office appliances, and latterly, a poorly engineered IoT.
In reality, we are expanding the attack surface of the planet exponentially without due caution or care in the most exposed sectors and locations. And so, we explore potential tech and operational solutions for the future.
NOTE: This lecture is one of a series that has examined technology design and deployment, devices and the IoT, people fallibility, deviousness, internal and external threats.
In class; RED and BLUE Team Exercises have also been conducted in support of the complete Cyber Security Package to date.
We are engaged in a war the like of which we have never seen or experienced before. Our enemies are invisible and relentless; with globally dispersed forces working at all levels and in all sectors of our societies. They are better organised, resourced, motivated, and adaptive than any of our organisations or institutions, and they are winning. This war is also one of paradox!
“The cost to many nations is now on a par with their GDP”
“No previous war has seen so many suffer so much to (almost) never retaliate”
“We are up against attackers who operate as a virtual (ghost-like) guerrilla army”
“No state can defend its population and organisations, and they stand alone - isolated and exposed”
“A real army/defence force would rehearse and play all day and very occasionally engage in warfare. We, on the other hand, are at war every day but never play, war-game, or anticipate new forms of attack”
To turn this situation around we need to understand our enemies and adopt their tactics and tools as a part of our defence strategy. We also have to be united, and organised so the no one, and no organisation, stands alone. We also have to engage in sharing attack data, experiences and solutions.
All this has to be supported by wargaming, and anticipatory solutions creation.
The good news is; we have better, and more, people, machines, networks, facilities, and expertise than our enemies. All it requires is the embracing of advanced R&D, leadership, sharing, and orchestration on a global scale.
The past 25 years have seen a move toward the convergence of telephone and computer onto a single network. Whilst the telephone network enjoyed a unique and isolated development (and growth) of dedicated circuit switching for near 200 years, computing more naturally ventured into ethernet (packet switching) and on to the internet in just 55years.
So different are these networking concepts that it was originally thought they could never converge. But as the internet grew to outgun the old fixed telephone network and new mobile working, it became economically and technologically clear that convergence (VOIP) was possible and most likely would be transformational.
Having ‘fixed’ the conundrum of real-time communication using uncontrolled packets that introduce variable latency, a new ‘monster’ reared - cybersecurity! Telephone and mobile nets never suffered ‘hacker attacks’ to the same degree as the PC dominated world and so new provisions had to be made. These came in the form of end-to-end packet encryption and layered link encryption with constraints on the number of end-to-end and node-to-node hops.
Today, telephone calls mostly pass through a portion of the internet, PN, PVN, with a shrinking number still originating and terminating at old analog and digital local loops with circuit switches. By and large, the core network is ‘super secure’ and it is in the new digital and old analogue periphery where the major risks reside. Within the next decade the full transformation to all-digital, packet switching, should be complete.
As per the internet; people, insider, malware, Denial of Service (DoS and DDoS) and other forms of attack persist, but the defences developed to combat these are formidable. In this lecture we address the attack scenarios and the defences to date and highlight some of the lesser-known/advertised approaches of both the defenders and the attackers.
The migration of the fundamentally analogue telephone from a circuit switched network to one essential designed for machine communications based on packet switching has not been entirely comfortable. It was not at all obvious that it might work, or indeed, that it might even be possible given the sensitivity of the human ear and mind to artificiality, noise and latency.
After serving humanity for well over 100 years the analogue telephone network and devices have been overtaken by mobile computing devices offering far more facilities and power. So, despite the detailed testing, and charactering of human speech, the design and modelling of device and network abilities, we are saying goodbye to this past.
During to past 40 years a new world has emerge with intelligence and computing power at the edge of networks and not at the core. Layering speech and video on this new ‘internet’ has been a challenge, but now the performance and economics are more than viable. So, in this lecture we trace this history of development and illustrate the tech challenges with a series of audio demonstrations.
In short, we highlight the nature and impact of bandwidth, signal-to-noise ratio, latency, and packet loss through the old analogue to the new digital eras. We also present some ‘off piste’ examples of military and aircraft communications. Throughout we also highlight the key design directions designs, failures and flaws.
From the begging of the industrial revolution, we have built systems and machines on the basis that people will just have to learn about the interface and adjust accordingly. And so the skill of the individual craftsman was overtaken and subverted by the expertise of the ‘operator,’ production line, and mass production enabling us all to do more-and-more with less-and-less, to raise living standards, the health and wealth of individuals and nations.
In effect, we bent humanity into technology to meet the specific needs (and will) of the machines, but to the greater benefit of humanity! But now we stand at the cusp of a new era with AI and Robotics are able to adapt to our individual and most specific needs. That is: machines bend to meet our needs; to empower us as individuals and organisations to do and achieve ever more.
But their remains one last bastion of inconvenience centered on ID and security - often referred to as ‘Password Hell’. We are all awash with multiple Cards, Licences, Visas, Passports, Badges, Codes, PINs, Passwords, User Names, IDs, Log-On, Log-In, Entry, and Exit Protocols! And so it is time to get all of this out of the domain of the human and into the realm of our machines! Today we are in the process of migrating from a nightmare past of our own design, into a biometric world where machines will recognise us and grant us access automatically. And at the fringe some young populations are already being chipped exactly in the same way our pets have been chipped for the past decades.
Apart from the obvious advantage of not having to carry any money or ID of any kind, there is the assurance of extra safety, security and health support wherever we happen to be. It is not available right now, but beyond an ID Chip, we can easily embed, or provide links to, our medical record into the same technology. We, and not just our devices and possessions, also become a part of the IoT!
Of course, for many, they see the threat of a looming dystopian future aka Hollywood! But this will be a choice between convenience and greater security versus what we have today - but that choice has to remain ours! In this presentation we look at the widening spectrum of technologies available and the need to concatenate widely different techniques to exceed the accuracy of DNA and other human/biological parameters
In this lecture is the final session of an extensive wireless course delivered over several weeks at the University of Suffolk. So, by way of ‘rounding-off’ the series, we chart the progression of wireless/radio communication from the first spark transmitters through Carrier -Wave Morse, AM, FM, DSSC, SSB to digital systems along with the use of LW, MW, SW, VHF, UHF and Microwaves. Whilst we focus on Electro-Magnetic-Waves from 30kHz through 300GHz, we also mention optical, ultrasonic, and chemical communication as additional modes.
Our examinations detail the distinct genetic trails of 1, 2, 3G, and 4, 5G, the approximate development cycles/timeline along with distinctive changes in design thinking. We then postulate that 6 and 7G are likely to form a new line of development with 6G probably realised without any towers or any conventional cellular structure. In this context we also point out that there are no digital radios today, only traditional analogue designs with ‘strap-on-modems’ at the transmitter and receiver. Perhaps more radically, we suggest that it is time to adapt fully digital designs that allow for the eradication of the established bands and channels mode of operation.
We also chart the energy hungry progression of systems from 1 through 5G where tower installations are now consuming in excess of 10kW due to the extensive signal processing employed. This immediately debunks any notion of another step in the direction of more bandwidth, lower latency, greater coverage with >20x more towers (than 4G) and >250Bn power hungry smart devices. In short: we propose that 5G is the last of the line and the realisation of 6G demands new thinking and new modes that lead us away from W and mW to µW and nW wireless designs.
Whilst most of the technology required for 6G is available up to 300GHz, there remains one big channel in respect of the growing number of antennas per device and platform. Even for 3 - 5G + WiFi + BlueTooth space is at a premium in mobile devices and fractal antennas have not lived up to their promise too integrate all of these into one wideband structure. However, at 100GHz and above, antennas/dipoles become less than chip size and can see 10s included as phased arrays. But this all needs further work!
Throughout this lecture, we provide examples, demonstrations, and mind-experiments to support our assertions.
In a world that appears riven by social media, ill-informed opinion, rumour, and conspiracy theories in preference to facts and established truths, it can be alarming to see scientists, doctors, and engineers challenged by vacuous statements that often hold sway over the hard-won truths of science. Moreover, large numbers of people do not understand the ‘scientific method’ and what makes it so powerful.
Paradoxically, those challenging science and scientists based on their belief systems do so using technologies that can only be furnished by scientific methodologies. For sure; no religion, belief system, great political mind, anarchist, professional protester, or social commentator will produce a TV set, mobile phone, laptop, tablet, supercomputer, MRI Scanner, AI system, or vaccine! But they will criticise, challenge, and be abusive based on their ignorance and inability.
So, this is the world that now influences the minds of young aspiring students, and this presentation is designed to go beyond the simple exposition and statement of the scientific principles and method, to provide an ancient, modern, and forward-looking perspective. It also includes a complex ‘worked example’ to highlight the rigour that must be applied to establish any truth!
We are engaged in an exponentially growing cyber war that we are visibly losing. Within the next 3 years it has been estimated that the global cost will equal, or overtake, the UK GDP, and it is clear that our defences are inadequate and often ineffective. Malware and ransomer-ware continue to extort more money, and cause damage and inconvenience to individuals, organisations and society, whilst hacker groups, criminals and rogue states continue to innovate and maintain their advantage. At the same time, our defences are subverted and rendered ineffective as we operate in a reactive and prescriptive, after the fact, mode with no foresight or anticipation.
In any war it is essential to know and understand as much about the enemy as possible, it is also necessary to establish the truth and validity of any situation or development. Doing this in the cyber domain is orders of magnitude more difficult than the real world, but some of the relevant tools are now available or at an advanced stage of development. For example; fully automated fact checkers and truth engines have been demonstrated, whilst situational awareness technologies are commercially available. However, what is missing is some level of context assessment on a continual basis. Without this we will continue to be ‘blind-sided’ by the actions and developments of the attackers as they maintain their element of surprise along every line of innovation.
What do we need? In short ; a Context Engine that continually monitors networks, servers, routers, machines, devices and people for anomalous behaviours that flag pending attacks as behavioural deviations that are generally easy to detect. In the case of attacker groups we have observed precursor events and trends in network activity days ahead of some big offensive. However, this requires a shift in the defenders thinking and operations away for the reactive and short term, to the long term continual monitoring, data collection and analysis in order to establish threat assessments on a real time.
The behavioural analysis of people, networks and ITC, is at the core of our ‘Context Engine’ solution which completes the triangle of: Truth; Situation; Context Awareness to provide defenders with a fuller and transformative picture. Most of the known precursor elements of this undertaken have been studied in some depth, with some behavioural elements identified on real networks and some physical situations. The unknown can only add more accuracy!
It has been estimated that the global earnings of Cyber Criminals will equal or exceed the GDP of the UK sometime in the 2022/23 window. If this was the capability of a country they would be joining the G8! Clearly, we are losing the Cyber War hands down, and the time has long passed when we might ignore the threat scenarios surrounding us.
In this lecture we examine global networks from home and office through the ‘last mile,’ and on to national and international networks to identify the key vulnerabilities and points of potential ingress. We identify the cyber risks as escalating as we approach the periphery of all forms of network. For the most part, the core/carrier networks are virtually unassailable physically as they are dominated by terrestrial and undersea optical fibre cables.
Throughout the ‘carrier’ network levels the difficulty of physical interception, encryption, routing, and path diversity employed renders them secure in the extreme. Attackers, therefore, tend to focus on the exploitation of people, devices, services, home, and office appliances, and latterly, a poorly engineered IoT.
In reality, we are expanding the attack surface of the planet exponentially without due caution or care in the most exposed sectors and locations. And so, we explore potential tech and operational solutions for the future.
NOTE: This lecture is one of a series that has examined technology design and deployment, devices and the IoT, people fallibility, deviousness, internal and external threats.
In class; RED and BLUE Team Exercises have also been conducted in support of the complete Cyber Security Package to date.
We are engaged in a war the like of which we have never seen or experienced before. Our enemies are invisible and relentless; with globally dispersed forces working at all levels and in all sectors of our societies. They are better organised, resourced, motivated, and adaptive than any of our organisations or institutions, and they are winning. This war is also one of paradox!
“The cost to many nations is now on a par with their GDP”
“No previous war has seen so many suffer so much to (almost) never retaliate”
“We are up against attackers who operate as a virtual (ghost-like) guerrilla army”
“No state can defend its population and organisations, and they stand alone - isolated and exposed”
“A real army/defence force would rehearse and play all day and very occasionally engage in warfare. We, on the other hand, are at war every day but never play, war-game, or anticipate new forms of attack”
To turn this situation around we need to understand our enemies and adopt their tactics and tools as a part of our defence strategy. We also have to be united, and organised so the no one, and no organisation, stands alone. We also have to engage in sharing attack data, experiences and solutions.
All this has to be supported by wargaming, and anticipatory solutions creation.
The good news is; we have better, and more, people, machines, networks, facilities, and expertise than our enemies. All it requires is the embracing of advanced R&D, leadership, sharing, and orchestration on a global scale.
The past 25 years have seen a move toward the convergence of telephone and computer onto a single network. Whilst the telephone network enjoyed a unique and isolated development (and growth) of dedicated circuit switching for near 200 years, computing more naturally ventured into ethernet (packet switching) and on to the internet in just 55years.
So different are these networking concepts that it was originally thought they could never converge. But as the internet grew to outgun the old fixed telephone network and new mobile working, it became economically and technologically clear that convergence (VOIP) was possible and most likely would be transformational.
Having ‘fixed’ the conundrum of real-time communication using uncontrolled packets that introduce variable latency, a new ‘monster’ reared - cybersecurity! Telephone and mobile nets never suffered ‘hacker attacks’ to the same degree as the PC dominated world and so new provisions had to be made. These came in the form of end-to-end packet encryption and layered link encryption with constraints on the number of end-to-end and node-to-node hops.
Today, telephone calls mostly pass through a portion of the internet, PN, PVN, with a shrinking number still originating and terminating at old analog and digital local loops with circuit switches. By and large, the core network is ‘super secure’ and it is in the new digital and old analogue periphery where the major risks reside. Within the next decade the full transformation to all-digital, packet switching, should be complete.
As per the internet; people, insider, malware, Denial of Service (DoS and DDoS) and other forms of attack persist, but the defences developed to combat these are formidable. In this lecture we address the attack scenarios and the defences to date and highlight some of the lesser-known/advertised approaches of both the defenders and the attackers.
The migration of the fundamentally analogue telephone from a circuit switched network to one essential designed for machine communications based on packet switching has not been entirely comfortable. It was not at all obvious that it might work, or indeed, that it might even be possible given the sensitivity of the human ear and mind to artificiality, noise and latency.
After serving humanity for well over 100 years the analogue telephone network and devices have been overtaken by mobile computing devices offering far more facilities and power. So, despite the detailed testing, and charactering of human speech, the design and modelling of device and network abilities, we are saying goodbye to this past.
During to past 40 years a new world has emerge with intelligence and computing power at the edge of networks and not at the core. Layering speech and video on this new ‘internet’ has been a challenge, but now the performance and economics are more than viable. So, in this lecture we trace this history of development and illustrate the tech challenges with a series of audio demonstrations.
In short, we highlight the nature and impact of bandwidth, signal-to-noise ratio, latency, and packet loss through the old analogue to the new digital eras. We also present some ‘off piste’ examples of military and aircraft communications. Throughout we also highlight the key design directions designs, failures and flaws.
From the begging of the industrial revolution, we have built systems and machines on the basis that people will just have to learn about the interface and adjust accordingly. And so the skill of the individual craftsman was overtaken and subverted by the expertise of the ‘operator,’ production line, and mass production enabling us all to do more-and-more with less-and-less, to raise living standards, the health and wealth of individuals and nations.
In effect, we bent humanity into technology to meet the specific needs (and will) of the machines, but to the greater benefit of humanity! But now we stand at the cusp of a new era with AI and Robotics are able to adapt to our individual and most specific needs. That is: machines bend to meet our needs; to empower us as individuals and organisations to do and achieve ever more.
But their remains one last bastion of inconvenience centered on ID and security - often referred to as ‘Password Hell’. We are all awash with multiple Cards, Licences, Visas, Passports, Badges, Codes, PINs, Passwords, User Names, IDs, Log-On, Log-In, Entry, and Exit Protocols! And so it is time to get all of this out of the domain of the human and into the realm of our machines! Today we are in the process of migrating from a nightmare past of our own design, into a biometric world where machines will recognise us and grant us access automatically. And at the fringe some young populations are already being chipped exactly in the same way our pets have been chipped for the past decades.
Apart from the obvious advantage of not having to carry any money or ID of any kind, there is the assurance of extra safety, security and health support wherever we happen to be. It is not available right now, but beyond an ID Chip, we can easily embed, or provide links to, our medical record into the same technology. We, and not just our devices and possessions, also become a part of the IoT!
Of course, for many, they see the threat of a looming dystopian future aka Hollywood! But this will be a choice between convenience and greater security versus what we have today - but that choice has to remain ours! In this presentation we look at the widening spectrum of technologies available and the need to concatenate widely different techniques to exceed the accuracy of DNA and other human/biological parameters
In this lecture is the final session of an extensive wireless course delivered over several weeks at the University of Suffolk. So, by way of ‘rounding-off’ the series, we chart the progression of wireless/radio communication from the first spark transmitters through Carrier -Wave Morse, AM, FM, DSSC, SSB to digital systems along with the use of LW, MW, SW, VHF, UHF and Microwaves. Whilst we focus on Electro-Magnetic-Waves from 30kHz through 300GHz, we also mention optical, ultrasonic, and chemical communication as additional modes.
Our examinations detail the distinct genetic trails of 1, 2, 3G, and 4, 5G, the approximate development cycles/timeline along with distinctive changes in design thinking. We then postulate that 6 and 7G are likely to form a new line of development with 6G probably realised without any towers or any conventional cellular structure. In this context we also point out that there are no digital radios today, only traditional analogue designs with ‘strap-on-modems’ at the transmitter and receiver. Perhaps more radically, we suggest that it is time to adapt fully digital designs that allow for the eradication of the established bands and channels mode of operation.
We also chart the energy hungry progression of systems from 1 through 5G where tower installations are now consuming in excess of 10kW due to the extensive signal processing employed. This immediately debunks any notion of another step in the direction of more bandwidth, lower latency, greater coverage with >20x more towers (than 4G) and >250Bn power hungry smart devices. In short: we propose that 5G is the last of the line and the realisation of 6G demands new thinking and new modes that lead us away from W and mW to µW and nW wireless designs.
Whilst most of the technology required for 6G is available up to 300GHz, there remains one big channel in respect of the growing number of antennas per device and platform. Even for 3 - 5G + WiFi + BlueTooth space is at a premium in mobile devices and fractal antennas have not lived up to their promise too integrate all of these into one wideband structure. However, at 100GHz and above, antennas/dipoles become less than chip size and can see 10s included as phased arrays. But this all needs further work!
Throughout this lecture, we provide examples, demonstrations, and mind-experiments to support our assertions.
For millennia we have crafted artifacts from bulk materials that we have progressively refined to produce ever more precision tools and products. Latterly, we have crossed a critical threshold where our abilities now eclipse Mother Nature. For example; the smallest transistors in production today have feature sizes down to 2nm which is smaller than a biological virus ~20 - 200nm. The implications for ITC, AI, Robotics, and Production are ever more profound as we approach, and most likely undercut, the scale of the atom ~ 0.1-0.4nm. Not only does this open the door to new technologies, it sees new and remarkable capabilities. So, in this presentation we look at this new Tech Horizon spanning robotics to quantum computing and sensory technologies, and how they will help us realise sustainable futures germane to Industry 4.0, 5.0, and beyond.
Part 1 of this two-part serious was about rethinking and reeducation: ‘Attack Scenarios’ approached the transformation process by getting students to think as if they are attacker so that in Part 2; ‘Defence Scenarios’ they are challenged to get ahead of the game; to anticipate and respond ahead of an attack, by recalling what they did in RED Team mode which gave them the opportunity to design their own criminal empire on screen!
In both Part 1 and Part 2 the detailed discussions occurred in camera and are not for publication or open public access.
Every Industrial revolution has seen the progression from people dominated design, build and production to a higher degrees of automation that has gone hand-in-hand with shortening timescales enabled by ever-more powerful technologies. However, at a fundamental level the process has remained the same, but it is now edging toward a continuum of evolution as opposed to a series of discrete jumps that often trigger company reorganizations. In concert, there is a realization abroad that it is no longer about the biggest, the strongest, the best, or the fittest, it is now all about the survival of the most adaptable.
By and large it is relatively easy to predict when and where tech change will occur and the likely outcomes, in terms of existing and future products and services, but how people, customers, companies and societies will react is an unsolved puzzle. On another plane, competition and threats may well occur outside the sector, from a direction managers are not looking, by entirely new mechanisms, and at a most critical time. These are all challenges indeed!
How to adapt to, and cope with these collective challenges is the focus of this presentation which is illustrated and supported by past and present industrial cases along with the experiences and methodologies of those who have driven/weathered this storm as well as those who failed. Many of the illustrations are automated and there are exemplar movies and segue inserts throughout.
Our communications history is dominated by fixed networks of bounded linear predictability. These were based on precise engineering design giving assured information security, and measured operation. However, mobile devices, internet, social networks, IP, and Apps changed all that! Internets are inherently non-linear, unbounded, and essentially designoid — that is, mostly shaped by evolution, steered by demand/rapid innovation - highly adaptive and ‘learning’ in real time.
So, those who suppose we can control such networks to fully guard and protect the information of institutions and individuals are sadly mistaken. And further confounded by Industry 4.0 and the Internet of Things (IoT). Here, a mix of the information of individuals and things, is distributed across the planet on a scale far larger than ever conceived in the past, to become essential components in the survival of our species in realising sustainable societies.
Not surprising then, Privacy and Data protection are big issues for regulators, governments and civil liberties organisations. But so far, nothing has worked, and we see the UK Data Protection Act, EU-GDPR, EU-USA Shield, and Copyright Laws often ignored or worked around. These are largely derivatives of a paper based world and a pre-computing world are now largely unfit for purpose.
This presentation was created in support of a short keynote for ICGS3-21 (14-15 Jan21) UK to purposely highlight the reasons why we are losing the cyber war and what we have to do to win. The approach adopted quantifies the key weakness and shortcomings of our current defence strategies to give pointers to a more secure future.
In postulating remedies, we purposely fall back on the wisdoms of Sun Tzu and The Art of War to highlight and explain the meaning and implications of quoted insights (below) and their pertinence to modern cyber wars/security.
“To know your Enemy, you must become your Enemy”
In this way, we go beyond opinion and suspicion by quantifying the scale of the individual elements of the cyber security equation using a variant of Drake’s Equation. This gives us a good estimate of the scale of the problems we face. Beyond this we highlight some cultural and political issues that need urgent attention.
Finally, we link to comprehensive presentations going back to 2016 that detail specific Red and Blue team exercises thinking and preparation. These themes were invoked to widen the awareness and thinking in the student body @ The UoS.
For the vast majority of history the progress of our species and civilisation was limited by a very few artisans - the workers of metal, wood, leather and cloth along with famers and distribution networks. Specifically, the number of skilled blacksmiths determined the rate of sword, knife, lance and armour production, and ultimately the size of empires.
The turning point came in the eaten 1700s when the Royal Navy was expanding to explore and colonies the planer. Nails were the problem with more than 20k required per ship! So this was the first item to be mad automatically, followed by wooden blocks for the rigging. The water mills constructed to power the production therefore mark the start of Industry 1.0 and the growth of the British Empire.
The spread of automation through Industry 2, 3 and 4 accelerated and empowered us to do more and more using less and less people, power and materials. Without it we could not support the population of the planet or the lifestyle we enjoy. Remarkably, at no time during this process have we seen mass unemployment, and consistently, more and more jobs have been created. In brief, better production capabilities have seen the creation of better tools, which in turn has led to better productivity and better quality.
The process has been evident in everything hardware, and much of entertainment ,design, and software, with services perhaps the last bastion of human based delivery and support. However; the on-line world and rise of AI are now changing the balance across retail, banking, insurance, accountancy, and services in general.
CyberCrime represents one of the biggest threats to society and human progress to be encountered in the past 70 years. As a business, it is by far the biggest on the planet with a balance sheet that would see it joining the G8 within the next 3 years given its continued exponential growth. With these criminal activities only attracting sensational reporting in the context of stolen passwords and account details, society soldiers on not understanding the detail and not understanding the growing threat. Attacks are tolerated in much the same way as a snowstorm!
Military, national defense, and security organizations, along with police and government can no longer cope and are in large part unable to defend and protect their citizens. The IT industry and those engaged in Cyber Defence are struggling too and remain in a reactive defense mode - mostly responding after the fact/act! The Dark Side not only enjoy the first-mover advantage, they are unbounded by the Law, Ethics, or indeed any constraints!
There are also rogue states and terrorists plus many other groups also leveraging the openness of societies to attack, and often straying into/exploiting criminal resources! At the same time the defenders tend to be far and few on the ground, generally underfunded and resourced, and often unappreciated and poorly paid/rewarded. For sure, it is time to rethink this arena and change our thinking on how we approach defense.
This lecture is Part 1 of a rethink/reeducation process: ‘Attack Scenarios’ approaches the transformation process by getting students to think as if they are attacker so that in Part 2; ‘Defence Scenarios’ they can get ahead of the game to anticipate and respond ahead of an attack. This they do in RED Team mode with an opportunity to design their own criminal empire on screen!
No doubt Aldous Huxley and George Orwell would be pleased to see cameras and surveillance devices everywhere, just as they predicted, but they would then be amazed to find that we buy and install them and become upset if no one is watching! So the Dystopian futures they both predicted and feared are not here yet, but they might just be in the pipeline, and being built a device at a time by us!
Only 70 years ago close observation and surveillance was difficult and very expensive. Today, it is so very cheap, efficient, and everywhere: in our pockets; on our wrists; in our homes, offices, cars, trains, planes; in the streets and on the highways and major roads.
To some degree every country has embraced all the possibilities presented by the technology to make their societies safer and more progressive as organisms, but now here comes AI. Automatic voice, face, finger, eye, action, movement and habit recognition writ large along with all our messages, entertainment, work and recreation patterns monitored 24x7, so inference engines can check if we are good, bad, dangerous, safe, under threat and so on!
Some countries are now employing such technology to judge, sentence, and commit people for criminal acts and ant-social behaviours etc. At this point we have to proceed with care in the recognition that data errors ‘happen’ and human biases can be built in at the birth of such AI systems. Nothing is ever perfect - not people, and certainly not our machines, and we have to progressively drive out bias snd error…
Industries 1.0, 2.0 (and most of) 3.0, saw manufacturing and construction using natural materials readily extracted, refined, amalgamated, machined, and molded. In general, these exhibited fixed mechanical, electrical, and chemical properties. However, the latter stages of Industry 3.0 embraced synthetics exhibiting superior properties to afford new degrees of freedom in the design of structures and products.
Today Industry 4.0 sees further advances with metamaterials, dynamic coatings, controllable properties, and additive manufacturing. Embedded smarts have also made communication between components, products and structures possible under the guise of the IoT. Adaptable materials with a degree of self-repair are also opening the door to further freedoms and less material use. In combination, these represent a big step toward sustainable societies with highly efficient ReUse, RePurposing, and Recycling (3R).
At the leading edge, we are now realising active surfaces that can reflect, absorb, or amplify wireless signals, offer programmable colour, and integral energy storage. But amongst a growing list of possibilities, it is integral sensing & communication that may define this new era. In this presentation, we look at these advances in the context of smart design, cities & societies.
Throughout our education and life we are mostly given a ‘soda-straw’ view of Maths, Physics, Chemistry, Biology, HealthCare, Business and Commerce that conditions us to ‘one concept at a time’ thinking. This is rife in Government and Politics, Industry and Health, and it has been extremely powerful in a now past slow paced and disconnected world. In fact, the speciation of disciplines, topics and problems has largely been responsible for the acceleration and prominence of human progress.
However; in a connected/networked, highly mobile, and tech driven world this simple and narrow minded view is insufficient and dangerous. In common parlance we refer to ‘unintended consequences’ whilst in complex system theory would use the term ‘emergent behaviours’. In brief; education, health, crime, productivity, GDP creation, social cohesion and stability cannot be considered independent variables/properties. They are all related and interdependent. For example; when politicians decide to starve the education system of funds for very young children the impact shows up in health, crime and the economy some 10 - 30 years later!
By analogy; all of this is true of our technologies, industries, lives, and the prospect of sustainable societies. Robots, AI, AL, and Quantum Computing do not stand alone in isolation, they have complementary roles. In this Public Lecture we devote an hour to thinking more holistically what these technologies bring to the party in the context of industry, health, society, sustainable societies and global warming. We then devote a further hour to discussion and debate.
In the context of Global Warming we make the following overriding observations:
“Panic is a poor substitute for thinking”
“Tech is the only exponential capability we enjoy”
“Technology is never a threat, but humans always are”
“Uncertainty always prescribes the precautionary principle”
The majority of cyber attacks against organisations and peoples start with general data about their targets, or very specific data, about one individual who can be used as an access portal to everyone, and everything! Sadly, the majority of attacks appear to be founded on known and published, or simple/very weak passwords that here easy to guess or crack with modest tools.
“I think we can safely assume; ‘Joe Public’ has little knowledge of cyber-security and even less inclination to engage in good security practices. And so, we have a ubiquitous security risk at every level of society with no hope of curing the problem through education and training”
This is compounded by vast libraries of professional papers, web sites, and industry studies that proffer a somewhat confusing range of guidelines and advice largely invisible to, and unhelpful for, the lay population. Probably the ultimate long term solution, in the face of an enemy that is becoming more sophisticated, powerful, and determined by the day, is the full automation through built in biometrics based on face, hand, finger, voice, typing patterns et al. plus a PIN and simple password/’n' factor authentication.
For sure we need an industry based fix; and probably in the form of ‘security as a service’. In the meantime, this presentation addresses what it takes to create ‘fit-for-purpose’ passwords at a device level and on up through Cloud Working. The techniques and guidelines give an assured security spanning trivial documentation through to financial services and state secrets applicable for 2019/20/21. For 2021/22/23 it would be prudent to reassess the advance in attack technologies and techniques, and the change in the success statistics of the Dark Side. It is quite likely that passwords may need strengthening by the addition of additional characters in some cases.
Links to associated/related/earlier slide sets are also provided.
Seventy years on from AI appearing on the public scene and all the optimistic projections have been largely overtaken with systems outgunning humans at all board, card and computer games including Chess, Poker and GO. Of course; general knowledge, medical diagnosis, genetics and proteomics, image and pattern recognition are now all firmly in the grasp of AI.
Interestingly, AI is treading a similar path to computing in that it began with single purpose/task machines that could only deal with a company payroll calculations or banking transactions and nothing more! General purpose computing emerged over further decades to give us the PCs and devices we now enjoy. So, AI currently runs as task specific applications on these general purpose platforms, and no doubt, general purpose AI will also become tractable in a few decades too!
Recent progress has promoted a deal of debate and discussion along with hundreds of published papers and definitions that attempt to characterise biological and artificial intelligence. But they all suffer the same futility and fail! Without reference to any formal characterisation, all discussion and debate remains relatively meaningless.
Somewhat ironically, it was the defence industry that triggered the analysis work here. Two of key steps to success were: the abandonment of all performance comparisons between biological and machine entities; and the avoidance of using the human brain as some ‘golden’ intelligence reference.
This presentation is suitable for professionals and public alike, and comes fully illustrated by high quality graphics, animations and movies. Inevitably, it contains (engineering) mathematics that non-practitioners will have to take on trust, whilst professionals may wish challenge on the basis that the focus on getting a solution rather than the purity of the process!
Every profession, along with education courses, has now been parsed into specialisms - as series of ‘soda straws’ or pipes giving a narrow view and focus with little chance of ‘cross-pollination’. Even IT and Systems Security is now sliced into many different facets spanning coding and encryption through to malware; electronic and physical attacks; technology and people.
Covering all of these specialisms in a single course can be difficult let alone a single lecture. But this lecture attempts to do just that (or at least a large slice of it) in a 3-hour session of two 90min sessions. It is done so against the backdrop of an established set of Security Laws.
The primary objective is to give the student a broad view of the wider threats and how they are perpetrated and linked together. Some technical aspects are not explicitly included, but they are reserved for other detailed sessions.
We are living through an extraordinary pandemic (CV-19) that has changed all the network norms including the way we work and communicate. An invisible consequence has been the transformation of internet and telecoms traffic promoted by people working from home, restrictions on all travel and a paralysis of almost all social norms. Living and working in isolation for 3 - 5 months has become the new mode for many, and even the most technophobic have had to turned to video conferencing and on-line purchases to ‘survive’
From a network point of view the transition has seen the concentrations of traffic in major cities and towns mutate to the dispersed and disparate working, social and entertainment activities that have found the last mile wanting. Insufficient bandwidth connectivity and resilience have quickly become a prime concern with the overloading of core networks a lesser concern.
Installing new optical links and making the core (undersea and overland long-lines) networks more robust is relatively easy as they are by far the most resilient and secure of our infrastructures. It is the local loop, our last mile, that poses the hard to fix problem. In this session we present tested model solutions based on direct ‘dark-fibre’ to home and office with no electronics, splitters or access points in the field. This is augmented by Mesh-Nets and 4/5G providing temporary bridges for random fibre breaks and cable damage.
Education systems across the West have degenerated into a series of memory tests and the quest to hit abstract performance targets and measures. So students that appear well qualified are often unable to apply the most basic of mathematical, scientific, engineering or logical principles, and nor do they have a good appreciation of history or design. This does not bode well for a future of faster change and greater complexity.
“At the most basic level our society it is about the survival of the most adaptable”
For sure; today’s education and learning methodologies have to move toward more experimental and experiential working in order to reinforce the basics whilst engendering far greater understanding. Early specialism has also to be reversed with all students studying a broader range of topics through school and on into college and/or university.
“Education isn’t something you have to get done and dusted - it is a lifelong pursuit”
There is a further need to recognize that the (so-called) academic and practical streams are afforded equal importance! To get the best out of teams/groups all members have to share a common base of understanding and appreciation. In turn, this can be enabled and supported by Just-in-Time education and training-on-line. But there is much more….
Throughout my career in science, engineering and management I attended numerous meeting where many misconceptions and misinterpretations were evident. Perhaps the most expansive and expensive were the probabilities assumed and calculated for system reliability and/or product manufacturing quality. Eventually, I began to refer to this as ‘five nines’ problem!
Not fully understanding the origins of the reliability measures, it is so easy to demand a 99.999% instead of 99.99% up time for an electronic system. What could be easier? At face value it appears to be trivial and straightforward! Likewise, taking a 5s manufacturing plant up to a 6s defect level turns out to be a monumental engineering challenge! And at the time of writing 6s has never been achieved!
It appears that to few engineering and management courses address this topic, and if they do, it is as a scant reference of insufficient depth. So, we see far too many students understand in any depth, if at all! And when they become managers they just ‘don’t get it’!
This presentation and the associated lecture have been specifically created to address this problem with relevance to BSc, BA, MSc and MBA students along with anyone needing a refresher or explicit introduction to the topic. In addition to the graphics, animations and movies, the lecture is also littered with practical examples and the outcomes of case studies.
The precise definition and understanding of Industry 4.0, and how the vital elements are chosen varies widely by industry and country along with a deal of vagueness on the operational detail. This is particularly true of sustainability, new materials, security, IoT, recycling, logistics, integration, and interdependencies. In this short presentation, we highlight how many of the components are critically interdependent.
工业4.0的精确定义和理解以及关键要素的选择方式因行业和国家而异,并且在操作细节上含糊不清。 对于可持续性,新材料,安全性,物联网,回收,物流,集成和相互依存关系尤其如此。 在这个简短的演讲中,我们重点介绍了有多少个组件是相互依赖的。
在此处可以找到支持范围更广的I4.0演示/治疗方法:
A supporting and far broader I4.0 presentation/treatment can be found here:
https://www.slideshare.net/PeterCochrane/why-industry-40
在这里有更多关于物联网的信息:
With more on the IoT here:
https://www.slideshare.net/PeterCochrane/the-iot-for-real
还提供了支持书:
A supporting book is also available:
https://www.springer.com/gp/book/9783030129521
This Presentation was for my talk at Null on Steganography using Python. This only serves as a on screen ppt to the talk. In order to understand this in-detail please follow my page to find the code
Informing Innovation: Contextual Investigation for Effective Academic Technol...char booth
Keynote presentation at the 2013 AMICAL Conference at John Cabot University in Rome, Italy.
Description: In this era of relentless change in higher education and information technology, it is essential to investigate local learning contexts to inform strategic programming and facilitate productive partnerships between libraries and academic institutions. From direct research into user needs and characteristics using environmental scanning, ethnography, and survey methodology to innovative tech-supported collaborations that inform library service models and pedagogy, this talk will explore established and emerging methods for developing an informed orientation to local communities of academic technology practice.
The aspirational visions of Society 5.0 coined by many nations around 2015/16 have now been eclipsed by technological progress and world events including another European war, global warming, climate change and resource shortages. In this new context, the published 5.0 documents now seem naive and simplistic, high on aspiration, and very short on ‘the how’. The stark reality is that the present situation has been induced by our species and our inability to understand and cope with complexity.
“There are no simple solutions to complex problems”
What is now clear is that our route to survival and Society 5.0 will be born of Industry 4.0/5.0 and a symbiosis between Mother Nature, Machines, and Mankind. Today we consume and destroy near 50% more resources than the planet might reasonably support, and merely improving the efficiency of all our processes and what we do will only delay the end point. And so I4.0 is founded on new materials and new processes that are far less damaging, inherently sustainable, and most importantly, readily dispensable across the planet.
“Reversing global warming will not see a climatic reversal to some previously stable state”
In this presentation, we start with the nature of climate change, move on to the technology changes that might save the day, the impact of Industry 4.0/5.0, and then postulate what Society 5.0 might actually look like.
Only 40 years ago, the rate of technologically driven change was such that companies could re-organize efficiently and economically over considerable periods of time, but about 30 years ago this changed as the arrival of new technologies accelerated. We effectively moved from a world of slow periodic changes to one where change became a continuum. The leading-edge sectors were fast to recognize and adopt this new mode of continual adaptation driven by new technologies. This saw these ever more efficient and expansive companies dominating some sectors. For the majority, however, it seems that this transition was not recognized until relatively recently, and a so new movement was born under the banner of digitalization. This not only impacts the way people work, it affects company operations and changes markets, and it does so suddenly!.
Perhaps the most impactive and recent driver of change in this regard has been COVID which saw the adoption of video conferencing and working as a survival imperative in much less than a month. This now stands as a beacon of proof that companies, organizations, and society, can indeed change and adapt to the new at a rate previously considered impossible. The big danger for digitalization programmes now is the simple-minded view that there are singular (magic) solutions that fit every company and organization, but this is not the case. The reality is that the needs and culture of an organization are not the same and may not be uniform from top to bottom.
Manufacturing necessitates very steep hierarchical management structures and tight control to ensure the consistency of the quality of products. On the other hand, a research laboratory or design company requires a low flat management hierarchy and an apparently relaxed level of control. This is absolutely necessary to foster creativity, innovation, and invention. This presentation gives practical examples of management and organizational, extremes. We then go on to highlight the need to embrace AI and Quantum Computing over the coming decade to deal with future technologies, operating
and market complexity.
For millennia we have crafted artifacts from bulk materials that we have progressively refined to produce ever more precision tools and products. Latterly, we have crossed a critical threshold where our abilities now eclipse Mother Nature. For example; the smallest transistors in production today have feature sizes down to 2nm which is smaller than a biological virus ~20 - 200nm. The implications for ITC, AI, Robotics, and Production are ever more profound as we approach, and most likely undercut, the scale of the atom ~ 0.1-0.4nm. Not only does this open the door to new technologies, it sees new and remarkable capabilities. So, in this presentation we look at this new Tech Horizon spanning robotics to quantum computing and sensory technologies, and how they will help us realise sustainable futures germane to Industry 4.0, 5.0, and beyond.
Part 1 of this two-part serious was about rethinking and reeducation: ‘Attack Scenarios’ approached the transformation process by getting students to think as if they are attacker so that in Part 2; ‘Defence Scenarios’ they are challenged to get ahead of the game; to anticipate and respond ahead of an attack, by recalling what they did in RED Team mode which gave them the opportunity to design their own criminal empire on screen!
In both Part 1 and Part 2 the detailed discussions occurred in camera and are not for publication or open public access.
Every Industrial revolution has seen the progression from people dominated design, build and production to a higher degrees of automation that has gone hand-in-hand with shortening timescales enabled by ever-more powerful technologies. However, at a fundamental level the process has remained the same, but it is now edging toward a continuum of evolution as opposed to a series of discrete jumps that often trigger company reorganizations. In concert, there is a realization abroad that it is no longer about the biggest, the strongest, the best, or the fittest, it is now all about the survival of the most adaptable.
By and large it is relatively easy to predict when and where tech change will occur and the likely outcomes, in terms of existing and future products and services, but how people, customers, companies and societies will react is an unsolved puzzle. On another plane, competition and threats may well occur outside the sector, from a direction managers are not looking, by entirely new mechanisms, and at a most critical time. These are all challenges indeed!
How to adapt to, and cope with these collective challenges is the focus of this presentation which is illustrated and supported by past and present industrial cases along with the experiences and methodologies of those who have driven/weathered this storm as well as those who failed. Many of the illustrations are automated and there are exemplar movies and segue inserts throughout.
Our communications history is dominated by fixed networks of bounded linear predictability. These were based on precise engineering design giving assured information security, and measured operation. However, mobile devices, internet, social networks, IP, and Apps changed all that! Internets are inherently non-linear, unbounded, and essentially designoid — that is, mostly shaped by evolution, steered by demand/rapid innovation - highly adaptive and ‘learning’ in real time.
So, those who suppose we can control such networks to fully guard and protect the information of institutions and individuals are sadly mistaken. And further confounded by Industry 4.0 and the Internet of Things (IoT). Here, a mix of the information of individuals and things, is distributed across the planet on a scale far larger than ever conceived in the past, to become essential components in the survival of our species in realising sustainable societies.
Not surprising then, Privacy and Data protection are big issues for regulators, governments and civil liberties organisations. But so far, nothing has worked, and we see the UK Data Protection Act, EU-GDPR, EU-USA Shield, and Copyright Laws often ignored or worked around. These are largely derivatives of a paper based world and a pre-computing world are now largely unfit for purpose.
This presentation was created in support of a short keynote for ICGS3-21 (14-15 Jan21) UK to purposely highlight the reasons why we are losing the cyber war and what we have to do to win. The approach adopted quantifies the key weakness and shortcomings of our current defence strategies to give pointers to a more secure future.
In postulating remedies, we purposely fall back on the wisdoms of Sun Tzu and The Art of War to highlight and explain the meaning and implications of quoted insights (below) and their pertinence to modern cyber wars/security.
“To know your Enemy, you must become your Enemy”
In this way, we go beyond opinion and suspicion by quantifying the scale of the individual elements of the cyber security equation using a variant of Drake’s Equation. This gives us a good estimate of the scale of the problems we face. Beyond this we highlight some cultural and political issues that need urgent attention.
Finally, we link to comprehensive presentations going back to 2016 that detail specific Red and Blue team exercises thinking and preparation. These themes were invoked to widen the awareness and thinking in the student body @ The UoS.
For the vast majority of history the progress of our species and civilisation was limited by a very few artisans - the workers of metal, wood, leather and cloth along with famers and distribution networks. Specifically, the number of skilled blacksmiths determined the rate of sword, knife, lance and armour production, and ultimately the size of empires.
The turning point came in the eaten 1700s when the Royal Navy was expanding to explore and colonies the planer. Nails were the problem with more than 20k required per ship! So this was the first item to be mad automatically, followed by wooden blocks for the rigging. The water mills constructed to power the production therefore mark the start of Industry 1.0 and the growth of the British Empire.
The spread of automation through Industry 2, 3 and 4 accelerated and empowered us to do more and more using less and less people, power and materials. Without it we could not support the population of the planet or the lifestyle we enjoy. Remarkably, at no time during this process have we seen mass unemployment, and consistently, more and more jobs have been created. In brief, better production capabilities have seen the creation of better tools, which in turn has led to better productivity and better quality.
The process has been evident in everything hardware, and much of entertainment ,design, and software, with services perhaps the last bastion of human based delivery and support. However; the on-line world and rise of AI are now changing the balance across retail, banking, insurance, accountancy, and services in general.
CyberCrime represents one of the biggest threats to society and human progress to be encountered in the past 70 years. As a business, it is by far the biggest on the planet with a balance sheet that would see it joining the G8 within the next 3 years given its continued exponential growth. With these criminal activities only attracting sensational reporting in the context of stolen passwords and account details, society soldiers on not understanding the detail and not understanding the growing threat. Attacks are tolerated in much the same way as a snowstorm!
Military, national defense, and security organizations, along with police and government can no longer cope and are in large part unable to defend and protect their citizens. The IT industry and those engaged in Cyber Defence are struggling too and remain in a reactive defense mode - mostly responding after the fact/act! The Dark Side not only enjoy the first-mover advantage, they are unbounded by the Law, Ethics, or indeed any constraints!
There are also rogue states and terrorists plus many other groups also leveraging the openness of societies to attack, and often straying into/exploiting criminal resources! At the same time the defenders tend to be far and few on the ground, generally underfunded and resourced, and often unappreciated and poorly paid/rewarded. For sure, it is time to rethink this arena and change our thinking on how we approach defense.
This lecture is Part 1 of a rethink/reeducation process: ‘Attack Scenarios’ approaches the transformation process by getting students to think as if they are attacker so that in Part 2; ‘Defence Scenarios’ they can get ahead of the game to anticipate and respond ahead of an attack. This they do in RED Team mode with an opportunity to design their own criminal empire on screen!
No doubt Aldous Huxley and George Orwell would be pleased to see cameras and surveillance devices everywhere, just as they predicted, but they would then be amazed to find that we buy and install them and become upset if no one is watching! So the Dystopian futures they both predicted and feared are not here yet, but they might just be in the pipeline, and being built a device at a time by us!
Only 70 years ago close observation and surveillance was difficult and very expensive. Today, it is so very cheap, efficient, and everywhere: in our pockets; on our wrists; in our homes, offices, cars, trains, planes; in the streets and on the highways and major roads.
To some degree every country has embraced all the possibilities presented by the technology to make their societies safer and more progressive as organisms, but now here comes AI. Automatic voice, face, finger, eye, action, movement and habit recognition writ large along with all our messages, entertainment, work and recreation patterns monitored 24x7, so inference engines can check if we are good, bad, dangerous, safe, under threat and so on!
Some countries are now employing such technology to judge, sentence, and commit people for criminal acts and ant-social behaviours etc. At this point we have to proceed with care in the recognition that data errors ‘happen’ and human biases can be built in at the birth of such AI systems. Nothing is ever perfect - not people, and certainly not our machines, and we have to progressively drive out bias snd error…
Industries 1.0, 2.0 (and most of) 3.0, saw manufacturing and construction using natural materials readily extracted, refined, amalgamated, machined, and molded. In general, these exhibited fixed mechanical, electrical, and chemical properties. However, the latter stages of Industry 3.0 embraced synthetics exhibiting superior properties to afford new degrees of freedom in the design of structures and products.
Today Industry 4.0 sees further advances with metamaterials, dynamic coatings, controllable properties, and additive manufacturing. Embedded smarts have also made communication between components, products and structures possible under the guise of the IoT. Adaptable materials with a degree of self-repair are also opening the door to further freedoms and less material use. In combination, these represent a big step toward sustainable societies with highly efficient ReUse, RePurposing, and Recycling (3R).
At the leading edge, we are now realising active surfaces that can reflect, absorb, or amplify wireless signals, offer programmable colour, and integral energy storage. But amongst a growing list of possibilities, it is integral sensing & communication that may define this new era. In this presentation, we look at these advances in the context of smart design, cities & societies.
Throughout our education and life we are mostly given a ‘soda-straw’ view of Maths, Physics, Chemistry, Biology, HealthCare, Business and Commerce that conditions us to ‘one concept at a time’ thinking. This is rife in Government and Politics, Industry and Health, and it has been extremely powerful in a now past slow paced and disconnected world. In fact, the speciation of disciplines, topics and problems has largely been responsible for the acceleration and prominence of human progress.
However; in a connected/networked, highly mobile, and tech driven world this simple and narrow minded view is insufficient and dangerous. In common parlance we refer to ‘unintended consequences’ whilst in complex system theory would use the term ‘emergent behaviours’. In brief; education, health, crime, productivity, GDP creation, social cohesion and stability cannot be considered independent variables/properties. They are all related and interdependent. For example; when politicians decide to starve the education system of funds for very young children the impact shows up in health, crime and the economy some 10 - 30 years later!
By analogy; all of this is true of our technologies, industries, lives, and the prospect of sustainable societies. Robots, AI, AL, and Quantum Computing do not stand alone in isolation, they have complementary roles. In this Public Lecture we devote an hour to thinking more holistically what these technologies bring to the party in the context of industry, health, society, sustainable societies and global warming. We then devote a further hour to discussion and debate.
In the context of Global Warming we make the following overriding observations:
“Panic is a poor substitute for thinking”
“Tech is the only exponential capability we enjoy”
“Technology is never a threat, but humans always are”
“Uncertainty always prescribes the precautionary principle”
The majority of cyber attacks against organisations and peoples start with general data about their targets, or very specific data, about one individual who can be used as an access portal to everyone, and everything! Sadly, the majority of attacks appear to be founded on known and published, or simple/very weak passwords that here easy to guess or crack with modest tools.
“I think we can safely assume; ‘Joe Public’ has little knowledge of cyber-security and even less inclination to engage in good security practices. And so, we have a ubiquitous security risk at every level of society with no hope of curing the problem through education and training”
This is compounded by vast libraries of professional papers, web sites, and industry studies that proffer a somewhat confusing range of guidelines and advice largely invisible to, and unhelpful for, the lay population. Probably the ultimate long term solution, in the face of an enemy that is becoming more sophisticated, powerful, and determined by the day, is the full automation through built in biometrics based on face, hand, finger, voice, typing patterns et al. plus a PIN and simple password/’n' factor authentication.
For sure we need an industry based fix; and probably in the form of ‘security as a service’. In the meantime, this presentation addresses what it takes to create ‘fit-for-purpose’ passwords at a device level and on up through Cloud Working. The techniques and guidelines give an assured security spanning trivial documentation through to financial services and state secrets applicable for 2019/20/21. For 2021/22/23 it would be prudent to reassess the advance in attack technologies and techniques, and the change in the success statistics of the Dark Side. It is quite likely that passwords may need strengthening by the addition of additional characters in some cases.
Links to associated/related/earlier slide sets are also provided.
Seventy years on from AI appearing on the public scene and all the optimistic projections have been largely overtaken with systems outgunning humans at all board, card and computer games including Chess, Poker and GO. Of course; general knowledge, medical diagnosis, genetics and proteomics, image and pattern recognition are now all firmly in the grasp of AI.
Interestingly, AI is treading a similar path to computing in that it began with single purpose/task machines that could only deal with a company payroll calculations or banking transactions and nothing more! General purpose computing emerged over further decades to give us the PCs and devices we now enjoy. So, AI currently runs as task specific applications on these general purpose platforms, and no doubt, general purpose AI will also become tractable in a few decades too!
Recent progress has promoted a deal of debate and discussion along with hundreds of published papers and definitions that attempt to characterise biological and artificial intelligence. But they all suffer the same futility and fail! Without reference to any formal characterisation, all discussion and debate remains relatively meaningless.
Somewhat ironically, it was the defence industry that triggered the analysis work here. Two of key steps to success were: the abandonment of all performance comparisons between biological and machine entities; and the avoidance of using the human brain as some ‘golden’ intelligence reference.
This presentation is suitable for professionals and public alike, and comes fully illustrated by high quality graphics, animations and movies. Inevitably, it contains (engineering) mathematics that non-practitioners will have to take on trust, whilst professionals may wish challenge on the basis that the focus on getting a solution rather than the purity of the process!
Every profession, along with education courses, has now been parsed into specialisms - as series of ‘soda straws’ or pipes giving a narrow view and focus with little chance of ‘cross-pollination’. Even IT and Systems Security is now sliced into many different facets spanning coding and encryption through to malware; electronic and physical attacks; technology and people.
Covering all of these specialisms in a single course can be difficult let alone a single lecture. But this lecture attempts to do just that (or at least a large slice of it) in a 3-hour session of two 90min sessions. It is done so against the backdrop of an established set of Security Laws.
The primary objective is to give the student a broad view of the wider threats and how they are perpetrated and linked together. Some technical aspects are not explicitly included, but they are reserved for other detailed sessions.
We are living through an extraordinary pandemic (CV-19) that has changed all the network norms including the way we work and communicate. An invisible consequence has been the transformation of internet and telecoms traffic promoted by people working from home, restrictions on all travel and a paralysis of almost all social norms. Living and working in isolation for 3 - 5 months has become the new mode for many, and even the most technophobic have had to turned to video conferencing and on-line purchases to ‘survive’
From a network point of view the transition has seen the concentrations of traffic in major cities and towns mutate to the dispersed and disparate working, social and entertainment activities that have found the last mile wanting. Insufficient bandwidth connectivity and resilience have quickly become a prime concern with the overloading of core networks a lesser concern.
Installing new optical links and making the core (undersea and overland long-lines) networks more robust is relatively easy as they are by far the most resilient and secure of our infrastructures. It is the local loop, our last mile, that poses the hard to fix problem. In this session we present tested model solutions based on direct ‘dark-fibre’ to home and office with no electronics, splitters or access points in the field. This is augmented by Mesh-Nets and 4/5G providing temporary bridges for random fibre breaks and cable damage.
Education systems across the West have degenerated into a series of memory tests and the quest to hit abstract performance targets and measures. So students that appear well qualified are often unable to apply the most basic of mathematical, scientific, engineering or logical principles, and nor do they have a good appreciation of history or design. This does not bode well for a future of faster change and greater complexity.
“At the most basic level our society it is about the survival of the most adaptable”
For sure; today’s education and learning methodologies have to move toward more experimental and experiential working in order to reinforce the basics whilst engendering far greater understanding. Early specialism has also to be reversed with all students studying a broader range of topics through school and on into college and/or university.
“Education isn’t something you have to get done and dusted - it is a lifelong pursuit”
There is a further need to recognize that the (so-called) academic and practical streams are afforded equal importance! To get the best out of teams/groups all members have to share a common base of understanding and appreciation. In turn, this can be enabled and supported by Just-in-Time education and training-on-line. But there is much more….
Throughout my career in science, engineering and management I attended numerous meeting where many misconceptions and misinterpretations were evident. Perhaps the most expansive and expensive were the probabilities assumed and calculated for system reliability and/or product manufacturing quality. Eventually, I began to refer to this as ‘five nines’ problem!
Not fully understanding the origins of the reliability measures, it is so easy to demand a 99.999% instead of 99.99% up time for an electronic system. What could be easier? At face value it appears to be trivial and straightforward! Likewise, taking a 5s manufacturing plant up to a 6s defect level turns out to be a monumental engineering challenge! And at the time of writing 6s has never been achieved!
It appears that to few engineering and management courses address this topic, and if they do, it is as a scant reference of insufficient depth. So, we see far too many students understand in any depth, if at all! And when they become managers they just ‘don’t get it’!
This presentation and the associated lecture have been specifically created to address this problem with relevance to BSc, BA, MSc and MBA students along with anyone needing a refresher or explicit introduction to the topic. In addition to the graphics, animations and movies, the lecture is also littered with practical examples and the outcomes of case studies.
The precise definition and understanding of Industry 4.0, and how the vital elements are chosen varies widely by industry and country along with a deal of vagueness on the operational detail. This is particularly true of sustainability, new materials, security, IoT, recycling, logistics, integration, and interdependencies. In this short presentation, we highlight how many of the components are critically interdependent.
工业4.0的精确定义和理解以及关键要素的选择方式因行业和国家而异,并且在操作细节上含糊不清。 对于可持续性,新材料,安全性,物联网,回收,物流,集成和相互依存关系尤其如此。 在这个简短的演讲中,我们重点介绍了有多少个组件是相互依赖的。
在此处可以找到支持范围更广的I4.0演示/治疗方法:
A supporting and far broader I4.0 presentation/treatment can be found here:
https://www.slideshare.net/PeterCochrane/why-industry-40
在这里有更多关于物联网的信息:
With more on the IoT here:
https://www.slideshare.net/PeterCochrane/the-iot-for-real
还提供了支持书:
A supporting book is also available:
https://www.springer.com/gp/book/9783030129521
This Presentation was for my talk at Null on Steganography using Python. This only serves as a on screen ppt to the talk. In order to understand this in-detail please follow my page to find the code
Informing Innovation: Contextual Investigation for Effective Academic Technol...char booth
Keynote presentation at the 2013 AMICAL Conference at John Cabot University in Rome, Italy.
Description: In this era of relentless change in higher education and information technology, it is essential to investigate local learning contexts to inform strategic programming and facilitate productive partnerships between libraries and academic institutions. From direct research into user needs and characteristics using environmental scanning, ethnography, and survey methodology to innovative tech-supported collaborations that inform library service models and pedagogy, this talk will explore established and emerging methods for developing an informed orientation to local communities of academic technology practice.
The aspirational visions of Society 5.0 coined by many nations around 2015/16 have now been eclipsed by technological progress and world events including another European war, global warming, climate change and resource shortages. In this new context, the published 5.0 documents now seem naive and simplistic, high on aspiration, and very short on ‘the how’. The stark reality is that the present situation has been induced by our species and our inability to understand and cope with complexity.
“There are no simple solutions to complex problems”
What is now clear is that our route to survival and Society 5.0 will be born of Industry 4.0/5.0 and a symbiosis between Mother Nature, Machines, and Mankind. Today we consume and destroy near 50% more resources than the planet might reasonably support, and merely improving the efficiency of all our processes and what we do will only delay the end point. And so I4.0 is founded on new materials and new processes that are far less damaging, inherently sustainable, and most importantly, readily dispensable across the planet.
“Reversing global warming will not see a climatic reversal to some previously stable state”
In this presentation, we start with the nature of climate change, move on to the technology changes that might save the day, the impact of Industry 4.0/5.0, and then postulate what Society 5.0 might actually look like.
Only 40 years ago, the rate of technologically driven change was such that companies could re-organize efficiently and economically over considerable periods of time, but about 30 years ago this changed as the arrival of new technologies accelerated. We effectively moved from a world of slow periodic changes to one where change became a continuum. The leading-edge sectors were fast to recognize and adopt this new mode of continual adaptation driven by new technologies. This saw these ever more efficient and expansive companies dominating some sectors. For the majority, however, it seems that this transition was not recognized until relatively recently, and a so new movement was born under the banner of digitalization. This not only impacts the way people work, it affects company operations and changes markets, and it does so suddenly!.
Perhaps the most impactive and recent driver of change in this regard has been COVID which saw the adoption of video conferencing and working as a survival imperative in much less than a month. This now stands as a beacon of proof that companies, organizations, and society, can indeed change and adapt to the new at a rate previously considered impossible. The big danger for digitalization programmes now is the simple-minded view that there are singular (magic) solutions that fit every company and organization, but this is not the case. The reality is that the needs and culture of an organization are not the same and may not be uniform from top to bottom.
Manufacturing necessitates very steep hierarchical management structures and tight control to ensure the consistency of the quality of products. On the other hand, a research laboratory or design company requires a low flat management hierarchy and an apparently relaxed level of control. This is absolutely necessary to foster creativity, innovation, and invention. This presentation gives practical examples of management and organizational, extremes. We then go on to highlight the need to embrace AI and Quantum Computing over the coming decade to deal with future technologies, operating
and market complexity.
Telecom customer services appear to be stuck in the early 20th Century with the telephone call the primary channel for service provision that can take days to affect. Compare that to Google, Amazon, IBM, Apple and other modern companies where customers control service provision by the minute or second.
Modem business is driven by the accumulation of customer data, but the Telecom Industry sees vast amounts of customer-related data dormant and untapped. As a result, many new opportunities are lost. For example, the behavior of people, devices, systems, and networks give the earliest indicators of potential security problems.
OTT operators exploit networks and make far greater profits than any other sector and this might be further amplified by the roll-out of 5G. But without a fundamental rethink of FTTP, 5G will fail to deliver sufficient coverage and the advertised data rates. This pending failure is already seeing alternative solutions from outside the industry along with the realization that most ‘things’ on the IoT will never connect to the internet!
Predicting digital futures a sector at a time is relatively easy, but in a networked world driven by accelerating technologies this is insufficient. Sectors do not operate in isolation, they are connected, and as technology advances the boundaries morph, with whole industries overtaken and pushed aside. At the same time old jobs lose relevance and new skills are required, but in aggregate ever more people are employed. Today there is no country, no matter how big or rich, that has all the raw materials and people required to power its industries, healthcare systems, farming and food production, or indeed educational institutions. Insourcing, outsourcing, and globalisation are the result, and they are about to be augmented by global networking of facilities, skills and abilities
We have never known or understood so much about our world, and nor have we enjoyed the capabilities bestowed by modern technology. But keeping up to date, acquiring the right knowledge and skills is a growing challenge as ‘the world of the simple’ evaporates and complexity takes over.
“There are plenty of simple solutions to complex problems, but they are all wrong”
Preparing for change whilst coping with the status quo now presents many new challenges way beyond human ability and we have to partner with machines to aid our decisions. For organisations it is essential to find and employ the right people, and for people it is necessary to become ever more flexible and adaptable whilst continually acquiring pertinent capabilities.
“AI and robots are not going to push us aside, but they will change everything”
No man is an island, and neither is any country, company or institution. A digital and connected global interdependency now governs the fortunes of our species as technology empowers us at every level. In this presentation we highlight a small sample of the technologies on the horizon, the jobs they will destroy, enhance and create.
It is hard to understate the importance of ‘Thermodynamics’ in providing an almost complete (Grand Unified Theory) picture of the inner physics of energy transfer spanning machines and chemistry thro information.
Apparently, Einstein had two favourite theories: General Relativity and Thermodynamics! He championed both because of their ‘beauty’, completeness, and emergent properties purely derived from the fundamental consideration of how the universe works.
The origins of this topic mainly reside in the Industrial revolution and the realisation that the early machinery was grossly inefficient. E.G. Engines were only converting the energy consumed to ~2% of useful work output. This drew the attention of Savery (1698), Newcomen (1712), Carnot (1769), and for the next 200 years the conundrum of lost energy occupied many of the greatest scientific minds. This culminated in Rudolf Clausius (~1850)publishing his theory of Thermodynamics with further refinement by Boltzmann (1872).
Why was all this so important? In the 1700s a ‘beam engine’ weighing in at >20 tons consumed vast amounts of coal, to deliver an output ~10hp. Today a Turbofan jet Engine can deliver >30k hp at a weight of ~6 tons. This is the difference between working with little understanding, and today where our knowledge is far more complete. Our latest challenges tend around non-linear loss mechanisms associated with turbulent air and fuel flow.. And like many other fields we have to step beyond our generalise mathematical models and turn to the power of our computers for deeper insights.
Ultimately all machines, mechanisms, computing processes and information itself, involve the transformation of matter and/or bits, and thus they are Entropic and subject to the theory of Thermodynamics. This lecture therefore presents a foundation spanning the history and progress to date in preparation for the embracing other science and engineering disciplines.
When people are exposed to the new for the first time their reaction, quite rightly, is generally one of caution and perhaps a degree of suspicion. And, when that ‘new born’ is a novel technology, reactions can quickly become amplified and biased toward the dystopian by the sensationalism of media and mis-information of social networks. In this modern era I think we can also safely assume that Hollywood has more than a ‘bit part’ in nurturing extreme reactions with movies such as Terminator, AI and Ex-Machina.
Our purpose here is to dispel the modern myth that technology is, or can be, inherently evil and a direct threat to humanity. We do so by positing three basic axioms:
“Without technology we would know and understand
almost nothing”
“The greatest threat to humanity is humanity”
“If technology progress and societal advance stall, then civilisations collapse”
Having briefly establishing these in the context of our wider history, we focus on the Industrial Revolutions and their beneficial upside and consequential negatives. We then move on to examine Robotics, Artificial Intelligence, Artificial Life, and Quantum Computing in the context of our current needs and realising sustainable futures, and the survival of our civilisation.
"Part of the research community thinks that it is still early to tackle the development of quantum software engineering techniques. The reason is that how the quantum computers of the future will look like is still unknown. However, there are some facts that we can affirm today: 1) quantum and classical computers will coexist, each dedicated to the tasks at which they are most efficient. 2) quantum computers will be part of the cloud infrastructure and will be accessible through the Internet. 3) complex software systems will be made up of smaller pieces that will collaborate with each other. 4) some of those pieces will be quantum, therefore the systems of the future will be hybrid. 5) the coexistence and interaction between the components of said hybrid systems will be supported by service composition: quantum services.
This talk analyzes the challenges that the integration of quantum services poses to Service Oriented Computing."
The biggest force for social change since the first industrial revolution has been adjusting to, and taking advantage of, the new and accelerating capabilities of our advancing technologies. And in our entire history, the dominant technology driver has been silicon-based electronics. It has prompted revolutions in Computing, Telecoms, Automation, AI, and Robotics that radically changed the human condition. Today, that same exponential revolution is accelerating us into Industry 4.0 and onto Industry 5.0.
The consequential transformation of medicine, industrial design and production, farming, food, processing, supply and demand has seen living standards improve and life expectancy widen. Many of our institutions have also seen tech-driven transformations in line with industry. If there has been a down-side to this progression, it has been our inability to transform the workforce ahead of new demands. Unemployment has persisted whilst reeducation and retraining have been on the back foot, whilst, the net creation of new jobs has always exceeded the demise of the old. As a result, leading countries in the first world now have labour shortages at all levels right across the spectrum.
Recently, COVID-19 has demonstrated that we have the technology and we can rapidly reorganise and change society if we have to. So in this presentation, we examine ‘the force functions’ and changes engineered to date, and then peer over the horizon to sample what is to come in terms of technologies and working practices…
Data mining and analysis has been dominated by the big looking at the small. Businesses, institutions and governments examine our habits with an eye to commercial opportunities, welfare, and security. However, big data is migrating analysis into the arena of networking and association to enhance services: advertising, ‘pre-selling,’ healthcare, security and tax avoidance reduction. But this leaves the critical arena of Small Data unaddressed - the small looking at the small - individuals and things examining and exploiting their own data.
Here we consider a future of ubiquitous tagging, sensors, measuring and networked monitoring powered by the IoT. Key conclusions see many devices talking to each other at close range with little (or no) need of internet connection, and more network connections generated between things than those on the net.
Artificial Intelligence – Case-based reasoning for recommender systems – Invi...Thomas Roth-Berghofer
Artificial Intelligence is mimicking cognitive abilities. Experience guides us in our learning efforts and is one of the most important assets for problem solving. Experience is everywhere. For example, a recording technician needs experience in the studio to produce a recording worth listening to. Does the recording sound full and rich or still too tinny? Does the bass section sound overwhelming? Experience — my own or someone else’s — can help me solve a current problem, for example, in the recording studio. Case-based reasoning, a methodology in which experience is expressed in the form cases, allows transferring and applying expert knowledge where needed.
Canary Deployments on Amazon EKS with Istio - SRV305 - Chicago AWS SummitAmazon Web Services
Within complex systems, even well-written code can behave in unexpected ways and lead to outages and critical issues. Amazon Elastic Container Service for Kubernetes (Amazon EKS) enables you to easily run Kubernetes, quickly deploy new code, and revert to safe, stable releases when issues are identified. But the damage done in the short period between deployment and rollback can be significant. In this session, we show you how to limit the effect of unforeseen issues using canary deployments with Istio and how to better monitor your applications in Amazon EKS and spot potential problems before they affect your customer base. This session is brought to you by AWS partner, Datadog.
Instrumentation as a Living Documentation: Teaching Humans About Complex SystemsBrian Troutwine
Instrumentation of Complex Systems is necessary and addresses the issues of static documentation of said systems. Instrumentation is flawed, flaws which are resolvable with an intentional kind of documentation.
Given at Write the Docs, Portland OR 2014.
Recently, it has become increasingly evident that we have engineers and scientists reaching a professional level of practice without a clear understanding of the scientific method, its origins, and its fundamental workings. There also appears to be a lack of appreciation of our total dependence on the truths that science continually reveals. How this situation ensued appears to vary from country to country, and the flavour of education system encountered by students. But a common complaint is the progressive dumbing down of the science curriculum along with a dire shortage of qualified teachers. This also seems to be compounded with the increasing speciation of science and engineering into narrower and narrower disciplines. So this situation (crisis?) prompted a request for a corrective series of foundation lectures focussed on healing these educational flaws across relevant disciplines, graduating and practicing levels. This then is the first in this foundation series.
In most developed nations the proportion of old people is increasing along with their demands on healthcare services as they transit toward their eventual exit from this life. People no longer, live, work, retire and die in short order! Far more likely, they experience a series of complex, and often protracted, episodes an a concatenation of individual organ failure.
We therefore see a growing healthcare crisis across the First World with politicians resorting to very simple/similar ‘spend more, train more, and support more’ solutions. But this lacks any deep analysis. Reality is that no amount of money or people will cure this - it is a self sustaining loop of medical advance, improving survival rates, longer life spans, falling birth rates, fewer young people of sufficient talents, and reducing tax returns!
“This is an complex (non-linear) problem & there are no simple solutions”
Doing more with less, but far better, at a lower cost, by continually exploiting the latest technology is something already been pioneered/experienced by industry. It is the basic mechanism that now powers our progress - including many supporting healthcare technologies. This general principle is now a long overdue essential for healthcare professionals and patients; and absolutely necessary, if are to see any significant improvement in services.
Here we present examples of technologies that are available toady and most likely to be available in the next decade along with some necessary and key behavioural and responsibility changes.
What the Flash Crash & Black Boxes can teach us about the Search #searchlove ...Kelvin Newman
May 6th, 2010 the Dow Jones Industrial Average plunged about 1000 points only to recover those losses within minutes – this was the Flash Crash. No catastrophes or physical events caused this swing, it was the black boxes of stock market algorithms. Black boxes a lot like Google’s. How do we prepare for the future when even Google doesn’t know how its algorithm works?
10 Billion a Day, 100 Milliseconds Per: Monitoring Real-Time Bidding at AdRollBrian Troutwine
This is the talk I gave at Erlang Factory SF Bay Area 2014. In it I discussed the instrumentation by default approach taken in the AdRoll real-time bidding team, discuss the technical details of the libraries we use and lessons learned to adapt your organization to deal with the onslaught of data from instrumentation.
It should be no surprise that AI is treading a similar path to computing which began with single-purpose machines tasked for payroll calculations, banking transactions, or weapons targeting et al, but nothing more! It took decades for General Purpose Computing to emerge in the form of the now ubiquitous PC. Today, AI is still in a single-purpose/task-specific phase, and we have no general-purpose platforms, but their emergence is only a matter of time!
Recent AI progress has seen a repeat of the media debate and alarmist warnings for our computing past, compounded by consequential advances in robotics. In turn, this has promoted numerous attempts to draw biological equivalences defining the time when machines will overtake humans. But without any workable definitions or framework that tend to little more than un/educated guesses. Recourse to IQ measures and the Touring test have proved to be irrelevant, and without a reference framework or formal characterisation, continued discussion and debate remain futile
We therefore approach this AI problem from the bottom up by defining the simplest of machines and lifeforms to derive clues, pointers and basic boundary conditions . This sees a fundamental Entropic description emerge that is applicable to both machine and lifeforms.
This presentation is suitable for professionals and the public alike, and is fully illustrated by high-quality graphics, animations and, movies. Inevitably, it contains some mathematics that non-practitioners will have to take on trust, but the focus is on defining the key characteristics, parameters, and important features of AI, our total dependence, and the future!
Note: A 40 min session for a predominantly ley audience and not all the slides presented here were used on the day. Their inclusion here is in response to those audience members requesting more detail at the end of/during the event.
Past civilisations have nurtured small populations of those trying to understand and manipulate nature to some advantage in materials, tools, weapons, food, and wealth. However, they never formed communities and lacked the means of recording, communicating, and sharing successes and failures. They also lacked a common framework/philosophy to qualify them as scientists, but that all began to change in the 16th Century. In this lecture we consider the progression to a philosophy of science, and the underlying principles and assumptions that now guide scientific inquiry.We also examines the nature of scientific knowledge, the methods of acquisition, evolution, and significance over past centuries, and reflect on the value to society.
In the struggle to solve problems, deliver understanding, and reveal the truth about our universe, science had to suffer and survive: ignorance, bigotry, established superstitions, and the ‘diktats’ of religions and politics, and latterly, falling education standards mired by social media. We chart that ‘scientific’ journey emphasising the importance of observation, experimentation, and the search for universal laws. Ultimately, this essentially Aristotelian perspective was challenged and overtaken by the rise of empiricism, which emphasised the importance of sensory experience and the limitations of human knowledge.
Science continues to evolve and provide us with the best truths attainable with our leading edge technologies of observation and experimentation. Today, it stands as the greatest and richest contributor to human knowledge, understanding, progress, and wellbeing. In turn, debates and controversies are ongoing, shaping the field and philosophy which remains essential for understanding the nature of scientific knowledge and the models it creates. But unlike any belief system, the answers and models furnishers by science are not certain and invariant, they tend to be stochastic and incomplete - ‘the best we can do’ at a given time.
In this workshop session we identify aging technology design concepts, old business and operating models, plus energy supply limits as the prime constraints of 6G and beyond. We also identify the notion of an erroneous spectrum shortage born of the bands and channel mode of operation which is fundamentally unsuited to 6G and IoT demands in the near and far future.
We strongly link optical fibre in the local loop with future wireless systems and the need for very low-energy ‘tower-less’ systems. We also postulate a future demanding UWB and HWB (Hyper) with transmission energies ~𝛍W and signals below the ambient noise level. This will be necessary to power an IoT of >2.4Tn Things which we estimate to be necessary for Industry 4/5 and sustainable societies.
Engineering might be defined as the judicial application of science and scientific knowledge, but with the rider that unlike science and scientific studies, engineering always has to deliver a solution and a result. There are therefore aspects of engineering that stretch and challenge, the accepted, wisdom and knowledge of science. To purists, this might appear outrageous, but it is no more so than the works of Erwin Schrödinger or Leonhard Euler et al
In this lecture we examine many of the established engineering basics whilst being mindful that most of our education, techniques, and working solutions are founded on the assumption of well behave linear environments. As our entire universe, and everything in it, is inherently complex and non-linear, we have to salute the powers of approximation and iteration for our many engineering success to date. However, we are increasingly being challenged by complexities of the fundamental non-linear nature of the problems confronting us. ( E.G. Politics, Conflict, Global Warming, Sustainability, Medicine, Fusion Power, Logistics, Networks, Depletion of Resources, Accelerating Tech Driven Change +++)
We start by tracing history from the foundations up to the present day, including modern analytical nomenclature and techniques, system reliability, resilience and costs, we highlight the the basic human limitations that necessitate multi-disciplinary teams that include AI and vast computing power.
The overall treatment includes our analogue past, digital today, and analogue/digital hybrid future of computing, robots, networks and systems of all kinds. It also includes animations, movies and sound files to demonstrate the realities of modern system design including the inherent complexities. To further highlight, and exemplify this projected future, we examine a real engineering project concerned with acoustic sniper spotting under battlefield conditions and extreme noise. Here a combination of digital modelling sees the use of analogue acoustic filter arrays, analogue signal amplification, and digital signal processing doubling the range of sniper detection and location.
IoT growth forecasts currently tend to span 30 – 60 Bn ‘Things’ by 2030. However, this ignores the central IoT role in realising sustainable societies where raw materials and component use have to see very high levels of reuse, repurposing, and recycling. In such a world almost everything we possess and use will have to be tagged and be electronically addressable as a part of the IoT. Such a need immediately sees growth estimates of 2Tn or more over the span of Industry 4 and 5. On the basis of energy demands alone, it is inconceivable that the technologies of BlueTooth, WiFi, 4, 5, and 6G could support such demand, and nor are the signaling and security protocols viable on such a scale.
The evolution of the IoT will therefore most likely see a new form of dynamic network requiring new lightweight protocols employing very little signal processing, together with very low energy wireless technologies (in the micro-Watt range) operating over extremely short distances (~10m). This need might be best satisfied by a new form of ‘Zero Infrastructure Mesh Networks’ that engage in active resource sharing, lossy probabilistic routing, and cyber security realised through an integrated ‘auto-immunity’ system. Ultimately, we might also envisage data amalgamation at key nodes that have a direct connection into the internet along with an additional layer of cyber checks and protection.
We justify the above assertions by illustrating the energy and network limitations of today’s 5G networks and those already obvious in current 6G proposals. We then go on to detail how a suitable IoT MeshNet might be configured and realised, along with a few solutions and emergent outcomes on the way.
Uncanny Valley addresses our reactions to humanoid objects, such as robots, a video game characters, or dolls, and how they look and act ‘almost’ like a real human. Feeling of uneasiness or disgust in the observer are addressed directly, rather than familiarity or attraction. The theory was proposed by Japanese roboticist Masahiro Mori in 1970 and has been explored by many researchers and artists since. It has application in AI, robotics, MMI, and human-computer interaction, and helps designers to create more appealing devices that can interact with people in various domains, such as industry, education, entertainment, defence, health care, et al.
In this lecture we explain and demonstrate the fundamentals before extending the principle to sound, motion, actions, and eyes as an output mechanism. We also note that all this poses some challenges and risks in the potential for reduced the emotional connections, empathy, acceptance, and trust between humans and machines. On a further dimension the potential to create threat and terror can be useful opportunity in the military domain. It is thus important to understand the causes and effects of the uncanny valley in the wider sense in order to meet the needs of each application space
In a world of accelerating innovation and increasingly complex digital services, applications, appliances, and devices, it seems unreasonable to expect customers to understand and maintain their own cyber security. We are way past the point where even the well educated can cope with the compounded complexity of an ‘on-line-life’. The reality is, today's products and services are incomplete and sport wholly inadequate cyber defence applications.
Perhaps the single biggest problem is that defenders have never been professional attackers - and they don’t share the same level of thinking and deviousness, or indeed, the inventiveness of their enemies. Apart from an education embracing the attack techniques, and in some cases, engaging in war games, the defenders remain on the back foot However, there a number of new, an potentially significant, approaches yet to be addressed, and we care to look at the problem from a new direction.
In the maintenance of high-tech equipment and systems across many industries, identifiable precursors are employed to flag impending outages and failures. This realisation prompted a series of experiments to see if it was possible to presage pending cyber attacks. And indeed it was found to be the case!
In this presentation we give an overview of our early experimental and observational results, long with our current thinking spanning networks through to individual hackers, and inside actors.
Connecting Everything Vital to Sustainability
Mobile network evolution has followed a reasonably predictable path almost entirely focused on the needs of human communication. The transition from 1 to 2G was dictated by the economics of reliability, performance, and scale, whilst 3, 4, and 5G saw the transition to mobile computing with full internet access, AI and an ever-expanding plethora of applications. But 5G could be the end of the line as cell-site energy demands have become excessive at ~10kW.
Midway between the migration from 4G to 5G, M2M and the IoT machines overtook the human population of 8Bn people with near (estimated) 20Bn devices. Current IoT growth rates suggest a 40 - 60Bn population by 2030 to 2050. However, we present evidence that it could be far more ~ 1,000Bn ‘Things’. This is based on the observation of the number of IoT components populating modern vehicles, homes, offices, factories and plants, along with smart ‘human implants’ and ‘smart bolts’ plus the instrumentation of civil; structures.
The bold assumption that 5G would be a dominant player in the IoT is now patently one of naivety and the world has become far more complex with over 10 wireless standards currently in use. So, this poses the question; will 6G rise to the challenge? We see this as highly unlikely as the diversity of need is extremely broad, and we propose that it could be the end of tower based networks for a lot of applications. A migration to mesh-nets, UWB and (Hyper Wide Band) for the IoT at frequencies above 100GHz seems the most obvious engineering choice as it allows for far simpler designs with extremely low power at sub $0.01/device cost. 5G is already on the margins of being sustainable, and a ‘more-of-the-same’ thinking 6G can lonely be far worse!
In 2015/16 a number of bodies/nations set about defining societies they would aspire to in the near future. Each vision document similarly described some idealistic, egalitarian, super-smart, human centred, state providing a near uniformity of living conditions, and opportunity. At the same time, each society would be free of adversity, with economic development guided by ecological and human need. Of course, economic growth was defined to continue in line with the past. Very nice, but a product of old linear thinking and modelling!
It is now approaching 2022 and in the past 5/7 years our base silicon technology has advanced to enjoy a >30 fold increase in computing power. Our top end mobile devices would now challenge a super computer of 1996/7 era, whist AI systems now pervade our homes, offices, vehicles, professions and all our on-line services. At the same time, information overload has started to rival some medical conditions!
All of this has also been compounded by two years of COVID-19 lockdowns and restrictions that have seen the normalisation of social isolation, limited travel, working and eduction from home, virtualised medicine and care, support services, shopping and meetings. In turn, this has resulted in empty offices, towns and cities. Concurently, climate change, global warming, pollution, finite resources, a stressed planetary system, and social unrest have suddenly become urgent issues. Against this backdrop it really seems to be time to revisit those Society 5.0 Visions and the limited linear thinking that contrived them!
In this presentation we examine many of the core parameters and assumptions to highlight existing, or soon to be realised, solutions and remedies. In doing so, a different picture of Society 5.0 emerges.
It was scientifically established in the 1970s that we are stressing the planet beyond the point where it can naturally recover. Today we are using about 50% more natural resources than can be extracted sustainably. The long history of industrialisation and population growth is now seeing climate change, extreme weather, and perhaps it is human overpopulation and terraforming that is now giving way to pandemics as we increasingly challenge and stress ecosystems.
Stressed systems react and fail in a variety of ways, and there is increasing evidence that CV-19 might just be the surprising product of human abuse of nature. What we can be certain of is that without action we will see more unpleasant and unwelcome surprises.
The Green Agenda is our biggest hope, but much of it is driven by emotion rather than deep thought, evidence, and scientific analysis. For example; recycling is mostly a fallacy and we need to think again! In reality Industry 4.0 is the first major program vested in the basics of long term sustainability.
In this presentation we give a brief overview of what I4.0 brings to the party by a focus on one major sector that is ripe for transformation. A much broader and wider treatment has been presented at previous events and numerous additional, associative, and supportive slide sets in this series are available on the web site.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Designing Great Products: The Power of Design and Leadership by Chief Designe...
QUANTUM COMPUTING REALITY CHECK
1. QUANTUM COMPUTING REALITY CHECK
Demystifying a world of the weird and unexpected
Prof Peter Cochrane
www.petercochrane.com
2. Why QC ?
The big deal
Without QC we will never understand:
-Life
-Physics
-Biology
-Climate
-Ecologies
-Chemistry
-Cosmology
-Complexity
-Non-Linearity
-Quantum Mechanics
-Many-Body Problems
-++++++++++++++++++
3. Why QC ?
The big deal
Without QC we will never understand:
-Life
-Physics
-Biology
-Climate
-Ecologies
-Chemistry
-Cosmology
-Complexity
-Non-Linearity
-Quantum Mechanics
-Many-Body Problems
-++++++++++++++++++
A
C
H
IEV
IN
G
LO
N
G
T
ER
M
S
U
S
T
A
IN
A
B
ILIT
Y
—
T
H
E
s
u
r
v
iv
a
l
o
f
s
p
ec
ies
—
A
R
E
A
T
a
m
u
c
h
h
ig
h
er
r
is
k
S
A
N
S
Q
U
A
N
T
U
M
C
O
M
P
U
T
IN
G
A
s
fa
r
a
s
w
e
c
a
n
s
ee
o
u
r
d
ig
it
a
l
p
a
t
h
w
ill
n
o
lo
n
g
er
s
C
a
le
t
o
a
d
d
r
es
s
m
a
n
y
o
f
o
u
r
im
p
o
r
t
a
n
t
M
IS
S
IO
N
C
r
it
ic
a
l
p
r
o
b
lem
s
T
h
e
Fu
n
d
a
m
en
t
a
l
lim
it
s
a
r
e
d
efin
ed
b
y
t
h
e
en
t
ir
e
u
n
iv
er
s
e
a
n
d
a
ll
in
it
b
ein
g
a
p
a
r
t
o
f
t
h
e
s
a
m
e
q
u
a
n
t
u
m
m
a
c
h
in
e
4. W I L D C L A I M S
Quantum Computers will:
- Access all bank accounts
- Instantly decrypt credit cards
- Replace all our digital machines
- Make quantum encryption mandatory
- Give high precision and certain answers
- Shrink down to become desktop machines
- Solve all our problems today and in the future
- Simulate the known universe with less than 300 Qbits
- xxx
- Improve your sex life
5. W I L D C L A I M S
Quantum Computers will:
- Access all bank accounts
- Instantly decrypt credit cards
- Replace all our digital machines
- Make quantum encryption mandatory
- Give high precision and certain answers
- Shrink down to become desktop machines
- Solve all our problems today and in the future
- Simulate the known universe with less than 300 Qbits
- xxx
- Improve your sex life
X
6. KNOWN UNIVERSE
An incomplete definition
Best Estimates ~ 1078 to 1082 Atoms/Protons
Ball Park: Say 1080 ‘Particles’ ~ 2240
So 300 Qbits => 2300 instantaneously defined ‘states’
7. KNOWN UNIVERSE
An incomplete definition
Best Estimates ~ 1078 to 1082 Atoms/Protons
Ball Park: Say 1080 ‘Particles’ ~ 2240
So 300 Qbits => 2300 instantaneously defined ‘states’
X
8. Q U A N T U M S U P R E M E C Y
No one complete definition - many shades
Key numbers; IFF we can engineer them!
50 Qbits => 250 ≈ 1015 ≈ 1PBit …. 54 Qbits > 1PByte
70 Qbits => 270 ≈ 1021 ≈ 1ZBit …. 74 Qbits > 1ZByte
70 Qbits Machine can fundamentally perform computations
impossible on any other machine
9. T H E G O T C H A !
Q u a n t u m S u p r e m a c y ? ?
10. H M M R E A L L Y !
Q u a n t u m S u p r e m a c y ? ?
Google 53 Qbits
Selected Problem
Best Estimate of
Super Computer
Processing Time
P r o b l e m
S p e c i f i c
S o l u t i o n &
C o m p u t a t i o n
G e n e r a l
P u r p o s e
S o l u t i o n s &
C o m p u t a t i o n
The largest chemical simulation on a
quantum computer to date…but not
all that impressive…energy state min!
11. KEY CONCEPTS
A d u a l i t y o f s t a t e s
A cat is locked in a box
with a device that that
w i l l k i l l i t a t s o m e
random time….
Before we open to box - all
we can say it that it may
be dead or alive with equal
probability 50% : 50% !
Schrödinger’s Cat
Famously dead and alive
at the same time!
12. KEY CONCEPTS
A d u a l i t y o f s t a t e s
A cat is locked in a box
with a device that that
w i l l k i l l i t a t s o m e
random time….
Before we open to box - all
we can say it that it may
be dead or alive with equal
probability 50% : 50% !
1
0
Schrödinger’s Cat
Famously dead and alive
at the same time!
HOLD THAT
THOUGHT
13. K E Y C O N C E P T S
Q u a n t u m e n t a n g l e m e n t
Communication and/or
Action @ a Distance
with no visible/known/
explainable means of
connection..
…demonstrated at
over 100km!
14. K E Y C O N C E P T S
Q u a n t u m e n t a n g l e m e n t
Communication and/or
Action @ a Distance
with no visible/known/
explainable means of
connection..
…demonstrated at
over 100km!
15. K E Y C O N C E P T S
Q u a n t u m e n t a n g l e m e n t
Communication and/or
Action @ a Distance
with no visible/known/
explainable means of
connection..
…demonstrated at
over 100km!
16. K E Y C O N C E P T S
Q u a n t u m e n t a n g l e m e n t
Einstein
Spooky Action
at a distance
still A MYSTERY
Communication and/or
Action @ a Distance
with no visible/known/
explainable means of
connection..
…demonstrated at
over 100km!
17. C O N C E P T U A L I S A T I O N
By analogy with a situation we understand
U n s t a b l e
S t a t e
S t a b l e
S t a t e
If the magnets are free floating they will move to the
lowest energy/stress/pressure state of alignment
18. C O N C E P T U A L I S A T I O N
By analogy with a situation we understand
U n s t a b l e
S t a t e
S t a b l e
S t a t e
If the magnets are free floating they will move to the
lowest energy/stress/pressure state of alignment
All the fields
and forces can
be measured
accuratelyWe can define
and describe all
of them
well
Mathematically
But we still do
not know
what
they are
exactly
19. WHAT WE KNOW FOR SURE (?)
W h a t o u r e x p e r i m e n t s / m e a s u re m e n t s re ve a l
Gravity - (g) = The weakest force in the universe but occurs in massive
concentrations - only attracts - acts over vast distances
Weak Nuclear Force - (1025 x g) = Only acts at a sub atomic scale and is
responsible for radioactive decay
ElectroMagnetic Force - (1036 x g) = Acts over vast distances but occurs in
very low concentrations - attracts and repels
Strong Nuclear Force - (1038 x g) = Only acts at a sub atomic nucleus scale
attracts & repels - very low concentrations
20. WHAT WE KNOW FOR SURE (?)
W h a t o u r e x p e r i m e n t s / m e a s u re m e n t s re ve a l
Gravity - (g) = The weakest force in the universe but occurs in massive
concentrations - only attracts - acts over vast distances
Weak Nuclear Force - (1025 x g) = Only acts at a sub atomic scale and is
responsible for radioactive decay
ElectroMagnetic Force - (1036 x g) = Acts over vast distances but occurs in
very low concentrations - attracts and repels
Strong Nuclear Force - (1038 x g) = Only acts at a sub atomic nucleus scale
attracts & repels - very low concentrations
Are we still
missing
something
WE NEED A GUT
that tells us
how
t hey are
all related
21. SOME COMPUTING BASICS
Digital Computers:
Binary Bit = 1 or 0 with 100% probability
Binary States = ’n bits’ gives 2n possible states
Digital Process = Serial/Exact/Defined and precise
Quantum Computers:
QBit: = 1 and 0 simultaneously with uncertainty
QStates = ’n Qbits’ gives all 2n ‘instantly’
Process = Parallel/Superposition and imprecise
Just like
Schrodinger’s
Cat!
22. SOME COMPUTING BASICS
Digital Computers:
Binary Bit = 1 or 0 with 100% probability
Binary States = ’n bits’ gives 2n possible states
Digital Process = Serial/Exact/Defined and precise
Quantum Computers:
QBit: = 1 and 0 simultaneously with uncertainty
QStates = ’n Qbits’ gives all 2n ‘instantly’
Process = Parallel/Superposition and imprecise
Just like
Schrodinger’s
Cat!
The sequential
propagation of
deterministic
states
A PROBABALISTIC
propagation of
energy waves
Coherent
solution states
are short lived
and noisy
23. B o o l e a n A l g e b r a R u l e s
A l l i s w e l l b e h a v e d , s t a b l e
d e t e r m i n i s t i c , r e p e a t a b l e ,
a c c u r a t e , r e l i a b l e , l o g i c a l
O n e m a t h e m a t i c a l
f r a m e w o r k a b l e t o
p o w e r a u t o m a t i o n ,
m e c h a n i c a l / l o g i c /
d i g i t a l c o m p u t i n g
t e l e p h o n e s w i t c h e s ,
m i c r o e l e c t r o n i c s
a n d s u p e r c o m p u t e r s
D I G I T A L C O M P U T I N G
C o n c a t e n a t e d g a t e s o f c e r t a i n t y
24. Q U A N T U M C O M P U T I N G ?
C o n c a t e n a t e d g a t e s o f n o i s y u n c e r t a i n t y
M a t h a l g o r i t h m s a n d l o g i c
s o l u t i o n s f o r c o m p l e x
p r o b l e m s e t s g r o w i n g
a h e a d o f t h e Q C h a r d w a r e
c a p a b i l i t y
C o m p l e x L o g i c F o r m s
S u i t e d t o t h e a n a l y s i s o f
n o n - l i n e a r s y s t e m s s u c h
a s d a t a b a s e s e a r c h e s ,
b i o l o g y , c h e m i s t r y , l i f e ,
f l u i d f l o w , w e a t h e r
s y s t e m s , p r e d i c t i o n s a l
25. BACK TO ANALOGUE
Q C i s a n e w p a r a d i g m
QCs are NOT:
-Digital
-Deterministic
-Conditionally stable
-Self reliant stand alone
QCs ARE:
-Analogue
-Subject to errors
-C o n d i t i o n a l l y u n s t a b l e
-Subject to all forms of noise
-R e l i a n t o n d i g i t a l c o m p u t e r s
26. Q C O P E R A T I O N S
N e e d s d i g i t a l c o n t r o l
Digital computers control programming, algorithms, error
correction, answer selection/testing and verification: and
this is very unlikely to change unless we make new and
fundamental discoveries in physics and/or materials
Digital computers also have a big ‘caretaker’ roll maintaining a
stable temperature controlled environment with cryogenics plant
down below 10mKelvin.
27. 'Physics is to sex, as mathematics is to masturbation'
"I think I can safely say that nobody understands quantum mechanics."
Richard Feynman (1985)
P O S I T I O N I N G
What do we really know?
STILL
TRUE
Toda
y
28. “Probability and Statistics are a consequence of incomplete/sparse data”
S E G U A E
M o d e l l i n g
“Quantum Theory is a consequence of measurement and modelling
inadequacies”
“Both are victims of a
lack of dimensionality”
⤻
⤻
29. DEMO: DIMENSIONASLITY
A n ’ N ’ D i m e n s i o n a l w o r l d
A ‘one D’ world looks like this
…and all we can do to describe
the behaviour probabilistically !
Adding just one more dimension
tells the whole story !
30. This does not look like a deterministic world as there appears to be
no patterns visible. Therefore, all we can do is assume a probabalistic
model and apply statistical analysis as to the distribution, and duration,
of the flashes and possibly discern their colour and brightness.
Just because we cannot see or imagine determinism here does not mean to
say that it does not exist. Let’s add just one more dimension and see what
gives…
ONE DIMENSIONAL WORLD
31. Now the full determinism of Newtonian Mechanics is revealed and the
mechanism at work is obvious and well known!
TWO DIMENSIONAL WORLD
32. STOP PRESS
This came as something
of a ‘big’ surprise - and
e x p l a i n s w h y s o m e
studies have hit a dead
end….they really do need
a Quantum Computer!
33. I G N O R A N C E
And incomplete/imperfect
model does not mean we
cannot exploit something
34. “Relax your grip on the solid, semi-solid, particle and wave models of the atomic,
sub-atomic and photonic - and start thinking in terms of:
‘clouds of energy’
M Y A D V I C E
A bit of a brain bender!
“And be prepared to flip between time and frequency…and to extend that
understanding to spatial dimensions and forms”
“IFF you have studied Fourier and Laplace you will find a lot of axioms here - but if not,
I will also try and explain graphically and with animations”
35. Cold Does Not Exist - but and absence of heat does
Darkness Does Not Exist - but and absence of photons does
SCIENTIFIC METHOD
A tried and tested framework
Observation
Hypothesis
Theory
Experiment
Agree Disagree
Corroboration
Teams all over the planet try to repeat the
results with some dedicated to disproving
the theory on a continual basis
36. Cold Does Not Exist - but and absence of heat does
Darkness Does Not Exist - but and absence of photons does
SCIENTIFIC METHOD
A tried and tested framework
Observation
Hypothesis
Theory
Experiment
Agree Disagree
Corroboration
Teams all over the planet try to repeat the
results with some dedicated to disproving
the theory on a continual basis
37. Cold Does Not Exist - but and absence of heat does
Darkness Does Not Exist - but and absence of photons does
SCIENTIFIC METHOD
A tried and tested framework
Observation
Hypothesis
Theory
Experiment
Agree Disagree
Corroboration
Teams all over the planet try to repeat the
results with some dedicated to disproving
the theory on a continual basis
This methodology has
been tried and tested over
>400 years and is responsible
for the greatest advances our
species has ever enjoyed,
but it can never deliver
100% certainty!
38. S C I E N T I F I C D I S C I P L I N E
You can never be certain you are right
“The first principle is that you must not fool
yourself — and you are the easiest person to
fool.”
“You have to be very careful. After you’ve not
fooled yourself, it’s easy not to fool other
scientists. You just have to be honest in a
conventional way after that.”
Richard P. Feynman
39. H I S T O R Y
A t o m i c M o d e l s
“These concepts all served us well and have been good enough for their time - but they are incomplete”
Heisenberg uncertainty principle: “As we learn more about the electron's
position, we know less about its energy, and vice versa - and as time and
experiments have progressed, the cloud analogy looks closer to the truth!
40. H I S T O R Y
A t o m i c M o d e l s
BUT WE MUST
START HERE
The increasing sophistication of these models went hand-in-hand with our ability
to make detailed observations based on laboratory experiments - and in turn gave
us a knowledge of chemistry and materials that has powered our modern world
“These concepts all served us well and have been good enough for their time - but they are incomplete”
Heisenberg uncertainty principle: “As we learn more about the electron's
position, we know less about its energy, and vice versa - and as time and
experiments have progressed, the cloud analogy looks closer to the truth!
41. A C T U A L I T Y 1
A s f a r a s we c a n o b s e r ve t o d a y
Erwin Schrödinger Wave Function Model 1926
“His conceptualisation and mathematical model turned out to be
a pretty good approximation to the truth given that he had no
way of making any realistic measurements or observations”
We h a v e t o t h i n k i n t e r m s o f p r o b a b i l i s t i c c l o u d s o f e n e r g y
t h a t a r e c o n s t a n t l y o n t h e m o v e a n d n e a r u n b o u n d e d f o r m
S c h r ö d i n g e r d i d n ot d e r i v e
t h i s e q u at i o n : it ‘ s o r t o f ’
p o p p e d i nto h i s h e a d !
42. A C T U A L I T Y 2
A s f a r a s we c a n o b s e r ve t o d a y
M a n y d i f f e r e n t g r o u p s
g l o b a l l y h a v e i n d e p e n d e n t l y
o b s e r v e d t h e s t r u c t u r e ( o r
l a c k o f i t ) o f d i f f e re n t t y p e s
o f a t o m s u s i n g a v a r i e t y o f
t e c h n i q u e s , a n d a l l
p ro d u c e d re s u l t s t h a t s h o w
v i b r a t i n g a n d f u z z y c l o u d s
o f e n e r g y !
43. N o t i c e t h a t t h e ( c o h e r e n t )
m a t r i x s t r u c t u r e g i v e s w a y
t o ( d e c o h e r e n c e ) f a u l t
l i n e s f r o m t i m e - t o - t i m e
d u e t o d y n a m i c s t r e s s ,
s t r a i n , f i e l d s & e x t e r n a l
e n e r g y - a f e a t u r e e x p l o i t e d
i n s o m e s e n s o r s y s t e m s b u t a
r e a l ( n o i s y ) p r o b l e m i n Q C !
A C T U A L I T Y 3
A t o m i c c r y s t a l s o l i d i n m o t i o n
44. 2.6km
5.3km
56m
30cm
The Sun & Gold Atom
Normalised to 30cm
radius
A C T U A L I T Y 4
‘ M o re N o t h i n g ’ t h a n t h e u n i ve r s e
A t o m s h a ve ~ 1 0 x
t h e ‘e m p t i n e s s ’ o f
o u r s o l a r s y s t e m
45. 0
F U N D A M E N T A L S T A T E S
R e s o l v i n g t h e s e e m i n g l y i m p o s s i b l e
Digital Computers: two invariant stable states:
Binary Bit = 1 or 0 with 100% probability
Binary States ’n bits’ gives 2n possible states
Quantum Computers: probabilistic semi- stable states:
QBit: = 1 and 0 simultaneously uncertain
QStates ’n Qbits’ gives all 2n ‘instantly’
1:0
1
superposition
?
46. BITS & QBITS
Some big differences
Binary
1
0
T h o u s a n d s
of Electrons
T e n s o f
Electrons 0
1
A t o m
o r
E l e c t r o n
o r
P h o t o n
1
0
Digital Computer Quantum Computer
Spin
Up
Spin
Down
47. B L O C H S P H E R E
A t o m s , E l e c t ro n s P h o t o n s , i n m a n y s t a t e s
C o n e o f p r o b a b i l i t y f o r a n e l e c t r o n ,
a t o m , o r p h o t o n e l e c t ro n i c a l l y f o rc e d
t o a s s u m e a g i v e n p o l a r i s a t i o n a s a
Q b i t … u n c e r t a i n t y i s m o s t l y d u e t o
e x t e r n a l i n f l u e n c e s u c h a s n o i s e , E M
r a d i a t i o n , t e m p e r a t u r e , v i b r a t i o n + + +
To o v e r c o m e t h e s e e f f e c t s o p e r a t i n g
t e m p e r a t u r e s g e n e r a l l y < 1 0 m K w i t h
e x t e n s i v e s h i e l d i n g a n d i n s u l a t i o n f r o m
t h e n a t u r a l w o r k i n g e n v i r o n m e n t …
i n c l u d i n g p e o p l e !
48. P O I N C A R E S P H E R E
A t o m s E l e c t ro n P h o t o n s & m a n y s t a t e s
A t o m s , E l e c t r o n s , a n d P h o t o n s o f f e r t h e
p o t e n t i a l f o r ’ n’ - l e v e l q u a n t u m s y s t e m s
w i t h s t a t e d i a g r a m s s i m i l a r t o d i g i t a l
r a d i o … h o we ve r, c o n t ro l a n d s t a b i l i t y a re
d e m a n d i n g i s s u e s
A l s o k n o w n a n d t h e B l o c h
o r R i e m a n n S p h e r e , & t h e
H i l b e r t S t a t e S p a c e
S i g n a lS t a t e
N o i s e
49. S U P E R P O S I T I O N
A n o u t c o m e o f t h e S c h rö d i n g e r ’s e q u a t i o n
P a r t i c l e s a re a c o n c e n t r a t i o n
o f e n e r g y i n t h e f o r m w a v e s
t h a t s e e a l l p o s s i b l e s t a t e s
a s s u m e d a t t h e s a m e t i m e … . .
… . n o t u n t i l w e i n s p e c t o r
o b s e r v e t h e p a r t i c l e d o e s i t
a s s u m e a s i n g u l a r s t a b l e s t a t e
O n a n a t o m i c
s c a l e a n e l e c t ro n
s p i n s ~ 1 0 1 1 f a s t e r
t h a n t h i s s i m p l e d e m o
50. S U P E R P O S I T I O N
A n o u t c o m e o f t h e S c h rö d i n g e r ’s e q u a t i o n
H e r e w e s l o w d o w n t o
o b s e r v e i n d i v i d u a l s t a t e s ,
b u t i n d o i n g s o we d e s t ro y
a l l s u p e r p o s i t i o n s
52. D U A L I T Y
Pa r t i c l e & Wa v e A n a l o g y
Pa r t i c l e s : A t o m s , E l e c t r o n , P h o t o n s
c a n i n f l u e n c e e a c h o t h e r b e c a u s e
t h r o u g h t h e i r w a v e n a t u r e
53. “We assign clouds of energy the descriptors of singular atoms with constituents defined by
combinations of singular orthogonal waves collectively interacting in the near and far field -
but we do so with gaps in our understanding of what fields, waves and energy actually are!
S U P E R P O S I T I O N
P u l s e , p a r t i c l e o r w a v e i n t e r f e r e n c e
The good news is: from the observed and mathematically predicted behaviours, we grasp
sufficient to build everything from MRI Scanners to Nuclear Power Stations+++
54. S O H E R E W E A R E
A s i n g l e Q b i t i n s u p e r p o s i t i o n
S p i n ‘ u p’
S p i n ‘d o w n ’
0 101
55. Q U A N T U M C O M P U T I N G ?
C o n c a t e n a t e d g a t e s o f n o i s y u n c e r t a i n t y
C o n c a t e n a t e dC o n c a
‘Stable’
Input Tensors
‘Noisy, Unstable
Probabalistic’
Output Tensors
56. Q U A N T U M C O M P U T I N G ?
C o n c a t e n a t e d g a t e s o f n o i s y u n c e r t a i n t y
C o n c a t e n a t e dC o n c a
‘Stable’
Input Tensors
‘Noisy, Unstable
Probabalistic’
Output Tensors
We have to create an
algorithm or problem
facsimile model and
feed in the input data
states to then inspect
the output DATA
the output has a short
coherence state that
is noisy and we have
to take many samples
to assess and rework
in reverse digitally
57. C O M P U T I N G
G a t e s / F u n c t i o n a l B l o c k s
A l l t h e m a t r i c e s u s e d a r e
i n t h e f o r m o f 1 : 0 p a t t e r n s
All the1:0 patterns are
made up of Qbits that
are short term stable…
…they occasionally hit a
coherent condition that
represents one possible
solution state
58. I M A G I N E
M a n y m a n y B l o c k s
E a c h e l e m e n t i s a m a t r i x o f
d i f f e r e n t s i z e & c o n f i g u r a t i o n s
In a quantum machine the qbit states ripple through in
the form of energy waves exhibiting emergent patterns
of coherence and decoherence that periodically reveal
the likely solutions to the problem at hand
59. I M A G I N E
M a n y m a n y B l o c k s
E a c h e l e m e n t i s a m a t r i x o f
d i f f e r e n t s i z e & c o n f i g u r a t i o n s
In a quantum machine the qbit states ripple through in
the form of energy waves exhibiting emergent patterns
of coherence and decoherence that periodically reveal
the likely solutions to the problem at hand
60. Q B I T S T A T E W A V E S
M a t r i c e s o f Q b i t s s e e e n e r g y f l o w s
In a digital machine
we might envisage bits
on the march: a bit and
a gate at a time moving
in synchrony a clock tick
at a time from input to output…
In a quantum machine the qbit states
ripple through in the form of energy waves
with emergent patterns of coherence and
decoherence that periodically reveal the
likely solutions to the problem at hand
NOTE: This illustration is
a model and not actuality
61. EXAMPLE
Prime factors
Biggest QC Prime Factorisation to date: 1,099,551,473,989 = 1,048,589 x1,048,601
=>> O(1012)
RSA 1024 is =>> O(10309) : RSA 2048 is =>> O(10617)
62. A R C H I T E C T U R E R S
There are many - experimental
Nowhere near a
production line!
63. ANALOGUE OUTPUT
O u t p u t f o r m a t !
E r r o r r a t e s o f Q B i t s a n d
c o h e r e n c e u n c e r t a i n t y
l e a d t o a s o l u t i o n s p r e a d
d e m a n d i n g s e v e r a l ‘ r u n s
o r t r i a l s ’ t o h e l p i d e n t i f y
t h e c o r r e c t a n s w e r ( s )
Region of Likely
Answers
Answers Checked
by Digital Computer
64. S e l e c t N u m b e r
P a i r s f o r Te s t i n g
C h e c k
V a l i d i t y
PAR AD IG M
N o t U n i q u e
C o m p l e x i t y b e y o n d
d i g i t a l c o m p u t e r
F a c t o r i s e : 2 3 7 8 9 . . 0 7
C o m p l e x
Q u a n t u m
A n a l y s i s
M u l t i p l e p o s s i b l e
a n s w e r s t e s t e d b y
d i g i t a l c o m p u t e r
P o s s i b l e s :
7 0 0 4 … 0 1 7
3 1 2 3 … 4 1 9
+ + + +
+ + + +
1 3 6 4 … 0 0 3
Q C
D i g i t a l
C o m p u t e r
M u l t i p l i e s l a r g e p r i m e s
i n c o m b i n a t i o n p a i r s
Ve r i f i e d
R e s u l t
65. Q B I T P R O G R E S S
N o i s e / S t a b i l i t y C h a l l e n g e
2 Qbits
IBM, MIT
Oxford
Berkley
Stanford
7 Qbits
LosAlamos
5 Qbits
TU Munich 12 Qbits
MIT
50 Qbits
IBM
72 Qbits
Google
128 Qbits
Rigetti
Honeywell
1998 2000 2006
2018
20202017
68 Qbits
IBM
67. S A N I T Y C H E C K
We h a ve b e e n h e re b e f o re
1 9 5 0 s t a s k s p e c i f i c c o m p u t i n g
& t h e 2 4 x 7 m a i n t e n a n c e c re w
E l e c t r i c M o t o r
f o r t h e 2 M b y t e
H a rd D r i ve
68. Universal Quantum
Computing Engines
(100k- 1M QBits)
“Probably be the ultimate brain
in the making”
BUT we have NO:
- QBit tech small/reliable enough
- Machine architectures
- Software frameworks
Quantum
Simulation
(Combinatorial
Complexity)
“ T h e c u r r e n t
industry focus”
THE SPECTRUM?
Beyond the BS we have
ProcessingPower&Apps
Time to Commercialisation
Quantum
Annealing
(Optimisation)
Shor’s algorithm: factoring numbers for code breaking >2k Qbits RSA1024
Grover’s algorithm: searching massive unstructured data sets
>50 other unique algorithms have been developed
BUT there are no machines to test them on yet!
Commercialised
‘D’ Wave
QCAAS
Available
2025/30 2035/40
69. F I N - Q & A ?
www.petercochrane.com
“A digital computer is to a piano as a quantum computer
is to an orchestra”