The document discusses the origins and definitions of the digital twin concept. Some key points:
- The concept of a digital twin dates back to 2002 when Dr. Grieves presented the idea of real and virtual spaces that are linked and mirror each other throughout a physical system's lifecycle.
- A digital twin prototype contains information to describe and produce a physical version, while a digital twin instance is linked to a specific physical product.
- Digital twins can be used predictively to simulate future behavior and interrogatively to examine current and past states.
- The digital twin concept envisions the physical and virtual systems remaining linked and updated throughout a system's creation, production, usage, and disposal lifecycle phases.
Digital Twin refers to a physical and functional description of a component, product or system together with all available operational data. This includes all information which could be useful in current and subsequent lifecycle phases. Benefit of it for mechatronic and cyber-physical systems is to provide the information created during design and engineering also at the operation of the system. The comprehensive networking of all information, shared between partners and connecting design, production and usage, forms the presented paradigm of next generation Digital Twin.
Next IIoT wave: embedded digital twin for manufacturing IRS srl
Next IIoT wave will be a population of digital twin. A digital twin is a real-time digital replica of a physical device. Developing an embedded digital twin allows superior device diagnostic and failure anticipation. Discover how to to implement an embedded digital twin using real-time monitoring, physical models, and machine learning
Digital Twin refers to a physical and functional description of a component, product or system together with all available operational data. This includes all information which could be useful in current and subsequent lifecycle phases. Benefit of it for mechatronic and cyber-physical systems is to provide the information created during design and engineering also at the operation of the system. The comprehensive networking of all information, shared between partners and connecting design, production and usage, forms the presented paradigm of next generation Digital Twin.
Next IIoT wave: embedded digital twin for manufacturing IRS srl
Next IIoT wave will be a population of digital twin. A digital twin is a real-time digital replica of a physical device. Developing an embedded digital twin allows superior device diagnostic and failure anticipation. Discover how to to implement an embedded digital twin using real-time monitoring, physical models, and machine learning
Digital twins: the power of a virtual visual copy - Unite Copenhagen 2019Unity Technologies
From buildings and infrastructure to industrial machinery and factories, digital twins are becoming integral revisualization tools across the industrial sector. Learn how Unit040, a company specializing in visualization and simulation, creates digital twins that combine real-time 3D technology with BIM, CAD and CAE systems to add value at all stages of the building and product lifecycle, from the early design phase to predictive maintenance using Internet of Things (IoT) data.
Speakers:
Pieter Weterings - Unit040
Guido van Gageldonk - Unit040
Watch the session on YouTube: https://youtu.be/j4i14p89h_s
What is the Digital Twin?
Digital twin is the ability to make a virtual representation of the physical elements and the dynamics of how an Internet of Things device operates and works. It's more than a blueprint, it's more than a schematic. It's not just a picture. It's a lot more than a pair of ‘virtual reality’ glasses. It's a virtual representation of both the elements and the dynamics of how an Internet of Things device responds throughout its lifecycle. It can be a jet engine, a building, process on factory floor, and much, much more.
Watch the video introduction of this keynote presentation from Genius of Things Summit in Munich https://youtu.be/RaOejcczPas
Digital Twin - What is it and how can it help us?Shaun West
RQ: What services can be provided (by whom and to who) through (or adopting, or developing) the digital twin concepts?
Our focus is long-life capital equipment
Consider the whole life cycle
Apply Service Dominant Logic in the assessment
Consider technical and business hierarchies
A digital twin is a real-time digital replica of a physical device. Developing an embedded digital twin allows superior device diagnostic and failure anticipation. Discover how to use the NI platform to implement an environmental control device (HVAC) twin using real-time monitoring, physical models, and machine learning.
Digital Twin refers to a physical and functional description of a component, product or system together with all available operational data. This includes all information which could be useful in current and subsequent lifecycle phases. Benefit of it for mechatronic and cyber-physical systems is to provide the information created during design and engineering also at the operation of the system. The comprehensive networking of all information, shared between partners and connecting design, production and usage, forms the presented paradigm of next generation Digital Twin.
Presentation that was developed for IoT DevCon in April 2017, Santa Clara California USA.
In this deck, I go though the value of data & derivative (analytics) form an economic point of value, and then connect to how we can package that value for monetization in the context of IoT. The key technology enablement is the Digital Twin, which is describe as well as platform that support this paradigm. It ends with recommendations on how to get ready and how to start using GE Digital Predix Platform.
Technology that is going to create a revolution in every Industry including Health care. What is it, what are the tools and what is the outcome?
NASA started the research on Twins due to space travel and the need to have real time feedback of components. Now it is extending to even Health care to having a Human twin.
I gave this talk at the 'Digital Twin Conference' hosted by LH Corp at COEX, Seoul on August 8th, 2019.
Abstract: 'Digital Twin' is a digital replication of real world objects, processes, phenomena that can be used for various purposes. Digital twin concept backs to manufacturing industry in early 2000s for the PLM (Product Lifecycle Management) purposes. It is based on the idea that a digital informational construct about a physical system could be created as an entity on its own. Definitions of digital twin emphasize the three important levels or characteristics. At first, there should be connection between real physical world and corresponding virtual world. To do this, Level 1 digital twin provides virtual 3D models. Secondly, this connection between real world and virtual world is established by generating (near) real time data using sensors or IoT. This is called Level 2 digital twin. Thirdly, Level 3 digital twin carries out certain analyses, predictions, and simulations using virtual 3D and (near) real time data. ‘Smart Spaces’ are interactive environments where humans and technology can openly communicate with each other in a physical or digital setting. Examples of smart spaces include smart cities, smart factories, and smart homes. ‘Smart Spaces’ is one of Garner’s Top 10 Tech Trends for 2019. As spaces are going through digital transformation with 4th industrial revolution, there are many attempts to apply digital twin technology to manage urban, spatial, and industrial issues around the world. Those attempts look set to play an increasingly important role in the creation of smart cities, smart factories, and smart homes. Bringing the virtual and real worlds together in this way can help to give better analysis, visualization, and simulation to the decision-making process. This will be a multi-way process with iterative feedback among stakeholders.
In this talk, I'll share my real experiences in carrying out digital twin and smart space projects. Also I’ll talk about what I’ve learnt from these projects.
4 Applications of Digital Twin in HealthcareTyrone Systems
In this space, we’ve been discussing the possibilities of using IoT applications to solve costs and revenue leakage for healthcare providers, and to streamline clerical tasks that can often reduce facetime healthcare practitioners have with their patients.
The power of IoT is truly realized when data from the real-world is securely transformed to the digital realm—this is commonly referred to as creating “digital twins.” By “instrumenting” the real-world to provide near real-time data, and by applying the powerful tools and methods of the analytical and artificial intelligence fields, digital twins can speed the creation of new and revolutionary healthcare IoT applications.
Sokszor találkozol a digital thread kifejezéssel, de nem tudod hogy pontosan micsoda? Kíváncsi lenné rá, hogyan használják a cégek? Olvasd el prezentációnkat, amit előadtunk az idei Simonyi Konferencián!
A digital twin is a digital replica of a living or non-living physical entity. By bridging the physical and the virtual world, data is transmitted seamlessly allowing the virtual entity to exist simultaneously with the physical entity.
See: El Saddik, A. (2018). Digital Twins: The Convergence of Multimedia Technologies. IEEE MultiMedia, 25(2), 87-92.
Nurturing Digital Twins: How to Build Virtual Instances of Physical Assets to...Cognizant
To embark on the digital twin jounrey, assess your readiness, define and communicate a vision, set common data management rules and build in flexibility for intelligence.
Digital twins: the power of a virtual visual copy - Unite Copenhagen 2019Unity Technologies
From buildings and infrastructure to industrial machinery and factories, digital twins are becoming integral revisualization tools across the industrial sector. Learn how Unit040, a company specializing in visualization and simulation, creates digital twins that combine real-time 3D technology with BIM, CAD and CAE systems to add value at all stages of the building and product lifecycle, from the early design phase to predictive maintenance using Internet of Things (IoT) data.
Speakers:
Pieter Weterings - Unit040
Guido van Gageldonk - Unit040
Watch the session on YouTube: https://youtu.be/j4i14p89h_s
What is the Digital Twin?
Digital twin is the ability to make a virtual representation of the physical elements and the dynamics of how an Internet of Things device operates and works. It's more than a blueprint, it's more than a schematic. It's not just a picture. It's a lot more than a pair of ‘virtual reality’ glasses. It's a virtual representation of both the elements and the dynamics of how an Internet of Things device responds throughout its lifecycle. It can be a jet engine, a building, process on factory floor, and much, much more.
Watch the video introduction of this keynote presentation from Genius of Things Summit in Munich https://youtu.be/RaOejcczPas
Digital Twin - What is it and how can it help us?Shaun West
RQ: What services can be provided (by whom and to who) through (or adopting, or developing) the digital twin concepts?
Our focus is long-life capital equipment
Consider the whole life cycle
Apply Service Dominant Logic in the assessment
Consider technical and business hierarchies
A digital twin is a real-time digital replica of a physical device. Developing an embedded digital twin allows superior device diagnostic and failure anticipation. Discover how to use the NI platform to implement an environmental control device (HVAC) twin using real-time monitoring, physical models, and machine learning.
Digital Twin refers to a physical and functional description of a component, product or system together with all available operational data. This includes all information which could be useful in current and subsequent lifecycle phases. Benefit of it for mechatronic and cyber-physical systems is to provide the information created during design and engineering also at the operation of the system. The comprehensive networking of all information, shared between partners and connecting design, production and usage, forms the presented paradigm of next generation Digital Twin.
Presentation that was developed for IoT DevCon in April 2017, Santa Clara California USA.
In this deck, I go though the value of data & derivative (analytics) form an economic point of value, and then connect to how we can package that value for monetization in the context of IoT. The key technology enablement is the Digital Twin, which is describe as well as platform that support this paradigm. It ends with recommendations on how to get ready and how to start using GE Digital Predix Platform.
Technology that is going to create a revolution in every Industry including Health care. What is it, what are the tools and what is the outcome?
NASA started the research on Twins due to space travel and the need to have real time feedback of components. Now it is extending to even Health care to having a Human twin.
I gave this talk at the 'Digital Twin Conference' hosted by LH Corp at COEX, Seoul on August 8th, 2019.
Abstract: 'Digital Twin' is a digital replication of real world objects, processes, phenomena that can be used for various purposes. Digital twin concept backs to manufacturing industry in early 2000s for the PLM (Product Lifecycle Management) purposes. It is based on the idea that a digital informational construct about a physical system could be created as an entity on its own. Definitions of digital twin emphasize the three important levels or characteristics. At first, there should be connection between real physical world and corresponding virtual world. To do this, Level 1 digital twin provides virtual 3D models. Secondly, this connection between real world and virtual world is established by generating (near) real time data using sensors or IoT. This is called Level 2 digital twin. Thirdly, Level 3 digital twin carries out certain analyses, predictions, and simulations using virtual 3D and (near) real time data. ‘Smart Spaces’ are interactive environments where humans and technology can openly communicate with each other in a physical or digital setting. Examples of smart spaces include smart cities, smart factories, and smart homes. ‘Smart Spaces’ is one of Garner’s Top 10 Tech Trends for 2019. As spaces are going through digital transformation with 4th industrial revolution, there are many attempts to apply digital twin technology to manage urban, spatial, and industrial issues around the world. Those attempts look set to play an increasingly important role in the creation of smart cities, smart factories, and smart homes. Bringing the virtual and real worlds together in this way can help to give better analysis, visualization, and simulation to the decision-making process. This will be a multi-way process with iterative feedback among stakeholders.
In this talk, I'll share my real experiences in carrying out digital twin and smart space projects. Also I’ll talk about what I’ve learnt from these projects.
4 Applications of Digital Twin in HealthcareTyrone Systems
In this space, we’ve been discussing the possibilities of using IoT applications to solve costs and revenue leakage for healthcare providers, and to streamline clerical tasks that can often reduce facetime healthcare practitioners have with their patients.
The power of IoT is truly realized when data from the real-world is securely transformed to the digital realm—this is commonly referred to as creating “digital twins.” By “instrumenting” the real-world to provide near real-time data, and by applying the powerful tools and methods of the analytical and artificial intelligence fields, digital twins can speed the creation of new and revolutionary healthcare IoT applications.
Sokszor találkozol a digital thread kifejezéssel, de nem tudod hogy pontosan micsoda? Kíváncsi lenné rá, hogyan használják a cégek? Olvasd el prezentációnkat, amit előadtunk az idei Simonyi Konferencián!
A digital twin is a digital replica of a living or non-living physical entity. By bridging the physical and the virtual world, data is transmitted seamlessly allowing the virtual entity to exist simultaneously with the physical entity.
See: El Saddik, A. (2018). Digital Twins: The Convergence of Multimedia Technologies. IEEE MultiMedia, 25(2), 87-92.
Nurturing Digital Twins: How to Build Virtual Instances of Physical Assets to...Cognizant
To embark on the digital twin jounrey, assess your readiness, define and communicate a vision, set common data management rules and build in flexibility for intelligence.
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...IJNSA Journal
This study examines the traditional approach to software development within the United Kingdom Government and the accreditation process. Initially we look at the Waterfall methodology that has been used for several years. We discuss the pros and cons of Waterfall before moving onto the Agile Scrum methodology. Agile has been adopted by the majority of Government digital departments including the Government Digital Services. Agile, despite its ability to achieve high rates of productivity organized in short, flexible, iterations, has faced security professionals’ disbelief when working within the U.K. Government. One of the major issues is that we develop in Agile but the accreditation process is conducted using Waterfall resulting in delays to go live dates. Taking a brief look into the accreditation process that is used within Government for I.T. systems and applications, we focus on giving the accreditor the assurance they need when developing new applications and systems. A framework has been produced by utilising the Open Web Application Security Project’s (OWASP) Application Security Verification Standard (ASVS). This framework will allow security and Agile to work side by side and produce secure code.
Agent Assisted Methodologies have become an
important subject of research in advance Software Engineering.
Several methodologies have been proposed as, a theoretical
approach, to facilitate and support the development of complex
distributed systems. An important question when facing the
construction of Agent Applications is deciding which
methodology to follow. Trying to answer this question, a
framework with several criteria is applied in this paper for the
comparative analysis of existing multiagent system
methodologies. The results of the comparative over two of them,
conclude that those methodologies have not reached a sufficient
maturity level to be used by the software industry. The
framework has also proved its utility for the evaluation of any
kind of Agent Assisted Software Engineering Methodology.
A CLOUD BASED ARCHITECTURE FOR WORKING ON BIG DATA WITH WORKFLOW MANAGEMENTIJwest
In real environment there is a collection of many noisy and vague data, called Big Data. On the other hand,
to work on the data middleware have been developed and is now very widely used. The challenge of
working on Big Data is its processing and management. Here, integrated management system is required
to provide a solution for integrating data from multiple sensors and maximize the target success. This is in
situation that the system has constant time constrains for processing, and real-time decision-making
processes. A reliable data fusion model must meet this requirement and steadily let the user monitor data
stream. With widespread using of workflow interfaces, this requirement can be addressed. But, the work
with Big Data is also challenging. We provide a multi-agent cloud-based architecture for a higher vision to
solve this problem. This architecture provides the ability to Big Data Fusion using a workflow management
interface. The proposed system is capable of self-repair in the presence of risks and its risk is low.
L'Autorité de la Concurrence présente son nouveau guide "Protéger son environnement numérique" pour aider les internautes à repérer facilement la désinformation et à agir contre les contenus potentiellement illégaux comme les discours de haine et les deep fakes.
Cette étude conjointe de l'OEB et de l'EUIPO se concentre sur la manière dont les startups innovantes obtiennent des financements pour transformer leurs idées en nouveaux produits destinés au marché. Il examine comment les droits de propriété intellectuelle peuvent aider les fournisseurs de financement initial à se retirer avec succès en vendant à une autre entreprise ou en introduisant une introduction en bourse.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Connector Corner: Automate dynamic content and events by pushing a button
Digital twin
1. See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/307509727
Origins of the Digital Twin Concept
Working Paper · August 2016
DOI: 10.13140/RG.2.2.26367.61609
CITATIONS
4
READS
17,138
1 author:
Some of the authors of this publication are also working on these related projects:
DFAM - Design for Additive Manufacturing and Additive Manufacturing evaluation View project
Digital Twin View project
Michael Grieves
Florida Institute of Technology
22 PUBLICATIONS 418 CITATIONS
SEE PROFILE
All content following this page was uploaded by Michael Grieves on 31 August 2016.
The user has requested enhancement of the downloaded file.
2. Excerpted based on: Trans-Disciplinary Perspectives on System Complexity – All rights reserved
Digital Twin:
Mitigating Unpredictable, Undesirable Emergent
Behavior in Complex Systems (Excerpt)
Dr. Michael Grieves and John Vickers
III. The Digital Twin Concept
While the terminology has changed over time, the basic concept of the
Digital Twin model has remained fairly stable from its inception in 2002. It is
based on the idea that a digital informational construct about a physical system
could be created as an entity on its own. This digital information would be a “twin”
of the information that was embedded within the physical system itself and be
linked with that physical system through the entire lifecycle of the system.
Origins of the Digital Twin Concept
The concept of the Digital Twin dates back to a University of Michigan
presentation to industry in 2002 for the formation of a Product Lifecycle
Management (PLM) center. The presentation slide, as shown in Figure 3 and
originated by Dr. Grieves, was simply called “Conceptual Ideal for PLM.”
However, it did have all the elements of the Digital Twin: real space, virtual
space, the link for data flow from real space to virtual space, the link for
information flow from virtual space to real space and virtual sub-spaces.
The premise driving the model was that each system consisted of two
systems, the physical system that has always existed and a new virtual system
that contained all of the information about the physical system. This meant that
there was a mirroring or twinning of systems between what existed in real space
to what existed in virtual space and vice versa.
3. Excerpted based on: Trans-Disciplinary Perspectives on System Complexity – All rights reserved
The PLM or Product Lifecycle Management in the title meant that this was
not a static representation, but that the two systems would be linked throughout
the entire lifecycle of the system. The virtual and real systems would be
connected as the system went through the four phases of creation, production
(manufacture), operation (sustainment/support), and disposal.
This conceptual model was used in the first executive PLM courses at the
University of Michigan in early 2002, where it was referred to as the Mirrored
Spaces Model. It was referenced that way in a 2005 journal article (Grieves
2005). In the seminal PLM book, Product Lifecycle Management: Driving the
Next Generation of Lean Thinking, the conceptual model was referred to as the
Information Mirroring Model (Grieves 2006).
The concept was greatly expanded in Virtually Perfect: Driving Innovative
and Lean Products through Product Lifecycle Management (Grieves 2011),
where the concept was still referred to as the Information Mirroring Model.
However, it is here that the term, Digital Twin, was attached to this concept by
reference to the co-author’s way of describing this model. Given the
descriptiveness of the phrase, Digital Twin, we have used this term for the
conceptual model from that point on.
The Digital Twin has been adopted as a conceptual basis in the
astronautics and aerospace area in recent years. NASA has used it in their
technology roadmaps (Piascik, Vickers et al. 2010) and proposals for sustainable
space exploration (Caruso, Dumbacher et al. 2010). The concept has been
proposed for next generation fighter aircraft and NASA vehicles (Tuegel,
Ingraffea et al. 2011, Glaessgen and Stargel 2012)1
, along with a description of
the challenges (Tuegel, Ingraffea et al. 2011) and implementation of as-builts
(Cerrone, Hochhalter et al. 2014).
Defining the Digital Twin
What would be helpful are some definitions to rely on when referring to the
Digital Twin and its different
manifestations. We would propose
the following as visualized in
Figure 4:
Digital Twin (DT) - the
Digital Twin is a set of virtual
information constructs that fully
describes a potential or actual
physical manufactured product
from the micro atomic level to the
macro geometrical level. At its
1
In a comment, the Glaessgen paper attributes the origin of “Digital Twin” to DARPA
without any citation. We cannot find any actual support for this claim.
4. Excerpted based on: Trans-Disciplinary Perspectives on System Complexity – All rights reserved
optimum, any information that could be obtained from inspecting a physical
manufactured product can be obtained from its Digital Twin. Digital Twins are of
two types: Digital Twin Prototype (DTP) and Digital Twin Instance (DTI). DT’s are
operated on in a Digital Twin Environment (DTE)
Digital Twin Prototype (DTP) - this type of Digital Twin describes the
prototypical physical artifact. It contains the informational sets necessary to
describe and produce a physical version that duplicates or twins the virtual
version. These informational sets include, but are not limited to, Requirements,
Fully annotated 3D model, Bill of Materials (with material specifications), Bill of
Processes, Bill of Services, and Bill of Disposal.
Digital Twin Instance (DTI) - this type of Digital Twin describes a specific
corresponding physical product that an individual Digital Twin remains linked to
throughout the life of that physical product. Depending on the use cases required
for it, this type of Digital Twin may contain, but again is not limited to, the
following information sets: A fully annotated 3D model with General
Dimensioning and Tolerances (GD&T) that describes the geometry of the
physical instance and its components, a Bill of Materials that lists current
components and all past components, a Bill of Process that lists the operations
that were performed in creating this physical instance, along with the results of
any measurements and tests on the instance, a Service Record that describes
past services performed and components replaced, and Operational States
captured from actual sensor data, current, past actual, and future predicted.
Digital Twin Aggregate (DTA) – this type of Digital Twin is the aggregation
of all the DTIs. Unlike the DTI, the DTA may not be an independent data
structure. It may be a computing construct that has access to all DTIs and
queries them either ad-hoc or proactively. On an ad hoc basis, the computing
construct might ask, “What is the Mean Time Between Failure (MTBF) of
component X.” Proactively, the DTA might continually examine sensor readings
and correlate those sensor readings with failures to enable prognostics.
Digital Twin Environment (DTE)- this is an integrated, multi-domain
physics application space for operating on Digital Twins for a variety of purposes.
These purposes would include:
Predictive - the Digital Twin would be used for predicting future behavior
and performance of the physical product. At the Prototype stage, the prediction
would be of the behavior of the designed product with components that vary
between its high and low tolerances in order to ascertain that the as-designed
product met the proposed requirements. In the Instance stage, the prediction
would be a specific instance of a specific physical product that incorporated
actual components and component history. The predictive performance would be
based from current point in the product's lifecycle at its current state and move
forward. Multiple instances of the product could be aggregated to provide a range
of possible future states.
5. Excerpted based on: Trans-Disciplinary Perspectives on System Complexity – All rights reserved
Interrogative - this would apply to DTI’s as the realization of the DTA.
Digital Twin Instances could be interrogated for the current and past histories.
Irrespective of where their physical counterpart resided in the world, individual
instances could be interrogated for their current system state: fuel amount,
throttle settings, geographical location, structure stress, or any other
characteristic that was instrumented. Multiple instances of products would
provide data that would be correlated for predicting future states. For example,
correlating component sensor readings with subsequent failures of that
component would result in an alert of possible component failure being
generated when that sensor pattern was reported. The aggregate of actual
failures could provide Bayesian probabilities for predictive uses.
The Digital Twin Model throughout the Lifecycle
As indicated by the 2002 slide in Figure 3, the reference to PLM indicated
that this conceptual model was and is intended to be a dynamic model that
changes over the lifecycle of the system. The system emerges virtually at the
beginning of its lifecycle, takes physical form in the production phase, continues
through its operational life, and is eventually retired and disposed of.
In the create phase, the physical system does not yet exist. The system
starts to take shape in virtual space as a Digital Twin Prototype (DTP). This is not
a new phenomenon. For most of human history, the virtual space where this
system was created existed only in people’s minds. It is only in the last quarter of
the 20th
century that this virtual space could exist within the digital space of
computers.
This opened up an entire new way of system creation. Prior to this leap in
technology, the system would have to have been implemented in physical form,
initially in sketches and blueprints but shortly thereafter made into costly
prototypes, because simply existing in people’s minds meant very limited group
sharing and understanding of both form and behavior.
In addition, while human minds are a marvel, they have severe limitations
for tasks like these. The fidelity and permanence of our human memory leaves a
great deal to be desired. Our ability to create and maintain detailed information in
our memories over a long period of time is not very good. Even for simple
objects, asking us to accurately visualize its shape is a task that most of us would
be hard-pressed to do with any precision. Ask most of us to spatially manipulate
complex shapes, and the results would be hopelessly inadequate.
However, the exponential advances in digital technologies means that the
form of the system can be fully and richly modeled in three dimensions. In the
past, emergent form in complex and even complicated system was a problem
because it was very difficult to insure that all the 2D diagrams fit together when
translated into 3D objects.
6. Excerpted based on: Trans-Disciplinary Perspectives on System Complexity – All rights reserved
In addition, where parts of the system move, understanding conflicts and
clashes ranged from difficult to impossible. There was substantial wasted time
and costs in translating 2D blueprints to 3D physical models, uncovering form
problems, and going back to the 2D blueprints to resolve the problems and
beginning the cycle anew.
With 3D models, the entire system can be brought together in virtual
space, and the conflicts and clashes discovered cheaply and quickly. It is only
once that these issues of form have been resolved that the translation to physical
models need to occur.
While uncovering emergent form issues is a tremendous improvement
over the iterative and costly two-dimensional blueprints to physical models, the
ability to simulate behavior of the system in digital form is a quantum leap in
discovering and understanding emergent behavior. System creators can now test
and understand how their systems will behave under a wide variety of
environments, using virtual space and simulation.
Also as shown in Figure 3, the ability to have multiple virtual spaces as
indicated by the blocks labeled VS1…VSn meant that that the system could be
put through destructive tests inexpensively. When physical prototypes were the
only means of testing, a destructive test meant the end of that costly prototype
and potentially its environment. A physical rocket that blows up on the launch
pad destroys the rocket and launch pad, the cost of which is enormous. The
virtual rocket only blows up the virtual rocket and virtual launch pad, which can
be recreated in a new virtual space at close to zero cost.
The create phase is the phase in which we do the bulk of the work in filling
in the system’s four emergent areas: PD, PU, UD, and UU. While the traditional
emphasis has been on verifying and validating the requirements or predicted
desirable (PD) and eliminating the problems and failures or the predicted
undesirable (PU), the DTP model is also an opportunity to identify and eliminate
the unpredicted undesirable (UU). By varying simulation parameters across the
possible range they can take, we can investigate the non-linear behavior in
complex systems that may have combinations or discontinuities that lead to
catastrophic problems.
Once the virtual system is completed and validated, the information is
used in real space to create a physical twin. If we have done our modeling and
simulation correctly, meaning we have accurately modeled and simulated the
real world in virtual space over a range of possibilities, we should have
dramatically reduced the number of UUs.
This is not to say we can model and simulate all possibilities. Because of
all the possible permutations and combinations in a complex system, exploring
all possibilities may not be feasible in the time allowed. However, the exponential
advances in computing capability mean that we can keep expanding the
possibilities that we can examine.
7. Excerpted based on: Trans-Disciplinary Perspectives on System Complexity – All rights reserved
It is in this create phase that we can attempt to mitigate or eradicate the
major source of UUs – ones caused by human interaction. We can test the virtual
system under a wide variety of conditions with a wide variety of human actors.
System designers often do not allow for conditions that they cannot conceive of
occurring. No one would think of interacting with system in such a way – until
people actually do just that in moments of panic in crisis.
Before this ability to simulate our systems, we often tested systems using
the most competent and experienced personnel because we could not afford
expensive failures of physical prototypes. But most systems are operated by a
relatively wide range of personnel. There is an old joke that goes, “What do they
call the medical student who graduates at the bottom of his or her class?”
Answer, “Doctor.” We can now afford to virtually test systems with a diversity of
personnel, including the least qualified personnel, because virtual failures are not
only inexpensive, but they point out UUs that we have not considered.
We next move into the next phase of the lifecycle, the production phase.
Here we start to build physical systems with specific and potentially unique
configurations. We need to reflect these configurations, the as-builts, as a DTI in
virtual space so that we can have knowledge of the exact specifications and
makeup of these systems without having to be in possession of the physical
systems.
So in terms of the Digital Twin, the flow goes in the opposite direction from
the create phase. The physical system is built. The data about that physical build
is sent to virtual space. A virtual representation of that exact physical system is
created in digital space.
In the support/sustain phase, we find out whether our predictions about
the system behavior were accurate. The real and virtual systems maintain their
linkage. Changes to the real system occur in both form, i.e., replacement parts,
and behavior, i.e., state changes. It is during this phase that we find out whether
our predicted desirable performance actually occurs and whether we eliminated
the predicted undesirable behaviors.
This is the phase when we see those nasty unpredicted undesirable
behaviors. If we have done a good job in ferreting out UUs in the create phase
with modeling and simulation, then these UUs will be annoyances but will cause
only minor problems. However, as has often been the case in complex systems
in the past, these UUs can be major and costly problems to resolve. In the
extreme cases, these UUs can be catastrophic failures with loss of life and
property.
In this phase the linkage between the real system and virtual system goes
both ways. As the physical system undergoes changes we capture those
changes in the virtual system so that we know the exact configuration of each
system in use. On the other side, we can use the information from our virtual
systems to predict performance and failures of the physical systems. We can
8. Excerpted based on: Trans-Disciplinary Perspectives on System Complexity – All rights reserved
aggregate information over a range of systems to correlate specific state
changes with the high probability of future failures.
As mentioned before, the final phase, disposal / decommissioning, is often
ignored as an actual phase. There are two reasons in the context of this topic
why the disposal phase should receive closer attention. The first is that
knowledge about a system’s behavior is often lost when the system is retired.
The next generation of the system often has similar problems that could have
been avoided by using knowledge about the predecessor system. While the
physical system may need to be retired, the information about it can be retained
at little cost.
Second, while the topic at hand is emergent behavior of the system as it is
in use, there is the issue of emergent impact of the system on the environment
upon disposal. Without maintaining the design information about what material is
in the system and how it is to be disposed of properly, the system may be
disposed of in a haphazard and improper way.
References:
Caruso, P., D. Dumbacher and M. Grieves (2010). Product Lifecycle Management and the Quest for
Sustainable Space Explorations. AIAA SPACE 2010 Conference & Exposition. Anaheim, CA.
Cerrone, A., J. Hochhalter, G. Heber and A. Ingraffea (2014). "On the Effects of Modeling As-
Manufactured Geometry: Toward Digital Twin." International Journal of Aerospace Engineering 2014.
Glaessgen, E. H. and D. Stargel (2012). The digital twin paradigm for future nasa and us air force vehicles.
AAIA 53rd Structures, Structural Dynamics, and Materials Conference, Honolulu, Hawaii.
Grieves, M. (2005). "Product Lifecycle Management: the new paradigm for enterprises." Int. J. Product
Development 2(Nos. 1/2): 71-84.
Grieves, M. (2006). Product Lifecycle Management: Driving the Next Generation of Lean Thinking. New
York, McGraw-Hill.
Grieves, M. (2011). Virtually perfect : Driving Innovative and Lean Products through Product Lifecycle
Management. Cocoa Beach, FL, Space Coast Press.
Piascik, R., J. Vickers, D. Lowry, S. Scotti, J. Stewart. and A. Calomino (2010). Technology Area 12:
Materials, Structures, Mechanical Systems, and Manufacturing Road Map, NASA Office of Chief
Technologist.
Tuegel, E. J., A. R. Ingraffea, T. G. Eason and S. M. Spottswood (2011). "Reengineering Aircraft
Structural Life Prediction Using a Digital Twin." International Journal of Aerospace Engineering 2011.
View publication statsView publication stats