Digital twins constitute virtual representations of physically existing systems. However, their inherent complexity makes them difficult to develop and prove correct. In this paper, we explore the use of UML and OCL, complemented with an executable language, SOIL, to build and test digital twins at a high level of abstraction. We also show how to realize the bidirectional connection between the UML models of the digital twin in the USE tool with the physical twin, using an architectural framework centered on a data lake. We have built a prototype of the framework to demonstrate our ideas, and validated it by developing a digital twin of a Lego Mindstorms car. The results allow us to show some interesting advantages of using high-level UML models to specify virtual twins, such as simulation, property checking, and some other types of tests.
VK Business Profile - provides IT solutions and Web Development
Using UML OCL Models High-Level Digital Twins
1. Using UML and OCL Models to realize
High-Level Digital Twins
Paula Muñoz
Universidad de Málaga, Spain
Javier Troya
Universidad de Málaga, Spain
Antonio Vallecillo
Universidad de Málaga, Spain
2. What is a Digital Twin?
2
Digital Twin (DT) Physical Twin (PT)
Data
Control
A Digital Twin is a comprehensive digital representation of an actual system,
service or product (the Physical Twin), synchronized at a specified frequency and
fidelity [1]
[1] Digital Twin Consortium, “Glossary of digital twins,” 2021. [Online]. Available: https://www.digitaltwinconsortium.org/glossary/index.htm
3. Our approach
3
Digital Twin (DT) Physical Twin (PT)
Data
Control
Digital Twins are pretty complex
software systems since they
need to emulate the actual
physical system faithfully.
Raise the level of abstraction using
software models during their
development
5. A framework for defining and deploying DTs
5
moveForward(2)
rotate(45)
stop()
{ distance: 3, lightValue: 20,
isPressed: 0 }
{ X : 10, Y: 20,
angle: 20 }
6. A framework for defining and deploying DTs
6
It connects the PT and DT
It implements the Blackboard architectural
pattern
Transform the data into formats that each
component understands
Implement additional functionality for the
system (dashboards, IA algorithms)
Implement different type of tests on the
physical entity, the twin or even on service
components
12. Summary and conclusions
12
In our contribution, we show how it is possible
to use UML and OCL models for the specification
of DTs to verify their expected behavior
in the early stages of development.
Advantages
❑ The framework allows replacing the high-level models with lower-level implementations
❑ The DT model can be specified at the needed fidelity level depending on the type of analysis that we want
to perform
❑ It also allows to analyze and validate any part of the software independently
Future work
❑ Validate the proposal with further physical systems
❑ Create more analysis and services modules
❑ Evaluate the framework performance under stressful conditions to determine its scalability and
applicability to larger systems
13. Paula Muñoz
Universidad de Málaga, Spain
Using UML and OCL Models to realize High-Level Digital Twins
Javier Troya
Universidad de Málaga, Spain
Antonio Vallecillo
Universidad de Málaga, Spain
Editor's Notes
Currently, there are different concepts of Digital Twin in the literature.
The definition in which we base our work is the one given by the Digital Twin Consortium.
In general, a Digital Twin is a digital representation of an existing system, service, or product.
In this definition, it is assumed synchronization between this replica and the actual system.
Engineering DT systems, and in particular testing them, is challenging:
The Digital twin is usually quite complex since it needs to emulate the actual system faithfully.
We tackle this challenge by raising the level of abstraction using software models.
These models can reflect a simplified structure, physics, and behavior of the system with fidelity.
The simulations performed using models require lower computational costs since they only present the required dimensions.
In our work, we connect high-level UML models of a DT with the existing system, enabling the exchange of information between the twins.
Additionally, the proposed architecture is modular so that once the lightweight models are validated, and the system is developed, the models can be substituted by the actual implementation.
To demonstrate our proposal and to initially validate it, we used a Lego Mindstorms car.
The car is equipped with three different kinds of sensors:
Touch sensor to detect collisions
Ultrasonic sensor to measure distances
Light sensor able to differentiate colors
Additionally, it comes with a pose provider able to tell its position in planar coordinates
The car downloads from the computer the software which drives the vehicle, defining different behaviors
For example, the behavior we defined for our tests is a Line Follower: the car follows a black line on the floor.
If the line disappears, it keeps spinning until it finds it again.
In general, physical entities interact with their environment through a set of inputs and outputs.
The values of the inputs are the ones generated by the sensors, which detect changes in their environment.
The values of the outputs are the reactions that the physical entity has to input events.
Additionally, they may accept external commands to control their behavior.
In our architecture, these are the interactions sent and received by the physical twin, and since the digital twin is its replica, they are the same for it.
Data Lake
The communication between both twins is achieved through a data lake.
It is a flexible, scalable data storage and management system.
It implements a Blackboard architectural pattern, meaning that any element may write or read information from it in an asynchronous manner.
Drivers
The different components access the DL using drivers that transform the data into formats that each component understands => THE WHITE RECTANGLES.
Service components
Service components implement additional functionality that a digital twin system can provide.
Dashboards that dynamically visualize the data
Algorithms that learns from the past movements of the car and its collisions and draws floor maps to avoid the obstacles
Analysis
Analysis components implement different types of tests on the physical entity, the digital twin, or even on service components.
Compare traces.
This is the model of the Lego Car.
It includes the three sensors with their respective values.
The motor, which moves the car and rotates it at a certain angle.
We also have the planar coordinates of the car and the angle it is heading to
Additionally, we include the car identifier, the execution identifier (to distinguish between simulations)
We have a clock object to simulate the passage of time
When the clock executes ticks, every active object executes its action function.
For example, the car would execute the executeBehavior of the behavior it implements.
The behavior is defined using the imperative language SOIL
In our running example, the model of the information used by the digital twin is the following:
The input snapshots represent the information that comes from the physical twin.
For every input snapshot, the digital twin will simulate how the physical twin should react to a given input reading of the environment.
Every reaction will be an output car snapshot.
The Data Lake repository is implemented using Redis, which is a lightweight open-source key-value database.
We are choosing maps as the means to store the input and output snapshots.
The key for the maps will be composed of different elements separated by semicolons to allow more straightforward queries.
The Physical Twin
We connect the Physical Twin and the Data Lake using a publish-subscribe service and a socket since the car does not allow external Java libraries.
Connecting UML Models to Data Lake
We have created a USE plugin using its API to connect the UML Models in USE to the Data Lake.
This plugin allows creating the input snapshots using the information stored in the data lake.
Whenever the driver detects new relevant information in the data lake, it creates a new input snapshot.
These snapshots are stored later in the DL.
The same driver in the plugin detects new output snapshots. It takes the information, stores it in the data lake, and removes the snapshot.
In the paper, we also proposed an example of an Analysis component:
The class Monitor compares the snapshots produced by the digital and physical car.
Each time a mismatch is detected between two snapshots, they are recorded in a list of elements whose type is a mismatch.
We compare these snapshots with a certain precision. Given that we are dealing with physical entities, it is not realistic that the values are identical.
The framework discussed is suitable not only for high-level models but also for lower-level implementations since the models could be replaced at any given time without effort.
With our approach, the DT model can be specified at the needed fidelity level depending on the type of analysis we want to perform, reducing development efforts.
It also allows to analysis and validate any part of the software independently, such as the behavioral logic.
We are aware that full-fledged DTs require much more complex implementations than those described here.
However, as software artifacts, we claim that DTs require appropriate software engineering methods and practices, including rigorous specification, validation, and verification processes.
In our contribution, we show how it is possible to use UML and OCL models for the specification of DTs to verify their expected behavior in the early stages of development, with very low computational costs and development efforts.
Validate the proposal with a more detailed analysis of other physical systems
Create other analysis and services modules for validation
Evaluates the framework performance under stressful conditions to determine scalability and applicability to larger systems