37. High Throughput Design Pattern
• Do samples arrive continuously or at a
high periodic rate?
• Is the transport saturated?
38. High Throughput Periodic Data
• RTI Connext…
– Sends synchronously by default
– Supports batching for high periodic rates
– Supports multiple reliability paradigms
– Supports receive processing in receive thread
or application thread
39. High Throughput over
Constrained Network
• RTI Connext…
– Supports configurable MTU sizes
– Supports batching in a manner with reduces
protocol header overhead
– Supports a Low-Bandwidth network plugin
with header and data compression
– Supports a “multi-channel” feature to send
data over different NICs as a function of data
content
40. Reliable High Throughput
• Lots to consider
– Writer must keep data for potential retransmission
– Latency unpredictability
– Readers must behave
– Design for desired behavior if data lost…
• Declare failure and stop
• Report error and keep going
• Delay writing for readers to catch up
• Do nothing
• …
49. Start using DDS Today!
Download the FREE complete RTI Connext
DDS Pro package for Windows and Linux:
• Leading implementation of DDS
• C, C++, C#/.NET and Java APIs
• Tools to monitor, debug, test, visualize and
prototype distributed applications and systems
• Adapters to integrate with existing applications and
IT systems
Editor's Notes
Interoperability and Open Architecture
Current practice but…
What is it really?
Why is it hard?
This illustration was originally created by the U.S. Navy, who chose NDDS for its Open Architecture Computing Environment. Of all the software, only NDDS covers the full range of performance requirements from non-real-time to extreme real-time.
Non-real time: business layer apps
Soft real time: C2, display & decision support
Hard real time: e.g. sensor & weapon control
Extreme real-time: e.g. signal processing
A data-centric integration solution to achieve semantic interoperability is important and achievable.
It is important because… One of the only things I can guarantee in a SOS is that it will change. At some point, it will. And when that change happens, rather than have your system be broken by it, why not survive it? If I architect my data in a rigorous and formal manner, and since data is what the systems operate on, then any changes in the system are easily accommodated, because they’d manifest as changes to the information present in the SOS. If the changes are made in a rigorous and repeatable way, then by knowing the rules for formation and the abstract data model that all things in the SOS come from, I can simply transform it and understand it if need be. The data will have meaning. It will have context. It will be usable and understood.
Letting your system be broken by something that is inevitable seems a bit silly, especially since we can anticipate that change and accommodate it by making some intelligent architecture and design decisions upfront.
Here we can see legacy, future and current systems – which is a reality – they can technically interoperate via a protocol using a common infrastructure. We know how to do that. They can syntactically interoperate by using a common data structure. But how do we accommodate the systems where can can’t change the interfaces? When they are incompatible? We need a mediation component. Achieving semantic interoperability relies on components such as this, especially since one of our requirements was that we needed to be able to accommodate change and not be broken by it (have to make changes to existing interfaces).
The TSS includes a mapping of the TS API to DDS, per the FACE Technical Standard.
Standard and open interfaces:
TS API
RTI Transport API (called NETIO)
DDS-RTPS wire protocol
FACE OS security profile
Internally DDS API
A model is anything used in any way to represent something else. We use models to observe the effect on manipulating the original, without actually having to manipulate it. A really good model will capture all of the details we need to manipulate the original, and no more.
On the left we have a picture of an actual 1967 ford mustang gt, and on the right a model of that same car. Let’s say you have a child that is going through a phase where they’re really into cars. And this child wants nothing more than what his dad has – a 1967 ford mustang gt. Now, I love my kids and I want to give them everything just so I could see what marvelous things they’d do with it. However, I am not about to hand over the key to a car to my toddler. I would give them a scaled, fit for purpose version, such as the model toy on the right. It has very little in the way of extras, but it is entirely sufficient AND safe to entertain my toddler.
A data model is a representation that describes the data about the things that exist in your domain.
If you have a system – since systems operate on data – well, then you have a data model. If you’re a system integrator, you deal with data models during your integration activities. Data models come in many different representations, they express many different things in varying degrees of explicitness. Some data models capture information very unambiguously and others don’t. But no matter where your data model falls on the spectrum, you can work with it to make it better.
Data models come in many flavors, and they’re not all equal. Which is best for you is going to depend on your systems requirements, and the function of the system, or component that will use that data. Here we have three examples of models many people have some familiarity with at least two of them.
The dictionary is a list of terms for a particular domain of knowledge. It contains a list of terms, as well as the definitions and pronunciations for those terms. Using a representation such as this, words alongside their meaning, we can communicate about the things that exist in our domain and the meaning of those expressions, the words, is understood to those who use the same dictionary.
The linnean taxonomy is an example of a hierarchical data model - it shows us the conception, naming, and classification of organism groups. It represents information in a hierarchical format, such a classification or categorization schema. Using a representation structure such as this, I can express that “this” is one of “those”.
The last example is the periodic table of elements. From this we can tell that Gold has a weight and a certain number of protons… but I don’t know if 2 elements will bond, and if they will what they will form, simply by looking at this table.
Per our requirements, we define a good data model to be one that captures, among other things, the semantics, or meaning, of the things that is represents in an unambiguous way.
The process by which you generate a data model is something you need to consider… Especially if you need that data model that helps you meet your key non-functional requirements.