"IoT, arquitectura de solución y cómo enriquecerlo con Confluent"
Presentation given by Nelo Puchades Gascón at the Confluent Streaming Series Madrid, 05.11.2019.
13. Confluent Streaming Event
Copyright: mimacom ag, 2019. All rights reserved.
13
SOLUTION
Data Connector
Enterprise Applications Advanced Analitycs
Edge Fog
14. Confluent Streaming Event
Copyright: mimacom ag, 2019. All rights reserved.
14
SOLUTION
Data Connector
Enterprise Applications Advanced Analitycs
Edge Fog
15. Confluent Streaming Event
Copyright: mimacom ag, 2019. All rights reserved.
15
SOLUTION
Data Connector
Enterprise Applications Advanced Analitycs
Governance
Edge Fog
16. Confluent Streaming Event
Copyright: mimacom ag, 2019. All rights reserved.
16
SOLUTION
Data Connector
Enterprise Applications Advanced Analitycs
Governance
DevOps
Edge Fog
17. Confluent Streaming Event
Copyright: mimacom ag, 2019. All rights reserved.
17
SOLUTION – TECHNOLOGIES
Data Connector
Enterprise
Applications
Advanced
Analitycs
Governance
DevOps
Edge Fog
18. Confluent Streaming Event
Copyright: mimacom ag, 2019. All rights reserved.
18
CONFLUENT ENRICHMENT
Edge Fog
Data Connector
Enterprise
Applications Advanced Analitycs
Shadowing /
Digital Twin
Governance
DevOps
19. Confluent Streaming Event
Copyright: mimacom ag, 2019. All rights reserved.
19
CONFLUENT ENRICHMENT
Edge Fog
Data Connector
Enterprise
Applications Advanced Analitycs
Smart Data
Shadowing /
Digital Twin
Governance
DevOps
20. Confluent Streaming Event
Copyright: mimacom ag, 2019. All rights reserved.
20
CONFLUENT ENRICHMENT
Edge Fog
Data Connector
Enterprise
Applications Advanced Analitycs
Governance
DevOps
Smart Data
Shadowing /
Digital Twin
21. Confluent Streaming Event
Copyright: mimacom ag, 2019. All rights reserved.
21
CONFLUENT ENRICHMENT
Data Connector
Enterprise
Applications Advanced Analitycs
Governance
Smart Data
Shadowing /
Digital Twin
Fast introduction to myself:
Working in Mimacom as a IT Solutions Architect
Technology lover and always looking for new challenges
Devices connectivity exists along time ago, and previously was known as m2m.
IoT was a new term in 1999, but untill 2010 it doesn’t take tracktion and in 2014 was consolidated after Nest adquisition by Google
A lot of companies have connected devices and they process the information.
So which change request us to evolve into IoT?
First of all, the number of devices that we need to connect and the quantity of data to process
Each time there are more devices that we can connect to our systems in order to upgrade our business
For example:
With artificial visión, we can identify the profile of our physical shop consumers, in order to créate offers according of the customers present in our shop
Thanks to ingest data, we can predict maintenance and reduce the stopping time of our industry
The second change is the urgent of people to obtain information
At this moment, is not enough to recollect data, processe it at night and offer to our business in the next day. Our customers are requiring us to process data inmediately from a variety of sources.
The las one, but not less important, at this momento he have the capability to manage thousands of devices, also millions, and store all the information generated by them.
So, to satisfy business needs and be agile when our customers require us new needs, we need an IoT platform to grow our business, covering:
All kind of protocols (TCP, UDP, MQTT, HTTP, Looking for data, …)
Existing solutions and Enterprise integrations
Existing backend applications
Innovation
Scalability
IoT Platforms need to adapt to the number of connected devices
Cost effectiveness
Governance
Data normalization
Monitoring
Monetization
Security concerns
Avoid unauthorized Access
UUEE example, where a city was unplug all his lamps by a hacker
Avoid data extraction from devices
Consider devices restrictions (certificates, firmware, …) and cibersecurity policy
Edge and fog
Near devices, connecting one device (edge) or multiple devices (fog)
For offline requeriments
To help preprocess data
Automatic actuate to devices
Data Connector
Multiprotocol capability
Scalability
Flexibility
Process individual messages
Bidirectional connection to devices
Connects Enterprise to Devices
Enterprise Applications
On prem applications
Custom applications & packages
Cloud applications & SaaS
EIP systems
Advanced Analytics
Data lake
Reporting
Processing
External data
Security
Device registration
Digital certificate
OPN
VPN
SSL
Two factor identification
Protocols (ISO)
Governance:
Monetizing
Monitoring
Versioning
DevOps:
Philosophy
IaC
Technologies
Architecture based on microservices, using a broker (Kafka) to allow resilience. For example in case that the data lake is not avaliable, the bróker retain messages until it will be available (talk about maintenance periods of 24 hours)
Using EIP to allow all kinds of integrations
Using Kafka allow us to créate an event integration platform
This solution allow us to process data in real time, from thousands of things and monitoring it in different sectors, on cloud and on prem
Shadowing / Digital Twin
Thanks to Confluent Stream, we can expose the last state known of each thing (for example, realtime location of trucks)
Virtual representation of state / devices
Register the last known state of the devices
Offer devices’s information to the organization
Manage state changes of devices
Smart Data
Thanks to Confluent Stream we can create models than detect anormal working of sensors
KSQL language help us to know the last state of each thing, and the history information of the device
Creating specific rules to connect with Enterprise applications
Exposing processed data in real time (for example stocks dashboards, fullness of a place or recipient)
AI – Automatically actuate on devices (for example, detect that is rainning and close de window)
ML – Detect errors in sensors
Alerting systems
Connecting Governance with Data
Governance:
The capability of integrate Confluent with Elasticsearch, allow us to use the same tool for everything
We collect through confluent, and use the kibana dashboards to show monitoring, biusiness and “monetizing” data information
We can use a Schema Registry to control versión of data in each microservice
We can integrate Confluent with DevOps tolos, to manage the actual versión of the pipelines, through elasticsearch
We can identify performance deviation in real time, thanks to the use of streams and KSQL language
Shadowing / Digital Twin
Thanks to Confluent Stream, we can expose the last state known of each thing (for example, realtime location of trucks)
Virtual representation of state / devices
Register the last known state of the devices
Offer devices’s information to the organization
Manage state changes of devices
Smart Data
Thanks to Confluent Stream we can create models than detect anormal working of sensors (talk about cleaner car sample)
KSQL language help us to know the last state of each thing, and the history information of the device
Creating specific rules to connect with Enterprise applications
Exposing processed data in real time (for example stocks dashboards, fullness of a place or recipient)
AI – Automatically actuate on devices (for example, detect that is rainning and close de window)
ML – Detect errors in sensors
Alerting systems
Connecting Governance with Data
Governance:
The capability of integrate Confluent with Elasticsearch, allow us to use the same tool for everything
We collect through confluent, and use the kibana dashboards to show monitoring, biusiness and “monetizing” data information
We can use a Schema Registry to control versión of data in each microservice
We can integrate Confluent with DevOps tolos, to manage the actual versión of the pipelines, through elasticsearch
We can identify performance deviation in real time, thanks to the use of streams and KSQL language