Data integration and processing is a huge challenge in Industrial IoT (IIoT, aka Industry 4.0 or Automation Industry) due to monolithic systems and proprietary protocols. Apache Kafka, its ecosystem (Kafka Connect, KSQL) and Apache PLC4X are a great open source choice to implement this integration end to end in a scalable, reliable and flexible way.
This blog post covers a high level overview about the challenges and a good, flexible architecture. At the end, I share a video recording and the corresponding slide deck. These provide many more details and insights.
Apache Kafka is the De-facto Standard for Real-Time Event Streaming. It provides
Open Source (Apache 2.0 License)
Global-scale
Real-time
Persistent Storage
Stream Processing
PCL4X allows vertical integration and to write software independent of PLCs using JDBC-like adapters for various protocols like Siemens S7, Modbus, Allen Bradley, Beckhoff ADS, OPC-UA, Emerson, Profinet, BACnet, Ethernet.
Github example: https://github.com/kaiwaehner/iiot-integration-apache-plc4x-kafka-connect-ksql-opc-ua-modbus-siemens-s7
More details: http://www.kai-waehner.de/blog/2019/09/02/iiot-data-integr…and-apache-plc4x/
Video Recording: https://youtu.be/RWKggid25ds
Processing IoT Data from End to End with MQTT and Apache Kafka confluent
(Kai Waehner, Confluent) Kafka Summit SF 2018
This session discusses end-to-end use cases such as connected cars, smart home or healthcare sensors, where you integrate Internet of Things (IoT) devices with enterprise IT using open source technologies and standards. MQTT is a lightweight messaging protocol for IoT. However, MQTT is not built for high scalability, longer storage or easy integration to legacy systems. Apache Kafka is a highly scalable distributed streaming platform, which ingests, stores, processes and forwards high volumes of data from thousands of IoT devices.
This session discusses the Apache Kafka open source ecosystem as a streaming platform to process IoT data. See a live demo of how MQTT brokers like Mosquitto or RabbitMQ integrate with Kafka, and how you can even integrate MQTT clients to Kafka without MQTT Broker. Learn how to analyze the IoT data either natively on Kafka with Kafka Streams/KSQL or on an external big data cluster like Spark, Flink or Elasticsearch leveraging Kafka Connect.
Apache Kafka 0.8 basic training - VerisignMichael Noll
Apache Kafka 0.8 basic training (120 slides) covering:
1. Introducing Kafka: history, Kafka at LinkedIn, Kafka adoption in the industry, why Kafka
2. Kafka core concepts: topics, partitions, replicas, producers, consumers, brokers
3. Operating Kafka: architecture, hardware specs, deploying, monitoring, P&S tuning
4. Developing Kafka apps: writing to Kafka, reading from Kafka, testing, serialization, compression, example apps
5. Playing with Kafka using Wirbelsturm
Audience: developers, operations, architects
Created by Michael G. Noll, Data Architect, Verisign, https://www.verisigninc.com/
Verisign is a global leader in domain names and internet security.
Tools mentioned:
- Wirbelsturm (https://github.com/miguno/wirbelsturm)
- kafka-storm-starter (https://github.com/miguno/kafka-storm-starter)
Blog post at:
http://www.michael-noll.com/blog/2014/08/18/apache-kafka-training-deck-and-tutorial/
Many thanks to the LinkedIn Engineering team (the creators of Kafka) and the Apache Kafka open source community!
Ingesting and Processing IoT Data Using MQTT, Kafka Connect and Kafka Streams...confluent
(Guido Schmutz, Trivadis) Kafka Summit SF 2018
Internet of Things use cases are a perfect match for processing with a streaming platform such as Kafka and the Confluent Platform. Some of the questions to be answered are: How do we feed the data from our devices into Kafka? Do we directly send data to Kafka? Is Kafka accessible from outside the organization over the internet? What if we want to use a more specific IoT protocol such as MQTT or CoAP in between? How would we integrate it with Kafka? How can we enrich IoT streaming data with static data sitting in a traditional system?
This session will provide answers to these and other questions using a fictitious use case of a trucking company. Trucks are constantly sending data about position and driving habits, which can be used to derive real-time information and actions. A large part of the presentation will be a live demo. The demo will show the implementation of the pipeline incrementally: starting with sending the truck movement events directly to Kafka, then adding MQTT to the sensor data ingestion, followed by using Kafka Streams and KSQL to apply stream processing on the information received. The final pipeline will demonstrate the application of Kafka Connect with MQTT and JDBC source connectors for data ingestion and event stream enrichment, and Kafka Streams and KSQL for stream processing. The key takeaway is the live demonstration of a working end-to-end IoT streaming data ingestion pipeline using Kafka technologies.
Flexible and Scalable Integration in the Automation Industry/Industrial IoTconfluent
Speaker: Kai Waehner, Technology Evangelist, Confluent
Kafka-Native, End-to-End IIoT Data Integration and Processing with Kafka Connect, KSQL, and PLC4X
Best Practices for Streaming IoT Data with MQTT and Apache Kafka®confluent
Watch this talk here: https://www.confluent.io/online-talks/best-practices-for-streaming-iot-data-with-MQTT-and-apache-kafka-on-demand
Organizations today are looking to stream IoT data to Apache Kafka. However, connecting tens of thousands or even millions of devices over unreliable networks can create some architecture challenges.
In this session, we will identify and demo some best practices for implementing a large scale IoT system that can stream MQTT messages to Apache Kafka.
Talk about how to use Open-Source and especially Apache PLC4X to access industry machinery data and to use open-source to build the next generation of industrial software
The Rise Of Event Streaming – Why Apache Kafka Changes EverythingKai Wähner
Business digitalization trends like microservices, the Internet of Things or Machine Learning are driving the need to process events at a whole new scale, speed and efficiency. Traditional solutions like ETL/data integration or messaging are not build to serve these needs.
Today, the open source project Apache Kafka® is being used by thousands of companies including over 60% of the Fortune 100 to power and innovate their businesses by focusing their data strategies around event-driven architectures leveraging event streaming.We will discuss the market and technology changes that have given rise to Kafka and to Event Streaming, and we will introduce the audience to the key aspects of building an Event streaming platform with Kafka. Examples of productive use cases from the automotive, manufacturing and transportation sector will showcase the power of event streaming.
Vous apprendrez également à :
• Créer plus rapidement des produits et fonctionnalités à l’aide d’une suite complète de connecteurs et d’outils de gestion des flux, et à connecter vos environnements à des pipelines de données
• Protéger vos données et charges de travail les plus critiques grâce à des garanties intégrées en matière de sécurité, de gouvernance et de résilience
• Déployer Kafka à grande échelle en quelques minutes tout en réduisant les coûts et la charge opérationnelle associés
Processing IoT Data from End to End with MQTT and Apache Kafka confluent
(Kai Waehner, Confluent) Kafka Summit SF 2018
This session discusses end-to-end use cases such as connected cars, smart home or healthcare sensors, where you integrate Internet of Things (IoT) devices with enterprise IT using open source technologies and standards. MQTT is a lightweight messaging protocol for IoT. However, MQTT is not built for high scalability, longer storage or easy integration to legacy systems. Apache Kafka is a highly scalable distributed streaming platform, which ingests, stores, processes and forwards high volumes of data from thousands of IoT devices.
This session discusses the Apache Kafka open source ecosystem as a streaming platform to process IoT data. See a live demo of how MQTT brokers like Mosquitto or RabbitMQ integrate with Kafka, and how you can even integrate MQTT clients to Kafka without MQTT Broker. Learn how to analyze the IoT data either natively on Kafka with Kafka Streams/KSQL or on an external big data cluster like Spark, Flink or Elasticsearch leveraging Kafka Connect.
Apache Kafka 0.8 basic training - VerisignMichael Noll
Apache Kafka 0.8 basic training (120 slides) covering:
1. Introducing Kafka: history, Kafka at LinkedIn, Kafka adoption in the industry, why Kafka
2. Kafka core concepts: topics, partitions, replicas, producers, consumers, brokers
3. Operating Kafka: architecture, hardware specs, deploying, monitoring, P&S tuning
4. Developing Kafka apps: writing to Kafka, reading from Kafka, testing, serialization, compression, example apps
5. Playing with Kafka using Wirbelsturm
Audience: developers, operations, architects
Created by Michael G. Noll, Data Architect, Verisign, https://www.verisigninc.com/
Verisign is a global leader in domain names and internet security.
Tools mentioned:
- Wirbelsturm (https://github.com/miguno/wirbelsturm)
- kafka-storm-starter (https://github.com/miguno/kafka-storm-starter)
Blog post at:
http://www.michael-noll.com/blog/2014/08/18/apache-kafka-training-deck-and-tutorial/
Many thanks to the LinkedIn Engineering team (the creators of Kafka) and the Apache Kafka open source community!
Ingesting and Processing IoT Data Using MQTT, Kafka Connect and Kafka Streams...confluent
(Guido Schmutz, Trivadis) Kafka Summit SF 2018
Internet of Things use cases are a perfect match for processing with a streaming platform such as Kafka and the Confluent Platform. Some of the questions to be answered are: How do we feed the data from our devices into Kafka? Do we directly send data to Kafka? Is Kafka accessible from outside the organization over the internet? What if we want to use a more specific IoT protocol such as MQTT or CoAP in between? How would we integrate it with Kafka? How can we enrich IoT streaming data with static data sitting in a traditional system?
This session will provide answers to these and other questions using a fictitious use case of a trucking company. Trucks are constantly sending data about position and driving habits, which can be used to derive real-time information and actions. A large part of the presentation will be a live demo. The demo will show the implementation of the pipeline incrementally: starting with sending the truck movement events directly to Kafka, then adding MQTT to the sensor data ingestion, followed by using Kafka Streams and KSQL to apply stream processing on the information received. The final pipeline will demonstrate the application of Kafka Connect with MQTT and JDBC source connectors for data ingestion and event stream enrichment, and Kafka Streams and KSQL for stream processing. The key takeaway is the live demonstration of a working end-to-end IoT streaming data ingestion pipeline using Kafka technologies.
Flexible and Scalable Integration in the Automation Industry/Industrial IoTconfluent
Speaker: Kai Waehner, Technology Evangelist, Confluent
Kafka-Native, End-to-End IIoT Data Integration and Processing with Kafka Connect, KSQL, and PLC4X
Best Practices for Streaming IoT Data with MQTT and Apache Kafka®confluent
Watch this talk here: https://www.confluent.io/online-talks/best-practices-for-streaming-iot-data-with-MQTT-and-apache-kafka-on-demand
Organizations today are looking to stream IoT data to Apache Kafka. However, connecting tens of thousands or even millions of devices over unreliable networks can create some architecture challenges.
In this session, we will identify and demo some best practices for implementing a large scale IoT system that can stream MQTT messages to Apache Kafka.
Talk about how to use Open-Source and especially Apache PLC4X to access industry machinery data and to use open-source to build the next generation of industrial software
The Rise Of Event Streaming – Why Apache Kafka Changes EverythingKai Wähner
Business digitalization trends like microservices, the Internet of Things or Machine Learning are driving the need to process events at a whole new scale, speed and efficiency. Traditional solutions like ETL/data integration or messaging are not build to serve these needs.
Today, the open source project Apache Kafka® is being used by thousands of companies including over 60% of the Fortune 100 to power and innovate their businesses by focusing their data strategies around event-driven architectures leveraging event streaming.We will discuss the market and technology changes that have given rise to Kafka and to Event Streaming, and we will introduce the audience to the key aspects of building an Event streaming platform with Kafka. Examples of productive use cases from the automotive, manufacturing and transportation sector will showcase the power of event streaming.
Vous apprendrez également à :
• Créer plus rapidement des produits et fonctionnalités à l’aide d’une suite complète de connecteurs et d’outils de gestion des flux, et à connecter vos environnements à des pipelines de données
• Protéger vos données et charges de travail les plus critiques grâce à des garanties intégrées en matière de sécurité, de gouvernance et de résilience
• Déployer Kafka à grande échelle en quelques minutes tout en réduisant les coûts et la charge opérationnelle associés
IoT Architectures for Apache Kafka and Event Streaming - Industry 4.0, Digita...Kai Wähner
The Internet of Things (IoT) is getting more and more traction as valuable use cases come to light. Whether you are in Healthcare, Telecommunications, Manufacturing, Banking or Retail to name a few industries, there is one key challenge and that's the integration of backend IoT data logs and applications, business services and cloud services to process the data in real time and at scale.
In this talk, we will be sharing how Kafka has become the leading technology used throughout the business to provide Real Time Event Streaming. Explore real life use cases of Kafka Connect, Kafka Streams and KSQL independent of the data deployment be it on a private or public Cloud, On Premise or at the Edge.
Audi - Connected car infrastructure
Robert Bosch Power Tools - Track and Trace of devices and people at construction areas
Deutsche Bahn - Customer 360 for train timetable updates
E.ON - IoT Streaming Platform to integrate and build smart home, smart building and smart grid infrastructures
An Introduction to Confluent Cloud: Apache Kafka as a Serviceconfluent
Business breakout during Confluent’s streaming event in Munich, presented by Hans Jespersen, VP WW Systems Engineering at Confluent. This three-day hands-on course focused on how to build, manage, and monitor clusters using industry best-practices developed by the world’s foremost Apache Kafka™ experts. The sessions focused on how Kafka and the Confluent Platform work, how their main subsystems interact, and how to set up, manage, monitor, and tune your cluster.
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...confluent
MQ, ETL and ESB middleware are often used as integration backbone between legacy applications, modern microservices and cloud services. This introduces several challenges and complexities like point-to-point integration or non-scalable architectures. This session discusses how to build a completely event-driven streaming platform leveraging Apache Kafka’s open source messaging, integration and streaming components to leverage distributed processing, fault-tolerance, rolling upgrades and the ability to reprocess events. Learn the differences between a event-driven streaming platform leveraging Apache Kafka and middleware like MQ, ETL and ESBs – including best practices and anti-patterns, but also how these concepts and tools complement each other in an enterprise architecture.
Apache Kafka in the Automotive Industry (Connected Vehicles, Manufacturing 4....Kai Wähner
Connect all the things: An intro to event streaming for the automotive industry including connected cars, mobility services, and manufacturing / industrial IoT.
Video recording of this talk: https://www.youtube.com/watch?v=rBfBFrcO-WU
The Fourth Industrial Revolution (also known as Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices, using modern smart technology. Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way using integrating with various legacy and modern data sources and sinks.
Other industries—retail, healthcare, government, financial services, energy, and more—also lean into Industry 4.0 technology to take advantage of IoT devices, sensors, smart machines, robotics, and connected data. The variety of these deployments goes from disconnected edge use cases across hybrid architectures to global multi-cloud deployments.
In this presentation, I want to give you an overview of existing use cases for event streaming technology in a connected world across supply chains, industries and customer experiences that come along with these interdisciplinary data intersections:
- The Automotive Industry (and it’s not only Connected Cars)
- Mobility Services across verticals (transportation, logistics, travel industry, retailing, …)
- Smart Cities (including citizen health services, communication infrastructure, …)
Real-world examples include use cases from car makers such as Audi, BMW, Porsche, Tesla, plus many examples from mobility services such as Uber, Lyft, Here Technologies, and more.
The Top 5 Apache Kafka Use Cases and Architectures in 2022Kai Wähner
I see the following topics coming up more regularly in conversations with customers, prospects, and the broader Kafka community across the globe:
Kappa Architecture: Kappa goes mainstream to replace Lambda and Batch pipelines (that does not mean that there is no batch processing anymore). Examples: Kafka-powered Kappa architectures from Uber, Disney, Shopify, and Twitter.
Hyper-personalized Omnichannel: Retail and customer communication across online and offline channels becomes the new black, including context-specific upselling, recommendations, and location-based services. Examples: Omnichannel Retail and Customer 360 in Real-Time with Apache Kafka.
Multi-Cloud Deployments: Business units and IT infrastructures span across regions, continents, and cloud providers. Linking clusters for bi-directional replication of data in real-time becomes crucial for many business models. Examples: Global Kafka deployments.
Edge Analytics: Low latency requirements, cost efficiency, or security requirements enforce the deployment of (some) event streaming use cases at the far edge (i.e., outside a data center), for instance, for predictive maintenance and quality assurance on the shop floor level in smart factories. Examples: Edge analytics with Kafka.
Real-time Cybersecurity: Situational awareness and threat intelligence need to process massive data in real-time to defend against cyberattacks successfully. The many successful ransomware attacks across the globe in 2021 were a warning for most CIOs. Examples: Cybersecurity for situational awareness and threat intelligence in real-time.
The Rise of Data in Motion in the Healthcare Industry - Use Cases, Architectures and Examples powered by Apache Kafka.
Use Cases for Data in Motion in the Healthcare Industry:
- Know Your Patient (= “Customer 360”)
- Operations (Healthcare 4.0 including Drug R&D, Patient Care, etc.)
- IT Perspective (Cybersecurity, Mainframe Offload, Hybrid Cloud, Streaming ETL, etc)
Real-world examples include Covid-19 Electronic Lab Reporting, Cerner, Optum, Centene, Humana, Invitae, Bayer, Celmatix, Care.com.
IoT Architectures for a Digital Twin with Apache Kafka, IoT Platforms and Mac...Kai Wähner
A digital twin is a digital replica of a living or non-living physical entity. This session discusses the benefits and IoT architectures of a Digital Twin in Industrial IoT (IIoT) and its relation to Apache Kafka, IoT frameworks and Machine Learning. Kafka is often used as central event streaming platform to build a scalable and reliable digital twin for real time streaming sensor data. A live demo shows a scalable digital twin infrastructure for condition monitoring and predictive maintenance in real time for a connected car infrastructure leveraging Kafka, MQTT and TensorFlow.
Key Take-Aways:
• Learn about use cases and characteristics of a digital twin in various industries
• Understand how to build a digital twin for every single (of tens of thousands) IoT device or machine
• See different IoT architectures with Kafka and other IoT technologies and products, including edge, hybrid and global deployments
• Understand the relation to Machine Learning and bring added value to your IoT infrastructure by enabling use cases like predictive maintenance
• Understand how the Apache Kafka enables scalable and flexible end-to-end integration processing from IIoT data to various backend applications
• Watch a live demo of an end-to-end integration, real time processing and analytics of thousands of IoT devices
More details:
https://www.kai-waehner.de/blog/2019/11/28/apache-kafka-industrial-iot-iiot-build-an-open-scalable-reliable-digital-twin/
https://www.kai-waehner.de/blog/2020/03/25/architectures-digital-twin-digital-thread-apache-kafka-iot-platforms-machine-learning/
https://youtu.be/Q3eKPEVwNVY
GCP for Apache Kafka® Users: Stream Ingestion and Processingconfluent
Watch this talk here: https://www.confluent.io/online-talks/gcp-for-apache-kafka-users-stream-ingestion-processing
In private and public clouds, stream analytics commonly means stateless processing systems organized around Apache Kafka® or a similar distributed log service. GCP took a somewhat different tack, with Cloud Pub/Sub, Dataflow, and BigQuery, distributing the responsibility for processing among ingestion, processing and database technologies.
We compare the two approaches to data integration and show how Dataflow allows you to join and transform and deliver data streams among on-prem and cloud Apache Kafka clusters, Cloud Pub/Sub topics and a variety of databases. The session will have a mix of architectural discussions and practical code reviews of Dataflow-based pipelines.
Service Mesh with Apache Kafka, Kubernetes, Envoy, Istio and LinkerdKai Wähner
Microservice architectures are not free lunch! Microservices need to be decoupled, flexible, operationally transparent, data aware and elastic. Most material from last years only discusses point-to-point architectures with inflexible and non-scalable technologies like REST / HTTP. This video takes a look at cutting edge technologies like Apache Kafka, Kubernetes, Envoy, Linkerd and Istio to implement a cloud-native service mesh to solve these challenges and bring microservices to the next level of scale, speed and efficiency.
Key takeaways:
- Apache Kafka decouples services, including event streams and request-response
- Kubernetes provides a cloud-native infrastructure for the Kafka ecosystem
- Service Mesh helps with security and observability at ecosystem / organization scale
- Envoy and Istio sit in the layer above Kafka and are orthogonal to the goals Kafka addresses
Blog post: http://www.kai-waehner.de/blog/2019/09/24/cloud-native-apache-kafka-kubernetes-envoy-istio-linkerd-service-mesh
Video recording of this slide deck: https://youtu.be/Us_C4RFOUrA
Apache Kafka in the Airline, Aviation and Travel IndustryKai Wähner
Aviation and travel are notoriously vulnerable to social, economic, and political events, as well as the ever-changing expectations of consumers. Coronavirus is just a piece of the challenge.
This presentation explores use cases, architectures, and references for Apache Kafka as event streaming technology in the aviation industry, including airline, airports, global distribution systems (GDS), aircraft manufacturers, and more.
Examples include Lufthansa, Singapore Airlines, Air France Hop, Amadeus, and more. Technologies include Kafka, Kafka Connect, Kafka Streams, ksqlDB, Machine Learning, Cloud, and more.
Microservices Integration Patterns with KafkaKasun Indrasiri
Microservice composition or integration is probably the hardest thing in microservices architecture. Unlike conventional centralized ESB based integration, we need to leverage the smart-endpoints and dumb pipes terminology when it comes to integrating microservices.
There two main microservices integration patterns; service orchestration (active integrations) and service choreography (reactive integration). In this talk, we will explore on, Microservice Orchestration, Microservice Choreography, Event Sourcing, CQRS and how Kafka can be leveraged to implement microservices composition
While traditional on-prem systems have always been a target from internal and external attackers, recent times have seen increased attacks on Hadoop cloud deployments. Hadoop systems are going to be increasingly targeted due to the large volume of data that it stores. Many Hadoop installations on cloud are publicly accessible without any security measures which pose threat to exfiltration of large datasets and possibly crypto-mining on this infrastructure with its huge distributed compute capability.
Apache Knox provides multiple layers of security related to authentication, service-level authorization and web application security controls out of the box for multiple Hadoop components.
Apache Knox provides configuration to prevent common OWASP Top 10 security risks e.g. Cross-site Request Forgery (CSRF), Cross Site Scripting (XSS), MIME Content Type sniffing, Clickjacking, etc. We will also discuss controls like HTTP Strict Transport Security which prevents SSL Downgrade attacks and CORS filter for allowing applications to make cross domain requests only to specifically allowed hosts through XHR. Support to include/exclude Cipher suites and exclude SSL protocols enables compliance with hardening guidelines provided by CIS for application servers.
Knox has several supported authentication mechanisms with Kerberos underneath e.g. LDAP over SSL, AD, PAM based auth for Unix users, integration with Identity Providers like Okta, etc. Also, capabilities like Trusted Proxy, Single Sign-On auth, Hostmap Provider, Identity Assertion Provider, Client Authentication enhances the overall security posture.
We will also cover the typical kill-chain methodology tailored to Hadoop ecosystem which will help formulate the preventive measures against future compromises.
Spark (Structured) Streaming vs. Kafka Streams - two stream processing platfo...Guido Schmutz
Independent of the source of data, the integration and analysis of event streams gets more important in the world of sensors, social media streams and Internet of Things. Events have to be accepted quickly and reliably, they have to be distributed and analyzed, often with many consumers or systems interested in all or part of the events. In this session we compare two popular Streaming Analytics solutions: Spark Streaming and Kafka Streams.
Spark is fast and general engine for large-scale data processing and has been designed to provide a more efficient alternative to Hadoop MapReduce. Spark Streaming brings Spark's language-integrated API to stream processing, letting you write streaming applications the same way you write batch jobs. It supports both Java and Scala.
Kafka Streams is the stream processing solution which is part of Kafka. It is provided as a Java library and by that can be easily integrated with any Java application.
Machine Learning with Apache Kafka in Pharma and Life SciencesKai Wähner
Blog Post:
https://www.kai-waehner.de/apache-kafka-event-streaming-pharmaceuticals-pharma-life-sciences-use-cases-architecture
Video Recording:
https://youtu.be/t2IH0brwGTg
AI/Machine learning and the Apache Kafka ecosystem are a great combination for training, deploying and monitoring analytic models at scale in real-time. They are showing up more and more in projects but still, feel like buzzwords and hype for science projects.
See how to connect the dots!
--How are Kafka and Machine Learning related?
--How can they be combined to productionize analytic models in mission-critical and scalable real-time applications?
--We will discuss a step-by-step approach to build a scalable and reliable real-time infrastructure for drug discovery doing data integration, feature engineering, image processing, model scoring and processing orchestration.
Use Cases:
R&D Engineering
Sales & Marketing
Manufacturing & Quality Assurance
Supply Chain
Product Monitoring & After Sales Support
VoC (Voice of Customer)
Single View Customer
Yield/Quality Optimization
Improved Drug Yield
Proactive Service Scheduling
Testing & Simulation
Drug Diversion
Process/Quality Monitoring
Inventory & Supply Chain Optimization
Proactive Service Offers
Patent Research and Analytics
Personalized Offers / Ads
EDW Offload
Supply Chain Network Design/Risk Management
Product Predictive Maintenance
Clinical Trials
Customer Segmentation
Smart Products
Serialization & e-Pedigree
Product Usage Tracking
GTM
Global Facilities
Inventory and Logistics Visibility
Warranty & Recall Management
Real-Life Use Cases & Architectures for Event Streaming with Apache KafkaKai Wähner
Streaming all over the World: Real-Life Use Cases & Architectures for Event Streaming with Apache Kafka.
Learn about various case studies for event streaming with Apache Kafka across industries. The talk explores architectures for real-world deployments from Audi, BMW, Disney, Generali, Paypal, Tesla, Unity, Walmart, William Hill, and more. Use cases include fraud detection, mainframe offloading, predictive maintenance, cybersecurity, edge computing, track&trace, live betting, and much more.
Kappa vs Lambda Architectures and Technology ComparisonKai Wähner
Real-time data beats slow data. That’s true for almost every use case. Nevertheless, enterprise architects build new infrastructures with the Lambda architecture that includes separate batch and real-time layers.
This video explores why a single real-time pipeline, called Kappa architecture, is the better fit for many enterprise architectures. Real-world examples from companies such as Disney, Shopify, Uber, and Twitter explore the benefits of Kappa but also show how batch processing fits into this discussion positively without the need for a Lambda architecture.
The main focus of the discussion is on Apache Kafka (and its ecosystem) as the de facto standard for event streaming to process data in motion (the key concept of Kappa), but the video also compares various technologies and vendors such as Confluent, Cloudera, IBM Red Hat, Apache Flink, Apache Pulsar, AWS Kinesis, Amazon MSK, Azure Event Hubs, Google Pub Sub, and more.
Video recording of this presentation:
https://youtu.be/j7D29eyysDw
Further reading:
https://www.kai-waehner.de/blog/2021/09/23/real-time-kappa-architecture-mainstream-replacing-batch-lambda/
https://www.kai-waehner.de/blog/2021/04/20/comparison-open-source-apache-kafka-vs-confluent-cloudera-red-hat-amazon-msk-cloud/
https://www.kai-waehner.de/blog/2021/05/09/kafka-api-de-facto-standard-event-streaming-like-amazon-s3-object-storage/
Kafka error handling patterns and best practices | Hemant Desale and Aruna Ka...HostedbyConfluent
Transaction Banking from Goldman Sachs is a high volume, latency sensitive digital banking platform offering. We have chosen an event driven architecture to build highly decoupled and independent microservices in a cloud native manner and are designed to meet the objectives of Security, Availability Latency and Scalability. Kafka was a natural choice – to decouple producers and consumers and to scale easily for high volume processing. However, there are certain aspects that require careful consideration – handling errors and partial failures, managing downtime of consumers, secure communication between brokers and producers / consumers. In this session, we will present the patterns and best practices that helped us build robust event driven applications. We will also present our solution approach that has been reused across multiple application domains. We hope that by sharing our experience, we can establish a reference implementation that application developers can benefit from.
Activeeon technology for Big Compute and cloud migrationActiveeon
Activeeon is a key technology provider and actor in the cloud migration. Activeeon offers software and middleware solutions for Big Compute, workload automation and HPC. The company also provides workflows solutions for Machine Learning & IA.
Building a reliable and scalable IoT platform with MongoDB and HiveMQDominik Obermaier
Today’s Internet of Things (IoT) is enabling companies to blend together the physical and digital worlds, creating new business models and generating insights that increase productivity at once unimaginable levels. However, managing the ever growing volume of heterogeneous IoT data from disparate devices, systems and applications both on premise and in the cloud can be a challenging endeavour without a scalable and reliable IoT platform.
In this webinar, we will explore why and how companies are leveraging HiveMQ and MongoDB to build exactly that: a scalable and reliable IoT platform. Based upon a sample fleet management scenario, we will explain how telematics data can be routed via MQTT and efficiently stored to provide analytics and insights into the data.
Key Learnings
- Common challenges and pitfalls of IoT projects
- Required components for effectively handling data with an IoT platform
- HiveMQ for MQTT to enable bi-directional device communication over unstable networks
- MongoDB as the flexible and scalable modern data platform combining data from different sources and powering your applications
- Why MongoDB and HiveMQ is such a great combination
IoT Architectures for Apache Kafka and Event Streaming - Industry 4.0, Digita...Kai Wähner
The Internet of Things (IoT) is getting more and more traction as valuable use cases come to light. Whether you are in Healthcare, Telecommunications, Manufacturing, Banking or Retail to name a few industries, there is one key challenge and that's the integration of backend IoT data logs and applications, business services and cloud services to process the data in real time and at scale.
In this talk, we will be sharing how Kafka has become the leading technology used throughout the business to provide Real Time Event Streaming. Explore real life use cases of Kafka Connect, Kafka Streams and KSQL independent of the data deployment be it on a private or public Cloud, On Premise or at the Edge.
Audi - Connected car infrastructure
Robert Bosch Power Tools - Track and Trace of devices and people at construction areas
Deutsche Bahn - Customer 360 for train timetable updates
E.ON - IoT Streaming Platform to integrate and build smart home, smart building and smart grid infrastructures
An Introduction to Confluent Cloud: Apache Kafka as a Serviceconfluent
Business breakout during Confluent’s streaming event in Munich, presented by Hans Jespersen, VP WW Systems Engineering at Confluent. This three-day hands-on course focused on how to build, manage, and monitor clusters using industry best-practices developed by the world’s foremost Apache Kafka™ experts. The sessions focused on how Kafka and the Confluent Platform work, how their main subsystems interact, and how to set up, manage, monitor, and tune your cluster.
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...confluent
MQ, ETL and ESB middleware are often used as integration backbone between legacy applications, modern microservices and cloud services. This introduces several challenges and complexities like point-to-point integration or non-scalable architectures. This session discusses how to build a completely event-driven streaming platform leveraging Apache Kafka’s open source messaging, integration and streaming components to leverage distributed processing, fault-tolerance, rolling upgrades and the ability to reprocess events. Learn the differences between a event-driven streaming platform leveraging Apache Kafka and middleware like MQ, ETL and ESBs – including best practices and anti-patterns, but also how these concepts and tools complement each other in an enterprise architecture.
Apache Kafka in the Automotive Industry (Connected Vehicles, Manufacturing 4....Kai Wähner
Connect all the things: An intro to event streaming for the automotive industry including connected cars, mobility services, and manufacturing / industrial IoT.
Video recording of this talk: https://www.youtube.com/watch?v=rBfBFrcO-WU
The Fourth Industrial Revolution (also known as Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices, using modern smart technology. Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way using integrating with various legacy and modern data sources and sinks.
Other industries—retail, healthcare, government, financial services, energy, and more—also lean into Industry 4.0 technology to take advantage of IoT devices, sensors, smart machines, robotics, and connected data. The variety of these deployments goes from disconnected edge use cases across hybrid architectures to global multi-cloud deployments.
In this presentation, I want to give you an overview of existing use cases for event streaming technology in a connected world across supply chains, industries and customer experiences that come along with these interdisciplinary data intersections:
- The Automotive Industry (and it’s not only Connected Cars)
- Mobility Services across verticals (transportation, logistics, travel industry, retailing, …)
- Smart Cities (including citizen health services, communication infrastructure, …)
Real-world examples include use cases from car makers such as Audi, BMW, Porsche, Tesla, plus many examples from mobility services such as Uber, Lyft, Here Technologies, and more.
The Top 5 Apache Kafka Use Cases and Architectures in 2022Kai Wähner
I see the following topics coming up more regularly in conversations with customers, prospects, and the broader Kafka community across the globe:
Kappa Architecture: Kappa goes mainstream to replace Lambda and Batch pipelines (that does not mean that there is no batch processing anymore). Examples: Kafka-powered Kappa architectures from Uber, Disney, Shopify, and Twitter.
Hyper-personalized Omnichannel: Retail and customer communication across online and offline channels becomes the new black, including context-specific upselling, recommendations, and location-based services. Examples: Omnichannel Retail and Customer 360 in Real-Time with Apache Kafka.
Multi-Cloud Deployments: Business units and IT infrastructures span across regions, continents, and cloud providers. Linking clusters for bi-directional replication of data in real-time becomes crucial for many business models. Examples: Global Kafka deployments.
Edge Analytics: Low latency requirements, cost efficiency, or security requirements enforce the deployment of (some) event streaming use cases at the far edge (i.e., outside a data center), for instance, for predictive maintenance and quality assurance on the shop floor level in smart factories. Examples: Edge analytics with Kafka.
Real-time Cybersecurity: Situational awareness and threat intelligence need to process massive data in real-time to defend against cyberattacks successfully. The many successful ransomware attacks across the globe in 2021 were a warning for most CIOs. Examples: Cybersecurity for situational awareness and threat intelligence in real-time.
The Rise of Data in Motion in the Healthcare Industry - Use Cases, Architectures and Examples powered by Apache Kafka.
Use Cases for Data in Motion in the Healthcare Industry:
- Know Your Patient (= “Customer 360”)
- Operations (Healthcare 4.0 including Drug R&D, Patient Care, etc.)
- IT Perspective (Cybersecurity, Mainframe Offload, Hybrid Cloud, Streaming ETL, etc)
Real-world examples include Covid-19 Electronic Lab Reporting, Cerner, Optum, Centene, Humana, Invitae, Bayer, Celmatix, Care.com.
IoT Architectures for a Digital Twin with Apache Kafka, IoT Platforms and Mac...Kai Wähner
A digital twin is a digital replica of a living or non-living physical entity. This session discusses the benefits and IoT architectures of a Digital Twin in Industrial IoT (IIoT) and its relation to Apache Kafka, IoT frameworks and Machine Learning. Kafka is often used as central event streaming platform to build a scalable and reliable digital twin for real time streaming sensor data. A live demo shows a scalable digital twin infrastructure for condition monitoring and predictive maintenance in real time for a connected car infrastructure leveraging Kafka, MQTT and TensorFlow.
Key Take-Aways:
• Learn about use cases and characteristics of a digital twin in various industries
• Understand how to build a digital twin for every single (of tens of thousands) IoT device or machine
• See different IoT architectures with Kafka and other IoT technologies and products, including edge, hybrid and global deployments
• Understand the relation to Machine Learning and bring added value to your IoT infrastructure by enabling use cases like predictive maintenance
• Understand how the Apache Kafka enables scalable and flexible end-to-end integration processing from IIoT data to various backend applications
• Watch a live demo of an end-to-end integration, real time processing and analytics of thousands of IoT devices
More details:
https://www.kai-waehner.de/blog/2019/11/28/apache-kafka-industrial-iot-iiot-build-an-open-scalable-reliable-digital-twin/
https://www.kai-waehner.de/blog/2020/03/25/architectures-digital-twin-digital-thread-apache-kafka-iot-platforms-machine-learning/
https://youtu.be/Q3eKPEVwNVY
GCP for Apache Kafka® Users: Stream Ingestion and Processingconfluent
Watch this talk here: https://www.confluent.io/online-talks/gcp-for-apache-kafka-users-stream-ingestion-processing
In private and public clouds, stream analytics commonly means stateless processing systems organized around Apache Kafka® or a similar distributed log service. GCP took a somewhat different tack, with Cloud Pub/Sub, Dataflow, and BigQuery, distributing the responsibility for processing among ingestion, processing and database technologies.
We compare the two approaches to data integration and show how Dataflow allows you to join and transform and deliver data streams among on-prem and cloud Apache Kafka clusters, Cloud Pub/Sub topics and a variety of databases. The session will have a mix of architectural discussions and practical code reviews of Dataflow-based pipelines.
Service Mesh with Apache Kafka, Kubernetes, Envoy, Istio and LinkerdKai Wähner
Microservice architectures are not free lunch! Microservices need to be decoupled, flexible, operationally transparent, data aware and elastic. Most material from last years only discusses point-to-point architectures with inflexible and non-scalable technologies like REST / HTTP. This video takes a look at cutting edge technologies like Apache Kafka, Kubernetes, Envoy, Linkerd and Istio to implement a cloud-native service mesh to solve these challenges and bring microservices to the next level of scale, speed and efficiency.
Key takeaways:
- Apache Kafka decouples services, including event streams and request-response
- Kubernetes provides a cloud-native infrastructure for the Kafka ecosystem
- Service Mesh helps with security and observability at ecosystem / organization scale
- Envoy and Istio sit in the layer above Kafka and are orthogonal to the goals Kafka addresses
Blog post: http://www.kai-waehner.de/blog/2019/09/24/cloud-native-apache-kafka-kubernetes-envoy-istio-linkerd-service-mesh
Video recording of this slide deck: https://youtu.be/Us_C4RFOUrA
Apache Kafka in the Airline, Aviation and Travel IndustryKai Wähner
Aviation and travel are notoriously vulnerable to social, economic, and political events, as well as the ever-changing expectations of consumers. Coronavirus is just a piece of the challenge.
This presentation explores use cases, architectures, and references for Apache Kafka as event streaming technology in the aviation industry, including airline, airports, global distribution systems (GDS), aircraft manufacturers, and more.
Examples include Lufthansa, Singapore Airlines, Air France Hop, Amadeus, and more. Technologies include Kafka, Kafka Connect, Kafka Streams, ksqlDB, Machine Learning, Cloud, and more.
Microservices Integration Patterns with KafkaKasun Indrasiri
Microservice composition or integration is probably the hardest thing in microservices architecture. Unlike conventional centralized ESB based integration, we need to leverage the smart-endpoints and dumb pipes terminology when it comes to integrating microservices.
There two main microservices integration patterns; service orchestration (active integrations) and service choreography (reactive integration). In this talk, we will explore on, Microservice Orchestration, Microservice Choreography, Event Sourcing, CQRS and how Kafka can be leveraged to implement microservices composition
While traditional on-prem systems have always been a target from internal and external attackers, recent times have seen increased attacks on Hadoop cloud deployments. Hadoop systems are going to be increasingly targeted due to the large volume of data that it stores. Many Hadoop installations on cloud are publicly accessible without any security measures which pose threat to exfiltration of large datasets and possibly crypto-mining on this infrastructure with its huge distributed compute capability.
Apache Knox provides multiple layers of security related to authentication, service-level authorization and web application security controls out of the box for multiple Hadoop components.
Apache Knox provides configuration to prevent common OWASP Top 10 security risks e.g. Cross-site Request Forgery (CSRF), Cross Site Scripting (XSS), MIME Content Type sniffing, Clickjacking, etc. We will also discuss controls like HTTP Strict Transport Security which prevents SSL Downgrade attacks and CORS filter for allowing applications to make cross domain requests only to specifically allowed hosts through XHR. Support to include/exclude Cipher suites and exclude SSL protocols enables compliance with hardening guidelines provided by CIS for application servers.
Knox has several supported authentication mechanisms with Kerberos underneath e.g. LDAP over SSL, AD, PAM based auth for Unix users, integration with Identity Providers like Okta, etc. Also, capabilities like Trusted Proxy, Single Sign-On auth, Hostmap Provider, Identity Assertion Provider, Client Authentication enhances the overall security posture.
We will also cover the typical kill-chain methodology tailored to Hadoop ecosystem which will help formulate the preventive measures against future compromises.
Spark (Structured) Streaming vs. Kafka Streams - two stream processing platfo...Guido Schmutz
Independent of the source of data, the integration and analysis of event streams gets more important in the world of sensors, social media streams and Internet of Things. Events have to be accepted quickly and reliably, they have to be distributed and analyzed, often with many consumers or systems interested in all or part of the events. In this session we compare two popular Streaming Analytics solutions: Spark Streaming and Kafka Streams.
Spark is fast and general engine for large-scale data processing and has been designed to provide a more efficient alternative to Hadoop MapReduce. Spark Streaming brings Spark's language-integrated API to stream processing, letting you write streaming applications the same way you write batch jobs. It supports both Java and Scala.
Kafka Streams is the stream processing solution which is part of Kafka. It is provided as a Java library and by that can be easily integrated with any Java application.
Machine Learning with Apache Kafka in Pharma and Life SciencesKai Wähner
Blog Post:
https://www.kai-waehner.de/apache-kafka-event-streaming-pharmaceuticals-pharma-life-sciences-use-cases-architecture
Video Recording:
https://youtu.be/t2IH0brwGTg
AI/Machine learning and the Apache Kafka ecosystem are a great combination for training, deploying and monitoring analytic models at scale in real-time. They are showing up more and more in projects but still, feel like buzzwords and hype for science projects.
See how to connect the dots!
--How are Kafka and Machine Learning related?
--How can they be combined to productionize analytic models in mission-critical and scalable real-time applications?
--We will discuss a step-by-step approach to build a scalable and reliable real-time infrastructure for drug discovery doing data integration, feature engineering, image processing, model scoring and processing orchestration.
Use Cases:
R&D Engineering
Sales & Marketing
Manufacturing & Quality Assurance
Supply Chain
Product Monitoring & After Sales Support
VoC (Voice of Customer)
Single View Customer
Yield/Quality Optimization
Improved Drug Yield
Proactive Service Scheduling
Testing & Simulation
Drug Diversion
Process/Quality Monitoring
Inventory & Supply Chain Optimization
Proactive Service Offers
Patent Research and Analytics
Personalized Offers / Ads
EDW Offload
Supply Chain Network Design/Risk Management
Product Predictive Maintenance
Clinical Trials
Customer Segmentation
Smart Products
Serialization & e-Pedigree
Product Usage Tracking
GTM
Global Facilities
Inventory and Logistics Visibility
Warranty & Recall Management
Real-Life Use Cases & Architectures for Event Streaming with Apache KafkaKai Wähner
Streaming all over the World: Real-Life Use Cases & Architectures for Event Streaming with Apache Kafka.
Learn about various case studies for event streaming with Apache Kafka across industries. The talk explores architectures for real-world deployments from Audi, BMW, Disney, Generali, Paypal, Tesla, Unity, Walmart, William Hill, and more. Use cases include fraud detection, mainframe offloading, predictive maintenance, cybersecurity, edge computing, track&trace, live betting, and much more.
Kappa vs Lambda Architectures and Technology ComparisonKai Wähner
Real-time data beats slow data. That’s true for almost every use case. Nevertheless, enterprise architects build new infrastructures with the Lambda architecture that includes separate batch and real-time layers.
This video explores why a single real-time pipeline, called Kappa architecture, is the better fit for many enterprise architectures. Real-world examples from companies such as Disney, Shopify, Uber, and Twitter explore the benefits of Kappa but also show how batch processing fits into this discussion positively without the need for a Lambda architecture.
The main focus of the discussion is on Apache Kafka (and its ecosystem) as the de facto standard for event streaming to process data in motion (the key concept of Kappa), but the video also compares various technologies and vendors such as Confluent, Cloudera, IBM Red Hat, Apache Flink, Apache Pulsar, AWS Kinesis, Amazon MSK, Azure Event Hubs, Google Pub Sub, and more.
Video recording of this presentation:
https://youtu.be/j7D29eyysDw
Further reading:
https://www.kai-waehner.de/blog/2021/09/23/real-time-kappa-architecture-mainstream-replacing-batch-lambda/
https://www.kai-waehner.de/blog/2021/04/20/comparison-open-source-apache-kafka-vs-confluent-cloudera-red-hat-amazon-msk-cloud/
https://www.kai-waehner.de/blog/2021/05/09/kafka-api-de-facto-standard-event-streaming-like-amazon-s3-object-storage/
Kafka error handling patterns and best practices | Hemant Desale and Aruna Ka...HostedbyConfluent
Transaction Banking from Goldman Sachs is a high volume, latency sensitive digital banking platform offering. We have chosen an event driven architecture to build highly decoupled and independent microservices in a cloud native manner and are designed to meet the objectives of Security, Availability Latency and Scalability. Kafka was a natural choice – to decouple producers and consumers and to scale easily for high volume processing. However, there are certain aspects that require careful consideration – handling errors and partial failures, managing downtime of consumers, secure communication between brokers and producers / consumers. In this session, we will present the patterns and best practices that helped us build robust event driven applications. We will also present our solution approach that has been reused across multiple application domains. We hope that by sharing our experience, we can establish a reference implementation that application developers can benefit from.
Activeeon technology for Big Compute and cloud migrationActiveeon
Activeeon is a key technology provider and actor in the cloud migration. Activeeon offers software and middleware solutions for Big Compute, workload automation and HPC. The company also provides workflows solutions for Machine Learning & IA.
Building a reliable and scalable IoT platform with MongoDB and HiveMQDominik Obermaier
Today’s Internet of Things (IoT) is enabling companies to blend together the physical and digital worlds, creating new business models and generating insights that increase productivity at once unimaginable levels. However, managing the ever growing volume of heterogeneous IoT data from disparate devices, systems and applications both on premise and in the cloud can be a challenging endeavour without a scalable and reliable IoT platform.
In this webinar, we will explore why and how companies are leveraging HiveMQ and MongoDB to build exactly that: a scalable and reliable IoT platform. Based upon a sample fleet management scenario, we will explain how telematics data can be routed via MQTT and efficiently stored to provide analytics and insights into the data.
Key Learnings
- Common challenges and pitfalls of IoT projects
- Required components for effectively handling data with an IoT platform
- HiveMQ for MQTT to enable bi-directional device communication over unstable networks
- MongoDB as the flexible and scalable modern data platform combining data from different sources and powering your applications
- Why MongoDB and HiveMQ is such a great combination
Industrial production is becoming increasingly interlinked with modern information and communication technology. From the foundation of intelligent digitally-networked systems, a largely self-organized production will be possible. In Industrie4.0, people, machinery, plants, logistics and products will communicate and cooperate directly. To connect these different strands, a unified, flexible, high-performance system is needed to provide company-wide, real-time, information flow.
To target these issues, we developed enterprise:inmation.
It securely and efficiently gathers data from manufacturing, process control and IT systems all around the globe, contextualizes it and transforms it into actionable information, which is presented to every decision-maker on any device, anytime, at any location.
Software made by industrial system integration pros, in close cooperation with industry leaders. Business performance in real-time, anytime, anywhere, for all decision- makers -that is enterprise:inmation.
Processing Real-Time Data at Scale: A streaming platform as a central nervous...confluent
(Marcus Urbatschek, Confluent)
Presentation during Confluent’s streaming event in Munich. This three-day hands-on course focused on how to build, manage, and monitor clusters using industry best-practices developed by the world’s foremost Apache Kafka™ experts. The sessions focused on how Kafka and the Confluent Platform work, how their main subsystems interact, and how to set up, manage, monitor, and tune your cluster.
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...confluent
Our talk will explore the transformative impact of integrating Confluent, HiveMQ, and SparkPlug in Industry 4.0, emphasizing the creation of a Unified Namespace.
In addition to the creation of a Unified Namespace, our webinar will also delve into Stream Governance and Scaling, highlighting how these aspects are crucial for managing complex data flows and ensuring robust, scalable IIoT-Platforms.
You will learn how to ensure data accuracy and reliability, expand your data processing capabilities, and optimize your data management processes.
Don't miss out on this opportunity to learn from industry experts and take your business to the next level.
GARE du MIDIH MIDIH, towards a flexible, modular and open source reference ...MIDIH_EU
The MIDIH approach for defining and implementing a data-driven, open source and standards-based I4.0 reference architecture for pan-European DIHs, allowing manufacturing companies to stay on the wave of industry digitization and providing flexibility and agility for developers and systems integrators
Hybrid Cloud
Multi-Cloud
Serverless Computing
Data Containers
Artificial Intelligence Platforms
Service mesh
Immutable Infrastructure Focused On Containers
The Internet of Things (IoT)
Cloudlet
Cloud Security
Backup and Disaster Recovery (DR)
This webinar will focus on IoTView, InduSoft’s IoT and IIoT platform agnostic solution for creating HMIs for IoT devices and intelligent systems. In this webinar we’ll learn more about the capabilities of InduSoft IoTView, and how it can be embedded in end point devices such as pumps, motion control, valves, power monitors, and other controllers to create robust IIoT solutions.
"Cindy Xing is a Principal Software Dev Lead at Microsoft, with over 15 years of working experience in building and delivering large scale software distributed systems.
Her talk during the Data Science Conference will be focused on Edge Computing. Edge computing is the practice of processing data near the edge of your network, where the data is being generated, instead of in a centralized data-processing warehouse."
Best Practices for Streaming IoT Data with MQTT and Apache KafkaKai Wähner
Organizations today are looking to stream IoT data to Apache Kafka. However, connecting tens of thousands or even millions of devices over unreliable networks can create some architecture challenges. In this session, we will identify and demo some best practices for implementing a large scale IoT system that can stream MQTT messages to Apache Kafka.
We use HiveMQ as open source MQTT broker to ingest data from IoT devices, ingest the data in real time into an Apache Kafka cluster for preprocessing (using Kafka Streams / KSQL), and model training + inference (using TensorFlow 2.0 and its TensorFlow I/O Kafka plugin).
We leverage additional enterprise components from HiveMQ and Confluent to allow easy operations, scalability and monitoring.
Modernizing the Manufacturing Industry with Kafka and MQTT Dominik Obermaier
Industry 4.0 and smart manufacturing are driving the manufacturing industry to modernize their software infrastructure. The current infrastructure is costly to maintain, creates barriers to sharing data, difficult to integrate with other systems and is limiting corporations to new opportunities promised by Industry 4.0. A key challenge to modernizing a manufacturing infrastructure is how do you integrate old existing systems with new modern systems. Apache Kafka and MQTT are unique positioned to provide the core technology to enable modernization of the manufacturing industry.
This presentation will look at the unique business drivers for modernizing the manufacturing industry and how MQTT and Kafka can help make it a reality.
Similar to IIoT / Industry 4.0 with Apache Kafka, Connect, KSQL, Apache PLC4X (20)
Apache Kafka as Data Hub for Crypto, NFT, Metaverse (Beyond the Buzz!)Kai Wähner
Decentralized finance with crypto and NFTs is a huge topic these days. It becomes a powerful combination with the coming metaverse platforms across industries. This session explores the relationship between crypto technologies and modern enterprise architecture.
I discuss how data streaming and Apache Kafka help build innovation and scalable real-time applications of a future metaverse. Let's skip the buzz (and NFT bubble) and instead review existing real-world deployments in the crypto and blockchain world powered by Kafka and its ecosystem.
Apache Kafka is the de facto standard for data streaming to process data in motion. With its significant adoption growth across all industries, I get a very valid question every week: When NOT to use Apache Kafka? What limitations does the event streaming platform have? When does Kafka simply not provide the needed capabilities? How to qualify Kafka out as it is not the right tool for the job?
This session explores the DOs and DONTs. Separate sections explain when to use Kafka, when NOT to use Kafka, and when to MAYBE use Kafka.
No matter if you think about open source Apache Kafka, a cloud service like Confluent Cloud, or another technology using the Kafka protocol like Redpanda or Pulsar, check out this slide deck.
A detailed article about this topic:
https://www.kai-waehner.de/blog/2022/01/04/when-not-to-use-apache-kafka/
Kafka for Live Commerce to Transform the Retail and Shopping MetaverseKai Wähner
Live commerce combines instant purchasing of a featured product and audience participation.
This talk explores the need for real-time data streaming with Apache Kafka between applications to enable live commerce across online stores and brick & mortar stores across regions, countries, and continents in any retail business.
The discussion covers several building blocks of a live commerce enterprise architecture, including transactional data processing, omnichannel, natural language processing, augmented reality, edge computing, and more.
The Heart of the Data Mesh Beats in Real-Time with Apache KafkaKai Wähner
If there were a buzzword of the hour, it would certainly be "data mesh"! This new architectural paradigm unlocks analytic data at scale and enables rapid access to an ever-growing number of distributed domain datasets for various usage scenarios.
As such, the data mesh addresses the most common weaknesses of the traditional centralized data lake or data platform architecture. And the heart of a data mesh infrastructure must be real-time, decoupled, reliable, and scalable.
This presentation explores how Apache Kafka, as an open and scalable decentralized real-time platform, can be the basis of a data mesh infrastructure and - complemented by many other data platforms like a data warehouse, data lake, and lakehouse - solve real business problems.
There is no silver bullet or single technology/product/cloud service for implementing a data mesh. The key outcome of a data mesh architecture is the ability to build data products; with the right tool for the job.
A good data mesh combines data streaming technology like Apache Kafka or Confluent Cloud with cloud-native data warehouse and data lake architectures from Snowflake, Databricks, Google BigQuery, et al.
Apache Kafka vs. Cloud-native iPaaS Integration Platform MiddlewareKai Wähner
Enterprise integration is more challenging than ever before. The IT evolution requires the integration of more and more technologies. Applications are deployed across the edge, hybrid, and multi-cloud architectures. Traditional middleware such as MQ, ETL, ESB does not scale well enough or only processes data in batch instead of real-time.
This presentation explores why Apache Kafka is the new black for integration projects, how Kafka fits into the discussion around cloud-native iPaaS (Integration Platform as a Service) solutions, and why event streaming is a new software category.
A concrete real-world example shows the difference between event streaming and traditional integration platforms respectively cloud-native iPaaS.
Video Recording of this presentation:
https://www.youtube.com/watch?v=I8yZwKg_IJc&t=2842s
Blog post about this topic:
https://www.kai-waehner.de/blog/2021/11/03/apache-kafka-cloud-native-ipaas-versus-mq-etl-esb-middleware/
Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?Kai Wähner
The concepts and architectures of a data warehouse, a data lake, and data streaming are complementary to solving business problems.
Unfortunately, the underlying technologies are often misunderstood, overused for monolithic and inflexible architectures, and pitched for wrong use cases by vendors. Let’s explore this dilemma in a presentation.
The slides cover technologies such as Apache Kafka, Apache Spark, Confluent, Databricks, Snowflake, Elasticsearch, AWS Redshift, GCP with Google Bigquery, and Azure Synapse.
Serverless Kafka and Spark in a Multi-Cloud Lakehouse ArchitectureKai Wähner
Apache Kafka in conjunction with Apache Spark became the de facto standard for processing and analyzing data. Both frameworks are open, flexible, and scalable.
Unfortunately, the latter makes operations a challenge for many teams. Ideally, teams can use serverless SaaS offerings to focus on business logic. However, hybrid and multi-cloud scenarios require a cloud-native platform that provides automated and elastic tooling to reduce the operations burden.
This session explores different architectures to build serverless Apache Kafka and Apache Spark multi-cloud architectures across regions and continents.
We start from the analytics perspective of a data lake and explore its relation to a fully integrated data streaming layer with Kafka to build a modern data Data Lakehouse.
Real-world use cases show the joint value and explore the benefit of the "delta lake" integration.
Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...Kai Wähner
Hybrid cloud architectures are the new black for most companies. A cloud-first strategy is evident for many new enterprise architectures, but some use cases require resiliency across edge sites and multiple cloud regions. Data streaming with the Apache Kafka ecosystem is a perfect technology for building resilient and hybrid real-time applications at any scale. This talk explores different architectures and their trade-offs for transactional and analytical workloads. Real-world examples include financial services, retail, and the automotive industry.
Video recording:
https://qconlondon.com/london2022/presentation/resilient-real-time-data-streaming-across-the-edge-and-hybrid-cloud
Data Streaming with Apache Kafka in the Defence and Cybersecurity IndustryKai Wähner
Agenda:
1) Defence, Modern Warfare, and Cybersecurity in 202X
2) Data in Motion with Apache Kafka as Defence Backbone
3) Situational Awareness
4) Threat Intelligence
5) Forensics and AI / Machine Learning
6) Air-Gapped and Zero Trust Environments
7) SIEM / SOAR Modernization
Technologies discussed in the presentation include Apache Kafka, Kafka Streams, kqlDB, Kafka Connect, Elasticsearch, Splunk, IBM QRadar, Zeek, Netflow, PCAP, TensorFlow, AWS, Azure, GCP, Sigma, Confluent Cloud,
Real-World Deployments of Data Streaming with Apache Kafka across the Healthcare Value Chain using open source and cloud-native technologies and serverless SaaS:
1) Legacy Modernization and Hybrid Cloud: Optum (UnitedHealth Group, Centene, Bayer)
2) Streaming ETL (Bayer, Babylon Health)
3) Real-time Analytics (Cerner, Celmatix, CDC/Centers for Disease Control and Prevention)
4) Machine Learning and Data Science (Recursion, Humana)
5) Open API and Omnichannel (Care.com, Invitae)
Apache Kafka for Real-time Supply Chainin the Food and Retail IndustryKai Wähner
Use Cases, Architectures, and Real-World Examples for data in motion and real-time event streaming powered by Apache Kafka across the supply chain and logistics. Case studies and deployments include Baader, Walmart, Migros, Albertsons, Domino's Pizza, Instacart, Grab, Royal Caribbean, and more.
Kafka for Real-Time Replication between Edge and Hybrid CloudKai Wähner
Not all workloads allow cloud computing. Low latency, cybersecurity, and cost-efficiency require a suitable combination of edge computing and cloud integration.
This session explores architectures and design patterns for software and hardware considerations to deploy hybrid data streaming with Apache Kafka anywhere. A live demo shows data synchronization from the edge to the public cloud across continents with Kafka on Hivecell and Confluent Cloud.
Apache Kafka for Predictive Maintenance in Industrial IoT / Industry 4.0Kai Wähner
The manufacturing industry is moving away from just selling machinery, devices, and other hardware. Software and services increase revenue and margins. Equipment-as-a-Service (EaaS) even outsources the maintenance to the vendor.
This paradigm shift is only possible with reliable and scalable real-time data processing leveraging an event streaming platform such as Apache Kafka. This talk explores how Kafka-native Condition Monitoring and Predictive Maintenance help with this innovation.
More details:
https://www.kai-waehner.de/blog/2021/10/25/apache-kafka-condition-monitoring-predictive-maintenance-industrial-iot-digital-twin/
Video recording:
https://youtu.be/tfOuN5KeI9w
Apache Kafka Landscape for Automotive and ManufacturingKai Wähner
Today, in 2022, Apache Kafka is the central nervous system of many applications in various areas related to the automotive and manufacturing industry for processing analytical and transactional data in motion across edge, hybrid, and multi-cloud deployments.
This presentation explores the automotive event streaming landscape, including connected vehicles, smart manufacturing, supply chain optimization, aftersales, mobility services, and innovative new business models.
Afterwards, many real-world examples are shown from companies such as Audi, BMW, Porsche, Tesla, Uber, Grab, and FREENOW.
More detail in the blog post:
https://www.kai-waehner.de/blog/2022/01/12/apache-kafka-landscape-for-automotive-and-manufacturing/
Event Streaming CTO Roundtable for Cloud-native Kafka ArchitecturesKai Wähner
Technical thought leadership presentation to discuss how leading organizations move to real-time architecture to support business growth and enhance customer experience. This is a forum to discuss use cases with your peers to understand how other digital-native companies are utilizing data in motion to drive competitive advantage.
Agenda:
- Data in Motion with Event Streaming and Apache Kafka
- Streaming ETL Pipelines
- IT Modernisation and Hybrid Multi-Cloud
- Customer Experience and Customer 360
- IoT and Big Data Processing
- Machine Learning and Analytics
Apache Kafka in the Public Sector (Government, National Security, Citizen Ser...Kai Wähner
The Rise of Data in Motion in the Public Sector powered by event streaming with Apache Kafka.
Citizen Services:
- Health services, e.g. hospital modernization, track & trace - Covid distance control
- Public administration - reduce bureaucracy, data democratization across government departments
- eGovernment - Efficient and digital citizen engagement, e.g. personal ID application process
Smart City
- Smart driving, parking, buildings, environment
Waste management
- Open exchange – e.g. mobility services (1st and 3rd party)
Energy
- Smart grid and utilities infrastructure (energy distribution, smart home, smart meters, smart water, etc.)
- National Security
Law enforcement, surveillance, police/interior security data exchange
- Defense and military (border control, intelligent solider)
Cybersecurity for situational awareness and threat intelligence
Telco 4.0 - Payment and FinServ Integration for Data in Motion with 5G and Ap...Kai Wähner
The Era of Telco 4.0: Embracing Digital Transformation with Data in Motion. Learn about Payment and FinServ Integration for Data in Motion with 5G and Apache Kafka.
1) The rise of Telco 4.0 and the future forward
2) Data in Motion in the Telco industry
3) Real-world Fintech and Payment examples powered by Data in Motion
Apache Kafka in the Transportation and LogisticsKai Wähner
Event Streaming with Apache Kafka in the Transportation and Logistics.
Track & Trace, Real-time Locating System, Customer 360, Open API, and more…
Examples include Swiss Post, SBB, Deutsche Bahn, Hermes, Migros, Here Technologies, Otonomo, Lyft, Uber, Free Now, Lufthansa, Air France, Singapore Airlines, Amadeus Group, and more.
Apache Kafka for Cybersecurity and SIEM / SOAR ModernizationKai Wähner
Data in Motion powered by the Apache Kafka ecosystem for Situational Awareness, Threat Detection, Forensics, Zero Trust Zones and Air-Gapped Environments.
Agenda:
1) Cybersecurity in 202X
2) Data in Motion as Cybersecurity Backbone
3) Situational Awareness
4) Threat Intelligence
5) Forensics
6) Air-Gapped and Zero Trust Environments
7) SIEM / SOAR Modernization
More details in the "Kafka for Cybersecurity" blog series:
https://www.kai-waehner.de/blog/2021/07/02/kafka-cybersecurity-siem-soar-part-1-of-6-data-in-motion-as-backbone/
Serverless Kafka on AWS as Part of a Cloud-native Data Lake ArchitectureKai Wähner
AWS Data Lake / Lake House + Confluent Cloud for Serverless Apache Kafka. Learn about use cases, architectures, and features.
Data must be continuously collected, processed, and reactively used in applications across the entire enterprise - some in real time, some in batch mode. In other words: As an enterprise becomes increasingly software-defined, it needs a data platform designed primarily for "data in motion" rather than "data at rest."
Apache Kafka is now mainstream when it comes to data in motion! The Kafka API has become the de facto standard for event-driven architectures and event streaming. Unfortunately, the cost of running it yourself is very often too expensive when you add factors like scaling, administration, support, security, creating connectors...and everything else that goes with it. Resources in enterprises are scarce: this applies to both the best team members and the budget.
The cloud - as we all know - offers the perfect solution to such challenges.
Most likely, fully-managed cloud services such as AWS S3, DynamoDB or Redshift are already in use. Now it is time to implement "fully-managed" for Kafka as well - with Confluent Cloud on AWS.
Building a central integration layer that doesn't care where or how much data is coming from.
Implementing scalable data stream processing to gain real-time insights
Leveraging fully managed connectors (like S3, Redshift, Kinesis, MongoDB Atlas & more) to quickly access data
Confluent Cloud in action? Let's show how ao.com made it happen!
Translated with www.DeepL.com/Translator (free version)
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...informapgpstrackings
Keep tabs on your field staff effortlessly with Informap Technology Centre LLC. Real-time tracking, task assignment, and smart features for efficient management. Request a live demo today!
For more details, visit us : https://informapuae.com/field-staff-tracking/
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
SOCRadar Research Team: Latest Activities of IntelBrokerSOCRadar
The European Union Agency for Law Enforcement Cooperation (Europol) has suffered an alleged data breach after a notorious threat actor claimed to have exfiltrated data from its systems. Infamous data leaker IntelBroker posted on the even more infamous BreachForums hacking forum, saying that Europol suffered a data breach this month.
The alleged breach affected Europol agencies CCSE, EC3, Europol Platform for Experts, Law Enforcement Forum, and SIRIUS. Infiltration of these entities can disrupt ongoing investigations and compromise sensitive intelligence shared among international law enforcement agencies.
However, this is neither the first nor the last activity of IntekBroker. We have compiled for you what happened in the last few days. To track such hacker activities on dark web sources like hacker forums, private Telegram channels, and other hidden platforms where cyber threats often originate, you can check SOCRadar’s Dark Web News.
Stay Informed on Threat Actors’ Activity on the Dark Web with SOCRadar!
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
Into the Box Keynote Day 2: Unveiling amazing updates and announcements for modern CFML developers! Get ready for exciting releases and updates on Ortus tools and products. Stay tuned for cutting-edge innovations designed to boost your productivity.
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
Prosigns: Transforming Business with Tailored Technology SolutionsProsigns
Unlocking Business Potential: Tailored Technology Solutions by Prosigns
Discover how Prosigns, a leading technology solutions provider, partners with businesses to drive innovation and success. Our presentation showcases our comprehensive range of services, including custom software development, web and mobile app development, AI & ML solutions, blockchain integration, DevOps services, and Microsoft Dynamics 365 support.
Custom Software Development: Prosigns specializes in creating bespoke software solutions that cater to your unique business needs. Our team of experts works closely with you to understand your requirements and deliver tailor-made software that enhances efficiency and drives growth.
Web and Mobile App Development: From responsive websites to intuitive mobile applications, Prosigns develops cutting-edge solutions that engage users and deliver seamless experiences across devices.
AI & ML Solutions: Harnessing the power of Artificial Intelligence and Machine Learning, Prosigns provides smart solutions that automate processes, provide valuable insights, and drive informed decision-making.
Blockchain Integration: Prosigns offers comprehensive blockchain solutions, including development, integration, and consulting services, enabling businesses to leverage blockchain technology for enhanced security, transparency, and efficiency.
DevOps Services: Prosigns' DevOps services streamline development and operations processes, ensuring faster and more reliable software delivery through automation and continuous integration.
Microsoft Dynamics 365 Support: Prosigns provides comprehensive support and maintenance services for Microsoft Dynamics 365, ensuring your system is always up-to-date, secure, and running smoothly.
Learn how our collaborative approach and dedication to excellence help businesses achieve their goals and stay ahead in today's digital landscape. From concept to deployment, Prosigns is your trusted partner for transforming ideas into reality and unlocking the full potential of your business.
Join us on a journey of innovation and growth. Let's partner for success with Prosigns.
AI Pilot Review: The World’s First Virtual Assistant Marketing SuiteGoogle
AI Pilot Review: The World’s First Virtual Assistant Marketing Suite
👉👉 Click Here To Get More Info 👇👇
https://sumonreview.com/ai-pilot-review/
AI Pilot Review: Key Features
✅Deploy AI expert bots in Any Niche With Just A Click
✅With one keyword, generate complete funnels, websites, landing pages, and more.
✅More than 85 AI features are included in the AI pilot.
✅No setup or configuration; use your voice (like Siri) to do whatever you want.
✅You Can Use AI Pilot To Create your version of AI Pilot And Charge People For It…
✅ZERO Manual Work With AI Pilot. Never write, Design, Or Code Again.
✅ZERO Limits On Features Or Usages
✅Use Our AI-powered Traffic To Get Hundreds Of Customers
✅No Complicated Setup: Get Up And Running In 2 Minutes
✅99.99% Up-Time Guaranteed
✅30 Days Money-Back Guarantee
✅ZERO Upfront Cost
See My Other Reviews Article:
(1) TubeTrivia AI Review: https://sumonreview.com/tubetrivia-ai-review
(2) SocioWave Review: https://sumonreview.com/sociowave-review
(3) AI Partner & Profit Review: https://sumonreview.com/ai-partner-profit-review
(4) AI Ebook Suite Review: https://sumonreview.com/ai-ebook-suite-review
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoamtakuyayamamoto1800
In this slide, we show the simulation example and the way to compile this solver.
In this solver, the Helmholtz equation can be solved by helmholtzFoam. Also, the Helmholtz equation with uniformly dispersed bubbles can be simulated by helmholtzBubbleFoam.
First Steps with Globus Compute Multi-User EndpointsGlobus
In this presentation we will share our experiences around getting started with the Globus Compute multi-user endpoint. Working with the Pharmacology group at the University of Auckland, we have previously written an application using Globus Compute that can offload computationally expensive steps in the researcher's workflows, which they wish to manage from their familiar Windows environments, onto the NeSI (New Zealand eScience Infrastructure) cluster. Some of the challenges we have encountered were that each researcher had to set up and manage their own single-user globus compute endpoint and that the workloads had varying resource requirements (CPUs, memory and wall time) between different runs. We hope that the multi-user endpoint will help to address these challenges and share an update on our progress here.
In 2015, I used to write extensions for Joomla, WordPress, phpBB3, etc and I ...Juraj Vysvader
In 2015, I used to write extensions for Joomla, WordPress, phpBB3, etc and I didn't get rich from it but it did have 63K downloads (powered possible tens of thousands of websites).
Gamify Your Mind; The Secret Sauce to Delivering Success, Continuously Improv...Shahin Sheidaei
Games are powerful teaching tools, fostering hands-on engagement and fun. But they require careful consideration to succeed. Join me to explore factors in running and selecting games, ensuring they serve as effective teaching tools. Learn to maintain focus on learning objectives while playing, and how to measure the ROI of gaming in education. Discover strategies for pitching gaming to leadership. This session offers insights, tips, and examples for coaches, team leads, and enterprise leaders seeking to teach from simple to complex concepts.
Experience our free, in-depth three-part Tendenci Platform Corporate Membership Management workshop series! In Session 1 on May 14th, 2024, we began with an Introduction and Setup, mastering the configuration of your Corporate Membership Module settings to establish membership types, applications, and more. Then, on May 16th, 2024, in Session 2, we focused on binding individual members to a Corporate Membership and Corporate Reps, teaching you how to add individual members and assign Corporate Representatives to manage dues, renewals, and associated members. Finally, on May 28th, 2024, in Session 3, we covered questions and concerns, addressing any queries or issues you may have.
For more Tendenci AMS events, check out www.tendenci.com/events
Accelerate Enterprise Software Engineering with PlatformlessWSO2
Key takeaways:
Challenges of building platforms and the benefits of platformless.
Key principles of platformless, including API-first, cloud-native middleware, platform engineering, and developer experience.
How Choreo enables the platformless experience.
How key concepts like application architecture, domain-driven design, zero trust, and cell-based architecture are inherently a part of Choreo.
Demo of an end-to-end app built and deployed on Choreo.
Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...Anthony Dahanne
Les Buildpacks existent depuis plus de 10 ans ! D’abord, ils étaient utilisés pour détecter et construire une application avant de la déployer sur certains PaaS. Ensuite, nous avons pu créer des images Docker (OCI) avec leur dernière génération, les Cloud Native Buildpacks (CNCF en incubation). Sont-ils une bonne alternative au Dockerfile ? Que sont les buildpacks Paketo ? Quelles communautés les soutiennent et comment ?
Venez le découvrir lors de cette session ignite
Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...
IIoT / Industry 4.0 with Apache Kafka, Connect, KSQL, Apache PLC4X
1. 1Confidential
Flexible and Scalable Integration in
Automation Industry / Industrial IoT
Kai Waehner
Technology Evangelist
contact@kai-waehner.de
LinkedIn
@KaiWaehner
www.confluent.io
www.kai-waehner.de
Kafka-Native End-to-End IIoT Data Integration and Processing
with Kafka Connect, KSQL and Apache PLC4X
2. 2
Agenda
1) Modern IIoT Use Cases around Cloud, Big Data, Machine Learning
2) Automation Industry and its Challenges
3) Architecture for End-to-End Integration from Edge to Data Center / Cloud
4) Apache Kafka as Event Streaming Platform
5) Apache PLC4X for Edge Integration
6) Example: Supply Chain Optimization at Scale in Real Time
3. 3
Agenda
1) Modern IIoT Use Cases around Cloud, Big Data, Machine Learning
2) Automation Industry and its Challenges
3) Architecture for End-to-End Integration from Edge to Data Center / Cloud
4) Apache Kafka as Event Streaming Platform
5) Apache PLC4X for Edge Integration
6) Example: Supply Chain Optimization at Scale in Real Time
4. 4
Business Digitalization Trends are Driving the Need to Process
Events at a whole new Scale, Speed and Efficiency
Mobile Cloud Microservices Internet of Things Machine Learning
The world has changed!
6. 6
Some IIoT use cases
Analytics
• Ingest data into cloud for analytics
• Reduce cost: Leverage open frameworks instead of paying very expensive licenses per machine
• Flexible integration (select data to ingest, flexible changes over time)
• Machine Learning / Data Science
Manufacturing
• Collect data from machines à Preprocess + monitoring to optimize assembly line and reduce cost
• Aggregate data from different machines / companies —> Leverage (and sell?) insights
• Sell services on top of machines —> Predictive maintenance (remote)
• Scale up (add more sites, add more data)
Production Robots
• Ingest, process and monitor large volumes data (where the proprietary monolith does not scale)
Smart Factories
• Monitor and manage the whole factory (at scale, in real time, flexible)
• Integration with legacy proprietary protocols and modern cloud-native technologies
7. 7
Agenda
1) Modern IIoT Use Cases around Cloud, Big Data, Machine Learning
2) Automation Industry and its Challenges
3) Architecture for End-to-End Integration from Edge to Data Center / Cloud
4) Apache Kafka as Event Streaming Platform
5) Apache PLC4X for Edge Integration
6) Example: Supply Chain Optimization at Scale in Real Time
8. 8
History of Automation Industry vs. Big Data and Cloud
Christofer Dutz (codecentric)
https://foss-backstage.de/sites/foss-backstage.de/files/2018-07/Revolutionizing%20Industrial%20IoT%20with%20Apache%20PLC4X.pdf
9. 9
Challenges in Automation Industry
IoT != IIoT
• IoT = Connected cars, smart home, … à Large scale, secure, scalable, open,
modern technologies
• IIoT = Slow, insecure, not scalable, proprietary
Legacy / Proprietary IIoT Technologies
• Usually incompatible protocols, typically proprietary
• Usually serial connections (very low latency, nanoseconds) - with TCP / UDP
wrapper around it to integrate with “external world”
• Siemens S7, Modbus, Beckhoff, Profinet, Allen Bradley, etc.
• OPC-UA (required machine update + license cost)
Product Lifecycles
• Long lifecycle (tens of years)
• Factories cost millions, no simple changes / upgrades
• Still using Windows 7 without Service Packs => Usability and security issues
• Mantra: “Stay with your well known vendor forever”
10. 10
Challenges in Automation Industry
Monoliths
• No scalability
• No extendibility
• No real failover (start your backup machine)
Missing Security Capabilities
• Security in software development == Authentication,
Authorization, Antivirus, SSL, SASL, Kerberos
• Security in automation industry == Safety
• “if you press the red button, the machine stops
immediately”
• Insecure by nature => No Authentication /
Authorization / Encryption
• Mantra: “Our factory building and network is secure,
no access from outside”
• Contradicts with “move to cloud and big data
analytics”
11. 11
PLC (Programmable Logic Controller)
• Started early 70’s
• Control of manufacturing processes
• Small grey box
• ~100 messages per second, stored to CSV file, Windows Share
• Limited operations: Read (90+%), Write, Subscribe, Call
Functions, List Resources
• High reliability control, ease of programming and process
fault diagnosis
• Hardwire à softwire
• Has Input / Sensors, Output / Actors
• Firmware (= operating system)
• Mechanism to load user programs
• Highly fragmented market
• S7 (Siemens), Beckhoff ADS, Modbus (Asia), Ethernet/IP, KNX,
Emerson DeltaV, Profinet, Allen Bradley, etc.
• State of the art in automation industry
12. 12
Example: Siemens S7 Communication
When communicating with S7 Devices
there is a whole family of protocols,
that can be used.
In general you can divide them
into Profinet protocols and S7
Comm protocols. The later are far
simpler in structure, but also far less
documented.
The S7 Comm protocols are generally
split up into two flavors: The
classic S7 Comm and a newer version
unofficially called S7 Comm Plus.
https://plc4x.apache.org/protocols/s7/index.html
13. 13
Trends: ~50% of industrial assets in factories will be connected by 2020
https://iot-analytics.com/5-industrial-connectivity-trends-driving-the-it-ot-convergence
14. 14
Trends: Evolution of Convergence between IT and Industrial Automation
https://iot-analytics.com/5-industrial-connectivity-trends-driving-the-it-ot-convergence
15. 15
How to get from legacy, proprietary to cloud, big data, machine learning?
16. 16
Costly and inflexible legacy Integration between IIoT and other Systems
ModbusS7
Siemens
Integration
Middleware
Monolith
Schneider Electric
Integration
Middleware
Monolith
Integration
Middleware
18. 18
Agenda
1) Modern IIoT Use Cases around Cloud, Big Data, Machine Learning
2) Automation Industry and its Challenges
3) Architecture for End-to-End Integration from Edge to Data Center / Cloud
4) Apache Kafka as Event Streaming Platform
5) Apache PLC4X for Edge Integration
6) Example: Supply Chain Optimization at Scale in Real Time
19. 19
?
IIoT Architecture (High Level)
Kafka BrokerKafka BrokerStreaming
Platform
Connect
w/ MQTT
connector
GatewayDevicesDevicesDevicesMachine
Sensor Analytics
(Real Time)
Predictive
Maintenance
(Near Real Time)
Machine Learning
(Batch)
Edge Data Center / Cloud
How to integrate and process data at scale and reliable?
20. 20
Vendor-Neutral IoT Architectures across Edge, On Premise and Multi-Cloud
On-Premise / Edge
Deploy on bare-metal, VMs,
containers or Kubernetes in your
datacenter with Confluent Platform
and Confluent Operator
Public Cloud
Implement self-managed in the public
cloud or adopt a fully managed service
with Confluent Cloud
Hybrid Cloud
Build a persistent bridge between
datacenter and cloud with
Confluent Replicator
Confluent
Replicator
VM
SELF MANAGED FULLY MANAGED
21. Data Lake
Batch
Analytics
Event
Streaming
Platform
Batch
Integration
Real Time Pre-
processing
Machine Sensors
Streaming Platform
Other Components
Real Time
Processing
(6b) All Data
(7) Potential Defect
(3)
Read Data
Optimization
/ Analytics
(5)
Deploy
Optimization
Model
(8b) Alert Person (e.g. Mobile App)
(2)
Preprocess
Data (6a) Consume machine data
Model
Standard
based
Integration
(8a)
Stop Machine
(1)
Ingest Data
Real Time Edge
Computing
Model Lite
Real Time App
Model Server
RPC
PLC Proprietary
based
Integration
Standard
Interface
Proprietary
Interface
(9) Manual user-based analytics
and reporting to find insights
and improve real time process
22. 22
Agenda
1) Modern IIoT Use Cases around Cloud, Big Data, Machine Learning
2) Automation Industry and its Challenges
3) Architecture for End-to-End Integration from Edge to Data Center / Cloud
4) Apache Kafka as Event Streaming Platform
5) Apache PLC4X for Edge Integration
6) Example: Supply Chain Optimization at Scale in Real Time
23. 23
The beginning of a new Era
https://engineering.linkedin.com/distributed-systems/log-what-every-software-engineer-should-know-about-real-time-datas-unifying
The first use case. This is why Kafka was created!
25. 25
● Global-scale
● Real-time
● Persistent Storage
● Stream Processing
Edge
Cloud
Data LakeDatabases
Datacenter
IoT
SaaS AppsMobile
Microservices Machine
Learning
Apache Kafka
Apache Kafka: The De-facto Standard for Real-Time Event Streaming
26. 26
Apache Kafka at Scale at Tech Giants
> 4.5 trillion messages / day > 6 Petabytes / day
“You name it”
* Kafka Is not just used by tech giants
** Kafka is not just used for big data
27. 27
Confluent - Business Value per Use Case
Improve
Customer
Experience
(CX)
Increase
Revenue
(make money)
Business
Value
Decrease
Costs
(save
money)
Core Business
Platform
Increase
Operational
Efficiency
Migrate to
Cloud
Mitigate Risk
(protect money)
Key Drivers
Strategic Objectives
(sample)
Fraud
Detection
IoT sensor
ingestion
Digital
replatforming/
Mainframe Offload
Connected Car: Navigation & improved
in-car experience: Audi
Customer 360
Simplifying Omni-channel Retail at
Scale: Target
Faster transactional
processing / analysis
incl. Machine Learning / AI
Mainframe Offload: RBC
Microservices
Architecture
Online Fraud Detection
Online Security
(syslog, log
aggregation, Splunk
replacement)
Middleware
replacement
Regulatory
Digital
Transformation
Application Modernization: Multiple
Examples
Website / Core
Operations
(Central Nervous System)
The [Silicon Valley] Digital Natives;
LinkedIn, Netflix, Uber, Yelp...
Predictive Maintenance: Audi
Streaming Platform in a regulated
environment (e.g. Electronic Medical
Records): Celmatix
Real-time app
updates
Real Time Streaming Platform for
Communications and Beyond: Capital One
Developer Velocity - Building Stateful
Financial Applications with Kafka
Streams: Funding Circle
Detect Fraud & Prevent Fraud in Real
Time: PayPal
Kafka as a Service - A Tale of Security
and Multi-Tenancy: Apple
Example Use Cases
$↑
$↓
$
Example Case Studies
(of many)
34. 34
Kafka Streams
● No separate processing cluster required
● Develop on Mac, Linux, Windows
● Deploy to containers, VMs, bare metal, cloud
● Powered by Kafka: elastic, scalable, distributed,
battle-tested
● Perfect for small, medium, large use cases
● Fully integrated with Kafka security
● Exactly-once processing semantics
● Part of Apache Kafka
KStream<User, PageViewEvent> pageViews = builder.stream("pageviews-topic");
KTable<Windowed<User>, Long> viewsPerUserSession = pageViews
.groupByKey()
.count(SessionWindows.with(TimeUnit.MINUTES.toMillis(5)), "session-views");
https://docs.confluent.io/current/streams/
Write standard Java apps and microservices
to process your data in real-time
35. 35
KSQL: Enable Stream Processing using SQL-like Semantics
Leverage Kafka Streams API
using simple SQL commands
KSQL server
Engine
(runs queries)
REST API
CLIClients
Confluent
Control Center
GUI
Kafka Cluster
Use any programming language
Connect via Control Center UI,
CLI, REST or deploy in headless
mode
36. 36
streams
The streaming SQL engine for Apache Kafka
CREATE STREAM fraudulent_payments AS
SELECT * FROM payments
WHERE fraudProbability > 0.8;
Apache Kafka library to write
real-time applications and
microservices in Java and Scala
confluent.io/product/ksql
Confluent KSQL
You write only SQL. No Java, Python, or
other boilerplate to wrap around it!
Event Transformation with Stream Processing
37. 37
Kafka Connect
● Centralized management and configuration
● Support for hundreds of technologies including
RDBMS, Elasticsearch, HDFS, S3
● Supports CDC ingest of events from RDBMS
● Preserves data schema
● Fault tolerant and automatically load balanced
● Extensible API
● Single Message Transforms
● Part of Apache Kafka
{
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"connection.url": "jdbc:mysql://localhost:3306/demo?user=rmoff&password=foo",
"table.whitelist": "sales,orders,customers"
}
https://docs.confluent.io/current/connect/
Reliable and scalable integration of Kafka with other systems
38. 38
Connect External Data Sources and Sinks with Connectors
SOURCES SINKS
CDC
Connectors developed and supported by Confluent, partners and the open source community available on
confluent.io/hub
39. 39
IoT Integration with Kafka Connect, MQTT and REST Proxy
Video and Slides:
https://www.confluent.io/kafka-summit-sf18/processing-iot-data-from-end-to-end
40. 40
Native, decoupled Integration between IIoT and other Systems
ModbusSiemens
S7
Siemens
S7
Siemens
S7
Modbus Modbus Modbus
Kafka Connect Kafka Connect
Siemens
S7
?
41. 41
Agenda
1) Modern IIoT Use Cases around Cloud, Big Data, Machine Learning
2) Automation Industry and its Challenges
3) Architecture for End-to-End Integration from Edge to Data Center / Cloud
4) Apache Kafka as Event Streaming Platform
5) Apache PLC4X for Edge Integration
6) Example: Supply Chain Optimization at Scale in Real Time
42. 42
Apache PLC4X
• Top Level Apache project
• PLC 4 (for) X (anything)
• Goal: Open up PLC interfaces to outside world
• Vertical integration
• Write software independent of PLC
• JDBC-like Adapters for various protocols
https://plc4x.apache.org/
44. 44
Native, decoupled Integration between IIoT and other Systems
ModbusSiemens
S7
Siemens
S7
Siemens
S7
Modbus Modbus ModbusSiemens
S7
Kafka Connect
45. 45
One more thing à PLC4X vs. OPC-UA
• Open standard
• All the pros and cons of an open standard
(works with different vendors; slow adoption;
inflexible, etc.)
• Often poorly implemented
• Requires app server on top of PLC
• Every device has to be retrofitted with the
ability to speak a new protocol and use a
common client to speak with these devices
• Often overengineering for just reading the data
• Activating OPC-UA support on existing PLCs
greatly increases the load on the PLCs
• With licensing cost for every machine
• Open source framework (Apache 2.0 license)
• Provides unified API by implementing drivers
for communicating with most industrial
controllers in the protocols they natively
understand
• No need to modify existing hardware
• No increased load on the PLCs
• No need to pay for licenses to activate OPC-UA
support
• Drivers being implemented from the specs or
by reverse engineering protocols in order to be
fully Apache 2.0 licensed
• PLC4X adapter for OPC-UA available -> Both
can be used together!
46. 46
Agenda
1) Modern IIoT Use Cases around Cloud, Big Data, Machine Learning
2) Automation Industry and its Challenges
3) Architecture for End-to-End Integration from Edge to Data Center / Cloud
4) Apache Kafka as Event Streaming Platform
5) Apache PLC4X for Edge Integration
6) Example: Supply Chain Optimization at Scale in Real Time
47. Spark
Notebooks
(Jupyter)
Kafka
Cluster
Kafka
Connect
KSQL
Machine Sensors
Kafka Ecosystem
Other Components Real Time
Kafka Streams
Application
(Java / Scala)
(6b) All Data
(7) Potential Defect
(3)
Read Data
TensorFlow I/O
TensorFlow
(5)
Deploy Model
(2)
Preprocess
Data (6a) Consume machine data
TensorFlow
File
HTTP
MQTT
ROS
(8a)
Stop Machine
(1)
Ingest Data
Real Time Edge
Computing
(C / librdkafka)
TensorFlow Lite
Real Time Kafka
App
TensorFlow
Serving
HTTP /
gRPC
(4)
Train Model
PLC
Beckhoff
S7
Modbus
Allen Bradley
OPC-UA
PLC4X
Connector
Kafka Connect
Standard
Interface
Proprietary
Interface
(8b) Alert Person (e.g. Mobile App)
(9) Manual user-based analytics
and reporting to find insights
and improve real time process
49. Planners
forecast long
term schedule
Production
begins
IOT data from
production:
inventories,
manufacturing
machines,
yield metrics
Production
forecast
Forecasted
production -
plan diffs
Re optimize
plan based on
actuals
Change orders
to supply
chain:
inventory,
manufacturing
schedules
Change
operational
characteristics
: plant 223
needs new Al
extruder
Customer
delivery SLAs:
actuals vs.
plan
Streaming analytics using Confluent
Batch analytics using other frameworks
Physical operations
UI UI UIUI
(Reference use case implemented with our partner Expero)
50. Planners
forecast long
term schedule
Production
begins
IOT data from
production:
inventories,
manufacturing
machines,
yield metrics
Production
forecast
Forecasted
production -
plan diffs
Re optimize
plan based on
actuals
Change orders
to supply
chain:
inventory,
manufacturing
schedules
Change
operational
characteristics
: plant 223
needs new Al
extruder
Customer
delivery SLAs:
actuals vs.
plan
UI UI UIUI
Kafka
Connect
+
PLC4X
Connector
Machine
Sensors
Kafka
Cluster
KSQL
Tensor
Flow
Kafka
Connect
Notebooks
(Jupyter)
Spark
Real
Time
Kafka
App
Streaming analytics using Confluent
Batch analytics using other frameworks
Physical operations
TensorFlow
Serving
(Reference use case implemented with our partner Expero)
51. 51
Supply Chain Optimization in Real Time at Scale
Slides and Video Recording:
http://www.kai-waehner.de/blog/2019/08/23/apache-kafka-machine-learning-for-real-time-supply-chain-iiot-opcua-modbus/
53. 53
Confluent Platform
The Event Streaming Platform Built by the Original Creators of Apache Kafka®
Operations and Security
Development & Stream Processing
Apache Kafka
Confluent Platform
Support,Services,
Training,&Partners
Mission-Critical Reliability
Complete Event
Streaming Platform
Freedom of Choice
Datacenter Public Cloud Confluent Cloud
Self-Managed Software Fully Managed Service
54. 56
Confluent Platform – Benefits for IoT Projects
• Based on open source and de facto standards for IoT projects
• Low license / subscription costs for Confluent support / services / training (compared to traditional IoT vendors + their products)
• Spend budget for consulting to realize the project successfully
• Mission critical deployments at large scale in various industries
• Automotive, Manufacturing, Logistics, Oil&Gas, Retail, Telco, …
• Flexible architecture
• Lightweight infrastructure footprint on commodity hardware
• Pick what you need
• Deploy where you want
• Complementary to other frameworks, technologies (e.g. Siemens MindSphere, Cisco Kinetic) and cloud services (e.g. Google Cloud IoT)
• Customize and build for the specific customer use case
• Battle-tested at large scale
• Event Streaming Platform for real time integration and processing (plus integration to batch, file and other communication protocols)
• Security and reliability as core concepts
• Elastic scalability, start small and grow to extreme scale easily
• Partner (open source) technologies for specific integrations (like HiveMQ or PLC4X)
• Integration with any legacy and modern technology
• IoT standards like MQTT or OPC-UA
• Legacy and proprietary IIoT protocols like Modbus, Siemens S7, Beckhoff, Allen Bradley, etc.
• Modern technologies like S3, HDFS, MongoDB, etc.
• Modern applications (business services like Salesforce and IoT solutions like Siemens MindSphere)
55. 57
Confluent and IoT Platform Solutions
Kafka
Cluster
Siemens
MindSphere
KSQL
Machine Sensors
File
HTTP
MQTT
ROS
PLC
Beckhoff
S7
Modbus
OPC-UA
“you-name-it”
PLC4X
Connector
Kafka Connect
Azure
IoT Hub
Framework or solution?
Or both as complementary technologies?
S7 PLC