Supply Chain optimization leveraging Event Streaming with Apache Kafka. See real-world use cases and architectures from Walmart, BMW, Porsche, and other enterprises to improve the Supply Chain Management (SCM) processes. Automation, robustness, flexibility, real-time, decoupling, data integration, and hybrid deployments...
Video recording: https://youtu.be/dUkgungBmPs
Blog post: https://www.kai-waehner.de/apache-kafka-supply-chain-management-scm-optimization-scor-six-sigma-real-time
Supplier Management 101: Drive Spend Toward Preferred Suppliers and Reduce Risk SAP Ariba
Attend this session for an introduction to the new SAP Ariba Supplier Management portfolio, which includes the SAP Ariba Supplier Lifecycle and Performance and SAP Ariba Supplier Risk solutions. Fully integrated with your ERP system and procurement processes, these comprehensive tools can help you onboard, qualify, segment, and manage supplier performance while reducing supplier risk.
Interface Fact Sheets in LeanIX Enterprise Architecture ManagementLeanIX GmbH
Learn more about the new Interface Fact Sheet feature. LeanIX offers an innovative software-as-a-service solution for Enterprise Architecture Management (EAM), based either in a public cloud or the client’s data center.
Companies like Adidas, Axel Springer, Helvetia, RWE, Trusted Shops and Zalando use LeanIX Enterprise Architecture Management tool.
Free Trial: http://bit.ly/LeanIXFreeTrial
The Blueprint for Change: How the Best Are Succeeding in TransformationMuleSoft
Learn about the most impactful patterns of success for companies who are succeeding in transformation and the common pitfalls for companies who struggle. MuleSoft's Vice President of Customer Success, Brent Grimes, will explore the best-practice operating models to support transformation and will overview MuleSoft's Catalyst engagement model for getting your Application Network off the ground.
Apache Kafka for Real-time Supply Chainin the Food and Retail IndustryKai Wähner
Use Cases, Architectures, and Real-World Examples for data in motion and real-time event streaming powered by Apache Kafka across the supply chain and logistics. Case studies and deployments include Baader, Walmart, Migros, Albertsons, Domino's Pizza, Instacart, Grab, Royal Caribbean, and more.
Supplier Management 101: Drive Spend Toward Preferred Suppliers and Reduce Risk SAP Ariba
Attend this session for an introduction to the new SAP Ariba Supplier Management portfolio, which includes the SAP Ariba Supplier Lifecycle and Performance and SAP Ariba Supplier Risk solutions. Fully integrated with your ERP system and procurement processes, these comprehensive tools can help you onboard, qualify, segment, and manage supplier performance while reducing supplier risk.
Interface Fact Sheets in LeanIX Enterprise Architecture ManagementLeanIX GmbH
Learn more about the new Interface Fact Sheet feature. LeanIX offers an innovative software-as-a-service solution for Enterprise Architecture Management (EAM), based either in a public cloud or the client’s data center.
Companies like Adidas, Axel Springer, Helvetia, RWE, Trusted Shops and Zalando use LeanIX Enterprise Architecture Management tool.
Free Trial: http://bit.ly/LeanIXFreeTrial
The Blueprint for Change: How the Best Are Succeeding in TransformationMuleSoft
Learn about the most impactful patterns of success for companies who are succeeding in transformation and the common pitfalls for companies who struggle. MuleSoft's Vice President of Customer Success, Brent Grimes, will explore the best-practice operating models to support transformation and will overview MuleSoft's Catalyst engagement model for getting your Application Network off the ground.
Apache Kafka for Real-time Supply Chainin the Food and Retail IndustryKai Wähner
Use Cases, Architectures, and Real-World Examples for data in motion and real-time event streaming powered by Apache Kafka across the supply chain and logistics. Case studies and deployments include Baader, Walmart, Migros, Albertsons, Domino's Pizza, Instacart, Grab, Royal Caribbean, and more.
Financial Event Sourcing at Enterprise Scaleconfluent
For years, Rabobank has been actively investing in becoming a real-time, event-driven bank. If you are familiar with banking processes, you will understand that this is not simple. Many banking processes are implemented as batch jobs on not-so-commodity hardware, meaning that any migration effort is immense.
*Find out how Rabobank redesigned Rabo Alerts while continuing to provide a robust and stable alert system for its existing user base
*Learn how the project team managed to achieve a balance between the need to decentralise activity while not losing control
*Understand how Rabobank re-invented a reliable service to meet modern customer expectations
The Top 5 Apache Kafka Use Cases and Architectures in 2022Kai Wähner
I see the following topics coming up more regularly in conversations with customers, prospects, and the broader Kafka community across the globe:
Kappa Architecture: Kappa goes mainstream to replace Lambda and Batch pipelines (that does not mean that there is no batch processing anymore). Examples: Kafka-powered Kappa architectures from Uber, Disney, Shopify, and Twitter.
Hyper-personalized Omnichannel: Retail and customer communication across online and offline channels becomes the new black, including context-specific upselling, recommendations, and location-based services. Examples: Omnichannel Retail and Customer 360 in Real-Time with Apache Kafka.
Multi-Cloud Deployments: Business units and IT infrastructures span across regions, continents, and cloud providers. Linking clusters for bi-directional replication of data in real-time becomes crucial for many business models. Examples: Global Kafka deployments.
Edge Analytics: Low latency requirements, cost efficiency, or security requirements enforce the deployment of (some) event streaming use cases at the far edge (i.e., outside a data center), for instance, for predictive maintenance and quality assurance on the shop floor level in smart factories. Examples: Edge analytics with Kafka.
Real-time Cybersecurity: Situational awareness and threat intelligence need to process massive data in real-time to defend against cyberattacks successfully. The many successful ransomware attacks across the globe in 2021 were a warning for most CIOs. Examples: Cybersecurity for situational awareness and threat intelligence in real-time.
Platform Strategy to Deliver Digital Experiences on AzureWSO2
This slide deck introduces Choreo, a cloud native internal developer platform by Microsoft independent software vendor (ISV) Partner, WSO2. It enables your developers to create, deploy, and run new digital components like APIs, microservices, and integrations in serverless mode on any Kubernetes cluster with built-in DevSecOps.
Recording: https://wso2.com/choreo/resources/webinar/platform-strategy-to-deliver-digital-experiences-on-azure/
The Fourth Industrial Revolution (also known as Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices, using modern smart technology.
Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way integrating with various legacy and modern data sources and sinks.
In this presentation, I want to give you an overview of existing use cases for event streaming technology in a connected world across supply chains, industries and customer experiences that come along with these interdisciplinary data intersections:
• The Automotive Industry (and it’s not only Connected Cars)
• Mobility Services across verticals (transportation, logistics, travel industry, retailing, …)
• Smart Cities (including citizen health services, communication infrastructure, …)
All these industries and sectors do not have new characteristics and requirements. They require data integration, data correlation or real decoupling, just to name a few, but are now facing massively increased volumes of data.
Real-time messaging solutions have existed for many years. Hundreds of platforms exist for data integration (including ETL and ESB tooling or specific IIoT platforms). Proprietary monoliths monitor plants, telco networks, and other infrastructures for decades in real-time. But now, Kafka combines all the above characteristics in an open, scalable, and flexible infrastructure to operate mission-critical workloads at scale in real-time. And is taking over the world of connecting data.
The Rise Of Event Streaming – Why Apache Kafka Changes EverythingKai Wähner
Business digitalization trends like microservices, the Internet of Things or Machine Learning are driving the need to process events at a whole new scale, speed and efficiency. Traditional solutions like ETL/data integration or messaging are not build to serve these needs.
Today, the open source project Apache Kafka® is being used by thousands of companies including over 60% of the Fortune 100 to power and innovate their businesses by focusing their data strategies around event-driven architectures leveraging event streaming.We will discuss the market and technology changes that have given rise to Kafka and to Event Streaming, and we will introduce the audience to the key aspects of building an Event streaming platform with Kafka. Examples of productive use cases from the automotive, manufacturing and transportation sector will showcase the power of event streaming.
Building Value - Understanding the TCO and ROI of Apache Kafka & Confluentconfluent
For a product or service to be cost effective, it must be considered to be good value, where the benefits are worth at least what is paid for them. But how do we measure this, to prove the case? Given that value can be intangible, it can be hard to quantify and may have little relationship to cost. Added to this, the open source nature of Apache Kafka means that many companies skip the requirement to build a business case for it, until it has become mission critical and demands financial and human resources.
In this presentation, Lyndon Hedderly, Team Lead of Business Value Consulting at Confluent, will cover how Confluent works with customers to measure the business value of data streaming.
Simplify Supplier Risk Management Across Your Procurement Processes - SID 51538SAP Ariba
Suffering from sporadic supplier due diligence and fragmented risk information? Getting burned from engaging with at-risk suppliers? You are not alone. Come learn how to simplify supplier risk management across your procurement processes. Industry experts will share their experience using the SAP Ariba Supplier Risk solution to help ensure focused risk due diligence during supplier selection, detect early warning signals, and proactively monitor and address risks for each supplier engagement.
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB)Kai Wähner
Learn the differences between an event-driven streaming platform and middleware like MQ, ETL and ESBs – including best practices and anti-patterns, but also how these concepts and tools complement each other in an enterprise architecture.
Extract-Transform-Load (ETL) is still a widely-used pattern to move data between different systems via batch processing. Due to its challenges in today’s world where real time is the new standard, an Enterprise Service Bus (ESB) is used in many enterprises as integration backbone between any kind of microservice, legacy application or cloud service to move data via SOAP / REST Web Services or other technologies. Stream Processing is often added as its own component in the enterprise architecture for correlation of different events to implement contextual rules and stateful analytics. Using all these components introduces challenges and complexities in development and operations.
This session discusses how teams in different industries solve these challenges by building a native streaming platform from the ground up instead of using ETL and ESB tools in their architecture. This allows to build and deploy independent, mission-critical streaming real time application and microservices. The architecture leverages distributed processing and fault-tolerance with fast failover, no-downtime rolling deployments and the ability to reprocess events, so you can recalculate output when your code changes. Integration and Stream Processing are still key functionality but can be realized in real time natively instead of using additional ETL, ESB or Stream Processing tools.
Augury: Real-Time Insights for the Industrial IoTScyllaDB
Augury stores and serves time-series features from massive streams of IoT data, both for real-time insights, and offline learning and analytics. Learn about Augury’s needs and constraints, their solution evaluation and architecture, and fundamental practices for efficient data modeling, plus get a glimpse into the next-gen architecture at Augury, with a view on time-series feature storage and serving.
Digital Transformation: How leaders meet modern customer expectationsApigee | Google Cloud
Chet Kapoor, CEO Apigee presents to the Pacific Crest global technology leadership forum in Vail, Colorado, Aug. 11 2015. How leading companies lead digital at the C-Suite level, deliver digital by making developers productive, and build digital enterprises with APIs.
Real-Life Use Cases & Architectures for Event Streaming with Apache KafkaKai Wähner
Streaming all over the World: Real-Life Use Cases & Architectures for Event Streaming with Apache Kafka.
Learn about various case studies for event streaming with Apache Kafka across industries. The talk explores architectures for real-world deployments from Audi, BMW, Disney, Generali, Paypal, Tesla, Unity, Walmart, William Hill, and more. Use cases include fraud detection, mainframe offloading, predictive maintenance, cybersecurity, edge computing, track&trace, live betting, and much more.
Mainframe Integration, Offloading and Replacement with Apache KafkaKai Wähner
Video recording of this presentation:
https://youtu.be/upWzamacOVQ
Blog post with more details:
https://www.kai-waehner.de/blog/2020/04/24/mainframe-offloading-replacement-apache-kafka-connect-ibm-db2-mq-cdc-cobol/
Mainframes are still hard at work, processing over 70 percent of the world’s most essential computing transactions every day. Very high cost, monolithic architectures, and missing experts are the key challenges for mainframe applications. Time to get more innovative, even with the mainframe!
Mainframe offloading with Apache Kafka and its ecosystem can be used to keep a more modern data store in real-time sync with the mainframe. At the same time, it is persisting the event data on the bus to enable microservices, and deliver the data to other systems such as data warehouses and search indexes.
But the final goal and ultimate vision are to replace the mainframe by new applications using modern and less costly technologies. Stand up to the dinosaur, but keep in mind that legacy migration is a journey! Kai will guide you to the next step of your company’s evolution!
You will learn:
- how to not only reduce operational expenses but provide a path for architecture modernization, agility and eventually mainframe replacement
- what steps some of Confluent’s customers already took, leveraging technologies like Change Data Capture (CDC) or MQ for mainframe offloading
- how an event streaming platform enables cost reduction, architecture modernization, and a combination of a mainframe with new technologies
Simplify Supplier Risk Management Across Your Procurement ProcessesSAP Ariba
Suffering from sporadic supplier due diligence and fragmented risk information? Getting burned from engaging with at-risk suppliers? You are not alone. Come learn how to simplify supplier risk management across your procurement processes. Industry experts will share their experience using the SAP Ariba Supplier Risk solution to help ensure focused risk due diligence during supplier selection, detect early warning signals, and proactively monitor and address risks for each supplier engagement.
Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?Kai Wähner
The concepts and architectures of a data warehouse, a data lake, and data streaming are complementary to solving business problems.
Unfortunately, the underlying technologies are often misunderstood, overused for monolithic and inflexible architectures, and pitched for wrong use cases by vendors. Let’s explore this dilemma in a presentation.
The slides cover technologies such as Apache Kafka, Apache Spark, Confluent, Databricks, Snowflake, Elasticsearch, AWS Redshift, GCP with Google Bigquery, and Azure Synapse.
Latest Innovations from Workday Analytics and PlanningWorkday, Inc.
Learn from our product leaders through viewing the highlights of 2020R1, including the latest from Adaptive Insights, Workday Prism Analytics, and core reporting in Workday HCM and Workday Financial Management.
Apache Kafka in the Airline, Aviation and Travel IndustryKai Wähner
Aviation and travel are notoriously vulnerable to social, economic, and political events, as well as the ever-changing expectations of consumers. Coronavirus is just a piece of the challenge.
This presentation explores use cases, architectures, and references for Apache Kafka as event streaming technology in the aviation industry, including airline, airports, global distribution systems (GDS), aircraft manufacturers, and more.
Examples include Lufthansa, Singapore Airlines, Air France Hop, Amadeus, and more. Technologies include Kafka, Kafka Connect, Kafka Streams, ksqlDB, Machine Learning, Cloud, and more.
Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...Kai Wähner
Hybrid cloud architectures are the new black for most companies. A cloud-first strategy is evident for many new enterprise architectures, but some use cases require resiliency across edge sites and multiple cloud regions. Data streaming with the Apache Kafka ecosystem is a perfect technology for building resilient and hybrid real-time applications at any scale. This talk explores different architectures and their trade-offs for transactional and analytical workloads. Real-world examples include financial services, retail, and the automotive industry.
Video recording:
https://qconlondon.com/london2022/presentation/resilient-real-time-data-streaming-across-the-edge-and-hybrid-cloud
Financial Event Sourcing at Enterprise Scaleconfluent
For years, Rabobank has been actively investing in becoming a real-time, event-driven bank. If you are familiar with banking processes, you will understand that this is not simple. Many banking processes are implemented as batch jobs on not-so-commodity hardware, meaning that any migration effort is immense.
*Find out how Rabobank redesigned Rabo Alerts while continuing to provide a robust and stable alert system for its existing user base
*Learn how the project team managed to achieve a balance between the need to decentralise activity while not losing control
*Understand how Rabobank re-invented a reliable service to meet modern customer expectations
The Top 5 Apache Kafka Use Cases and Architectures in 2022Kai Wähner
I see the following topics coming up more regularly in conversations with customers, prospects, and the broader Kafka community across the globe:
Kappa Architecture: Kappa goes mainstream to replace Lambda and Batch pipelines (that does not mean that there is no batch processing anymore). Examples: Kafka-powered Kappa architectures from Uber, Disney, Shopify, and Twitter.
Hyper-personalized Omnichannel: Retail and customer communication across online and offline channels becomes the new black, including context-specific upselling, recommendations, and location-based services. Examples: Omnichannel Retail and Customer 360 in Real-Time with Apache Kafka.
Multi-Cloud Deployments: Business units and IT infrastructures span across regions, continents, and cloud providers. Linking clusters for bi-directional replication of data in real-time becomes crucial for many business models. Examples: Global Kafka deployments.
Edge Analytics: Low latency requirements, cost efficiency, or security requirements enforce the deployment of (some) event streaming use cases at the far edge (i.e., outside a data center), for instance, for predictive maintenance and quality assurance on the shop floor level in smart factories. Examples: Edge analytics with Kafka.
Real-time Cybersecurity: Situational awareness and threat intelligence need to process massive data in real-time to defend against cyberattacks successfully. The many successful ransomware attacks across the globe in 2021 were a warning for most CIOs. Examples: Cybersecurity for situational awareness and threat intelligence in real-time.
Platform Strategy to Deliver Digital Experiences on AzureWSO2
This slide deck introduces Choreo, a cloud native internal developer platform by Microsoft independent software vendor (ISV) Partner, WSO2. It enables your developers to create, deploy, and run new digital components like APIs, microservices, and integrations in serverless mode on any Kubernetes cluster with built-in DevSecOps.
Recording: https://wso2.com/choreo/resources/webinar/platform-strategy-to-deliver-digital-experiences-on-azure/
The Fourth Industrial Revolution (also known as Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices, using modern smart technology.
Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way integrating with various legacy and modern data sources and sinks.
In this presentation, I want to give you an overview of existing use cases for event streaming technology in a connected world across supply chains, industries and customer experiences that come along with these interdisciplinary data intersections:
• The Automotive Industry (and it’s not only Connected Cars)
• Mobility Services across verticals (transportation, logistics, travel industry, retailing, …)
• Smart Cities (including citizen health services, communication infrastructure, …)
All these industries and sectors do not have new characteristics and requirements. They require data integration, data correlation or real decoupling, just to name a few, but are now facing massively increased volumes of data.
Real-time messaging solutions have existed for many years. Hundreds of platforms exist for data integration (including ETL and ESB tooling or specific IIoT platforms). Proprietary monoliths monitor plants, telco networks, and other infrastructures for decades in real-time. But now, Kafka combines all the above characteristics in an open, scalable, and flexible infrastructure to operate mission-critical workloads at scale in real-time. And is taking over the world of connecting data.
The Rise Of Event Streaming – Why Apache Kafka Changes EverythingKai Wähner
Business digitalization trends like microservices, the Internet of Things or Machine Learning are driving the need to process events at a whole new scale, speed and efficiency. Traditional solutions like ETL/data integration or messaging are not build to serve these needs.
Today, the open source project Apache Kafka® is being used by thousands of companies including over 60% of the Fortune 100 to power and innovate their businesses by focusing their data strategies around event-driven architectures leveraging event streaming.We will discuss the market and technology changes that have given rise to Kafka and to Event Streaming, and we will introduce the audience to the key aspects of building an Event streaming platform with Kafka. Examples of productive use cases from the automotive, manufacturing and transportation sector will showcase the power of event streaming.
Building Value - Understanding the TCO and ROI of Apache Kafka & Confluentconfluent
For a product or service to be cost effective, it must be considered to be good value, where the benefits are worth at least what is paid for them. But how do we measure this, to prove the case? Given that value can be intangible, it can be hard to quantify and may have little relationship to cost. Added to this, the open source nature of Apache Kafka means that many companies skip the requirement to build a business case for it, until it has become mission critical and demands financial and human resources.
In this presentation, Lyndon Hedderly, Team Lead of Business Value Consulting at Confluent, will cover how Confluent works with customers to measure the business value of data streaming.
Simplify Supplier Risk Management Across Your Procurement Processes - SID 51538SAP Ariba
Suffering from sporadic supplier due diligence and fragmented risk information? Getting burned from engaging with at-risk suppliers? You are not alone. Come learn how to simplify supplier risk management across your procurement processes. Industry experts will share their experience using the SAP Ariba Supplier Risk solution to help ensure focused risk due diligence during supplier selection, detect early warning signals, and proactively monitor and address risks for each supplier engagement.
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB)Kai Wähner
Learn the differences between an event-driven streaming platform and middleware like MQ, ETL and ESBs – including best practices and anti-patterns, but also how these concepts and tools complement each other in an enterprise architecture.
Extract-Transform-Load (ETL) is still a widely-used pattern to move data between different systems via batch processing. Due to its challenges in today’s world where real time is the new standard, an Enterprise Service Bus (ESB) is used in many enterprises as integration backbone between any kind of microservice, legacy application or cloud service to move data via SOAP / REST Web Services or other technologies. Stream Processing is often added as its own component in the enterprise architecture for correlation of different events to implement contextual rules and stateful analytics. Using all these components introduces challenges and complexities in development and operations.
This session discusses how teams in different industries solve these challenges by building a native streaming platform from the ground up instead of using ETL and ESB tools in their architecture. This allows to build and deploy independent, mission-critical streaming real time application and microservices. The architecture leverages distributed processing and fault-tolerance with fast failover, no-downtime rolling deployments and the ability to reprocess events, so you can recalculate output when your code changes. Integration and Stream Processing are still key functionality but can be realized in real time natively instead of using additional ETL, ESB or Stream Processing tools.
Augury: Real-Time Insights for the Industrial IoTScyllaDB
Augury stores and serves time-series features from massive streams of IoT data, both for real-time insights, and offline learning and analytics. Learn about Augury’s needs and constraints, their solution evaluation and architecture, and fundamental practices for efficient data modeling, plus get a glimpse into the next-gen architecture at Augury, with a view on time-series feature storage and serving.
Digital Transformation: How leaders meet modern customer expectationsApigee | Google Cloud
Chet Kapoor, CEO Apigee presents to the Pacific Crest global technology leadership forum in Vail, Colorado, Aug. 11 2015. How leading companies lead digital at the C-Suite level, deliver digital by making developers productive, and build digital enterprises with APIs.
Real-Life Use Cases & Architectures for Event Streaming with Apache KafkaKai Wähner
Streaming all over the World: Real-Life Use Cases & Architectures for Event Streaming with Apache Kafka.
Learn about various case studies for event streaming with Apache Kafka across industries. The talk explores architectures for real-world deployments from Audi, BMW, Disney, Generali, Paypal, Tesla, Unity, Walmart, William Hill, and more. Use cases include fraud detection, mainframe offloading, predictive maintenance, cybersecurity, edge computing, track&trace, live betting, and much more.
Mainframe Integration, Offloading and Replacement with Apache KafkaKai Wähner
Video recording of this presentation:
https://youtu.be/upWzamacOVQ
Blog post with more details:
https://www.kai-waehner.de/blog/2020/04/24/mainframe-offloading-replacement-apache-kafka-connect-ibm-db2-mq-cdc-cobol/
Mainframes are still hard at work, processing over 70 percent of the world’s most essential computing transactions every day. Very high cost, monolithic architectures, and missing experts are the key challenges for mainframe applications. Time to get more innovative, even with the mainframe!
Mainframe offloading with Apache Kafka and its ecosystem can be used to keep a more modern data store in real-time sync with the mainframe. At the same time, it is persisting the event data on the bus to enable microservices, and deliver the data to other systems such as data warehouses and search indexes.
But the final goal and ultimate vision are to replace the mainframe by new applications using modern and less costly technologies. Stand up to the dinosaur, but keep in mind that legacy migration is a journey! Kai will guide you to the next step of your company’s evolution!
You will learn:
- how to not only reduce operational expenses but provide a path for architecture modernization, agility and eventually mainframe replacement
- what steps some of Confluent’s customers already took, leveraging technologies like Change Data Capture (CDC) or MQ for mainframe offloading
- how an event streaming platform enables cost reduction, architecture modernization, and a combination of a mainframe with new technologies
Simplify Supplier Risk Management Across Your Procurement ProcessesSAP Ariba
Suffering from sporadic supplier due diligence and fragmented risk information? Getting burned from engaging with at-risk suppliers? You are not alone. Come learn how to simplify supplier risk management across your procurement processes. Industry experts will share their experience using the SAP Ariba Supplier Risk solution to help ensure focused risk due diligence during supplier selection, detect early warning signals, and proactively monitor and address risks for each supplier engagement.
Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?Kai Wähner
The concepts and architectures of a data warehouse, a data lake, and data streaming are complementary to solving business problems.
Unfortunately, the underlying technologies are often misunderstood, overused for monolithic and inflexible architectures, and pitched for wrong use cases by vendors. Let’s explore this dilemma in a presentation.
The slides cover technologies such as Apache Kafka, Apache Spark, Confluent, Databricks, Snowflake, Elasticsearch, AWS Redshift, GCP with Google Bigquery, and Azure Synapse.
Latest Innovations from Workday Analytics and PlanningWorkday, Inc.
Learn from our product leaders through viewing the highlights of 2020R1, including the latest from Adaptive Insights, Workday Prism Analytics, and core reporting in Workday HCM and Workday Financial Management.
Apache Kafka in the Airline, Aviation and Travel IndustryKai Wähner
Aviation and travel are notoriously vulnerable to social, economic, and political events, as well as the ever-changing expectations of consumers. Coronavirus is just a piece of the challenge.
This presentation explores use cases, architectures, and references for Apache Kafka as event streaming technology in the aviation industry, including airline, airports, global distribution systems (GDS), aircraft manufacturers, and more.
Examples include Lufthansa, Singapore Airlines, Air France Hop, Amadeus, and more. Technologies include Kafka, Kafka Connect, Kafka Streams, ksqlDB, Machine Learning, Cloud, and more.
Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...Kai Wähner
Hybrid cloud architectures are the new black for most companies. A cloud-first strategy is evident for many new enterprise architectures, but some use cases require resiliency across edge sites and multiple cloud regions. Data streaming with the Apache Kafka ecosystem is a perfect technology for building resilient and hybrid real-time applications at any scale. This talk explores different architectures and their trade-offs for transactional and analytical workloads. Real-world examples include financial services, retail, and the automotive industry.
Video recording:
https://qconlondon.com/london2022/presentation/resilient-real-time-data-streaming-across-the-edge-and-hybrid-cloud
Apache Kafka and API Management / API Gateway – Friends, Enemies or Frenemies...HostedbyConfluent
Microservices became the new black in enterprise architectures. APIs provide functions to other applications or end users. Even if your architecture uses another pattern than microservices, like SOA (Service-Oriented Architecture) or Client-Server communication, APIs are used between the different applications and end users.
Apache Kafka plays a key role in modern microservice architectures to build open, scalable, flexible and decoupled real time applications. API Management complements Kafka by providing a way to implement and govern the full life cycle of the APIs.
This session explores how event streaming with Apache Kafka and API Management (including API Gateway and Service Mesh technologies) complement and compete with each other depending on the use case and point of view of the project team. The session concludes exploring the vision of event streaming APIs instead of RPC calls.
Apache Kafka and API Management / API Gateway – Friends, Enemies or Frenemies?Kai Wähner
Microservices became the new black in enterprise architectures. APIs provide functions to other applications or end users. Even if your architecture uses another pattern than microservices, like SOA (Service-Oriented Architecture) or Client-Server communication, APIs are used between the different applications and end users.
Apache Kafka plays a key role in modern microservice architectures to build open, scalable, flexible and decoupled real time applications. API Management complements Kafka by providing a way to implement and govern the full life cycle of the APIs.
This session explores how event streaming with Apache Kafka and API Management (including API Gateway and Service Mesh technologies) complement and compete with each other depending on the use case and point of view of the project team. The session concludes exploring the vision of event streaming APIs instead of RPC calls.
Understand how event streaming with Kafka and Confluent complements tools and frameworks such as Kong, Mulesoft, Apigee, Envoy, Istio, Linkerd, Software AG, TIBCO Mashery, IBM, Axway, etc.
A Streaming API Data Exchangeprovides streaming replication between business units and companies. API Management with REST/HTTP is not appropriate for streaming data.
Top 5 Event Streaming Use Cases for 2021 with Apache KafkaKai Wähner
Apache Kafka and Event Streaming are two of the most relevant buzzwords in tech these days. Ever wonder what the predicted TOP 5 Event Streaming Architectures and Use Cases for 2021 are? Check out the following presentation. Learn about edge deployments, hybrid and multi-cloud architectures, service mesh-based microservices, streaming machine learning, and cybersecurity.
On-demand video recording: https://videos.confluent.io/watch/XAjxV3j8hzwCcEKoZVErUJ
The Top 5 Event Streaming Use Cases & Architectures in 2021confluent
Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021
Apache Kafka® and Analytics in a Connected IoT Worldconfluent
Apache Kafka® and Analytics in a Connected IoT World, Kai Waehner, Sr. Solutions Engineer Advanced Technology Group, Confluent
https://www.meetup.com/Berlin-Apache-Kafka-Meetup-by-Confluent/events/273166575/
IoT Architectures for Apache Kafka and Event Streaming - Industry 4.0, Digita...Kai Wähner
The Internet of Things (IoT) is getting more and more traction as valuable use cases come to light. Whether you are in Healthcare, Telecommunications, Manufacturing, Banking or Retail to name a few industries, there is one key challenge and that's the integration of backend IoT data logs and applications, business services and cloud services to process the data in real time and at scale.
In this talk, we will be sharing how Kafka has become the leading technology used throughout the business to provide Real Time Event Streaming. Explore real life use cases of Kafka Connect, Kafka Streams and KSQL independent of the data deployment be it on a private or public Cloud, On Premise or at the Edge.
Audi - Connected car infrastructure
Robert Bosch Power Tools - Track and Trace of devices and people at construction areas
Deutsche Bahn - Customer 360 for train timetable updates
E.ON - IoT Streaming Platform to integrate and build smart home, smart building and smart grid infrastructures
Apache Kafka in Financial Services - Use Cases and ArchitecturesKai Wähner
The Rise of Event Streaming in Financial Services - Use Cases, Architectures and Examples powered by Apache Kafka.
The New FinServ Enterprise Reality: Every company is a software company. Innovate OR be Disrupted. Learn how Event Streaming with Apache Kafka and its ecosystem help...
More details:
https://www.kai-waehner.de/apache-kafka-financial-services-industry-banking-finserv-payment-fraud-middleware-messaging-transactions
https://www.kai-waehner.de/blog/2020/04/15/apache-kafka-machine-learning-banking-finance-industry/
https://www.kai-waehner.de/blog/2020/04/24/mainframe-offloading-replacement-apache-kafka-connect-ibm-db2-mq-cdc-cobol/
Can Apache Kafka Replace a Database? – The 2021 Update | Kai Waehner, ConfluentHostedbyConfluent
Can and should Apache Kafka replace a database? How long can and should I store data in Kafka? How can I query and process data in Kafka? These are common questions that come up more and more. This session explains the idea behind databases and different features like storage, queries, transactions, and processing to evaluate when Kafka is a good fit, and when it is not. The discussion includes different Kafka-native add-ons like Tiered Storage for long-term, cost-efficient storage, and ksqlDB as an event streaming database. The relation and trade-offs between Kafka and other databases are explored to complement each other instead of thinking about a replacement. This includes different options for pull and push-based bi-directional integration.
Apache Kafka for Smart Grid, Utilities and Energy ProductionKai Wähner
The energy industry is changing from system-centric to smaller-scale and distributed smart grids and microgrids. A smart grid requires a flexible, scalable, elastic, and reliable cloud-native infrastructure for real-time data integration and processing. This post explores use cases, architectures, and real-world deployments of event streaming with Apache Kafka in the energy industry to implement smart grids and real-time end-to-end integration.
Blog Post with more details:
https://www.kai-waehner.de/apache-kafka-smart-grid-energy-production-edge-iot-oil-gas-green-renewable-sensor-analytics
Connected Vehicles and V2X with Apache KafkaKai Wähner
This session discusses uses cases leveraging Apache Kafka open source ecosystem as streaming platform to process IoT data.
See use cases, architectural alternatives and a live demo of how devices connect to Kafka via MQTT. Learn how to analyze the IoT data either natively on Kafka with Kafka Streams/KSQL, or on an external big data cluster like Spark, Flink or Elastic leveraging Kafka Connect, and how to leverage TensorFlow for Machine Learning.
The focus is on connected cars / connected vehicles and V2X use cases respectively mobility services.
A live demo shows how to build a cloud-native IoT infrastructure on Kubernetes to connect and process streaming data in real-time from 100.000 cars to do predictive maintenance at scale in real-time.
Code for the live demo on Github:
https://github.com/kaiwaehner/hivemq-mqtt-tensorflow-kafka-realtime-iot-machine-learning-training-inference
Building a Secure, Tamper-Proof & Scalable Blockchain on Top of Apache Kafka ...confluent
Apache Kafka is an open source event streaming platform. It is often used to complement or even replace existing middleware to integrate applications and build microservice architectures. Apache Kafka is already used in various projects in almost every bigger company today. Understood, battled-tested, highly scalable, reliable, real-time.
Blockchain is a different story. This technology is a lot in the news, especially related to cryptocurrencies like Bitcoin. But what is the added value for software architectures? Is blockchain just hype and adds complexity? Or will it be used by everybody in the future, like a web browser or mobile app today? And how is it related to an integration architecture and event streaming platform?
This session explores use cases for blockchains and discusses different alternatives such as Hyperledger, Ethereum and a Kafka-native tamper-proof blockchain implementation. Different architectures are discussed to understand when blockchain really adds value and how it can be combined with the Apache Kafka ecosystem to integrate blockchain with the rest of the enterprise architecture to build a highly scalable and reliable event streaming infrastructure.
Speakers:
Kai Waehner, Technology Evangelist, Confluent
Stephen Reed, CTO, Co-Founder, AiB
Kappa vs Lambda Architectures and Technology ComparisonKai Wähner
Real-time data beats slow data. That’s true for almost every use case. Nevertheless, enterprise architects build new infrastructures with the Lambda architecture that includes separate batch and real-time layers.
This video explores why a single real-time pipeline, called Kappa architecture, is the better fit for many enterprise architectures. Real-world examples from companies such as Disney, Shopify, Uber, and Twitter explore the benefits of Kappa but also show how batch processing fits into this discussion positively without the need for a Lambda architecture.
The main focus of the discussion is on Apache Kafka (and its ecosystem) as the de facto standard for event streaming to process data in motion (the key concept of Kappa), but the video also compares various technologies and vendors such as Confluent, Cloudera, IBM Red Hat, Apache Flink, Apache Pulsar, AWS Kinesis, Amazon MSK, Azure Event Hubs, Google Pub Sub, and more.
Video recording of this presentation:
https://youtu.be/j7D29eyysDw
Further reading:
https://www.kai-waehner.de/blog/2021/09/23/real-time-kappa-architecture-mainstream-replacing-batch-lambda/
https://www.kai-waehner.de/blog/2021/04/20/comparison-open-source-apache-kafka-vs-confluent-cloudera-red-hat-amazon-msk-cloud/
https://www.kai-waehner.de/blog/2021/05/09/kafka-api-de-facto-standard-event-streaming-like-amazon-s3-object-storage/
Apache Kafka vs. Cloud-native iPaaS Integration Platform MiddlewareKai Wähner
Enterprise integration is more challenging than ever before. The IT evolution requires the integration of more and more technologies. Applications are deployed across the edge, hybrid, and multi-cloud architectures. Traditional middleware such as MQ, ETL, ESB does not scale well enough or only processes data in batch instead of real-time.
This presentation explores why Apache Kafka is the new black for integration projects, how Kafka fits into the discussion around cloud-native iPaaS (Integration Platform as a Service) solutions, and why event streaming is a new software category.
A concrete real-world example shows the difference between event streaming and traditional integration platforms respectively cloud-native iPaaS.
Video Recording of this presentation:
https://www.youtube.com/watch?v=I8yZwKg_IJc&t=2842s
Blog post about this topic:
https://www.kai-waehner.de/blog/2021/11/03/apache-kafka-cloud-native-ipaas-versus-mq-etl-esb-middleware/
Apache Kafka in the Transportation and LogisticsKai Wähner
Event Streaming with Apache Kafka in the Transportation and Logistics.
Track & Trace, Real-time Locating System, Customer 360, Open API, and more…
Examples include Swiss Post, SBB, Deutsche Bahn, Hermes, Migros, Here Technologies, Otonomo, Lyft, Uber, Free Now, Lufthansa, Air France, Singapore Airlines, Amadeus Group, and more.
Deep Learning at Extreme Scale (in the Cloud) with the Apache Kafka Open Sou...Kai Wähner
How to Build a Machine Learning Infrastructure with Kafka, Connect, Streams, KSQL, etc…
This talk shows how to build Machine Learning models at extreme scale and how to productionize the built models in mission-critical real time applications by leveraging open source components in the public cloud. The session discusses the relation between TensorFlow and the Apache Kafka ecosystem - and why this is a great fit for machine learning at extreme scale.
The Machine Learning architecture includes: Kafka Connect for continuous high volume data ingestion into the public cloud, TensorFlow leveraging Deep Learning algorithms to build an analytic model on powerful GPUs, Kafka Streams for model deployment and inference in real time, and KSQL for real time analytics of predictions, alerts and model accuracy.
Sensor analytics for predictive alerting in real time is used as real world example from Internet of Things scenarios. A live demo shows the out-of-the-box integration and dynamic scalability of these components on Google Cloud.
Key takeaways for the audience
• Learn how to build a Machine Learning infrastructure at extreme scale and how to productionize the built models in mission-critical real time applications
• Understand the benefits of a machine learning platform on the public cloud
• Learn about an extreme scale Machine Learning architecture around the Apache Kafka open source ecosystem including Kafka Connect, Kafka Streams and KSQL
• See a live demo for an Internet of Things use case: Sensor analytics for predictive alerting in real time
Similar to Supply Chain Optimization with Apache Kafka (20)
Apache Kafka as Data Hub for Crypto, NFT, Metaverse (Beyond the Buzz!)Kai Wähner
Decentralized finance with crypto and NFTs is a huge topic these days. It becomes a powerful combination with the coming metaverse platforms across industries. This session explores the relationship between crypto technologies and modern enterprise architecture.
I discuss how data streaming and Apache Kafka help build innovation and scalable real-time applications of a future metaverse. Let's skip the buzz (and NFT bubble) and instead review existing real-world deployments in the crypto and blockchain world powered by Kafka and its ecosystem.
Apache Kafka is the de facto standard for data streaming to process data in motion. With its significant adoption growth across all industries, I get a very valid question every week: When NOT to use Apache Kafka? What limitations does the event streaming platform have? When does Kafka simply not provide the needed capabilities? How to qualify Kafka out as it is not the right tool for the job?
This session explores the DOs and DONTs. Separate sections explain when to use Kafka, when NOT to use Kafka, and when to MAYBE use Kafka.
No matter if you think about open source Apache Kafka, a cloud service like Confluent Cloud, or another technology using the Kafka protocol like Redpanda or Pulsar, check out this slide deck.
A detailed article about this topic:
https://www.kai-waehner.de/blog/2022/01/04/when-not-to-use-apache-kafka/
Kafka for Live Commerce to Transform the Retail and Shopping MetaverseKai Wähner
Live commerce combines instant purchasing of a featured product and audience participation.
This talk explores the need for real-time data streaming with Apache Kafka between applications to enable live commerce across online stores and brick & mortar stores across regions, countries, and continents in any retail business.
The discussion covers several building blocks of a live commerce enterprise architecture, including transactional data processing, omnichannel, natural language processing, augmented reality, edge computing, and more.
The Heart of the Data Mesh Beats in Real-Time with Apache KafkaKai Wähner
If there were a buzzword of the hour, it would certainly be "data mesh"! This new architectural paradigm unlocks analytic data at scale and enables rapid access to an ever-growing number of distributed domain datasets for various usage scenarios.
As such, the data mesh addresses the most common weaknesses of the traditional centralized data lake or data platform architecture. And the heart of a data mesh infrastructure must be real-time, decoupled, reliable, and scalable.
This presentation explores how Apache Kafka, as an open and scalable decentralized real-time platform, can be the basis of a data mesh infrastructure and - complemented by many other data platforms like a data warehouse, data lake, and lakehouse - solve real business problems.
There is no silver bullet or single technology/product/cloud service for implementing a data mesh. The key outcome of a data mesh architecture is the ability to build data products; with the right tool for the job.
A good data mesh combines data streaming technology like Apache Kafka or Confluent Cloud with cloud-native data warehouse and data lake architectures from Snowflake, Databricks, Google BigQuery, et al.
Serverless Kafka and Spark in a Multi-Cloud Lakehouse ArchitectureKai Wähner
Apache Kafka in conjunction with Apache Spark became the de facto standard for processing and analyzing data. Both frameworks are open, flexible, and scalable.
Unfortunately, the latter makes operations a challenge for many teams. Ideally, teams can use serverless SaaS offerings to focus on business logic. However, hybrid and multi-cloud scenarios require a cloud-native platform that provides automated and elastic tooling to reduce the operations burden.
This session explores different architectures to build serverless Apache Kafka and Apache Spark multi-cloud architectures across regions and continents.
We start from the analytics perspective of a data lake and explore its relation to a fully integrated data streaming layer with Kafka to build a modern data Data Lakehouse.
Real-world use cases show the joint value and explore the benefit of the "delta lake" integration.
Data Streaming with Apache Kafka in the Defence and Cybersecurity IndustryKai Wähner
Agenda:
1) Defence, Modern Warfare, and Cybersecurity in 202X
2) Data in Motion with Apache Kafka as Defence Backbone
3) Situational Awareness
4) Threat Intelligence
5) Forensics and AI / Machine Learning
6) Air-Gapped and Zero Trust Environments
7) SIEM / SOAR Modernization
Technologies discussed in the presentation include Apache Kafka, Kafka Streams, kqlDB, Kafka Connect, Elasticsearch, Splunk, IBM QRadar, Zeek, Netflow, PCAP, TensorFlow, AWS, Azure, GCP, Sigma, Confluent Cloud,
Real-World Deployments of Data Streaming with Apache Kafka across the Healthcare Value Chain using open source and cloud-native technologies and serverless SaaS:
1) Legacy Modernization and Hybrid Cloud: Optum (UnitedHealth Group, Centene, Bayer)
2) Streaming ETL (Bayer, Babylon Health)
3) Real-time Analytics (Cerner, Celmatix, CDC/Centers for Disease Control and Prevention)
4) Machine Learning and Data Science (Recursion, Humana)
5) Open API and Omnichannel (Care.com, Invitae)
The Rise of Data in Motion in the Healthcare Industry - Use Cases, Architectures and Examples powered by Apache Kafka.
Use Cases for Data in Motion in the Healthcare Industry:
- Know Your Patient (= “Customer 360”)
- Operations (Healthcare 4.0 including Drug R&D, Patient Care, etc.)
- IT Perspective (Cybersecurity, Mainframe Offload, Hybrid Cloud, Streaming ETL, etc)
Real-world examples include Covid-19 Electronic Lab Reporting, Cerner, Optum, Centene, Humana, Invitae, Bayer, Celmatix, Care.com.
Kafka for Real-Time Replication between Edge and Hybrid CloudKai Wähner
Not all workloads allow cloud computing. Low latency, cybersecurity, and cost-efficiency require a suitable combination of edge computing and cloud integration.
This session explores architectures and design patterns for software and hardware considerations to deploy hybrid data streaming with Apache Kafka anywhere. A live demo shows data synchronization from the edge to the public cloud across continents with Kafka on Hivecell and Confluent Cloud.
Apache Kafka for Predictive Maintenance in Industrial IoT / Industry 4.0Kai Wähner
The manufacturing industry is moving away from just selling machinery, devices, and other hardware. Software and services increase revenue and margins. Equipment-as-a-Service (EaaS) even outsources the maintenance to the vendor.
This paradigm shift is only possible with reliable and scalable real-time data processing leveraging an event streaming platform such as Apache Kafka. This talk explores how Kafka-native Condition Monitoring and Predictive Maintenance help with this innovation.
More details:
https://www.kai-waehner.de/blog/2021/10/25/apache-kafka-condition-monitoring-predictive-maintenance-industrial-iot-digital-twin/
Video recording:
https://youtu.be/tfOuN5KeI9w
Apache Kafka Landscape for Automotive and ManufacturingKai Wähner
Today, in 2022, Apache Kafka is the central nervous system of many applications in various areas related to the automotive and manufacturing industry for processing analytical and transactional data in motion across edge, hybrid, and multi-cloud deployments.
This presentation explores the automotive event streaming landscape, including connected vehicles, smart manufacturing, supply chain optimization, aftersales, mobility services, and innovative new business models.
Afterwards, many real-world examples are shown from companies such as Audi, BMW, Porsche, Tesla, Uber, Grab, and FREENOW.
More detail in the blog post:
https://www.kai-waehner.de/blog/2022/01/12/apache-kafka-landscape-for-automotive-and-manufacturing/
Event Streaming CTO Roundtable for Cloud-native Kafka ArchitecturesKai Wähner
Technical thought leadership presentation to discuss how leading organizations move to real-time architecture to support business growth and enhance customer experience. This is a forum to discuss use cases with your peers to understand how other digital-native companies are utilizing data in motion to drive competitive advantage.
Agenda:
- Data in Motion with Event Streaming and Apache Kafka
- Streaming ETL Pipelines
- IT Modernisation and Hybrid Multi-Cloud
- Customer Experience and Customer 360
- IoT and Big Data Processing
- Machine Learning and Analytics
Apache Kafka in the Public Sector (Government, National Security, Citizen Ser...Kai Wähner
The Rise of Data in Motion in the Public Sector powered by event streaming with Apache Kafka.
Citizen Services:
- Health services, e.g. hospital modernization, track & trace - Covid distance control
- Public administration - reduce bureaucracy, data democratization across government departments
- eGovernment - Efficient and digital citizen engagement, e.g. personal ID application process
Smart City
- Smart driving, parking, buildings, environment
Waste management
- Open exchange – e.g. mobility services (1st and 3rd party)
Energy
- Smart grid and utilities infrastructure (energy distribution, smart home, smart meters, smart water, etc.)
- National Security
Law enforcement, surveillance, police/interior security data exchange
- Defense and military (border control, intelligent solider)
Cybersecurity for situational awareness and threat intelligence
Telco 4.0 - Payment and FinServ Integration for Data in Motion with 5G and Ap...Kai Wähner
The Era of Telco 4.0: Embracing Digital Transformation with Data in Motion. Learn about Payment and FinServ Integration for Data in Motion with 5G and Apache Kafka.
1) The rise of Telco 4.0 and the future forward
2) Data in Motion in the Telco industry
3) Real-world Fintech and Payment examples powered by Data in Motion
Apache Kafka for Cybersecurity and SIEM / SOAR ModernizationKai Wähner
Data in Motion powered by the Apache Kafka ecosystem for Situational Awareness, Threat Detection, Forensics, Zero Trust Zones and Air-Gapped Environments.
Agenda:
1) Cybersecurity in 202X
2) Data in Motion as Cybersecurity Backbone
3) Situational Awareness
4) Threat Intelligence
5) Forensics
6) Air-Gapped and Zero Trust Environments
7) SIEM / SOAR Modernization
More details in the "Kafka for Cybersecurity" blog series:
https://www.kai-waehner.de/blog/2021/07/02/kafka-cybersecurity-siem-soar-part-1-of-6-data-in-motion-as-backbone/
Apache Kafka in the Automotive Industry (Connected Vehicles, Manufacturing 4....Kai Wähner
Connect all the things: An intro to event streaming for the automotive industry including connected cars, mobility services, and manufacturing / industrial IoT.
Video recording of this talk: https://www.youtube.com/watch?v=rBfBFrcO-WU
The Fourth Industrial Revolution (also known as Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices, using modern smart technology. Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way using integrating with various legacy and modern data sources and sinks.
Other industries—retail, healthcare, government, financial services, energy, and more—also lean into Industry 4.0 technology to take advantage of IoT devices, sensors, smart machines, robotics, and connected data. The variety of these deployments goes from disconnected edge use cases across hybrid architectures to global multi-cloud deployments.
In this presentation, I want to give you an overview of existing use cases for event streaming technology in a connected world across supply chains, industries and customer experiences that come along with these interdisciplinary data intersections:
- The Automotive Industry (and it’s not only Connected Cars)
- Mobility Services across verticals (transportation, logistics, travel industry, retailing, …)
- Smart Cities (including citizen health services, communication infrastructure, …)
Real-world examples include use cases from car makers such as Audi, BMW, Porsche, Tesla, plus many examples from mobility services such as Uber, Lyft, Here Technologies, and more.
Serverless Kafka on AWS as Part of a Cloud-native Data Lake ArchitectureKai Wähner
AWS Data Lake / Lake House + Confluent Cloud for Serverless Apache Kafka. Learn about use cases, architectures, and features.
Data must be continuously collected, processed, and reactively used in applications across the entire enterprise - some in real time, some in batch mode. In other words: As an enterprise becomes increasingly software-defined, it needs a data platform designed primarily for "data in motion" rather than "data at rest."
Apache Kafka is now mainstream when it comes to data in motion! The Kafka API has become the de facto standard for event-driven architectures and event streaming. Unfortunately, the cost of running it yourself is very often too expensive when you add factors like scaling, administration, support, security, creating connectors...and everything else that goes with it. Resources in enterprises are scarce: this applies to both the best team members and the budget.
The cloud - as we all know - offers the perfect solution to such challenges.
Most likely, fully-managed cloud services such as AWS S3, DynamoDB or Redshift are already in use. Now it is time to implement "fully-managed" for Kafka as well - with Confluent Cloud on AWS.
Building a central integration layer that doesn't care where or how much data is coming from.
Implementing scalable data stream processing to gain real-time insights
Leveraging fully managed connectors (like S3, Redshift, Kinesis, MongoDB Atlas & more) to quickly access data
Confluent Cloud in action? Let's show how ao.com made it happen!
Translated with www.DeepL.com/Translator (free version)
IBM Cloud Pak for Integration with Confluent Platform powered by Apache KafkaKai Wähner
The Rise of Data in Motion powered by Event Streaming - Use Cases and Architecture for IBM Cloud Pak with Confluent Platform. Including screenshots of the live demo (integration between IBM and Kafka via Confluent Platform and Kafka Connect connectors).
Learn about the integration capabilities of IBM Cloud Pak for Integration, now with the industry’s leading event streaming platform from Confluent Platform powered by Apache Kafka.
The rise of data in motion in the insurance industry is visible across all lines of business including life, healthcare, travel, vehicle, and others. Apache Kafka changes how enterprises rethink data. This blog post explores use cases and architectures for event streaming. Real-world examples from Generali, Centene, Humana, and Telsa show innovative insurance-related data integration and stream processing in real-time.
Apache Kafka and MQTT - Overview, Comparison, Use Cases, ArchitecturesKai Wähner
Apache Kafka and MQTT are a perfect combination for many IoT use cases. This presentation covers the pros and cons of both technologies. Various use cases across industries, including connected vehicles, manufacturing, mobility services, and smart city are explored. The examples use different architectures, including lightweight edge scenarios, hybrid integrations, and serverless cloud solutions.
Blog series with more details here:
https://www.kai-waehner.de/blog/2021/03/15/apache-kafka-mqtt-sparkplug-iot-blog-series-part-1-of-5-overview-comparison/
In software engineering, the right architecture is essential for robust, scalable platforms. Wix has undergone a pivotal shift from event sourcing to a CRUD-based model for its microservices. This talk will chart the course of this pivotal journey.
Event sourcing, which records state changes as immutable events, provided robust auditing and "time travel" debugging for Wix Stores' microservices. Despite its benefits, the complexity it introduced in state management slowed development. Wix responded by adopting a simpler, unified CRUD model. This talk will explore the challenges of event sourcing and the advantages of Wix's new "CRUD on steroids" approach, which streamlines API integration and domain event management while preserving data integrity and system resilience.
Participants will gain valuable insights into Wix's strategies for ensuring atomicity in database updates and event production, as well as caching, materialization, and performance optimization techniques within a distributed system.
Join us to discover how Wix has mastered the art of balancing simplicity and extensibility, and learn how the re-adoption of the modest CRUD has turbocharged their development velocity, resilience, and scalability in a high-growth environment.
Innovating Inference - Remote Triggering of Large Language Models on HPC Clus...Globus
Large Language Models (LLMs) are currently the center of attention in the tech world, particularly for their potential to advance research. In this presentation, we'll explore a straightforward and effective method for quickly initiating inference runs on supercomputers using the vLLM tool with Globus Compute, specifically on the Polaris system at ALCF. We'll begin by briefly discussing the popularity and applications of LLMs in various fields. Following this, we will introduce the vLLM tool, and explain how it integrates with Globus Compute to efficiently manage LLM operations on Polaris. Attendees will learn the practical aspects of setting up and remotely triggering LLMs from local machines, focusing on ease of use and efficiency. This talk is ideal for researchers and practitioners looking to leverage the power of LLMs in their work, offering a clear guide to harnessing supercomputing resources for quick and effective LLM inference.
Accelerate Enterprise Software Engineering with PlatformlessWSO2
Key takeaways:
Challenges of building platforms and the benefits of platformless.
Key principles of platformless, including API-first, cloud-native middleware, platform engineering, and developer experience.
How Choreo enables the platformless experience.
How key concepts like application architecture, domain-driven design, zero trust, and cell-based architecture are inherently a part of Choreo.
Demo of an end-to-end app built and deployed on Choreo.
Experience our free, in-depth three-part Tendenci Platform Corporate Membership Management workshop series! In Session 1 on May 14th, 2024, we began with an Introduction and Setup, mastering the configuration of your Corporate Membership Module settings to establish membership types, applications, and more. Then, on May 16th, 2024, in Session 2, we focused on binding individual members to a Corporate Membership and Corporate Reps, teaching you how to add individual members and assign Corporate Representatives to manage dues, renewals, and associated members. Finally, on May 28th, 2024, in Session 3, we covered questions and concerns, addressing any queries or issues you may have.
For more Tendenci AMS events, check out www.tendenci.com/events
Top Features to Include in Your Winzo Clone App for Business Growth (4).pptxrickgrimesss22
Discover the essential features to incorporate in your Winzo clone app to boost business growth, enhance user engagement, and drive revenue. Learn how to create a compelling gaming experience that stands out in the competitive market.
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
Custom Healthcare Software for Managing Chronic Conditions and Remote Patient...Mind IT Systems
Healthcare providers often struggle with the complexities of chronic conditions and remote patient monitoring, as each patient requires personalized care and ongoing monitoring. Off-the-shelf solutions may not meet these diverse needs, leading to inefficiencies and gaps in care. It’s here, custom healthcare software offers a tailored solution, ensuring improved care and effectiveness.
AI Pilot Review: The World’s First Virtual Assistant Marketing SuiteGoogle
AI Pilot Review: The World’s First Virtual Assistant Marketing Suite
👉👉 Click Here To Get More Info 👇👇
https://sumonreview.com/ai-pilot-review/
AI Pilot Review: Key Features
✅Deploy AI expert bots in Any Niche With Just A Click
✅With one keyword, generate complete funnels, websites, landing pages, and more.
✅More than 85 AI features are included in the AI pilot.
✅No setup or configuration; use your voice (like Siri) to do whatever you want.
✅You Can Use AI Pilot To Create your version of AI Pilot And Charge People For It…
✅ZERO Manual Work With AI Pilot. Never write, Design, Or Code Again.
✅ZERO Limits On Features Or Usages
✅Use Our AI-powered Traffic To Get Hundreds Of Customers
✅No Complicated Setup: Get Up And Running In 2 Minutes
✅99.99% Up-Time Guaranteed
✅30 Days Money-Back Guarantee
✅ZERO Upfront Cost
See My Other Reviews Article:
(1) TubeTrivia AI Review: https://sumonreview.com/tubetrivia-ai-review
(2) SocioWave Review: https://sumonreview.com/sociowave-review
(3) AI Partner & Profit Review: https://sumonreview.com/ai-partner-profit-review
(4) AI Ebook Suite Review: https://sumonreview.com/ai-ebook-suite-review
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
How Recreation Management Software Can Streamline Your Operations.pptxwottaspaceseo
Recreation management software streamlines operations by automating key tasks such as scheduling, registration, and payment processing, reducing manual workload and errors. It provides centralized management of facilities, classes, and events, ensuring efficient resource allocation and facility usage. The software offers user-friendly online portals for easy access to bookings and program information, enhancing customer experience. Real-time reporting and data analytics deliver insights into attendance and preferences, aiding in strategic decision-making. Additionally, effective communication tools keep participants and staff informed with timely updates. Overall, recreation management software enhances efficiency, improves service delivery, and boosts customer satisfaction.
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
Globus Compute wth IRI Workflows - GlobusWorld 2024Globus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and we describe a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
Unleash Unlimited Potential with One-Time Purchase
BoxLang is more than just a language; it's a community. By choosing a Visionary License, you're not just investing in your success, you're actively contributing to the ongoing development and support of BoxLang.
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
A Sighting of filterA in Typelevel Rite of Passage
Supply Chain Optimization with Apache Kafka
1. Apache Kafka in Manufacturing and Industry 4.0 - @KaiWaehner - www.kai-waehner.de
Apache Kafka for Optimization of
Supply Chain Management (SCM)
Decoupled Microservices, Data Integration, Real-Time Analytics, and More…
Kai Waehner
Field CTO
contact@kai-waehner.de
LinkedIn
@KaiWaehner
www.confluent.io
www.kai-waehner.de
3. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Supply Chain Management (SCM)
• Planning and coordination of all the people,
processes, and technology involved in creating
value for a company
• Cross cutting processes, including purchasing /
procurement, logistics, operations / manufacturing,
and others
• Automation, robustness, flexibility, hybrid
deployment (edge + cloud)
Six Sigma (6σ)
5. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
The Future of SCM – Outlook to 2040
https://www.ipa.fraunhofer.de/en/press-media/press_releases/how-supply-chain-management-will-change-by-2040.html
Autonomy: Vehicles, machines, sensors will be
handled largely autonomously
Connected: SCM will no longer form a chain, but
rather a network:
Communication: The number of people involved will
be rising because of the increasing number of new
goods and services being exchanged.
Fast: Companies will develop into significant
players within a shorter space of time, although by
the same token they can also disappear all the
faster from the market.
Flexible: SCM processes will be able to adjust more
quickly to unforeseen events such as pandemics,
wars and natural disasters.
13. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
STREAM
PROCESSING
Create and store
materialized views
Filter
Analyze in-flight
Time
C CC
Event Streaming
14. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
An Event Streaming Platform
is the Underpinning of an Event-driven Architecture
MES
ERP
Sensors
Mobile
Customer 360
Real-time Alerting
System
Data warehouse
Producers
Consumers
Streams of real time events
Stream processing apps
Connectors
Connectors
Stream processing apps
Supplier
Alert
ForecastInventory Customer
Order
15. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Hybrid and Multi-Cloud Architectures
VM
SELF MANAGED
FULLY MANAGED
Edge vs. regional vs. global deployments
Cloud-first (greenfield) vs.
Hybrid architecture vs.
Strategic move from on premise to cloud
16. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Why Apache Kafka for Supply Chaim Management (SCM)?
• Real-time messaging (at scale, mission-critical)
• Global Kafka (edge, data center, multi-cloud)
• Cloud-native (open, flexible, elastic)
• Data integration (legacy + modern protocols, applications,
communication paradigms)
• Data correlation (real-time + historical data, omni-channel)
• Real decoupling (not just messaging, but also infinite
storage + replayability of events)
• Real-time monitoring
• Transactional data and analytics data (MES, ERP, CRM,
SCM, …)
• Applied machine learning (model training and scoring)
• Cybersecurity
• Complementary to legacy and cutting-edge technology
(Mainframe, PLCs, 3D printing, augmented reality,
…)
17. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Customer Experience
(CX)
Profit -
Increase
Revenue
(make money)
Business
Value
Decrease
Costs
(save
money)
Core Business Platform
Increase Operational
Efficiency
Migrate to Cloud
Mitigate Risk
(protect money)
Key Drivers Strategic Objectives
Risk & Cost
Avoidance
Optimize Inventory
IoT - Device management
Optimize labor - Assets
Risk, What If & Analytics
Capacity Planning
State | Federal & Regulatory
Compliance | Governance - Data
Providence
Bottleneck Elimination
Audit, Regulatory
Digital
Transformation
Example Use Cases
$↑
$↓
$↔
Sales Demand Forecasting
$
Customer - Supply Chain Visibility
Use Cases for Event Streaming in the Supply Chain
18. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Food Value Chain
IoT-Based and Data-Driven
Single source of truth
across the food value chain
(in the factories, and across regions)
Business critical
operations
(tracking, calculations, alerts, …)
https://www.confluent.io/blog/creating-iot-based-data-driven-food-value-chain-with-confluent-cloud/
21. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Kafka Connect
Kafka Cluster
MQTT
Integration
Domain-Driven Design (DDD) for your Event Streaming Platform
SAP
Integration
Real Time
Predictions
IoT Platform
Connector
Java / Python /
”you-name-it”
Schema
Registry
Event Streaming Platform
IoT Domain OT Domain Analytics Domain
Independent and loosely coupled, but scalable, highly available and reliable!
21
22. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Decoupled Microservices @ Porsche
https://medium.com/porschedev/apache-kafka-at-porsche-literary-figure-meets-car-manufacturer-ead9d99c3bc
https://medium.com/next-level-german-engineering/data-streaming-porsche-bc49c6aa17a8
“The recent rise of data streaming has opened new possibilities for real-time analytics. At Porsche, data
streaming technologies are increasingly applied across a range of contexts, including warranty and sales,
manufacturing and supply chain, connected vehicles, and charging stations.”
Sridhar Mamella (Platform Manager Data Streaming at Porsche)
24. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
SCM
=
Zoo of
technologies and
products
Options:
No software
Buy
Rent
Make
Mix
TMS
Transport Management System
WMS
Warehouse Management System
WES
Warehouse Execution System
DPS
Demand Planning System
DRP
Distribution Requirements Planning
LMS
Labor Management System
CRM
Customer Relationship Management
SRM
Supplier Relationship Management
ERP
Enterprise
Resources
Planning
BI
Business Intelligence
MES
Manufacturing Execution System
PLM
Product Lifecycle Management
25. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Postmodern SCM / ERP (coined by Gartner) built with Kafka + other Apps
Replace legacy, monolithic and highly customized ERP suites
by a mixture of loosely coupled, exchangeable cloud-based and on-premises applications.
TMS
Legacy Proprietary
SOAP Web Services
Supplier
Alert
ForecastInventory Customer
Order
Core ERP
CRM
SaaS
Kafka Interface
MES
Proprietary
HTTP Web Services
LMS
Legacy Homegrown
Database + CDC
SRM
Kafka-native
26. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Postmodern SCM / ERP built with Kafka
Zero downtime with rolling upgrades and backwards compatibility
MirrorMaker 2 /
Confluent Replicator
Tier 1 Supplier
Server: Latest version
Clients: 0.11, 2.0
Streaming replication
between
SCM applications
Version compatibility between
different
clients and servers
Core ERP
Server: AK 2.3
Clients: 2.0, 2.5
MES
Server: CP 54 / AK 2.4
Clients: 0.11, 2.4
Confluent
Cluster Linking
27. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Real-Time Inventory System
https://www.confluent.io/blog/walmart-real-time-inventory-management-using-kafka/
https://www.confluent.io/kafka-summit-san-francisco-2019/when-kafka-meets-the-scaling-and-reliability-needs-of-worlds-largest-retailer-a-walmart-story/
“Retail shopping experiences have evolved to include multiple channels, both online
and offline, and have added to a unique set of challenges in this digital era. Having
an up to date snapshot of inventory position on every item is a very important aspect
to deal with these challenges. We at Walmart have solved this at scale by designing
an event-streaming-based, real-time inventory system leveraging Apache Kafka…
Like any supply chain network, our infrastructure involved a plethora of event sources
with all different types of data”
Suman Pattnaik, Big Data Architect @
Walmart
29. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Optimization of the Supply Chain Process
30. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Supply Chain Planning
Streaming and Batch Analytics Use Case
• Prepare for the unknown and the
unknowable
• Act at the right time in the right context
• Yossi Sheffi: “You need to have
sensors in the ground”, then SCM
becomes a process of sensing and
responding
What if…?
31. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
From a batch-oriented systems architecture to
a streaming micro services platform
Thoughtworks:
https://www.confluent.io/kafka-summit-sf17/Fast-Data-in-Supply-Chain-Planning
32. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Strangler Design Pattern
The Big Bang does not always work…
“The most important
reason to consider a
strangler fig
application over a cut-
over rewrite is
reduced risk.”
Martin FowlerThoughtworks:
https://www.confluent.io/kafka-summit-sf17/Fast-Data-in-Supply-Chain-Planning
33. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Supply Chain Purchasing
Real-time Natural Language Processing (NLP) for Digital Contract Intelligence
34. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Apache Kafka as Infrastructure for ML
35. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Apache Kafka’s Open Ecosystem as Infrastructure for ML
Kafka
Streams/
ksqlDB
Kafka Connect
Confluent REST Proxy
Confluent Schema Registry
Go/.NET/Python
Kafka Producer
ksqlDB
Python
Client
36. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
BMW Group
Industry-ready NLP Service Framework Based on Kafka
https://www.confluent.io/kafka-summit-lon19/industry-ready-nlp-service-framework-kafka/
38. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Digital Twin – Merging the Physical and the Digital World
38
• Downtime reduction
• Inventory management
• Fleet management
• What-if simulations
• Operational planning
• Servitization
• Product development
• Healthcare
• Customer experience
“Virtual representation of something else (Physical thing, process, service)”
“A living model that drives a business outcome”
https://www.youtube.com/watch?v=Ri0TD7kYsIQ
39. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Software and Digital Services become the Key Differentiator
39
https://www.mckinsey.com/industries/advanced-electronics/our-insights/iiot-platforms-the-technology-stack-as-value-driver-in-industrial-equipment-and-machinery https://www.rolls-royce.com/media/press-releases-archive/yr-2012/121030-the-hour.aspx
40. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Digital Thread
40
Digital Twin vs. Digital Thread?
I only use the term Digital Twin in the following slides.
Both terms overlap, often meaning the same.
Span
the
entire
lifecycle
41. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Apache Kafka
Kafka as Integration Platform
for the Digital Twin
41
Digital
Twin
Real-Time
Inventory
Management
Kafka Connect
Connectivity
Homogenization
Reprogrammable and smart
Digital traces
Modularity
Storage Processing
42. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Connected Car Infrastructure
42
https://www.youtube.com/watch?v=yGLKi3TMJv8
• Real Time Data Analysis
• Swarm Intelligence
• Collaboration with Partners
• Predictive AI
• …
43. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Apache Kafka
Kafka as Digital Twin
43
Real-Time
Inventory
Management
Kafka Connect
Connectivity
Homogenization
Reprogrammable and smart
Digital traces
Modularity
Digital Twin
Storage Processing
44. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Construction site management
Collaborative planning
Inventory and asset management
Track, manage, and locate tools and equipment anytime and anywhere
https://www.confluent.io/customers/bosch/
https://events.confluent.io/online-talks/bosch-power-toolse-nables-real-time-analytics-on-iot-event-streams
47. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Real-time
High throughput
Decentralized database
Distributed log of records
Immutable log
Replication
High availability
Decoupling of applications / clients
Role-based access control to data
Tamper-proof
Encrypted payloads
Cross-company
is not a Blockchain!
X
48. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Kafka AND Blockchain
48
Kafka
Real Time
Instant Payment
App
(Java, C++,
Python, etc.)
Batch
Analytics Platform
(Spark, Splunk, etc.)
Blockchain
(Bitcoin)
Blockchain
(Ethereum)
Kafka-native
Blockchain
Kafka AS Blockchain
Real Time
Instant Payment
App
(Java, C++,
Python, etc.)
Batch
Analytics Platform
(Spark, Splunk, etc.)
49. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Kafka vs. Blockchain for Supply Chain Management
49
Use Tamper-proof
Apache Kafka for
• Enterprise infrastructure
• Open, scalable, real-time
requirements
• Flexible architectures for many use
cases
Use Hyperledger,
Ethereum, et al. for
• Deployment over various
independent organizations
• Participants verify the distributed
ledger contents themselves.
• Specific use cases
• Server-side managed and
controlled by multiple
organizations
• Scenarios where the business
value overturns the added
complexity and project risk
Use Kafka and
Blockchain together to
combine the benefits of
both
(where this makes sense!)
https://events.confluent.io/blockchain-kafka
https://www.kai-waehner.de/blog/2020/07/17/apache-kafka-blockchain-dlt-comparison-kafka-native-vs-hyperledger-ethereum-ripple-iota-libra/
51. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
The Rise of Event Streaming
2010
Apache Kafka
created at LinkedIn by
Confluent founders
2014
2020
80%
Fortune 100
Companies
trust and use
Apache Kafka
52. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
I N V E S T M E N T & T I M E
VALUE
3
4
5
1
2
Event Streaming Maturity Model
Initial Awareness /
Pilot (1 Kafka Cluster)
Start to Build Pipeline /
Deliver 1 New Outcome
(1 Kafka Cluster)
Mission-Critical
Deployment
(Stretched, Hybrid,
Multi-Region)
Build Contextual
Event-Driven Apps
(Stretched, Hybrid,
Multi-Region)
Central Nervous
System
(Global Kafka)
Product, Support, Training, Partners, Technical Account Management...
53. Apache Kafka for Supply Chain Management (SCM) - @KaiWaehner - www.kai-waehner.de
Fully Managed Cloud ServiceSelf Managed Software FREEDOM OF CHOICE
COMMITTER-DRIVEN EXPERTISE PartnersTrainingProfessional
Services
Enterprise
Support
Apache Kafka
EFFICIENT
OPERATIONS AT SCALE
PRODUCTION-
STAGE PREREQUISITES
UNRESTRICTED
DEVELOPER PRODUCTIVITY
SQL-based
Stream Processing
KSQL (ksqlDB)
Rich Pre-built Ecosystem
Connectors | Hub | Schema Registry
Multi-language Development
non-Java clients | REST Proxy
GUI-driven Mgmt & Monitoring
Control Center
Flexible DevOps Automation
Operator | Ansible
Dynamic Performance & Elasticity
Auto Data Balancer | Tiered Storage
Enterprise-grade Security
RBAC | Secrets | Audit logs
Data Compatibility
Schema Registry | Schema Validation
Global Resilience
Multi-Region Clusters | Replicator
Developer Operator Architect
Open Source | Community licensed
PARTNERSHIP
FOR BUSINESS SUCCESS
Complete
Engagement Model
Revenue / Cost / Risk Impact
TCO / ROI
Executive Buyer
54. Apache Kafka in Manufacturing and Industry 4.0 - @KaiWaehner - www.kai-waehner.de
Kai Waehner
Field CTO
contact@kai-waehner.de
@KaiWaehner
www.kai-waehner.de
www.confluent.io
LinkedIn
Questions? Feedback?
Let’s connect!