The Streaming Assessment – An Introductionconfluent
Business breakout during Confluent’s streaming event in Munich, presented by Lyndon Hedderly, Director of Customer Solutions at Confluent. This three-day hands-on course focused on how to build, manage, and monitor clusters using industry best-practices developed by the world’s foremost Apache Kafka™ experts. The sessions focused on how Kafka and the Confluent Platform work, how their main subsystems interact, and how to set up, manage, monitor, and tune your cluster.
Understanding the TCO and ROI of Apache Kafka & Confluentconfluent
For a product or service to be cost effective, it must be considered to be good value, where the benefits are worth at least what is paid for them. But how do we measure this, to prove the case? Given that value can be intangible, it can be hard to quantify and may have little relationship to cost. Added to this, the open source nature of Apache Kafka means that many companies skip the requirement to build a business case for it, until it has become mission critical and demands financial and human resources.
In this presentation, Lyndon Hedderly, Team Lead of Business Value Consulting at Confluent, will cover how Confluent works with customers to measure the business value of data streaming
Data reply sneak peek: real time decision enginesconfluent
Events happen constantly in every business: a purchase in an online shop, a credit limit is hit, the mobile internet plan has been exhausted, users interact with a website. Events rule the business world. So why would you react to them hours or days later? Real-Time Decision Engines enable a variety of use cases, driving new products, increasing user experience, reducing costs and risks by reacting instantly to business events.
From personalized instantaneous marketing campaigns to reacting to user interactions, Real-Time is the key to open up a world of use cases that batch and scheduled processing cannot efficiently satisfy. In this talk, we are going to show some example use cases that Data Reply developed for some of its customers and how Real-Time Decision Engines had an impact on their businesses.
Digital Transformation Mindset - More Than Just Technologyconfluent
Many enterprises faced with silo’ed, batch-oriented, legacy systems struggle to compete in this new digital-first world. Adhering to the ‘If it’s not broken don’t fix it’ mentality leaves the door wide open for native digital challengers to grow and succeed. To stay competitive, your organization must respond in real time to every customer experience transaction, sale, and market movement. But how do you get there? First, you must change your mindset.
As streaming platforms become central to data strategies, companies both small and large are re-thinking their enterprise architecture with real-time context at the forefront. Monoliths are evolving into microservices. Datacenters are moving to the cloud. What was once a ‘batch’ mindset is quickly being replaced with stream processing as the demands of the business impose real-time requirements on technology leaders.
Join Argyle, in partnership with Confluent, in our 2018 CIO Virtual Event: The Digital Transformation Mindset – More Than Just Technology. During the webinar we’ll learn how leading companies across industries rely on a streaming platform to make event-driven architectures central to:
• How data strategies and IT initiatives are improving the digital customer experiences
• How executives are reducing risk with real time monitoring and anomaly detection
• Increasing operational agility with microservices and IoT architectures within organizations
Event-Based Business Architecture: Orchestrating Enterprise Communications confluent
(Gary Samuelson, GarySamuelson) Kafka Summit SF 2018
A business-oriented view, illustrating both process models and in-flight task progress, is critical to understanding organizational health, efficiency and alignment to strategic goals. The intent of this talk is to illustrate the real-time relationship between Kafka-managed events (event driven) and business architecture via actionable models (real-time analytics).
Takeaways:
-Understand how business views technology in terms of capabilities aligned to strategy.
-Introduce process model and performance views into an event-oriented dashboard. This view illustrates the organization in terms of collaborating human and automated services.
-Illustrate how system architecture dovetails into business goals with an aligned business/IT architectures.
The Streaming Assessment – An Introductionconfluent
Business breakout during Confluent’s streaming event in Munich, presented by Lyndon Hedderly, Director of Customer Solutions at Confluent. This three-day hands-on course focused on how to build, manage, and monitor clusters using industry best-practices developed by the world’s foremost Apache Kafka™ experts. The sessions focused on how Kafka and the Confluent Platform work, how their main subsystems interact, and how to set up, manage, monitor, and tune your cluster.
Understanding the TCO and ROI of Apache Kafka & Confluentconfluent
For a product or service to be cost effective, it must be considered to be good value, where the benefits are worth at least what is paid for them. But how do we measure this, to prove the case? Given that value can be intangible, it can be hard to quantify and may have little relationship to cost. Added to this, the open source nature of Apache Kafka means that many companies skip the requirement to build a business case for it, until it has become mission critical and demands financial and human resources.
In this presentation, Lyndon Hedderly, Team Lead of Business Value Consulting at Confluent, will cover how Confluent works with customers to measure the business value of data streaming
Data reply sneak peek: real time decision enginesconfluent
Events happen constantly in every business: a purchase in an online shop, a credit limit is hit, the mobile internet plan has been exhausted, users interact with a website. Events rule the business world. So why would you react to them hours or days later? Real-Time Decision Engines enable a variety of use cases, driving new products, increasing user experience, reducing costs and risks by reacting instantly to business events.
From personalized instantaneous marketing campaigns to reacting to user interactions, Real-Time is the key to open up a world of use cases that batch and scheduled processing cannot efficiently satisfy. In this talk, we are going to show some example use cases that Data Reply developed for some of its customers and how Real-Time Decision Engines had an impact on their businesses.
Digital Transformation Mindset - More Than Just Technologyconfluent
Many enterprises faced with silo’ed, batch-oriented, legacy systems struggle to compete in this new digital-first world. Adhering to the ‘If it’s not broken don’t fix it’ mentality leaves the door wide open for native digital challengers to grow and succeed. To stay competitive, your organization must respond in real time to every customer experience transaction, sale, and market movement. But how do you get there? First, you must change your mindset.
As streaming platforms become central to data strategies, companies both small and large are re-thinking their enterprise architecture with real-time context at the forefront. Monoliths are evolving into microservices. Datacenters are moving to the cloud. What was once a ‘batch’ mindset is quickly being replaced with stream processing as the demands of the business impose real-time requirements on technology leaders.
Join Argyle, in partnership with Confluent, in our 2018 CIO Virtual Event: The Digital Transformation Mindset – More Than Just Technology. During the webinar we’ll learn how leading companies across industries rely on a streaming platform to make event-driven architectures central to:
• How data strategies and IT initiatives are improving the digital customer experiences
• How executives are reducing risk with real time monitoring and anomaly detection
• Increasing operational agility with microservices and IoT architectures within organizations
Event-Based Business Architecture: Orchestrating Enterprise Communications confluent
(Gary Samuelson, GarySamuelson) Kafka Summit SF 2018
A business-oriented view, illustrating both process models and in-flight task progress, is critical to understanding organizational health, efficiency and alignment to strategic goals. The intent of this talk is to illustrate the real-time relationship between Kafka-managed events (event driven) and business architecture via actionable models (real-time analytics).
Takeaways:
-Understand how business views technology in terms of capabilities aligned to strategy.
-Introduce process model and performance views into an event-oriented dashboard. This view illustrates the organization in terms of collaborating human and automated services.
-Illustrate how system architecture dovetails into business goals with an aligned business/IT architectures.
Event-Streaming verstehen in unter 10 Minconfluent
Um die unternehmerische Geschwindigkeit zu erhöhen, die Wettbewerbsfähigkeit durch neue Produkte und Services zu steigern und schnell auf plötzlich ändernde Markteinflüsse reagieren zu können, müssen Daten und Ereignisströme in Echtzeit geteilt, verarbeitet und ausgewertet werden können. Apache Kafka hat sich hier als Industrie-Standard für Event-Streaming etabliert. Ob Connected Car, Industrie 4.0 oder Customer 360 – alle diese zukunftsorientierten Themen benötigen schnelle Kommunikation, effiziente Vernetzung und eine Verarbeitung von enormen Datenmengen in Echtzeit.
Stream me to the Cloud (and back) with Confluent & MongoDBconfluent
In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.
Financial Event Sourcing at Enterprise Scaleconfluent
For years, Rabobank has been actively investing in becoming a real-time, event-driven bank. If you are familiar with banking processes, you will understand that this is not simple. Many banking processes are implemented as batch jobs on not-so-commodity hardware, meaning that any migration effort is immense.
*Find out how Rabobank redesigned Rabo Alerts while continuing to provide a robust and stable alert system for its existing user base
*Learn how the project team managed to achieve a balance between the need to decentralise activity while not losing control
*Understand how Rabobank re-invented a reliable service to meet modern customer expectations
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...confluent
MQ, ETL and ESB middleware are often used as integration backbone between legacy applications, modern microservices and cloud services. This introduces several challenges and complexities like point-to-point integration or non-scalable architectures. This session discusses how to build a completely event-driven streaming platform leveraging Apache Kafka’s open source messaging, integration and streaming components to leverage distributed processing, fault-tolerance, rolling upgrades and the ability to reprocess events. Learn the differences between a event-driven streaming platform leveraging Apache Kafka and middleware like MQ, ETL and ESBs – including best practices and anti-patterns, but also how these concepts and tools complement each other in an enterprise architecture.
Driving Business Transformation with Real-Time Analytics Using Apache Kafka a...confluent
Watch this talk here: https://www.confluent.io/online-talks/driving-business-transformation-real-time-analytics-using-apache-kafka-and-ksql
Digital transformation is more than just a buzzword, it’s become a necessity in order to compete in the modern era. At the heart of digital transformation is real-time data. Your organization must respond in real time to every customer experience transaction, sale, and market movement in order to stay competitive.
Streaming data technologies like Apache Kafka® and Confluent KSQL, the streaming SQL engine for Apache Kafka, are being used to detect and react to events as they occur. Combining this technology with the analytics insights from RCG and visualizations from Arcadia Data delivers a powerful foundation for driving real time business decisions. Use cases span across industries and include retail transaction cost analysis, automotive maintenance and loyalty program management, and credit card fraud detection.
Join experts from Confluent, RCG and Arcadia Data for a discussion and demo on how companies are integrating streaming data technologies to transform their business.
You will learn:
-Why Apache Kafka is widely used for real-time event monitoring and decisioning
-How to integrate real-time analytics and visualizations to drive business processes
-How KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time
IBM Cloud Pak for Integration with Confluent Platform powered by Apache KafkaKai Wähner
The Rise of Data in Motion powered by Event Streaming - Use Cases and Architecture for IBM Cloud Pak with Confluent Platform. Including screenshots of the live demo (integration between IBM and Kafka via Confluent Platform and Kafka Connect connectors).
Learn about the integration capabilities of IBM Cloud Pak for Integration, now with the industry’s leading event streaming platform from Confluent Platform powered by Apache Kafka.
The Business Case for Cloud Management - RightScale Compute 2013RightScale
Speakers:
Nick Kephart - Product Marketing Manager, RightScale
Baliey Caldwell - VP Business Development, RightScale
We’ll discuss the cloud technology landscape and how RightScale fits in to manage Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). This session will clarify cloud benefits, cloud challenges, and how cloud management can drive agility, cost, and time savings. You’ll leave with a checklist of how to quantify the business case for cloud management.
We introduce LEAD a formal CEP that was presented at DEBS 2019. Using a running example, we explain how the pattern matching job is generated using Aged Colored Petri Net as logical execution plan. The implementation is currently under Flink Streaming.
Development of dynamic pricing for tours using real-time data feeds | Mourad ...HostedbyConfluent
FREE NOW business is growing rapidly as a ride-hailing industry in general which creates a fair amount of technical challenges related to real-time data aggregation and processing. FREE NOW was a long-time user of Kafka and lately adopted Confluent Cloud as a mainstreaming data platform. We managed to scale it towards several hundreds of topics containing various information about the trip, location and business performance overall. This information is heavily utilized to create streaming applications like dynamic pricing computation, fraud detection as well as real-time analytics for marketing campaigns, and much more. We would like to share the details of the implementation for the real-time computation of the dynamic tour pricing which is based on more than 200 million events daily. Also, we would like to reflect on how Confluent helped us to address the development complexity and provide scalability options at the same time.
Event Streaming CTO Roundtable for Cloud-native Kafka ArchitecturesKai Wähner
Technical thought leadership presentation to discuss how leading organizations move to real-time architecture to support business growth and enhance customer experience. This is a forum to discuss use cases with your peers to understand how other digital-native companies are utilizing data in motion to drive competitive advantage.
Agenda:
- Data in Motion with Event Streaming and Apache Kafka
- Streaming ETL Pipelines
- IT Modernisation and Hybrid Multi-Cloud
- Customer Experience and Customer 360
- IoT and Big Data Processing
- Machine Learning and Analytics
Camunda Day Amsterdam 2019: Workflow Automation in Microservices Architecture...camunda services GmbH
Many Camunda users start to adopt microservices. In this presentation Niall Deehan, IT consultant at Camunda, shares experiences and best practices on how to apply BPMN and the Camunda platform in your microservices architecture. `He’ll tackle questions around how you can slice end-to-end business processes into appropriate pieces, how many engines you should operate and how to keep in control.
Cisco’s E-Commerce Transformation Using Kafka confluent
(Gaurav Goyal + Dharmesh Panchmatia, Cisco Systems) Kafka Summit SF 2018
Cisco e-commerce platform is a custom-built mission-critical platform which accounts for $40+ billion of Cisco’s revenue annually. It’s a suite of 35 different applications and 300+ services that powers product configuration, pricing, quoting and order booking across all Cisco product lines including hardware, software, services and subscriptions. It’s a B2B platform used by the Cisco sales team, partners and direct customers, serving 140,000 unique users across the globe. In order to improve customer experience and business agility, Cisco decided to transition the platform to cloud-native technologies, MongoDB, Elasticsearch and Kafka.
In this session, we will share details around:
-Kafka architecture
-How we are experiencing significant resiliency advantages, zero-downtime deployment and improved performance
-How we’ve implemented Kafka to pass data to 20+ downstream applications, removing point-to-point integrations, batch jobs and standardizing the handshake
-How are we using Kafka for pushing data for machine learning and analytics use cases
-Best practices and lessons learned
The rise of data in motion in the insurance industry is visible across all lines of business including life, healthcare, travel, vehicle, and others. Apache Kafka changes how enterprises rethink data. This blog post explores use cases and architectures for event streaming. Real-world examples from Generali, Centene, Humana, and Telsa show innovative insurance-related data integration and stream processing in real-time.
Confluent Cloud for Apache Kafka® | Google Cloud Next ’19confluent
Google Cloud Next ’19
Speakers:
Gaetan Castelein, Confluent Product Marketing
Kir Titievsky, Google Product Management
Confluent Cloud for Apache Kafka® was a session conducted at Google Cloud Next ’19 on the topic of how Confluent and Google are partnering to give you a complete event-streaming platform that extends Kafka with essential capabilities for developers and enterprises. Confluent is available as a fully managed, first class service on GCP, or can be deployed on-premises on Google Cloud Services Platform. Developers can deploy Confluent Cloud™ in minutes right from the Google Cloud Console to start building event-driven applications. Enterprises can build hybrid cloud streaming solutions with a common platform that spans from on-premises to GCP, streaming data to GCP to leverage best-of-breed services such as BigQuery and TensorFlow. Review this presentation to learn about Confluent and GCP services, and see how you can get started in just minutes with no upfront commitment.
Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniertconfluent
Für die Automobilindustrie ist die digitale Transformation wie für jede andere Branche zugleich eine digitale Revolution: Neue Marktspieler, neue Technologien und die in immer größeren Mengen anfallenden Daten schaffen neue Chancen, aber auch neue Herausforderungen – und erfordern neben neuen IT-Architekturen auch völlig neue Denkansätze.
60% der Fortune500-Unternehmen setzen zur Umsetzung ihrer Daten-Streaming-Projekte auf die umfassende verteilte Streaming-Plattform Apache Kafka®, darunter auch die AUDI AG.
Erfahren Sie in diesem Webinar:
Wie Kafka als Grundlage sowohl für Daten-Pipelines als auch für Anwendungen dient, die Echtzeit-Datenströme konsumieren und verarbeiten.
Wie Kafka Connect und Kafka Streams geschäftskritische Anwendungen unterstützt
Wie Audi mithilfe von Kafka und Confluent eine Fast Data IoT-Plattform umgesetzt hat, die den Bereich „Connected Car“ revolutioniert
Sprecher:
David Schmitz, Principal Architect, Audi Electronics Venture GmbH
Kai Waehner, Technology Evangelist, Confluent
Apache Kafka in Financial Services - Use Cases and ArchitecturesKai Wähner
The Rise of Event Streaming in Financial Services - Use Cases, Architectures and Examples powered by Apache Kafka.
The New FinServ Enterprise Reality: Every company is a software company. Innovate OR be Disrupted. Learn how Event Streaming with Apache Kafka and its ecosystem help...
More details:
https://www.kai-waehner.de/apache-kafka-financial-services-industry-banking-finserv-payment-fraud-middleware-messaging-transactions
https://www.kai-waehner.de/blog/2020/04/15/apache-kafka-machine-learning-banking-finance-industry/
https://www.kai-waehner.de/blog/2020/04/24/mainframe-offloading-replacement-apache-kafka-connect-ibm-db2-mq-cdc-cobol/
Defining a Cloud Adoption Journey to Deliver Cloud Native ServicesAmazon Web Services
Hear how the NSW Department of Industry defined a cloud adoption journey that enabled the migration of over 500 applications and 800 databases in 18 months. Developing an interactive agile methodology that drove momentum to migrate 25 applications per week to the cloud.
Speaker: Michael Cracroft, Chief Security and Technology Officer, Service NSW & Ben Thurgood, Solutions Architect, AWS
What are the best digital transformation tools for your business?learntransformation0
These tools, whether they be cloud computing services, data analytics platforms, or collaboration tools, serve as the cornerstone upon which the basis of an organisation that has undergone a digital transformation is created.
Event-Streaming verstehen in unter 10 Minconfluent
Um die unternehmerische Geschwindigkeit zu erhöhen, die Wettbewerbsfähigkeit durch neue Produkte und Services zu steigern und schnell auf plötzlich ändernde Markteinflüsse reagieren zu können, müssen Daten und Ereignisströme in Echtzeit geteilt, verarbeitet und ausgewertet werden können. Apache Kafka hat sich hier als Industrie-Standard für Event-Streaming etabliert. Ob Connected Car, Industrie 4.0 oder Customer 360 – alle diese zukunftsorientierten Themen benötigen schnelle Kommunikation, effiziente Vernetzung und eine Verarbeitung von enormen Datenmengen in Echtzeit.
Stream me to the Cloud (and back) with Confluent & MongoDBconfluent
In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.
Financial Event Sourcing at Enterprise Scaleconfluent
For years, Rabobank has been actively investing in becoming a real-time, event-driven bank. If you are familiar with banking processes, you will understand that this is not simple. Many banking processes are implemented as batch jobs on not-so-commodity hardware, meaning that any migration effort is immense.
*Find out how Rabobank redesigned Rabo Alerts while continuing to provide a robust and stable alert system for its existing user base
*Learn how the project team managed to achieve a balance between the need to decentralise activity while not losing control
*Understand how Rabobank re-invented a reliable service to meet modern customer expectations
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...confluent
MQ, ETL and ESB middleware are often used as integration backbone between legacy applications, modern microservices and cloud services. This introduces several challenges and complexities like point-to-point integration or non-scalable architectures. This session discusses how to build a completely event-driven streaming platform leveraging Apache Kafka’s open source messaging, integration and streaming components to leverage distributed processing, fault-tolerance, rolling upgrades and the ability to reprocess events. Learn the differences between a event-driven streaming platform leveraging Apache Kafka and middleware like MQ, ETL and ESBs – including best practices and anti-patterns, but also how these concepts and tools complement each other in an enterprise architecture.
Driving Business Transformation with Real-Time Analytics Using Apache Kafka a...confluent
Watch this talk here: https://www.confluent.io/online-talks/driving-business-transformation-real-time-analytics-using-apache-kafka-and-ksql
Digital transformation is more than just a buzzword, it’s become a necessity in order to compete in the modern era. At the heart of digital transformation is real-time data. Your organization must respond in real time to every customer experience transaction, sale, and market movement in order to stay competitive.
Streaming data technologies like Apache Kafka® and Confluent KSQL, the streaming SQL engine for Apache Kafka, are being used to detect and react to events as they occur. Combining this technology with the analytics insights from RCG and visualizations from Arcadia Data delivers a powerful foundation for driving real time business decisions. Use cases span across industries and include retail transaction cost analysis, automotive maintenance and loyalty program management, and credit card fraud detection.
Join experts from Confluent, RCG and Arcadia Data for a discussion and demo on how companies are integrating streaming data technologies to transform their business.
You will learn:
-Why Apache Kafka is widely used for real-time event monitoring and decisioning
-How to integrate real-time analytics and visualizations to drive business processes
-How KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time
IBM Cloud Pak for Integration with Confluent Platform powered by Apache KafkaKai Wähner
The Rise of Data in Motion powered by Event Streaming - Use Cases and Architecture for IBM Cloud Pak with Confluent Platform. Including screenshots of the live demo (integration between IBM and Kafka via Confluent Platform and Kafka Connect connectors).
Learn about the integration capabilities of IBM Cloud Pak for Integration, now with the industry’s leading event streaming platform from Confluent Platform powered by Apache Kafka.
The Business Case for Cloud Management - RightScale Compute 2013RightScale
Speakers:
Nick Kephart - Product Marketing Manager, RightScale
Baliey Caldwell - VP Business Development, RightScale
We’ll discuss the cloud technology landscape and how RightScale fits in to manage Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). This session will clarify cloud benefits, cloud challenges, and how cloud management can drive agility, cost, and time savings. You’ll leave with a checklist of how to quantify the business case for cloud management.
We introduce LEAD a formal CEP that was presented at DEBS 2019. Using a running example, we explain how the pattern matching job is generated using Aged Colored Petri Net as logical execution plan. The implementation is currently under Flink Streaming.
Development of dynamic pricing for tours using real-time data feeds | Mourad ...HostedbyConfluent
FREE NOW business is growing rapidly as a ride-hailing industry in general which creates a fair amount of technical challenges related to real-time data aggregation and processing. FREE NOW was a long-time user of Kafka and lately adopted Confluent Cloud as a mainstreaming data platform. We managed to scale it towards several hundreds of topics containing various information about the trip, location and business performance overall. This information is heavily utilized to create streaming applications like dynamic pricing computation, fraud detection as well as real-time analytics for marketing campaigns, and much more. We would like to share the details of the implementation for the real-time computation of the dynamic tour pricing which is based on more than 200 million events daily. Also, we would like to reflect on how Confluent helped us to address the development complexity and provide scalability options at the same time.
Event Streaming CTO Roundtable for Cloud-native Kafka ArchitecturesKai Wähner
Technical thought leadership presentation to discuss how leading organizations move to real-time architecture to support business growth and enhance customer experience. This is a forum to discuss use cases with your peers to understand how other digital-native companies are utilizing data in motion to drive competitive advantage.
Agenda:
- Data in Motion with Event Streaming and Apache Kafka
- Streaming ETL Pipelines
- IT Modernisation and Hybrid Multi-Cloud
- Customer Experience and Customer 360
- IoT and Big Data Processing
- Machine Learning and Analytics
Camunda Day Amsterdam 2019: Workflow Automation in Microservices Architecture...camunda services GmbH
Many Camunda users start to adopt microservices. In this presentation Niall Deehan, IT consultant at Camunda, shares experiences and best practices on how to apply BPMN and the Camunda platform in your microservices architecture. `He’ll tackle questions around how you can slice end-to-end business processes into appropriate pieces, how many engines you should operate and how to keep in control.
Cisco’s E-Commerce Transformation Using Kafka confluent
(Gaurav Goyal + Dharmesh Panchmatia, Cisco Systems) Kafka Summit SF 2018
Cisco e-commerce platform is a custom-built mission-critical platform which accounts for $40+ billion of Cisco’s revenue annually. It’s a suite of 35 different applications and 300+ services that powers product configuration, pricing, quoting and order booking across all Cisco product lines including hardware, software, services and subscriptions. It’s a B2B platform used by the Cisco sales team, partners and direct customers, serving 140,000 unique users across the globe. In order to improve customer experience and business agility, Cisco decided to transition the platform to cloud-native technologies, MongoDB, Elasticsearch and Kafka.
In this session, we will share details around:
-Kafka architecture
-How we are experiencing significant resiliency advantages, zero-downtime deployment and improved performance
-How we’ve implemented Kafka to pass data to 20+ downstream applications, removing point-to-point integrations, batch jobs and standardizing the handshake
-How are we using Kafka for pushing data for machine learning and analytics use cases
-Best practices and lessons learned
The rise of data in motion in the insurance industry is visible across all lines of business including life, healthcare, travel, vehicle, and others. Apache Kafka changes how enterprises rethink data. This blog post explores use cases and architectures for event streaming. Real-world examples from Generali, Centene, Humana, and Telsa show innovative insurance-related data integration and stream processing in real-time.
Confluent Cloud for Apache Kafka® | Google Cloud Next ’19confluent
Google Cloud Next ’19
Speakers:
Gaetan Castelein, Confluent Product Marketing
Kir Titievsky, Google Product Management
Confluent Cloud for Apache Kafka® was a session conducted at Google Cloud Next ’19 on the topic of how Confluent and Google are partnering to give you a complete event-streaming platform that extends Kafka with essential capabilities for developers and enterprises. Confluent is available as a fully managed, first class service on GCP, or can be deployed on-premises on Google Cloud Services Platform. Developers can deploy Confluent Cloud™ in minutes right from the Google Cloud Console to start building event-driven applications. Enterprises can build hybrid cloud streaming solutions with a common platform that spans from on-premises to GCP, streaming data to GCP to leverage best-of-breed services such as BigQuery and TensorFlow. Review this presentation to learn about Confluent and GCP services, and see how you can get started in just minutes with no upfront commitment.
Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniertconfluent
Für die Automobilindustrie ist die digitale Transformation wie für jede andere Branche zugleich eine digitale Revolution: Neue Marktspieler, neue Technologien und die in immer größeren Mengen anfallenden Daten schaffen neue Chancen, aber auch neue Herausforderungen – und erfordern neben neuen IT-Architekturen auch völlig neue Denkansätze.
60% der Fortune500-Unternehmen setzen zur Umsetzung ihrer Daten-Streaming-Projekte auf die umfassende verteilte Streaming-Plattform Apache Kafka®, darunter auch die AUDI AG.
Erfahren Sie in diesem Webinar:
Wie Kafka als Grundlage sowohl für Daten-Pipelines als auch für Anwendungen dient, die Echtzeit-Datenströme konsumieren und verarbeiten.
Wie Kafka Connect und Kafka Streams geschäftskritische Anwendungen unterstützt
Wie Audi mithilfe von Kafka und Confluent eine Fast Data IoT-Plattform umgesetzt hat, die den Bereich „Connected Car“ revolutioniert
Sprecher:
David Schmitz, Principal Architect, Audi Electronics Venture GmbH
Kai Waehner, Technology Evangelist, Confluent
Apache Kafka in Financial Services - Use Cases and ArchitecturesKai Wähner
The Rise of Event Streaming in Financial Services - Use Cases, Architectures and Examples powered by Apache Kafka.
The New FinServ Enterprise Reality: Every company is a software company. Innovate OR be Disrupted. Learn how Event Streaming with Apache Kafka and its ecosystem help...
More details:
https://www.kai-waehner.de/apache-kafka-financial-services-industry-banking-finserv-payment-fraud-middleware-messaging-transactions
https://www.kai-waehner.de/blog/2020/04/15/apache-kafka-machine-learning-banking-finance-industry/
https://www.kai-waehner.de/blog/2020/04/24/mainframe-offloading-replacement-apache-kafka-connect-ibm-db2-mq-cdc-cobol/
Defining a Cloud Adoption Journey to Deliver Cloud Native ServicesAmazon Web Services
Hear how the NSW Department of Industry defined a cloud adoption journey that enabled the migration of over 500 applications and 800 databases in 18 months. Developing an interactive agile methodology that drove momentum to migrate 25 applications per week to the cloud.
Speaker: Michael Cracroft, Chief Security and Technology Officer, Service NSW & Ben Thurgood, Solutions Architect, AWS
What are the best digital transformation tools for your business?learntransformation0
These tools, whether they be cloud computing services, data analytics platforms, or collaboration tools, serve as the cornerstone upon which the basis of an organisation that has undergone a digital transformation is created.
Cloud technology is no longer a new player in the market,
but it’s a mature and integral part of the IT landscape and a
key parameter in driving business growth. It is an
indispensable topic among CXOs. A research by Fraedon has
found that almost half of the banks find their legacy
systems to be the biggest hindrance in their growth.
Show Me the Money: Connecting Performance Engineering to Real Business ResultsCorrelsense
Performance testing and optimization are often neglected parts of enterprise application roll out and upgrade initiatives.
The challenge for many IT managers is communicating the value of IT performance projects to business stakeholders who would benefit the most.
An interactive discussion with Walter Kuketz, CTO of Collaborative Consulting where he shares:
- How to align key business drivers with your performance engineering projects
- Ways to bridge the IT-business stakeholder communication gap
- A new approach to model business transactions and their IT dependencies
Host: Frank Days
Title: VP of Marketing, Correlsense
Real Time Customer Experience for today's Right-Now EconomyDataStax
Milliseconds of interactions define the moments your customers experience with you and your brand. This is now the granularity at which customer engagement is defined. We will share and demonstrate how we have helped our customers to successfully deliver a highly personalized, responsive and consistent experience -- both in the moment and at scale. The results? Customer satisfaction and advocacy in today's right-now digital economy which translate to increased brand loyalty and revenue growth.
Confluent & GSI Webinars series - Session 3confluent
An in depth look at how Confluent is being used in the financial services industry. Gain an understanding of how organisations are utilising data in motion to solve common problems and gain benefits from their real time data capabilities.
It will look more deeply into some specific use cases and show how Confluent technology is used to manage costs and mitigate risks.
This session is aimed at Solutions Architects, Sales Engineers and Pre Sales, and also the more technically minded business aligned people. Whilst this is not a deeply technical session, a level of knowledge around Kafka would be helpful.
Devoteam itsmf 2021 - from business automation to continuous value-driven i...itSMF Belgium
The race for enterprise business process digitalization is raging. IT is often left behind as enterprise budgets for innovation are shifting towards business teams.
During this session, we will present the challenges and our field-tested approaches to catch-up and how to take this opportunity to create new app factories. All the while using low-code and RPA platforms.
You will discover how to capture business demands, and create an operating model for your IT department to stay in control of the applications being deployed, while bringing value at speed.
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...confluent
In our exclusive webinar, you'll learn why event-driven architecture is the key to unlocking cost efficiency, operational effectiveness, and profitability. Gain insights on how this approach differs from API-driven methods and why it's essential for your organization's success.
Unlocking the Power of IoT: A comprehensive approach to real-time insightsconfluent
In today's data-driven world, the Internet of Things (IoT) is revolutionizing industries and unlocking new possibilities. Join Data Reply, Confluent, and Imply as we unveil a comprehensive solution for IoT that harnesses the power of real-time insights.
Workshop híbrido: Stream Processing con Flinkconfluent
El Stream processing es un requisito previo de la pila de data streaming, que impulsa aplicaciones y pipelines en tiempo real.
Permite una mayor portabilidad de datos, una utilización optimizada de recursos y una mejor experiencia del cliente al procesar flujos de datos en tiempo real.
En nuestro taller práctico híbrido, aprenderás cómo filtrar, unir y enriquecer fácilmente datos en tiempo real dentro de Confluent Cloud utilizando nuestro servicio Flink sin servidor.
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...confluent
Our talk will explore the transformative impact of integrating Confluent, HiveMQ, and SparkPlug in Industry 4.0, emphasizing the creation of a Unified Namespace.
In addition to the creation of a Unified Namespace, our webinar will also delve into Stream Governance and Scaling, highlighting how these aspects are crucial for managing complex data flows and ensuring robust, scalable IIoT-Platforms.
You will learn how to ensure data accuracy and reliability, expand your data processing capabilities, and optimize your data management processes.
Don't miss out on this opportunity to learn from industry experts and take your business to the next level.
La arquitectura impulsada por eventos (EDA) será el corazón del ecosistema de MAPFRE. Para seguir siendo competitivas, las empresas de hoy dependen cada vez más del análisis de datos en tiempo real, lo que les permite obtener información y tiempos de respuesta más rápidos. Los negocios con datos en tiempo real consisten en tomar conciencia de la situación, detectar y responder a lo que está sucediendo en el mundo ahora.
Eventos y Microservicios - Santander TechTalkconfluent
Durante esta sesión examinaremos cómo el mundo de los eventos y los microservicios se complementan y mejoran explorando cómo los patrones basados en eventos nos permiten descomponer monolitos de manera escalable, resiliente y desacoplada.
Purpose of the session is to have a dive into Apache, Kafka, Data Streaming and Kafka in the cloud
- Dive into Apache Kafka
- Data Streaming
- Kafka in the cloud
Build real-time streaming data pipelines to AWS with Confluentconfluent
Traditional data pipelines often face scalability issues and challenges related to cost, their monolithic design, and reliance on batch data processing. They also typically operate under the premise that all data needs to be stored in a single centralized data source before it's put to practical use. Confluent Cloud on Amazon Web Services (AWS) provides a fully managed cloud-native platform that helps you simplify the way you build real-time data flows using streaming data pipelines and Apache Kafka.
Q&A with Confluent Professional Services: Confluent Service Meshconfluent
No matter whether you are migrating your Kafka cluster to Confluent Cloud, running a cloud-hybrid environment or are in a different situation where data protection and encryption of sensitive information is required, Confluent Service Mesh allows you to transparently encrypt your data without the need to make code changes to you existing applications.
Citi Tech Talk: Event Driven Kafka Microservicesconfluent
Microservices have become a dominant architectural paradigm for building systems in the enterprise, but they are not without their tradeoffs. Learn how to build event-driven microservices with Apache Kafka
Transforming applications built with traditional messaging solutions such as TIBCO, MQ and Solace to be scalable, reliable and ready for the move to cloud
How can applications built with traditional messaging technologies like TIBCO, Solace and IBM MQ be modernised and be made cloud ready? What are the advantages to Event Streaming approaches to pub/sub vs traditional message queues? What are the strengeths and weaknesses of both approaches, and what use cases and requirements are actually a better fit for messaging than Kafka?
This session will show why the old paradigm does not work and that a new approach to the data strategy needs to be taken. It aims to show how a Data Streaming Platform is integral to the evolution of a company’s data strategy and how Confluent is not just an integration layer but the central nervous system for an organisation
Vous apprendrez également à :
• Créer plus rapidement des produits et fonctionnalités à l’aide d’une suite complète de connecteurs et d’outils de gestion des flux, et à connecter vos environnements à des pipelines de données
• Protéger vos données et charges de travail les plus critiques grâce à des garanties intégrées en matière de sécurité, de gouvernance et de résilience
• Déployer Kafka à grande échelle en quelques minutes tout en réduisant les coûts et la charge opérationnelle associés
Confluent Partner Tech Talk with Synthesisconfluent
A discussion on the arduous planning process, and deep dive into the design/architectural decisions.
Learn more about the networking, RBAC strategies, the automation, and the deployment plan.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
2. Agenda
2
01
Introduction
Challenges in 2020
Key trends
The opportunity
02
Your current state… and why we believe there’s a future state
architecture that will help you win
Introduction to Event Streaming
03
Adopting event Streaming
Typical customer journey / Practical advice - including adopting fully-managed-services
Closing comments on Confluent
3. 3
Headwinds
H1 2020
Being a Digital leader has never been
more important.
“Businesses that once mapped digital strategy in
one- to three-year phases must now scale their
initiatives in a matter of days or weeks.”
4. Listening to our customers,
three strategic themes remain consistent...
4
6. Use data (events) for better,
faster real-time decisions &
automation
Transform architecture,
Build for flexibility
Lower fixed operating costs —e.g.
using Cloud services
/ migrating to Microservices
Automate business processes.
(workflow automation, AI & ML) -
incl. chatbots for customer service,
AI & ML for Fraud Prevention,
Cyber Risk, Regulatory
Contextual interactivity /
OmniChannel experiences /
Back office integration,
movement to online
(perhaps entirely)
Personalizing and
optimizing the next step in
the customer’s journey -
improving CX
Obsessing over the
customer & customer
journey
New Entrants / Challengers
New Digital business models (adjacent to
existing business) / pivoting e.g.
IoT - connected devices,
Logistics - sensors & big data...
Deploy applications & infrastructure to support
the accelerated shift to online channels
6
Examples... Spoiler alert: event streaming
underpins these capabilities
7. 7
In 2020, these trends have only accelerated...
Digitizing all
channels
Increased
pressure on
Cost Reduction
→ Incl. Cloud
- Amidst the COVID-19 crisis, digital has become central to
every interaction, forcing organizations - and individuals -
further up the adoption curve almost overnight...
8. ● In this Post-COVID world, do you agree that your technology
spend aligns with what other companies have shared with us
(~60% cost optimization, 30% improving CX, 10% new Business
models and innovation)?
● What do you feel is the right mix of spend for your
organization and industry?
● Many of our customers are saying they want to become
real-time digital enterprises to respond more effectively to
disruption and meet the increasing now-now-now demands of
their customers in the Post-COVID digital era. Is that also
something that your organization is aspiring towards?
Questions / Discussion:
9. Your current state… and why we
believe there’s a future state
architecture that will help you win
Introduction to Event Streaming
Part 2
10. Typical Enterprise Data Architecture - a Giant Mess
10
LINE OF BUSINESS 01 LINE OF BUSINESS 02 PUBLIC CLOUD
11. And the biggest challenge to
infrastructure modernization?
11
Executives point to:
● People
● Skills
● Management
challenges
12. 12
Customer
Experience
Overall, your current state may restrict your
ability to deliver on your strategic
objectives
Operational
Efficiencies
New Business
Models
Slow Batch Processing,
Periodic Queries on Passive
Data - lack of real-time
experiences
Limited Scalability,
expensive to maintain
Deliver the agility and new
features and functions - with
real-time requirements
13. Rethinking data
as a continually
updating stream
of events
Event Streaming
Introducing Event Streaming
14. “Walmart is able to take data from your past
buying patterns, their internal stock
information, your mobile phone location data,
social media as well as external weather
information and analyse all of this in seconds
so it can send you a voucher for a BBQ cleaner
to your phone – but only if you own a
barbeque, the weather is nice and you
currently are within a 3 miles radius of a
Walmart store that has the BBQ cleaner in
stock.”
15. “Walmart is a $500 billion in revenue company, so every
second is worth millions of dollars. Kafka and Confluent are
the backbone of our digital omnichannel transformation
and success.”
—Chris Kasten, VP of Walmart Cloud
Transforming Retail through Event Streaming
Impact
● Revenue — Kafka and Confluent helped top-line company
growth
● Scale — Improved processing of 8,500 nodes for 11 billion
events per day
● Brand — Delivered exceptional omnichannel customer
experience through new offerings, resulting in brand loyalty
16. “CapitalOne is able to use data, in real-time, to
identify an anomalous transaction on your
account, based on your contextual history - what
you’ve spent in the past, the location of the
transaction versus the location of your cell
phone.
It is able to do this, to prevent fraudulent
transactions, whilst offering a superior customer
experience and support new ways of interacting
with the bank, saving both operational costs and
fraud losses.”
17. Preventing Fraud with Real-Time Event Streaming
Impact
● ROI — Replaced legacy system with batch processing, saving
time and preventing potential losses
● Scale — Implemented event streaming platform with ability
to process more than 27B real-time events a month to
pinpoint suspicious behavior and respond
● Customer Experiences — Improved identification of
questionable activity, triggering real-time, actionable alerts
for customers
“We look at events as running our business. Business people
within our organization want to be able to react to
events—and oftentimes it's a combination of events.”
—Chris D’Agostino, VP of Streaming Data Engineering
18. Event Streaming is Transforming Every Industry
Auto / Transport
Without Event Streaming With Event Streaming
Batch-driven scheduling Real-time ETA
Banking Nightly credit-card fraud checks Real-time credit card fraud prevention
Retail Batch inventory updates Real-time inventory management
Healthcare Batch claims processing Real-time claims processing
Media
Batch data pipelines - production
supply chain
Real-time data pipeline
Manufacturing Scheduled equipment maintenance Automated, predictive maintenance
Defense Reactive cyber-security forensics Automated SIEM and Anomaly Detection
U.S. Defense
Agencies
19. ● Many companies we speak to have raised the frustration
around how they would like to accelerate their digital
transformation initiatives, but their legacy data architecture is
holding them back. Do you feel that your efforts to move
towards becoming real-time digital enterprises is being
hampered by your existing data architectures, and technical
debt that’s built up over the years?
● What are some of these technology frustrations that you’ve
encountered? Or are the challenges primarily around finding
the right talent with the right skill sets?
Questions / Discussion:
21. I N V E S T M E N T & T I M E
VALUE
3
4
5
1
2
Event Streaming - a typical customer
adoption journey...
Early Interest
Mission critical,
but disparate
LOBs
Identify a project
Mission-critical,
connected LOBs
Projects
Platform
Central Nervous
System
The Level 4 jump
23. Central-Nervous System for Data
Hadoop ...
Device
Logs ... App ...MicroserviceMainframes
Data
Warehouse Splunk ...
Data Stores Logs 3rd Party Apps Custom Apps / Microservices
Same Day
Transactioning
(Account Open)
Fees Charges
& Billing
Real-time
Customer 360
Machine
Learning
Models
Real-time Data
Transformation
Real-time Fraud
Detection
24. ● We’ve heard examples of Walmart and Capital One. Walmart
who are challenging digital disruptors like Amazon.com,
CapitalOne who are widely viewed as a digital leader in
Financial Services. They’ve both embraced real-time event
streaming at the heart of their digital nervous system.
● Do you agree that having a central nervous system for your
organisations that allow you to respond to real-time events is
critical to your own digital transformation and innovation
initiatives?
Questions / Discussion:
26. 26
Confluent Enables Your Event
Streaming Success
Confluent founders are
original creators of Kafka
Confluent team wrote 80%
of Kafka commits and has
over 1M hours technical
experience with Kafka
Confluent Platform
extends Apache Kafka to
be a secure,
enterprise-ready platform
Confluent helps enterprises
succeed with reduced effort,
less risk and faster time to
market
Hall of Innovation
CTO Innovation
Award Winner
2019
Enterprise Technology
Innovation
AWARDS
27. Confluent Products: Software and SaaS
2727
SELF-MANAGED SOFTWARE
Confluent Platform
The Enterprise Distribution of Apache Kafka
In the datacenter
VM
FULLY-MANAGED SOFTWARE
Confluent Cloud
Apache Kafka Re-Engineered for the Cloud
In the cloud
Both: Subscription products where price scales with usage
28. The Cost-Efficient Way to Deploy Kafka
Confluent Platform
(case study: top 20 bank)
Confluent IP (Tiered Storage)
reduced broker nodes ~70%
Est. savings of $5M per year
vs. Open Source AK
Confluent Cloud
(common scenario)
1
2
3
⬇50
%
Cloud
Infrastructure
Operational
(FTE)
Support and
other spend
Total
self-managed
Confluent
Cloud
Ask about our TCO Calculator...
29. Damien Wong <dwong@confluent.io>
Richard Koh <rkoh@confluent.io>
Lyndon Hedderly <lyndon@confluent.io>
Jo Balfour <jo@confluent.io>
cnfl.io/slack
cnfl.io/blog
Thank you!
Link to Project Metamorphosis
https://www.confluent.io/blog/cost-effective-kafka-for-lower-tco/
Get a free TCO assessment
https://www.confluent.io/project-metamorphosis/cost-effective/