This document discusses using serverless stream processing for financial services applications. It introduces Confluent and Apache Kafka for building event streaming applications. It describes using ksqlDB and AWS Lambda for serverless stream processing. Best practices are provided when using AWS Lambda as a stateless stream processor. Examples of financial use cases are given such as fraud detection, payments, and risk analytics. The benefits of scaling automatically and paying only for consumption are highlighted.
Transforming Financial Services with Event Streaming Dataconfluent
The document discusses how event streaming can transform financial services by providing real-time and scalable data. It describes how banks have become software-driven and the challenges of legacy infrastructure. The document then provides an overview of how Confluent event streaming works and its benefits. Finally, it discusses some key use cases for financial services including improving customer experiences, unlocking value from mainframes and core systems, payments, open banking, security and fraud, and regulatory compliance.
This document outlines an agenda for a webinar on building secure, event-driven microservices with Confluent Cloud on AWS. The agenda includes presentations on building modern streaming analytics with Confluent on AWS, event streaming made easy with Confluent, and a lab on building end-to-end streaming data pipelines with Confluent Cloud. The hosts for the webinar are Ahmed Zamzam from Confluent and Nuno Barreto from AWS.
Build real-time streaming data pipelines to AWS with Confluentconfluent
Traditional data pipelines often face scalability issues and challenges related to cost, their monolithic design, and reliance on batch data processing. They also typically operate under the premise that all data needs to be stored in a single centralized data source before it's put to practical use. Confluent Cloud on Amazon Web Services (AWS) provides a fully managed cloud-native platform that helps you simplify the way you build real-time data flows using streaming data pipelines and Apache Kafka.
Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniertconfluent
Für die Automobilindustrie ist die digitale Transformation wie für jede andere Branche zugleich eine digitale Revolution: Neue Marktspieler, neue Technologien und die in immer größeren Mengen anfallenden Daten schaffen neue Chancen, aber auch neue Herausforderungen – und erfordern neben neuen IT-Architekturen auch völlig neue Denkansätze.
60% der Fortune500-Unternehmen setzen zur Umsetzung ihrer Daten-Streaming-Projekte auf die umfassende verteilte Streaming-Plattform Apache Kafka®, darunter auch die AUDI AG.
Erfahren Sie in diesem Webinar:
Wie Kafka als Grundlage sowohl für Daten-Pipelines als auch für Anwendungen dient, die Echtzeit-Datenströme konsumieren und verarbeiten.
Wie Kafka Connect und Kafka Streams geschäftskritische Anwendungen unterstützt
Wie Audi mithilfe von Kafka und Confluent eine Fast Data IoT-Plattform umgesetzt hat, die den Bereich „Connected Car“ revolutioniert
Sprecher:
David Schmitz, Principal Architect, Audi Electronics Venture GmbH
Kai Waehner, Technology Evangelist, Confluent
This document discusses how cloud computing with Amazon Web Services (AWS) can help companies innovate faster by focusing resources on core business initiatives rather than infrastructure maintenance. It notes traditional IT models struggle to keep pace with disruption and security risks, while AWS provides agility, flexibility, and security to help companies deploy applications globally and at "startup speed". The document outlines how AWS computing services like EC2, S3, Lambda, DynamoDB, and Redshift allow businesses to reduce costs while gaining scalability and developer productivity.
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confl...Precisely
Mainframes sind immer noch weit verbreitet im Einsatz und verarbeiten täglich über 70 Prozent der wichtigsten Rechentransaktionen der Welt. Sehr hohe Kosten, monolithische Architekturen und fehlende Experten sind die größten Herausforderungen für Mainframe-Anwendungen. Es ist an der Zeit, innovativer zu werden, auch mit dem Mainframe! Stellen wir uns gemeinsam dem Dinosaurier!
Mainframe Offloading mit Confluent, Apache Kafka und dem zugehörigen Ökosystem kann genutzt werden, um moderne Dateninfrastrukturen in Echtzeit mit dem Mainframe synchron zu halten. Dabei ermöglich Kafka sowohl die Datenverarbeitung als auch die Integration mit Systemen wie Data Warehouses und Analytics-Plattformen. Dabei können via Change Data Capture (CDC) permanent Mainframe-Änderungen im hochvoluminösen Bereich nach Kafka gepusht werden.
In dieser on-demand-präsentation zeigen Confluent und Precisely, wie Unternehmen diesen Schritt zur Legacy-Migration machen, Kosten sparen, eine skalierbare und offene Architektur schaffen und so neue Dienste und Anwendungen ermöglichen.
Explore the various options for streaming data on AWS, such as Amazon Kinesis and Amazon Managed Streaming for Kafka, and the various options for processing streams of data such as Apache Spark, Apache Flink, AWS Lambda, and Amazon Kinesis Analytics for Java. Let's explore what an architecture for processing Australia's new Open Banking data format at 60,000 transactions per second could look like.
The New Normal: Benefits of Cloud Computing and Defining your IT StrategyAmazon Web Services
The standard business model is changing rapidly changing. Companies used to be built for the long haul. But now, success is powered by rapid-paced innovation and the ability to get disruptive products to market first.
You’re used to balancing resources between keeping things running and the development of new initiatives. But merely keeping the lights on doesn't differentiate you from your competitors.
Transforming Financial Services with Event Streaming Dataconfluent
The document discusses how event streaming can transform financial services by providing real-time and scalable data. It describes how banks have become software-driven and the challenges of legacy infrastructure. The document then provides an overview of how Confluent event streaming works and its benefits. Finally, it discusses some key use cases for financial services including improving customer experiences, unlocking value from mainframes and core systems, payments, open banking, security and fraud, and regulatory compliance.
This document outlines an agenda for a webinar on building secure, event-driven microservices with Confluent Cloud on AWS. The agenda includes presentations on building modern streaming analytics with Confluent on AWS, event streaming made easy with Confluent, and a lab on building end-to-end streaming data pipelines with Confluent Cloud. The hosts for the webinar are Ahmed Zamzam from Confluent and Nuno Barreto from AWS.
Build real-time streaming data pipelines to AWS with Confluentconfluent
Traditional data pipelines often face scalability issues and challenges related to cost, their monolithic design, and reliance on batch data processing. They also typically operate under the premise that all data needs to be stored in a single centralized data source before it's put to practical use. Confluent Cloud on Amazon Web Services (AWS) provides a fully managed cloud-native platform that helps you simplify the way you build real-time data flows using streaming data pipelines and Apache Kafka.
Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniertconfluent
Für die Automobilindustrie ist die digitale Transformation wie für jede andere Branche zugleich eine digitale Revolution: Neue Marktspieler, neue Technologien und die in immer größeren Mengen anfallenden Daten schaffen neue Chancen, aber auch neue Herausforderungen – und erfordern neben neuen IT-Architekturen auch völlig neue Denkansätze.
60% der Fortune500-Unternehmen setzen zur Umsetzung ihrer Daten-Streaming-Projekte auf die umfassende verteilte Streaming-Plattform Apache Kafka®, darunter auch die AUDI AG.
Erfahren Sie in diesem Webinar:
Wie Kafka als Grundlage sowohl für Daten-Pipelines als auch für Anwendungen dient, die Echtzeit-Datenströme konsumieren und verarbeiten.
Wie Kafka Connect und Kafka Streams geschäftskritische Anwendungen unterstützt
Wie Audi mithilfe von Kafka und Confluent eine Fast Data IoT-Plattform umgesetzt hat, die den Bereich „Connected Car“ revolutioniert
Sprecher:
David Schmitz, Principal Architect, Audi Electronics Venture GmbH
Kai Waehner, Technology Evangelist, Confluent
This document discusses how cloud computing with Amazon Web Services (AWS) can help companies innovate faster by focusing resources on core business initiatives rather than infrastructure maintenance. It notes traditional IT models struggle to keep pace with disruption and security risks, while AWS provides agility, flexibility, and security to help companies deploy applications globally and at "startup speed". The document outlines how AWS computing services like EC2, S3, Lambda, DynamoDB, and Redshift allow businesses to reduce costs while gaining scalability and developer productivity.
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confl...Precisely
Mainframes sind immer noch weit verbreitet im Einsatz und verarbeiten täglich über 70 Prozent der wichtigsten Rechentransaktionen der Welt. Sehr hohe Kosten, monolithische Architekturen und fehlende Experten sind die größten Herausforderungen für Mainframe-Anwendungen. Es ist an der Zeit, innovativer zu werden, auch mit dem Mainframe! Stellen wir uns gemeinsam dem Dinosaurier!
Mainframe Offloading mit Confluent, Apache Kafka und dem zugehörigen Ökosystem kann genutzt werden, um moderne Dateninfrastrukturen in Echtzeit mit dem Mainframe synchron zu halten. Dabei ermöglich Kafka sowohl die Datenverarbeitung als auch die Integration mit Systemen wie Data Warehouses und Analytics-Plattformen. Dabei können via Change Data Capture (CDC) permanent Mainframe-Änderungen im hochvoluminösen Bereich nach Kafka gepusht werden.
In dieser on-demand-präsentation zeigen Confluent und Precisely, wie Unternehmen diesen Schritt zur Legacy-Migration machen, Kosten sparen, eine skalierbare und offene Architektur schaffen und so neue Dienste und Anwendungen ermöglichen.
Explore the various options for streaming data on AWS, such as Amazon Kinesis and Amazon Managed Streaming for Kafka, and the various options for processing streams of data such as Apache Spark, Apache Flink, AWS Lambda, and Amazon Kinesis Analytics for Java. Let's explore what an architecture for processing Australia's new Open Banking data format at 60,000 transactions per second could look like.
The New Normal: Benefits of Cloud Computing and Defining your IT StrategyAmazon Web Services
The standard business model is changing rapidly changing. Companies used to be built for the long haul. But now, success is powered by rapid-paced innovation and the ability to get disruptive products to market first.
You’re used to balancing resources between keeping things running and the development of new initiatives. But merely keeping the lights on doesn't differentiate you from your competitors.
This document discusses how the traditional IT model of maintaining on-premise infrastructure is costly and lacks agility, taking resources away from innovation. It notes that over 2/3 of IT budgets go to keeping existing systems running. The document promotes moving to AWS's cloud computing model which allows companies to focus resources on differentiation, reduce costs, improve security, and innovate faster by removing the burden of managing infrastructure. It provides examples of large companies using AWS to lower costs, improve agility and speed of development.
Apache Kafka as Event Streaming Platform for Microservice ArchitecturesKai Wähner
This session introduces Apache Kafka, an event-driven open source streaming platform. Apache Kafka goes far beyond scalable, high volume messaging. In addition, you can leverage Kafka Connect for integration and the Kafka Streams API for building lightweight stream processing microservices in autonomous teams. The Confluent Platform adds further components such as a Schema Registry, REST Proxy, KSQL, Clients for different programming languages and Connectors for different technologies.
The session discusses how tech giants like LinkedIn, Ebay or Airbnb leverage Apache Kafka as event streaming platform to solve various different business problems and how to create a scalable, flexible microservice architecture. A live demo shows how you can easily process and analyze streams of events using Apache Kafka and KSQL.
ActiveGrid provides a transaction grid computing platform that enables businesses to build and deploy applications across a grid of commodity servers using open source technologies. The platform allows applications to dynamically adapt transactions based on various factors like user type, device capabilities, and quality of service policies. It aims to offer a lower-cost and more scalable alternative to traditional 3-tier architectures. ActiveGrid products include tools for graphical application development, grid application servers, and transaction processing servers.
- Richard Harshman discussed how AWS is changing industries like hotels, music, dating, taxis, and storage by providing cloud services for every industry, workload, and company size.
- Customers from startups to large enterprises like Meerkat, Singapore's largest online grocery store, and others discussed how AWS has helped them lower costs, increase speed and agility, and improve critical business processes.
- AWS offers a comprehensive portfolio of services across infrastructure, databases, analytics, developer tools, and more that are continually innovating and expanding. This allows customers to benefit from the cloud without upfront costs or long term contracts.
Building Real-Time Serverless Data Applications With Joseph Morais and Adam W...HostedbyConfluent
Building Real-Time Serverless Data Applications With Joseph Morais and Adam Wagner | Current 2022
Enterprises that are trying to accelerate development times for their digital native applications are defaulting to serverless architectures. And for good reasons too. With self-serve provisioning, elastic scaling, lower TCO, and industry standard security features, it makes it easy for developers to spend more time building and less time managing.
Join this session to see first hand how developers are pairing Confluent's cloud native, serverless Apache Kafka offering with AWS's serverless services to build data apps and platform that scale.
Apache Kafka® Use Cases for Financial Servicesconfluent
Traditional systems were designed in an era that predates large-scale distributed systems. These systems often lack the ability to scale to meet the needs of the modern data-driven organisation. Adding to this is the accumulation of technologies and the explosion of data which can result in complex point-to-point integrations where data becomes siloed or separated across the enterprise.
The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka and the Confluent Platform are designed to solve the problems associated with traditional systems and provide a modern, distributed architecture and Real-time Data streaming capability. In addition these technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .
Confluent & GSI Webinars series - Session 3confluent
An in depth look at how Confluent is being used in the financial services industry. Gain an understanding of how organisations are utilising data in motion to solve common problems and gain benefits from their real time data capabilities.
It will look more deeply into some specific use cases and show how Confluent technology is used to manage costs and mitigate risks.
This session is aimed at Solutions Architects, Sales Engineers and Pre Sales, and also the more technically minded business aligned people. Whilst this is not a deeply technical session, a level of knowledge around Kafka would be helpful.
Serverless Kafka on AWS as Part of a Cloud-native Data Lake ArchitectureKai Wähner
AWS Data Lake / Lake House + Confluent Cloud for Serverless Apache Kafka. Learn about use cases, architectures, and features.
Data must be continuously collected, processed, and reactively used in applications across the entire enterprise - some in real time, some in batch mode. In other words: As an enterprise becomes increasingly software-defined, it needs a data platform designed primarily for "data in motion" rather than "data at rest."
Apache Kafka is now mainstream when it comes to data in motion! The Kafka API has become the de facto standard for event-driven architectures and event streaming. Unfortunately, the cost of running it yourself is very often too expensive when you add factors like scaling, administration, support, security, creating connectors...and everything else that goes with it. Resources in enterprises are scarce: this applies to both the best team members and the budget.
The cloud - as we all know - offers the perfect solution to such challenges.
Most likely, fully-managed cloud services such as AWS S3, DynamoDB or Redshift are already in use. Now it is time to implement "fully-managed" for Kafka as well - with Confluent Cloud on AWS.
Building a central integration layer that doesn't care where or how much data is coming from.
Implementing scalable data stream processing to gain real-time insights
Leveraging fully managed connectors (like S3, Redshift, Kinesis, MongoDB Atlas & more) to quickly access data
Confluent Cloud in action? Let's show how ao.com made it happen!
Translated with www.DeepL.com/Translator (free version)
The document discusses challenges facing today's enterprises including cutting costs, driving value with tight budgets, maintaining security while increasing access, and finding the right transformative capabilities. It then discusses challenges in building applications such as scaling, availability, and costs. The document introduces the Windows Azure platform as a solution, highlighting its fundamentals of scale, automation, high availability, and multi-tenancy. It provides considerations for using cloud computing on or off premises and discusses ownership models.
Cloud forum platform - from sap to new applications final aMauricio Godoy
The document discusses how cloud computing provides a platform for business innovation and growth. It describes how the cloud enables cost-effective business processes and new business models by providing shared computing resources that are scalable, location-independent, and self-service. It also summarizes how IBM's Smart Cloud platform supports key workloads like application development, storage, CRM, and mobile enhancement through automated management and security.
This document provides an agenda and logistics for an AWS & Confluent GameDay event. The agenda includes sessions on data analytics on AWS, unlocking value with Confluent on AWS, and a workshop. Logistics cover things like wifi access, dietary requirements, and feedback collection. Presenters are listed from AWS and Confluent.
Amazon Web Services provides cloud computing services that allow companies to focus on innovation rather than infrastructure management. Traditional IT models spend too much on maintaining existing systems rather than new initiatives, while cloud services from AWS allow companies to innovate faster, reduce costs and risks, and gain more flexibility. Several large companies are highlighted that have moved infrastructure and applications to AWS to achieve these benefits.
Big Data, IoT, data lake, unstructured data, Hadoop, cloud, and massively parallel processing (MPP) are all just fancy words unless you can find uses cases for all this technology. Join me as I talk about the many use cases I have seen, from streaming data to advanced analytics, broken down by industry. I’ll show you how all this technology fits together by discussing various architectures and the most common approaches to solving data problems and hopefully set off light bulbs in your head on how big data can help your organization make better business decisions.
Event mesh api meetup AsyncAPI SingaporePhil Scanlon
1) The document discusses how event-driven architectures and asynchronous APIs can make interactions more responsive and agile. It defines an event and provides examples of how events are generated from user actions.
2) A key benefit of event-driven systems is improved responsiveness through asynchronous processing and deferred execution. This allows isolating performance-critical paths from less time-sensitive operations.
3) The document advocates for a microservices approach powered by an event mesh to route events. This enables agility, scalability, and parallel processing while maintaining eventual consistency.
The document discusses best practices for building modern applications in the cloud. It recommends:
1) Structuring applications as collections of microservices to improve agility and enable independent deployments.
2) Using serverless technologies like AWS Lambda and Fargate as much as possible to automate infrastructure management and only pay for resources used.
3) Treating infrastructure as code to model applications and infrastructure with templates for repeatable, predictable deployments and continuous integration/delivery of changes.
Bridge Your Kafka Streams to Azure Webinarconfluent
With a fully managed Apache Kafka(R) as-a-service on Microsoft Azure, businesses can focus on building applications and not managing clusters. Build a persistent bridge from on-premises data systems to the cloud with a hybrid Kafka service or stream across public clouds for multi-cloud data pipelines.
In this session for business and technical data leaders, you can learn about powering business applications with the managed Kafka service that streams data into Azure SQL Data Warehouse, Cosmos DB, Azure Data Lake Storage and Azure Blob Storage.
2011.04.04. Les partenaires IBM et le Cloud Business - Loic SimonClub Alliances
Deck sur les Partenaires IBM et le Cloud Business préparé par Loic Simon à l'occasion de sessions de Formation délivrées aux responsables de la relation partenaires chez IBM.
IBM Cloud Pak for Integration with Confluent Platform powered by Apache KafkaKai Wähner
The Rise of Data in Motion powered by Event Streaming - Use Cases and Architecture for IBM Cloud Pak with Confluent Platform. Including screenshots of the live demo (integration between IBM and Kafka via Confluent Platform and Kafka Connect connectors).
Learn about the integration capabilities of IBM Cloud Pak for Integration, now with the industry’s leading event streaming platform from Confluent Platform powered by Apache Kafka.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
This document discusses how the traditional IT model of maintaining on-premise infrastructure is costly and lacks agility, taking resources away from innovation. It notes that over 2/3 of IT budgets go to keeping existing systems running. The document promotes moving to AWS's cloud computing model which allows companies to focus resources on differentiation, reduce costs, improve security, and innovate faster by removing the burden of managing infrastructure. It provides examples of large companies using AWS to lower costs, improve agility and speed of development.
Apache Kafka as Event Streaming Platform for Microservice ArchitecturesKai Wähner
This session introduces Apache Kafka, an event-driven open source streaming platform. Apache Kafka goes far beyond scalable, high volume messaging. In addition, you can leverage Kafka Connect for integration and the Kafka Streams API for building lightweight stream processing microservices in autonomous teams. The Confluent Platform adds further components such as a Schema Registry, REST Proxy, KSQL, Clients for different programming languages and Connectors for different technologies.
The session discusses how tech giants like LinkedIn, Ebay or Airbnb leverage Apache Kafka as event streaming platform to solve various different business problems and how to create a scalable, flexible microservice architecture. A live demo shows how you can easily process and analyze streams of events using Apache Kafka and KSQL.
ActiveGrid provides a transaction grid computing platform that enables businesses to build and deploy applications across a grid of commodity servers using open source technologies. The platform allows applications to dynamically adapt transactions based on various factors like user type, device capabilities, and quality of service policies. It aims to offer a lower-cost and more scalable alternative to traditional 3-tier architectures. ActiveGrid products include tools for graphical application development, grid application servers, and transaction processing servers.
- Richard Harshman discussed how AWS is changing industries like hotels, music, dating, taxis, and storage by providing cloud services for every industry, workload, and company size.
- Customers from startups to large enterprises like Meerkat, Singapore's largest online grocery store, and others discussed how AWS has helped them lower costs, increase speed and agility, and improve critical business processes.
- AWS offers a comprehensive portfolio of services across infrastructure, databases, analytics, developer tools, and more that are continually innovating and expanding. This allows customers to benefit from the cloud without upfront costs or long term contracts.
Building Real-Time Serverless Data Applications With Joseph Morais and Adam W...HostedbyConfluent
Building Real-Time Serverless Data Applications With Joseph Morais and Adam Wagner | Current 2022
Enterprises that are trying to accelerate development times for their digital native applications are defaulting to serverless architectures. And for good reasons too. With self-serve provisioning, elastic scaling, lower TCO, and industry standard security features, it makes it easy for developers to spend more time building and less time managing.
Join this session to see first hand how developers are pairing Confluent's cloud native, serverless Apache Kafka offering with AWS's serverless services to build data apps and platform that scale.
Apache Kafka® Use Cases for Financial Servicesconfluent
Traditional systems were designed in an era that predates large-scale distributed systems. These systems often lack the ability to scale to meet the needs of the modern data-driven organisation. Adding to this is the accumulation of technologies and the explosion of data which can result in complex point-to-point integrations where data becomes siloed or separated across the enterprise.
The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka and the Confluent Platform are designed to solve the problems associated with traditional systems and provide a modern, distributed architecture and Real-time Data streaming capability. In addition these technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .
Confluent & GSI Webinars series - Session 3confluent
An in depth look at how Confluent is being used in the financial services industry. Gain an understanding of how organisations are utilising data in motion to solve common problems and gain benefits from their real time data capabilities.
It will look more deeply into some specific use cases and show how Confluent technology is used to manage costs and mitigate risks.
This session is aimed at Solutions Architects, Sales Engineers and Pre Sales, and also the more technically minded business aligned people. Whilst this is not a deeply technical session, a level of knowledge around Kafka would be helpful.
Serverless Kafka on AWS as Part of a Cloud-native Data Lake ArchitectureKai Wähner
AWS Data Lake / Lake House + Confluent Cloud for Serverless Apache Kafka. Learn about use cases, architectures, and features.
Data must be continuously collected, processed, and reactively used in applications across the entire enterprise - some in real time, some in batch mode. In other words: As an enterprise becomes increasingly software-defined, it needs a data platform designed primarily for "data in motion" rather than "data at rest."
Apache Kafka is now mainstream when it comes to data in motion! The Kafka API has become the de facto standard for event-driven architectures and event streaming. Unfortunately, the cost of running it yourself is very often too expensive when you add factors like scaling, administration, support, security, creating connectors...and everything else that goes with it. Resources in enterprises are scarce: this applies to both the best team members and the budget.
The cloud - as we all know - offers the perfect solution to such challenges.
Most likely, fully-managed cloud services such as AWS S3, DynamoDB or Redshift are already in use. Now it is time to implement "fully-managed" for Kafka as well - with Confluent Cloud on AWS.
Building a central integration layer that doesn't care where or how much data is coming from.
Implementing scalable data stream processing to gain real-time insights
Leveraging fully managed connectors (like S3, Redshift, Kinesis, MongoDB Atlas & more) to quickly access data
Confluent Cloud in action? Let's show how ao.com made it happen!
Translated with www.DeepL.com/Translator (free version)
The document discusses challenges facing today's enterprises including cutting costs, driving value with tight budgets, maintaining security while increasing access, and finding the right transformative capabilities. It then discusses challenges in building applications such as scaling, availability, and costs. The document introduces the Windows Azure platform as a solution, highlighting its fundamentals of scale, automation, high availability, and multi-tenancy. It provides considerations for using cloud computing on or off premises and discusses ownership models.
Cloud forum platform - from sap to new applications final aMauricio Godoy
The document discusses how cloud computing provides a platform for business innovation and growth. It describes how the cloud enables cost-effective business processes and new business models by providing shared computing resources that are scalable, location-independent, and self-service. It also summarizes how IBM's Smart Cloud platform supports key workloads like application development, storage, CRM, and mobile enhancement through automated management and security.
This document provides an agenda and logistics for an AWS & Confluent GameDay event. The agenda includes sessions on data analytics on AWS, unlocking value with Confluent on AWS, and a workshop. Logistics cover things like wifi access, dietary requirements, and feedback collection. Presenters are listed from AWS and Confluent.
Amazon Web Services provides cloud computing services that allow companies to focus on innovation rather than infrastructure management. Traditional IT models spend too much on maintaining existing systems rather than new initiatives, while cloud services from AWS allow companies to innovate faster, reduce costs and risks, and gain more flexibility. Several large companies are highlighted that have moved infrastructure and applications to AWS to achieve these benefits.
Big Data, IoT, data lake, unstructured data, Hadoop, cloud, and massively parallel processing (MPP) are all just fancy words unless you can find uses cases for all this technology. Join me as I talk about the many use cases I have seen, from streaming data to advanced analytics, broken down by industry. I’ll show you how all this technology fits together by discussing various architectures and the most common approaches to solving data problems and hopefully set off light bulbs in your head on how big data can help your organization make better business decisions.
Event mesh api meetup AsyncAPI SingaporePhil Scanlon
1) The document discusses how event-driven architectures and asynchronous APIs can make interactions more responsive and agile. It defines an event and provides examples of how events are generated from user actions.
2) A key benefit of event-driven systems is improved responsiveness through asynchronous processing and deferred execution. This allows isolating performance-critical paths from less time-sensitive operations.
3) The document advocates for a microservices approach powered by an event mesh to route events. This enables agility, scalability, and parallel processing while maintaining eventual consistency.
The document discusses best practices for building modern applications in the cloud. It recommends:
1) Structuring applications as collections of microservices to improve agility and enable independent deployments.
2) Using serverless technologies like AWS Lambda and Fargate as much as possible to automate infrastructure management and only pay for resources used.
3) Treating infrastructure as code to model applications and infrastructure with templates for repeatable, predictable deployments and continuous integration/delivery of changes.
Bridge Your Kafka Streams to Azure Webinarconfluent
With a fully managed Apache Kafka(R) as-a-service on Microsoft Azure, businesses can focus on building applications and not managing clusters. Build a persistent bridge from on-premises data systems to the cloud with a hybrid Kafka service or stream across public clouds for multi-cloud data pipelines.
In this session for business and technical data leaders, you can learn about powering business applications with the managed Kafka service that streams data into Azure SQL Data Warehouse, Cosmos DB, Azure Data Lake Storage and Azure Blob Storage.
2011.04.04. Les partenaires IBM et le Cloud Business - Loic SimonClub Alliances
Deck sur les Partenaires IBM et le Cloud Business préparé par Loic Simon à l'occasion de sessions de Formation délivrées aux responsables de la relation partenaires chez IBM.
IBM Cloud Pak for Integration with Confluent Platform powered by Apache KafkaKai Wähner
The Rise of Data in Motion powered by Event Streaming - Use Cases and Architecture for IBM Cloud Pak with Confluent Platform. Including screenshots of the live demo (integration between IBM and Kafka via Confluent Platform and Kafka Connect connectors).
Learn about the integration capabilities of IBM Cloud Pak for Integration, now with the industry’s leading event streaming platform from Confluent Platform powered by Apache Kafka.
Similar to Building Serverless EDA w_ AWS Lambda (1).pptx (20)
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
HCL Notes and Domino License Cost Reduction in the World of DLAU
Building Serverless EDA w_ AWS Lambda (1).pptx
1. Serverless Stream Processing for
Financial Services
Ahmed Zamzam
AWS Partner Solutions
Architect - Confluent
Veda Raman
Specialist Solutions
Architect - AWS
Jason Demby
Senior Business
Development Leader - AWS
2. Agenda
Introducing Confluent
Rearchitected Kafka, together with the features
you need to rapidly deploy production use cases
2
Serverless stream processing
Building event streaming applications using
ksqlDB and AWS Lambda
Best Practices
Best practices when using AWS Lambda as a
stateless stream processor
The shift towards data streaming and Apache
Kafka and the value it provides
The rise of data streaming
3. Confluent for Financial Services: Use Cases
Deliver differentiated
customer experiences
Increase digital engagement &
improve omni-channel experience
Secure your enterprise
Detect and respond to fraud,
threats and attacks in real-time
Modernize your infrastructure
Drive massive operational efficiency,
developer velocity and reduce costs
Automate business resiliency
Mitigate risks in service offerings and
market risk exposure
Drive regulatory compliance
Stay compliant across global banking
regulations such as open banking, FINRA,
trade reporting, payments and more
Enable a sharing economy
Decentralize asset ownership and
increase market opportunity to deliver
profitable services in the future
4. What is streaming data?
Typical characteristics
Low-latency
Continuous Ordered,
incremental
High volume
5. Why streaming data?
Source: Perishable insights, Mike Gualtieri, Forrester
Data loses value quickly over time
Real-time Seconds Minutes Hours Days Months
Value
of
data
to
decision-making
Preventive/Predictive
Actionable Reactive Historical
Time critical
decisions
Traditional “batch” business intelligence
Information half-life
in decision-making
6. Event Streaming is the
Central Nervous System
for today’s enterprises.
Apache Kafka®
is the technology.
9. Real-time
Data
Trades
Market data
Positions
Collateral
A new paradigm for financial services
Continuously Process Data in Real-time
“We need to shift our thinking from everything
at rest, to everything in motion.” —
Real-Time Stream Processing
Orders, Execution,
Algorithms, Pricing
Models
Firm-wide Risk
Management
10. “We look at events as running
our business. Business people
within our organization want to
be able to react to events—and
oftentimes it's a combination of
events.”
VP of Streaming Data Engineering
11. Hall of Innovation
CTO Innovation
Award Winner
2019
Enterprise Technology
Innovation
AWARDS
...Confluent is the Only Company
Focused on Data in Motion
Vision
● Original creators of
Kafka
● Data in Motion
pioneers
Category Leadership
● 80% of Kafka commits
● 1M+ hours of Kafka
technical experience
● Operate 5K+ clusters
Value
● Remove risk
● Deploy at scale
● Accelerate time-to-
market
Product
● Extends Kafka to be
secure and
enterprise-ready
● Software or cloud-
native service
12. ...
Device
Logs ... ...
...
Data Stores Logs 3rd Party Apps Custom Apps / Microservices
Real-time
Customer 360
Financial Fraud
Detection
Real-time
Risk Analytics
Real-time
Payments
Machine
Learning
Models
...
Real-time Applications
Universal Event Pipeline
Amazon
S3
SaaS
apps
Confluent: Central Nervous System For
Enterprise
13. Confluent Enables Endless Financial Services Use Cases
Hybrid & Multi-Cloud
Messaging & Mainframe
Modernization
Streaming
Analytics
Event Driven
Microservices
CDC Patterns from
Systems Of Records
Corporate & Investment
Banking, Capital Markets
Trade Processing (Equities,
FICC, Derivatives...)
Real Time Payments and
Payments Tracking
Risk Analytics
Market, Reference, & Security
Master Data Distribution
Trading System Integrations
& Automation
CTO - Technology Modernization
Finance, Risk,
Compliance, IT, Cyber
Credit & Market Risk (CCAR,
BCBS 239, FRTB )
OATS / CAT reporting
Operational Log Hub
IT Observability
Cyber Security | SIEM
Modernization
Retail Banking, Wealth &
Asset Management
Fraud Detection
Open Banking
Customer 360 (omni channel
banking, alerts & notifications)
Client Advisor Workstations
Data and Analytics
for Asset Managers
14. Everywhere
Be everywhere
our customers
want to be
Cloud-Native
Re-imagined
Kafka experience
for the Cloud
Complete
Enable developers
to reliably &
securely build next-
gen apps faster
The Confluent Product Advantage
15. Confluent runs everywhere
18
SELF-MANAGED SOFTWARE
Confluent Platform
The Enterprise Distribution of Apache Kafka
In the datacenter
VM
FULLY-MANAGED SOFTWARE
Confluent Cloud
Apache Kafka Re-Engineered for the Cloud
In the cloud
16. Federated streaming, hybrid
and multi-cloud.
Data syndication and replication
across and between clouds and
on-premises, with self-service APIs,
data governance, and visual
tooling.
Reliable & real-time data streams
between all customer sites, so you
can run always-on streaming
analytics on the data of the entire
enterprise, despite regional or
cloud provider outages.
Everywhere:
Cluster Linking Global Central Nervous System
17. Copyright 2021, Confluent, Inc. All rights reserved. This document may not be reproduced in any manner without the express written permission of Confluent, Inc.
Augment Messaging and Mainframe Systems
and Migrate Over Time with our Support
20
1. Current middleware communication
2. Decouple from consumer app
3. Make your events available for downstream systems
Customer Payment
Jay $10
Sue $15
Grace $5
... ...
Application
producing data
Traditional Messaging
and Mainframe
Systems
Consumer
application
1
2
3
Downstream
data store
18. Accelerate modernization from on-prem to AWS
Redshift Sink
Lambda Sink
AWS Direct
Connect
Replicator
LEGACY EDW
MAINFRAME
LEGACY DB
JDBC / CDC
connectors
Connect
Leverage +100 Confluent pre-built connectors to
continuously bring valuable data from existing
services on-prem including enterprise data
warehouse, databases and mainframes
Modernize
Increase agility in getting applications to market
and reduce TCO when freeing up resources to
focus on value generating activities and not in
managing servers
On-prem AWS Cloud
Bridge
Hybrid cloud streaming
with consistent, event-
driven architecture for
modern apps
On-prem to AWS modernization
Amazon Athena
AWS Glue
SageMaker
Lake Formation
Amazon
DynamoDB
Amazon
Aurora
S3 Sink
Data Streams
Apps
ksqlDB
23. Kafka clients Kafka Streams ksqlDB
ConsumerRecords<String, String> records = consumer.poll(100);
Map<String, Integer> counts = new DefaultMap<String, Integer>();
for (ConsumerRecord<String, Integer> record : records) {
String key = record.key();
int c = counts.get(key)
c += record.value()
counts.put(key, c)
}
for (Map.Entry<String, Integer> entry : counts.entrySet()) {
int stateCount;
int attempts;
while (attempts++ < MAX_RETRIES) {
try {
stateCount = stateStore.getValue(entry.getKey())
stateStore.setValue(entry.getKey(), entry.getValue() +
stateCount)
break;
} catch (StateStoreException e) {
RetryUtils.backoff(attempts);
}
}
}
builder
.stream("input-stream",
Consumed.with(Serdes.String(), Serdes.String()))
.groupBy((key, value) -> value)
.count()
.toStream()
.to("counts", Produced.with(Serdes.String(), Serdes.Long()));
SELECT x, count(*) FROM stream GROUP BY x EMIT CHANGES;
Flexibility Simplicity
3 modalities of stream processing with
Confluent
24. ksqlDB at a Glance
What is it?
ksqlDB is an event streaming
database for working with streams
and tables of data.
All the key features of a modern
streaming solution.
Aggregations Joins
Windowing
Event-Time
Dual Query
Support
Exactly-Once
Semantics
Out-of-Order
Handling
User-Defined
Functions
Compute Storage
CREATE TABLE activePromotions AS
SELECT rideId,
qualifyPromotion(distanceToDst) AS promotion
FROM locations
GROUP BY rideId
EMIT CHANGES
How does it work?
It separates compute from storage, and scales
elastically in a fault-tolerant manner.
It remains highly available during disruption,
even in the face of failure to a quorum of its
servers.
ksqlDB Kafka
26. Serverless integration
Connect existing and apps & data stores in a repeatable way without
having to manage- Apache Kafka, Schema Registry to maintain
app compatibility, ksqlDB to develop real-time apps with SQL syntax
and Connect for effortless integrations with Lambda & data stores
AWS serverless platform
Stop provisioning, maintaining or administering servers for
backend components such as compute, databases and
storage so that you can focus on increasing agility and
innovation for your developer teams
Increase developer agility & speed of innovation
Apps
Microservices
ksqlDB
Schema
Registry
COMPUTE
AWS
Lambda
Data stores
REST Proxy
& Clients
Source
Connectors
Lambda
Sink
DATA STORES
Amazon
DynamoDB
Amazon
Aurora
STORAGE
Amazon
S3
S3 Sink
ANALYTICS
Amazon
Athena
Amazon
Redshift
Serverless app integration
29. No servers to manage
Only pay for stream consumption when
processing messages
Automatically scales consumers
Benefits of Serverless stream processing
Write less code
30. Serverless processing Server-based
processing
✔ Stream polling logic is separate from
application logic
✔ Stream polling logic is baked into your
application code
✔ Event driven processing ✔ Consumer must be running to poll the kafka clusters
✔ Scaling is handled automatically ✔ Scaling is done using consumer
groups.
✔ Poller: Open source APIs/libraries( KafkaStreams
javalibrary, kafka-python )
• Poller: Lambda ESM
Confluent Lambda Sink
Connector
35. Confluent Lambda Sink connector
• Sink connector polls Kafka partitions and calls your function
• Lambda can be called synchronously or asynchronously.
• At least once semantics
• Provides a dead letter queue (DLQ) for any failed invocations
36. Confluent Lambda Sink connector – Scaling
and Error Handling
• Sink connector scales upto a soft maximum of 10 connectors.
• Error handling semantics similar to sync and async lambda invocations.
• Async: Lambda service retries twice (three total attempts)
• Sync: By default, fails and stop processing for that partition.
Option to log to another kafka topic and continue processing
• Option to batch records. Configured through aws.lambda.batch.size
37. Lambda ESM consumer for Kafka
Lambda
Function
instance
Poller
• Starts with one concurrent poller and
customer function
• Lambda service polls the Kafka partitions
and invokes your lambda function
synchronously
38. Lambda ESM consumer for Kafka
– Scaling and Batching
Lambd
a
Function
instance
Polle
r
Function
instance
Function
instance
• Scaling:
• Lambda service checks every 3 mins if
scaling is needed.
• Starts with 1 poller and scales upto <=
#partitions
• Batching: Batch records based on a BatchSize
or Batchwindow.
40. Capture and log exceptions
data
producer
Lambda
service
function A
(instance 1)
batch size =
200
300 records
✔
function A
(instance 1)
✔
Catch exceptions and log
to CloudWatch Logs
CloudWatch
Logs
Return successfully from
Lambda function
• Ensure processing moves forward by catching exceptions and returning successfully
!
41. Optimize batch-size/batch-
window to lower cost
Lambda
Function
instance
Poller
• Lambda’s maximum execution time is 15 minutes
• Adjust the batch size (max 10,000) to ensure
execution time is optimal
• For sparse topics, consider batch window to
aggregate over a time period
42. Kafka Producer in Lambda (create
once, use many)
• Create Producer in the constructor
• Producer will be re-used across executions for the life of
the Lambda instance
• Reduce strain on brokers by minimizing
connections and producer clients
Producer Producers!!!!
This Not this
43. Consider using ksqlDB for state
• A powerful combination of ksqlDB and Lambda provides a stateful -> stateless ->
stateful pattern
45. Enrich Transaction events for
Fraud scoring
Customer Transaction Avg 7 days Num trans 10m
Jay $10 $8.5 1
46. Enrich Transaction events for
Fraud scoring
Customer Transaction Avg 7 days Num trans 10m
Jay $10 $8.5 1
Amazon
SageMaker
AWS
Lambda
ksqlDB
47. Next Steps
How did we do? Enter your feedback
<CSAT link & QR Code>
Schedule an executive briefing
Schedule a briefing for your business
and technology leadership team
Join or schedule a workshop
● Join our FSI workshop on July 27th
<link/QR code to registration>
● Schedule a workshop for your team
Learn more about serverless
Visit Serverlessland.com for self guided
workshops, videos, and resources
1
2
3