Jonathan Schabowsky, Sr. Architect at Solace, and Marc DiPasquale, Developer Advocate at Solace, present at MuleSoft Connect19 about making your enterprise event-driven.
Event Mesh Presentation at Gartner AADI MumbaiSolace
Summit Puri, Global Head of Systems Engineering at Solace, talks about the architecture layer that will make your business event-driven. Find out more about event mesh in this presentation from Gartner AADI Mumbai on March 11th, 2019.
Event Mesh: The Architecture Layer That Will Power Your Digital TransformationSolace
Event mesh is an architectural layer that routes events from producers to consumers in a flexible, reliable and governed manner, no matter where your apps are deployed. Crispin Clark, SVP Europe at Solace, and Harsh Jegadeesan, Head of Product Management Integration Platform at SAP, discuss in depth the evolution of the event mesh.
Originally presented by Jonathan Schabowsky at Kafka Summit 2020.
** About PubSub+ Event Portal for Apache Kafka **
You know and love Apache Kafka, but have you ever tried to visualize Kafka topology, or figure out who owns what event stream in a Kafka cluster? Your event-driven architecture has evolved, and your system has grown to the point where you’re feeling a bit… out of control.
You need a tool to discover your Kafka event streams, represent it in a graphical view, and make it easy to share and reuse events. Basically, you need an API portal, but for asynchronous, event-driven applications.
That is why we have developed PubSub+ Event Portal. This event management toolset makes it easy for you to discover, visualize, catalog and share your Apache Kafka event streams, including those from Confluent and Amazon MSK.
To learn more, visit: https://solace.com/products/portal/kafka/
AsyncAPI Conference: From Design to Code with Marc DiPasqualeSolace
TALK ABSTRACT: A long standing gap has been filled! The AsyncAPI specification has risen as the industry standard for defining asynchronous APIs and is being incorporated into several products to enable a better tomorrow for Event-Driven Architecture (EDA). In this talk you will learn how AsyncAPI has been adopted by the Solace PubSub+ Event Portal to provide full lifecycle management of your EDA. I will focus on how you can use the tool to design your architecture and demonstrate how to use the AsyncAPI’s Code Generators to quickly develop a Spring Cloud Stream microservice.
SPEAKER BIO: Marc DiPasquale is a Developer Advocate with extensive engineering experience in the public and private sector across multiple domains including healthcare, aviation and weather imagery processing. He has been using event-driven techniques and methodologies throughout his career and is excited by its elevation to the mainstream. Marc works with prospective and existing clients to enable the development of modern and reactive applications.
***
Follow Marc on Twitter: https://twitter.com/Mrc0113
And on LinkedIn: https://www.linkedin.com/in/marc-dipasquale/
Event Mesh Presentation at Gartner AADI MumbaiSolace
Summit Puri, Global Head of Systems Engineering at Solace, talks about the architecture layer that will make your business event-driven. Find out more about event mesh in this presentation from Gartner AADI Mumbai on March 11th, 2019.
Event Mesh: The Architecture Layer That Will Power Your Digital TransformationSolace
Event mesh is an architectural layer that routes events from producers to consumers in a flexible, reliable and governed manner, no matter where your apps are deployed. Crispin Clark, SVP Europe at Solace, and Harsh Jegadeesan, Head of Product Management Integration Platform at SAP, discuss in depth the evolution of the event mesh.
Originally presented by Jonathan Schabowsky at Kafka Summit 2020.
** About PubSub+ Event Portal for Apache Kafka **
You know and love Apache Kafka, but have you ever tried to visualize Kafka topology, or figure out who owns what event stream in a Kafka cluster? Your event-driven architecture has evolved, and your system has grown to the point where you’re feeling a bit… out of control.
You need a tool to discover your Kafka event streams, represent it in a graphical view, and make it easy to share and reuse events. Basically, you need an API portal, but for asynchronous, event-driven applications.
That is why we have developed PubSub+ Event Portal. This event management toolset makes it easy for you to discover, visualize, catalog and share your Apache Kafka event streams, including those from Confluent and Amazon MSK.
To learn more, visit: https://solace.com/products/portal/kafka/
AsyncAPI Conference: From Design to Code with Marc DiPasqualeSolace
TALK ABSTRACT: A long standing gap has been filled! The AsyncAPI specification has risen as the industry standard for defining asynchronous APIs and is being incorporated into several products to enable a better tomorrow for Event-Driven Architecture (EDA). In this talk you will learn how AsyncAPI has been adopted by the Solace PubSub+ Event Portal to provide full lifecycle management of your EDA. I will focus on how you can use the tool to design your architecture and demonstrate how to use the AsyncAPI’s Code Generators to quickly develop a Spring Cloud Stream microservice.
SPEAKER BIO: Marc DiPasquale is a Developer Advocate with extensive engineering experience in the public and private sector across multiple domains including healthcare, aviation and weather imagery processing. He has been using event-driven techniques and methodologies throughout his career and is excited by its elevation to the mainstream. Marc works with prospective and existing clients to enable the development of modern and reactive applications.
***
Follow Marc on Twitter: https://twitter.com/Mrc0113
And on LinkedIn: https://www.linkedin.com/in/marc-dipasquale/
Apache Kafka for Cybersecurity and SIEM / SOAR ModernizationKai Wähner
Data in Motion powered by the Apache Kafka ecosystem for Situational Awareness, Threat Detection, Forensics, Zero Trust Zones and Air-Gapped Environments.
Agenda:
1) Cybersecurity in 202X
2) Data in Motion as Cybersecurity Backbone
3) Situational Awareness
4) Threat Intelligence
5) Forensics
6) Air-Gapped and Zero Trust Environments
7) SIEM / SOAR Modernization
More details in the "Kafka for Cybersecurity" blog series:
https://www.kai-waehner.de/blog/2021/07/02/kafka-cybersecurity-siem-soar-part-1-of-6-data-in-motion-as-backbone/
Sharing Digital Transformation Experiences using the Event Mesh - Real Time, ...Phil Scanlon
By 2020, 70% of the new business ecosystems will require support for event processing, as per Gartner. This talk will cover the evolution to the Event Driven Paradigm, from a Service Oriented Architecture with real world examples, using an Advanced Event Broker
Event Driven data flows and APIs
Responsive customer experience leveraging the ‘Waiter Pattern’
Leveraging events for AI and Machine Learning
Event sourcing, Event Governance and other useful concepts and tools
Case studies - digital insurance, digital QR code payments, IOT
We introduce LEAD a formal CEP that was presented at DEBS 2019. Using a running example, we explain how the pattern matching job is generated using Aged Colored Petri Net as logical execution plan. The implementation is currently under Flink Streaming.
Stream events across your enterprise with the Solace PlatformSolace
In this presentation, Crispin Clarke will overview the technology to enable enterprises to become event-driven, why this is imperative in today’s business environment, and present an overview of the Solace PubSub+ Platform with some customer examples. I will also highlight the recent addition to the Solace Platform, Event Portal, which accelerates enterprises on their Digital Transformation journey.
Financial Event Sourcing at Enterprise Scaleconfluent
For years, Rabobank has been actively investing in becoming a real-time, event-driven bank. If you are familiar with banking processes, you will understand that this is not simple. Many banking processes are implemented as batch jobs on not-so-commodity hardware, meaning that any migration effort is immense.
*Find out how Rabobank redesigned Rabo Alerts while continuing to provide a robust and stable alert system for its existing user base
*Learn how the project team managed to achieve a balance between the need to decentralise activity while not losing control
*Understand how Rabobank re-invented a reliable service to meet modern customer expectations
DataOps on Streaming Data: From Kafka to InfluxDB via Kubernetes Native Flows...InfluxData
In this session, we are going to create a Lenses DataOps hub for IoT data with Apache Kafka and InfluxDB flows over Kubernetes. We will demonstrate how to create streaming flows and securely explore and monitor real-time data. We will use Kubernetes to spin up scalable flows and go through how we can simply provision such flows with secret management and monitoring end to end out capabilities.
Apache Kafka in the Public Sector (Government, National Security, Citizen Ser...Kai Wähner
The Rise of Data in Motion in the Public Sector powered by event streaming with Apache Kafka.
Citizen Services:
- Health services, e.g. hospital modernization, track & trace - Covid distance control
- Public administration - reduce bureaucracy, data democratization across government departments
- eGovernment - Efficient and digital citizen engagement, e.g. personal ID application process
Smart City
- Smart driving, parking, buildings, environment
Waste management
- Open exchange – e.g. mobility services (1st and 3rd party)
Energy
- Smart grid and utilities infrastructure (energy distribution, smart home, smart meters, smart water, etc.)
- National Security
Law enforcement, surveillance, police/interior security data exchange
- Defense and military (border control, intelligent solider)
Cybersecurity for situational awareness and threat intelligence
Understanding the TCO and ROI of Apache Kafka & Confluentconfluent
For a product or service to be cost effective, it must be considered to be good value, where the benefits are worth at least what is paid for them. But how do we measure this, to prove the case? Given that value can be intangible, it can be hard to quantify and may have little relationship to cost. Added to this, the open source nature of Apache Kafka means that many companies skip the requirement to build a business case for it, until it has become mission critical and demands financial and human resources.
In this presentation, Lyndon Hedderly, Team Lead of Business Value Consulting at Confluent, will cover how Confluent works with customers to measure the business value of data streaming
Using Kafka in Your Organization with Real-Time User Insights for a Customer ...confluent
(Chris Maier + Steven Royster, West Monroe Partners) Kafka Summit SF 2018
The value of real-time data is growing as an increasing number of companies look to provide a comprehensive experience for their customers. Utilizing Kafka in key facets of your organization will yield greater customer satisfaction and promote a better understanding of user interactions. As data streaming is becoming more prevalent in a wide variety of industries, companies are seeking to modernize their tech stacks by employing the extensible, scalable infrastructure afforded by Kafka.
Over the past few months, we have successfully developed a containerized Kafka implementation at a major healthcare provider. In addition, we created producers to publish messages to the Kafka cluster and consumers to receive them on the other end. By capturing a plethora of data around customer activity, we created opportunities for the business to act upon real-time metrics in order to provide an improved customer experience.
In this talk, we will cover the user-related data sources we connected to Kafka, the reasons we chose them, and how the insights gained from each source can be leveraged in your business. You will walk out understanding how capturing a wide variety of customer activity data can create opportunities for the business to act on real-time metrics in order to provide an improved customer experience.
Event-Streaming verstehen in unter 10 Minconfluent
Um die unternehmerische Geschwindigkeit zu erhöhen, die Wettbewerbsfähigkeit durch neue Produkte und Services zu steigern und schnell auf plötzlich ändernde Markteinflüsse reagieren zu können, müssen Daten und Ereignisströme in Echtzeit geteilt, verarbeitet und ausgewertet werden können. Apache Kafka hat sich hier als Industrie-Standard für Event-Streaming etabliert. Ob Connected Car, Industrie 4.0 oder Customer 360 – alle diese zukunftsorientierten Themen benötigen schnelle Kommunikation, effiziente Vernetzung und eine Verarbeitung von enormen Datenmengen in Echtzeit.
Event-Driven Microservices with Apache Kafka, Kafka Streams and KSQLKai Wähner
Building Event-Driven Microservices with Stateful Streams - Apache Kafka, Kafka Streams, KSQL, and more…
Event Driven Services come in many shapes and sizes from tiny event-driven functions that dip into an event stream, right through to heavy, stateful services which can facilitate request response. This talk makes the case for building this style of system using Stream Processing tools, defining a microservices architecture and leveraging Apache Kafka ecosystem including Kafka Streams and KSQL. We also walk through a number of patterns for how we actually put these things together to enable independent teams and autonomous development of microservices.
Kudos to my colleagues Ben and Jay who created many of the slides.
Mainframe Integration, Offloading and Replacement with Apache KafkaKai Wähner
Video recording of this presentation:
https://youtu.be/upWzamacOVQ
Blog post with more details:
https://www.kai-waehner.de/blog/2020/04/24/mainframe-offloading-replacement-apache-kafka-connect-ibm-db2-mq-cdc-cobol/
Mainframes are still hard at work, processing over 70 percent of the world’s most essential computing transactions every day. Very high cost, monolithic architectures, and missing experts are the key challenges for mainframe applications. Time to get more innovative, even with the mainframe!
Mainframe offloading with Apache Kafka and its ecosystem can be used to keep a more modern data store in real-time sync with the mainframe. At the same time, it is persisting the event data on the bus to enable microservices, and deliver the data to other systems such as data warehouses and search indexes.
But the final goal and ultimate vision are to replace the mainframe by new applications using modern and less costly technologies. Stand up to the dinosaur, but keep in mind that legacy migration is a journey! Kai will guide you to the next step of your company’s evolution!
You will learn:
- how to not only reduce operational expenses but provide a path for architecture modernization, agility and eventually mainframe replacement
- what steps some of Confluent’s customers already took, leveraging technologies like Change Data Capture (CDC) or MQ for mainframe offloading
- how an event streaming platform enables cost reduction, architecture modernization, and a combination of a mainframe with new technologies
Cloud Migration for Financial Services - Toronto - October 2016Amazon Web Services
Presented by Cloud Technology Partners. Robert Christiansen presents us best practices for cloud adoption, taking us on the journey from a single application on the cloud, through hybrid cloud, culminating with a Cloud First Approach.
Apache Kafka and Blockchain - Comparison and a Kafka-native ImplementationKai Wähner
Apache Kafka is an open-source event streaming platform used to complement or replace existing middleware, integrate applications, and build microservice architectures. Used at almost every large company today, it's understood, battled-tested, highly scalable, and reliable.
Blockchain is a different story. Being related to cryptocurrencies like Bitcoin, it's often in the news. But what is the value of software architecture? And how is it related to an integration architecture and event streaming platform?
This session explores blockchain use cases and different alternatives such as Hyperledger, Ethereum, and Kafka-native blockchain implementation. We discuss the value blockchain brings for different architectures, and how it can be integrated with the Kafka ecosystem to build a highly scalable and reliable event streaming infrastructure.
This talk discusses the concepts, use cases, and architectures behind Event Streaming, Apache Kafka, Distributed Ledger (DLT), and Blockchain. A comparison of different technologies such as Confluent, AIBlockchain, Hyperledger, Ethereum, Ripple, IOTA, and Libra explores when to use Kafka, a Kafka-native blockchain, a dedicated blockchain, or Kafka in conjunction with another blockchain.
The Fourth Industrial Revolution (also known as Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices, using modern smart technology.
Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way integrating with various legacy and modern data sources and sinks.
In this presentation, I want to give you an overview of existing use cases for event streaming technology in a connected world across supply chains, industries and customer experiences that come along with these interdisciplinary data intersections:
• The Automotive Industry (and it’s not only Connected Cars)
• Mobility Services across verticals (transportation, logistics, travel industry, retailing, …)
• Smart Cities (including citizen health services, communication infrastructure, …)
All these industries and sectors do not have new characteristics and requirements. They require data integration, data correlation or real decoupling, just to name a few, but are now facing massively increased volumes of data.
Real-time messaging solutions have existed for many years. Hundreds of platforms exist for data integration (including ETL and ESB tooling or specific IIoT platforms). Proprietary monoliths monitor plants, telco networks, and other infrastructures for decades in real-time. But now, Kafka combines all the above characteristics in an open, scalable, and flexible infrastructure to operate mission-critical workloads at scale in real-time. And is taking over the world of connecting data.
Financial technology, or FinTech, has been using technology to disrupt the financial services, payments, and banking industries. Fintech investment are on the rise - Q4 of 2014 was the busiest quarter of the last 5 years in fintech. $3.1 billion was invested into the fintech sector in Q4 of 2014.
Chicago is home to financial technology powerhouses and startups alike. On April 8th, we'll hear from Chicago companies using cloud technology to disrupt, react and innovate.
CloudCamp features short lightning talks, an "unpanel" with audience participation and questions, and small breakout clusters around beers and pizza.
Theme: "FinTech"
These full slides from the April 8th event include:
"Selling to the Sell-Side" - Sam Perl, Founding Partner at Fundology
"Cloud Culture Shock in Financial Services" - Susan Emery, Director of Product Management at Viewpointe @semery_vp
"Put away the credit card, a look at alternative payment methods" - John Downey, Security Lead at Braintree @jtdowney
"Micro-services and how they apply to FinTech" - Eero Pikat, President at BarChart @eeropikat
"What Financial Cloud Should Be" - Patrick Kerpan, CEO at Cohesive Networks @pjktech
Apache Kafka for Cybersecurity and SIEM / SOAR ModernizationKai Wähner
Data in Motion powered by the Apache Kafka ecosystem for Situational Awareness, Threat Detection, Forensics, Zero Trust Zones and Air-Gapped Environments.
Agenda:
1) Cybersecurity in 202X
2) Data in Motion as Cybersecurity Backbone
3) Situational Awareness
4) Threat Intelligence
5) Forensics
6) Air-Gapped and Zero Trust Environments
7) SIEM / SOAR Modernization
More details in the "Kafka for Cybersecurity" blog series:
https://www.kai-waehner.de/blog/2021/07/02/kafka-cybersecurity-siem-soar-part-1-of-6-data-in-motion-as-backbone/
Sharing Digital Transformation Experiences using the Event Mesh - Real Time, ...Phil Scanlon
By 2020, 70% of the new business ecosystems will require support for event processing, as per Gartner. This talk will cover the evolution to the Event Driven Paradigm, from a Service Oriented Architecture with real world examples, using an Advanced Event Broker
Event Driven data flows and APIs
Responsive customer experience leveraging the ‘Waiter Pattern’
Leveraging events for AI and Machine Learning
Event sourcing, Event Governance and other useful concepts and tools
Case studies - digital insurance, digital QR code payments, IOT
We introduce LEAD a formal CEP that was presented at DEBS 2019. Using a running example, we explain how the pattern matching job is generated using Aged Colored Petri Net as logical execution plan. The implementation is currently under Flink Streaming.
Stream events across your enterprise with the Solace PlatformSolace
In this presentation, Crispin Clarke will overview the technology to enable enterprises to become event-driven, why this is imperative in today’s business environment, and present an overview of the Solace PubSub+ Platform with some customer examples. I will also highlight the recent addition to the Solace Platform, Event Portal, which accelerates enterprises on their Digital Transformation journey.
Financial Event Sourcing at Enterprise Scaleconfluent
For years, Rabobank has been actively investing in becoming a real-time, event-driven bank. If you are familiar with banking processes, you will understand that this is not simple. Many banking processes are implemented as batch jobs on not-so-commodity hardware, meaning that any migration effort is immense.
*Find out how Rabobank redesigned Rabo Alerts while continuing to provide a robust and stable alert system for its existing user base
*Learn how the project team managed to achieve a balance between the need to decentralise activity while not losing control
*Understand how Rabobank re-invented a reliable service to meet modern customer expectations
DataOps on Streaming Data: From Kafka to InfluxDB via Kubernetes Native Flows...InfluxData
In this session, we are going to create a Lenses DataOps hub for IoT data with Apache Kafka and InfluxDB flows over Kubernetes. We will demonstrate how to create streaming flows and securely explore and monitor real-time data. We will use Kubernetes to spin up scalable flows and go through how we can simply provision such flows with secret management and monitoring end to end out capabilities.
Apache Kafka in the Public Sector (Government, National Security, Citizen Ser...Kai Wähner
The Rise of Data in Motion in the Public Sector powered by event streaming with Apache Kafka.
Citizen Services:
- Health services, e.g. hospital modernization, track & trace - Covid distance control
- Public administration - reduce bureaucracy, data democratization across government departments
- eGovernment - Efficient and digital citizen engagement, e.g. personal ID application process
Smart City
- Smart driving, parking, buildings, environment
Waste management
- Open exchange – e.g. mobility services (1st and 3rd party)
Energy
- Smart grid and utilities infrastructure (energy distribution, smart home, smart meters, smart water, etc.)
- National Security
Law enforcement, surveillance, police/interior security data exchange
- Defense and military (border control, intelligent solider)
Cybersecurity for situational awareness and threat intelligence
Understanding the TCO and ROI of Apache Kafka & Confluentconfluent
For a product or service to be cost effective, it must be considered to be good value, where the benefits are worth at least what is paid for them. But how do we measure this, to prove the case? Given that value can be intangible, it can be hard to quantify and may have little relationship to cost. Added to this, the open source nature of Apache Kafka means that many companies skip the requirement to build a business case for it, until it has become mission critical and demands financial and human resources.
In this presentation, Lyndon Hedderly, Team Lead of Business Value Consulting at Confluent, will cover how Confluent works with customers to measure the business value of data streaming
Using Kafka in Your Organization with Real-Time User Insights for a Customer ...confluent
(Chris Maier + Steven Royster, West Monroe Partners) Kafka Summit SF 2018
The value of real-time data is growing as an increasing number of companies look to provide a comprehensive experience for their customers. Utilizing Kafka in key facets of your organization will yield greater customer satisfaction and promote a better understanding of user interactions. As data streaming is becoming more prevalent in a wide variety of industries, companies are seeking to modernize their tech stacks by employing the extensible, scalable infrastructure afforded by Kafka.
Over the past few months, we have successfully developed a containerized Kafka implementation at a major healthcare provider. In addition, we created producers to publish messages to the Kafka cluster and consumers to receive them on the other end. By capturing a plethora of data around customer activity, we created opportunities for the business to act upon real-time metrics in order to provide an improved customer experience.
In this talk, we will cover the user-related data sources we connected to Kafka, the reasons we chose them, and how the insights gained from each source can be leveraged in your business. You will walk out understanding how capturing a wide variety of customer activity data can create opportunities for the business to act on real-time metrics in order to provide an improved customer experience.
Event-Streaming verstehen in unter 10 Minconfluent
Um die unternehmerische Geschwindigkeit zu erhöhen, die Wettbewerbsfähigkeit durch neue Produkte und Services zu steigern und schnell auf plötzlich ändernde Markteinflüsse reagieren zu können, müssen Daten und Ereignisströme in Echtzeit geteilt, verarbeitet und ausgewertet werden können. Apache Kafka hat sich hier als Industrie-Standard für Event-Streaming etabliert. Ob Connected Car, Industrie 4.0 oder Customer 360 – alle diese zukunftsorientierten Themen benötigen schnelle Kommunikation, effiziente Vernetzung und eine Verarbeitung von enormen Datenmengen in Echtzeit.
Event-Driven Microservices with Apache Kafka, Kafka Streams and KSQLKai Wähner
Building Event-Driven Microservices with Stateful Streams - Apache Kafka, Kafka Streams, KSQL, and more…
Event Driven Services come in many shapes and sizes from tiny event-driven functions that dip into an event stream, right through to heavy, stateful services which can facilitate request response. This talk makes the case for building this style of system using Stream Processing tools, defining a microservices architecture and leveraging Apache Kafka ecosystem including Kafka Streams and KSQL. We also walk through a number of patterns for how we actually put these things together to enable independent teams and autonomous development of microservices.
Kudos to my colleagues Ben and Jay who created many of the slides.
Mainframe Integration, Offloading and Replacement with Apache KafkaKai Wähner
Video recording of this presentation:
https://youtu.be/upWzamacOVQ
Blog post with more details:
https://www.kai-waehner.de/blog/2020/04/24/mainframe-offloading-replacement-apache-kafka-connect-ibm-db2-mq-cdc-cobol/
Mainframes are still hard at work, processing over 70 percent of the world’s most essential computing transactions every day. Very high cost, monolithic architectures, and missing experts are the key challenges for mainframe applications. Time to get more innovative, even with the mainframe!
Mainframe offloading with Apache Kafka and its ecosystem can be used to keep a more modern data store in real-time sync with the mainframe. At the same time, it is persisting the event data on the bus to enable microservices, and deliver the data to other systems such as data warehouses and search indexes.
But the final goal and ultimate vision are to replace the mainframe by new applications using modern and less costly technologies. Stand up to the dinosaur, but keep in mind that legacy migration is a journey! Kai will guide you to the next step of your company’s evolution!
You will learn:
- how to not only reduce operational expenses but provide a path for architecture modernization, agility and eventually mainframe replacement
- what steps some of Confluent’s customers already took, leveraging technologies like Change Data Capture (CDC) or MQ for mainframe offloading
- how an event streaming platform enables cost reduction, architecture modernization, and a combination of a mainframe with new technologies
Cloud Migration for Financial Services - Toronto - October 2016Amazon Web Services
Presented by Cloud Technology Partners. Robert Christiansen presents us best practices for cloud adoption, taking us on the journey from a single application on the cloud, through hybrid cloud, culminating with a Cloud First Approach.
Apache Kafka and Blockchain - Comparison and a Kafka-native ImplementationKai Wähner
Apache Kafka is an open-source event streaming platform used to complement or replace existing middleware, integrate applications, and build microservice architectures. Used at almost every large company today, it's understood, battled-tested, highly scalable, and reliable.
Blockchain is a different story. Being related to cryptocurrencies like Bitcoin, it's often in the news. But what is the value of software architecture? And how is it related to an integration architecture and event streaming platform?
This session explores blockchain use cases and different alternatives such as Hyperledger, Ethereum, and Kafka-native blockchain implementation. We discuss the value blockchain brings for different architectures, and how it can be integrated with the Kafka ecosystem to build a highly scalable and reliable event streaming infrastructure.
This talk discusses the concepts, use cases, and architectures behind Event Streaming, Apache Kafka, Distributed Ledger (DLT), and Blockchain. A comparison of different technologies such as Confluent, AIBlockchain, Hyperledger, Ethereum, Ripple, IOTA, and Libra explores when to use Kafka, a Kafka-native blockchain, a dedicated blockchain, or Kafka in conjunction with another blockchain.
The Fourth Industrial Revolution (also known as Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices, using modern smart technology.
Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way integrating with various legacy and modern data sources and sinks.
In this presentation, I want to give you an overview of existing use cases for event streaming technology in a connected world across supply chains, industries and customer experiences that come along with these interdisciplinary data intersections:
• The Automotive Industry (and it’s not only Connected Cars)
• Mobility Services across verticals (transportation, logistics, travel industry, retailing, …)
• Smart Cities (including citizen health services, communication infrastructure, …)
All these industries and sectors do not have new characteristics and requirements. They require data integration, data correlation or real decoupling, just to name a few, but are now facing massively increased volumes of data.
Real-time messaging solutions have existed for many years. Hundreds of platforms exist for data integration (including ETL and ESB tooling or specific IIoT platforms). Proprietary monoliths monitor plants, telco networks, and other infrastructures for decades in real-time. But now, Kafka combines all the above characteristics in an open, scalable, and flexible infrastructure to operate mission-critical workloads at scale in real-time. And is taking over the world of connecting data.
Financial technology, or FinTech, has been using technology to disrupt the financial services, payments, and banking industries. Fintech investment are on the rise - Q4 of 2014 was the busiest quarter of the last 5 years in fintech. $3.1 billion was invested into the fintech sector in Q4 of 2014.
Chicago is home to financial technology powerhouses and startups alike. On April 8th, we'll hear from Chicago companies using cloud technology to disrupt, react and innovate.
CloudCamp features short lightning talks, an "unpanel" with audience participation and questions, and small breakout clusters around beers and pizza.
Theme: "FinTech"
These full slides from the April 8th event include:
"Selling to the Sell-Side" - Sam Perl, Founding Partner at Fundology
"Cloud Culture Shock in Financial Services" - Susan Emery, Director of Product Management at Viewpointe @semery_vp
"Put away the credit card, a look at alternative payment methods" - John Downey, Security Lead at Braintree @jtdowney
"Micro-services and how they apply to FinTech" - Eero Pikat, President at BarChart @eeropikat
"What Financial Cloud Should Be" - Patrick Kerpan, CEO at Cohesive Networks @pjktech
Webinar #5: Mobile indsigter og trends ft. Google Become A/S
Compells femte webinar i rækken bærer titlen "Mobile insigter og trends", hvor vi har fået hjælp af Tobias Jensen, Agency Development Manager for Google.
Flexing Sugar Platform: Session 7: Turning CRM Inside Out: Using Sugar as a H...SugarCRM
CRM used to be just for employees, but revolutionary student housing company, Aspen Heights, has changed that assumption forever. With the help of Sugar Gold Partner BrainSell Technologies, Aspen Heights has implemented Sugar as a hub for their business. Residents can log into a customer portal and pay their rent, submit a support case and even see if their credit application has cleared.
4 presentations in a single Serverless Meetup! Our "Birthday Special" consisted of:
1. “Show and tell of Cloudinary BaaS recipes and DAM for a dynamic world” by Eric Courville, Sr. Director – Business Development, Americas at Cloudinary
2. “AliPay/WeChatPay @ HBC/SAKS” by Danny Elisha, Sr. Systems Architect at HBC
3. “How to leverage AWS – Unofficial Guide for Startups” by Alan Williamson, Director – Solutions Architecture at Onica.com
4. “Spreading the JAM(stack) to build Static serverless websites” by Bhavana Srinivas, Solutions Engineer at Netlify.
Silicon Halton Meetup 79 - Chart of AccountsSilicon Halton
Presentation by SB Partners LLP (www.sbpartners.ca) on the importance for Tech Solopreneurs and Entrepreneurs to setup a proper Chart of Accounts. We learned:
- Why understanding what a chart of accounts is – is a big deal
- The right chart of accounts for technology companies
- How setting up your accounting systems properly will pay off in the long run
- Some of the sexy saas accounting software that’s out there for smaller tech companies
- When to stop doing it by yourself and hire outside help
www.siliconhalton.com
twitter.com/siliconhalton
API’s and Identity: Enabling Optum to become the HealthCare cloudCA Technologies
Brief on how Optum is transforming itself to become the HealthCare Services Cloud and how APIs and Identity are the enablers to make this possible.
For more information, please visit http://cainc.to/Nv2VOe
How do I know when my organization has outgrown spreadsheets for managing cash and risk? What are best practices for improving visibility while standardizing risk controls? What are other companies doing to implement a global risk strategy and minimize the impact of financial, operational, and liquidity risk?
Kyriba's VP Strategy, Bob Stark, addresses these questions, and discusses how CFOs and finance teams face numerous risks and challenges every day. Key topics in this session include: understanding FX risk, optimizing global cash, protecting against payments fraud, tips & tricks to digitize your processes. Originally presented at SuiteWorld 2019 in Las Vegas.
Technology Primer: How to Achieve a Customer-Centric View in an Omni-Channel ...CA Technologies
Today, our customers are interacting with our business and services through mobile, web and increasingly via wearables. It is critical to understand and improve the user experience we are providing across all of these app channels. Are our customers getting good value from our apps? How are they interacting with us? How can we deliver more value via the channels and devices most important to our users? To stay competitive, you need to deliver great experiences to your users, quickly, securely and efficiently across all channels.
Discover how CA Technologies is providing a customer-centric view on your customer experience, empowering you to understand your users, quickly identify issues and deliver more business value via apps through the omni-channel.
For more information, please visit http://cainc.to/Nv2VOe
How kiwi.com transitioned to a remote-only recruiting strategyBeamery
Recruiting leaders have had to pack months, sometimes years of digital transformation plans into a few weeks to adapt to the Covid health crisis.
In this webinar, we invited Pavlína Schuster, Recruiting Operations Analyst at Kiwi.com, to describe how she worked with her team to accelerate the digital transformation agenda that her organization had started before the pandemic. She discussed remote events, at-home recruiting processes, and new ways to measure success for these adapted recruiting processes.
Senior finance executives know in their gut that cloud investments will be part of the future. Here are seven questions frequently asked by our CFO clients, some applying to cloud investments anywhere in the enterprise, some dealing specifically with Finance as a potential cloud user. These questions are relevant for any CFO in business today. https://deloi.tt/2FxRq7n
We are not the largest enterprise application development company out there. What we do bring to the table is a guerrilla team of laser focused Software Developers & experts capable of executing robust custom enterprise applications.
Deep Impact: Explore the Wide- Reaching Impact of a CyberattackPriyanka Aash
The impacts of a cyberattack are long-lasting and extend well beyond technology. In this cyber-wargame, participants will test their assumptions and incident response know-how against a cyberattack scenario with complex business impacts that unfolds over a simulated year.
(Source: RSA Conference USA 2017)
Similar to API Management, Meet Event Management (20)
High-Velocity, Real-Time Connected Industry – From Edge to CloudSolace
Creating a global, high-velocity and real-time industrial connectivity fabric that incorporates local, real-time edge analytics combined with centralised, cloud-based management is not a trivial challenge. It gets even more complicated when we try to integrate local, cloud, and central SAP systems – event-enabled and in real-time. Here we present a combined approach based on the Solace Hybrid IoT Event Mesh, the VMWare Pulse IoT Center and Altair’s SmartWorks IoT platform; all of which enable manufacturers to stream high-velocity machine data, analyse it locally in real-time using machine learning algorithms from Telchemy whilst feeding digital twin and analytics information into the cloud for central processing. ASAPIO provides the SAP event-enabling capabilities. All of this implemented in a fun setting – a Carrera race-track. We detect crashes in real-time, predict before they happen and update our digital twins in the cloud to reflect the state of the track. We update quality control &; work orders across your SAP estate.
Accelerate the Adoption of Event-Driven ArchitectureSolace
Mychelle Mollot, Solace CMO, and Michael Hilmen, Principle Architect at Solace, presented at BoomiWorld 19 on how to Accelerate the Adoption of Event-Driven Architecture with Solace and Dell Boomi.
This was delivered by Sumeet Puri (Senior Vice President, Global Head of Systems Engineering) at the Singapore Cricket Club on September 18th, 2019.
Topics covered include: event-driven architecture, event brokers, event mesh, becoming an event-driven enterprise, real-time data streaming, event streaming, event management
What the Evolution of Connected Car Platforms Can Teach Us About Building Ada...Solace
"Are We There Yet?": Swen-Helge Huber, Principle IoT Architect at Solace, explains what the evolution of connected car platforms can teach us about building adaptable and extendable IoT platforms at the 2019 IoT TechExpo Europe in Amsterdam.
Async API and Solace: Enabling the Event-Driven FutureSolace
Fran Méndez, founder of AsyncAPI, and Jonathan Schabowsky, senior architect at Solace, explain how the two companies are working together in this presentation from Gartner AADI.
Gartner CIO & IT Executive Summit -- Event Mesh: The Architecture Layer That ...Solace
Crispin Clarke, SVP Europe at Solace and Martin Bachmann, Head of Product Management, Connected Cores SAP SE, presented at Gartner CIO & IT Executive Summit 2019 in Munich.
This presentation from our Hong Kong User Group held on March 7, 2019 went over how the event driven transformation is driving the Solace Event Mesh vision.
IDC Insights Awards 2018 - What is an Event Mesh?Solace
Sumeet Puri, Senior Vice President and Global Head of Systems Engineering at Solace, presented at the IDC Insights Awards in Chandigarh, India in December 2018. He explained what an event mesh is, and how the architecture layer can make a business event-driven.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
13. WE need to Manage
Events…
13
Because we need answers to…
• Where do you discover events/topics and the schema definition that
defines the payload?
• What logical event address (topic) do you subscribe to in order to
receive just the events you want to do something with?
• Why does a given event exist, i.e. what is it’s context and purpose. And
if you can’t figure that out…
• Who do you contact to learn more about more events and their
context/purpose?
• When will a given event be available or deprecated?
• How do developers/Architects define their event-driven application
interfaces, How do they generate code?