Organisations are becoming Event Driven based on streaming technologies and adopting Data Mesh and Event Mesh architectures. As this becomes pervasive, so do the challenges around runtime governance and lifecycle management. For example, do you know what streams exist, who is producing and consuming them? What is the effect of upstream changes? How is this information kept up to date, and how do people collaborate efficiently across distributed teams and environments? Ever wish you had a way to view and visualize graphically the relationships between schemas, topics and applications? In this talk we will show you how to do that and get more value from your Kafka Streaming infrastructure using an Event Portal - an API portal specialized for event streams and publish/subscribe patterns. Join us to see how you can discover event streams from your Kafka clusters, import them to a catalog to see alongside other enterprise event streams and leverage code gen capabilities to ease development.
6. Data Mesh meet Event Mesh
• Data Mesh
– Solve the democratisation of data
ownership across the enterprise
– Evolved out of the analytical domain
– Data changes are events
– Log Style Event Broker
• Event Mesh
– An architectural layer to decouple applications,
allowing events to flow across the enterprise
– Evolved out of the transactional / middleware
domain, exist in IOT
– Events can carry data
– Queue / Pub-Sub style Event Broker
• Both
– Monolith to microservices
– De-centralise / distribute compute