The Cisco WAN Automation Engine (WAE) is multivendor software designed to automate, plan, build and optimize your network. This deep dive session will focus on WAE, the problems it solves and how it solves them
DEVNET-1129 WAN Automation Engine - Develop Traffic Aware Applications Using ...Cisco DevNet
The Cisco WAN Automation Engine (WAE) is multivendor software designed to automate, plan, build and optimize your network. This session will introduce WAE and how to leverage its REST APIs.
DEVNET-1129 WAN Automation Engine - Develop Traffic Aware Applications Using ...Cisco DevNet
This document discusses the Cisco WAN Automation Engine (WAE), which provides network optimization, automation, modeling, and predictive analysis capabilities. It analyzes historical and real-time data to find hot spots and improve network efficiency. The WAE includes programmatic network control, extensible data models, and intelligent traffic balancing. It consists of a network planning suite, visualization/analytics tools, and optimization/prediction functions that are accessed through REST APIs. Example applications include optimizing bandwidth placement, automated tunnel management, inventory/maintenance tools, offline planning and analysis, and visual analytics. Use cases involve a unified multi-layer view and support for segment routing.
Should we manage events like APIs? | Alan Chatt and Kim Clark, IBMHostedbyConfluent
APIs have become ubiquitous as a way of exposing the capabilities of the enterprise both internally and externally. However, are APIs alone enough? There is a strong resurgence in interest in asynchronous communication and event driven architecture. Applications want to receive events immediately so they can respond in real time, and furthermore they also want the benefit of being decoupled from the availability and performance characteristics of the systems providing that data. However, whilst the way that APIs are socialised, exposed, versioned etc. is well matured in the form of API management technology. We are now on the cusp of seeing first class support for event endpoint management to provide the same sophistication for discovering, exposing and consuming events.
How does LinkedIn monitor its network infrastructure.
Slides from the talk: Infrastructure Engineering @Scale, Meetup (https://www.meetup.com/Infrastructure-Engineering-Scale/events/243011551/)
4th Nov 2017
Bangalore, IN
Real-time Analytics with Upsert Using Apache Kafka and Apache Pinot | Yupeng ...HostedbyConfluent
This document discusses real-time analytics using Apache Kafka and Apache Pinot. It describes how Uber uses Apache Kafka for data streaming and Apache Pinot for real-time queries. The key challenge discussed is implementing upserts (updates to existing data records) in Pinot. Various designs for a global or local coordinator to handle upserts are considered. The adopted design leverages Kafka's partitioning to distribute segments by primary key locally. Limitations and future work are noted around input partitioning, data retention, and partial updates.
Server Sent Events using Reactive Kafka and Spring Web flux | Gagan Solur Ven...HostedbyConfluent
Server-Sent Events (SSE) is a server push technology where clients receive automatic server updates through the secure http connection. SSE can be used in apps like live stock updates, that use one way data communications and also helps to replace long polling by maintaining a single connection and keeping a continuous event stream going through it. We used a simple Kafka producer to publish messages onto Kafka topics and developed a reactive Kafka consumer by leveraging Spring Webflux to read data from Kafka topic in non-blocking manner and send data to clients that are registered with Kafka consumer without closing any http connections. This implementation allows us to send data in a fully asynchronous & non-blocking manner and allows us to handle a massive number of concurrent connections. We’ll cover:
•Push data to external or internal apps in near real time
•Push data onto the files and securely copy them to any cloud services
•Handle multiple third-party apps integrations
Continuous Intelligence for Customer Service Using Kafka Event Streams | Simo...HostedbyConfluent
Today’s products - devices, software and services - are well instrumented to permit users, vendors and service providers to gather maximum insight into how they are used, when they need repair and many other operational insights. Ensuring that products can rapidly adapt to a constantly changing environment and changing customer needs requires that the events they generate are analyzed continuously and in context. Insights can be synthesized from many sources in context - geospatial and proximity, trajectory and even predicted future states.Customers, vendors and service providers need to analyze, learn, and predict directly from streaming events because data volumes are huge and automated responses must often be delivered in milliseconds. To achieve insights quickly, we need to build models on-the-fly whose predictions are accurate and in sync with the real world, often to support automation. Many insights depend on analyzing the joint evolution of data sources whose behavior is correlated in time or space.In this talk we present Swim, an Apache 2.0 licensed platform for continuous intelligence applications. Swim builds a fluid model of data sources and their changing relationships in real-time - Swim applications analyze, learn and predict directly from event data. Swim applications integrate with Apache Kafka for event streaming. Developers need nothing more than Java skills. Swim deploys native or in containers on k8s, with the same code in each instance. Instances link to build an application layer mesh that facilitates distribution and massive scale without sacrificing consistency. We will present several continuous intelligence applications in use today that depend on real-time analysis, learning and prediction to power automation and deliver responses that are in sync with the real-world. We will show how easy it is to build, deploy and run distributed, highly available event streaming applications that analyze data from hundreds of millions of sources - petabytes per day. The architecture is intuitively appealing and blazingly fast.
Kafka & InfluxDB: BFFs for Enterprise Data Applications | Russ Savage, Influx...HostedbyConfluent
Modern data processing applications built on Kafka and InfluxDB deliver the performance, reliability, and flexibility that customers need for robust real-time data pipeline solutions. As the saying goes, the pipeline is greater than the sum of its Kafka and InfluxDB parts. In this session, Russ Savage, Director of Product Management at InfluxData will discuss basic concepts of integrating Kafka and InfluxDB while highlighting how companies are creating fault-tolerant, scalable and fast data pipelines with the power of InfluxDB and Kafka.
DEVNET-1129 WAN Automation Engine - Develop Traffic Aware Applications Using ...Cisco DevNet
The Cisco WAN Automation Engine (WAE) is multivendor software designed to automate, plan, build and optimize your network. This session will introduce WAE and how to leverage its REST APIs.
DEVNET-1129 WAN Automation Engine - Develop Traffic Aware Applications Using ...Cisco DevNet
This document discusses the Cisco WAN Automation Engine (WAE), which provides network optimization, automation, modeling, and predictive analysis capabilities. It analyzes historical and real-time data to find hot spots and improve network efficiency. The WAE includes programmatic network control, extensible data models, and intelligent traffic balancing. It consists of a network planning suite, visualization/analytics tools, and optimization/prediction functions that are accessed through REST APIs. Example applications include optimizing bandwidth placement, automated tunnel management, inventory/maintenance tools, offline planning and analysis, and visual analytics. Use cases involve a unified multi-layer view and support for segment routing.
Should we manage events like APIs? | Alan Chatt and Kim Clark, IBMHostedbyConfluent
APIs have become ubiquitous as a way of exposing the capabilities of the enterprise both internally and externally. However, are APIs alone enough? There is a strong resurgence in interest in asynchronous communication and event driven architecture. Applications want to receive events immediately so they can respond in real time, and furthermore they also want the benefit of being decoupled from the availability and performance characteristics of the systems providing that data. However, whilst the way that APIs are socialised, exposed, versioned etc. is well matured in the form of API management technology. We are now on the cusp of seeing first class support for event endpoint management to provide the same sophistication for discovering, exposing and consuming events.
How does LinkedIn monitor its network infrastructure.
Slides from the talk: Infrastructure Engineering @Scale, Meetup (https://www.meetup.com/Infrastructure-Engineering-Scale/events/243011551/)
4th Nov 2017
Bangalore, IN
Real-time Analytics with Upsert Using Apache Kafka and Apache Pinot | Yupeng ...HostedbyConfluent
This document discusses real-time analytics using Apache Kafka and Apache Pinot. It describes how Uber uses Apache Kafka for data streaming and Apache Pinot for real-time queries. The key challenge discussed is implementing upserts (updates to existing data records) in Pinot. Various designs for a global or local coordinator to handle upserts are considered. The adopted design leverages Kafka's partitioning to distribute segments by primary key locally. Limitations and future work are noted around input partitioning, data retention, and partial updates.
Server Sent Events using Reactive Kafka and Spring Web flux | Gagan Solur Ven...HostedbyConfluent
Server-Sent Events (SSE) is a server push technology where clients receive automatic server updates through the secure http connection. SSE can be used in apps like live stock updates, that use one way data communications and also helps to replace long polling by maintaining a single connection and keeping a continuous event stream going through it. We used a simple Kafka producer to publish messages onto Kafka topics and developed a reactive Kafka consumer by leveraging Spring Webflux to read data from Kafka topic in non-blocking manner and send data to clients that are registered with Kafka consumer without closing any http connections. This implementation allows us to send data in a fully asynchronous & non-blocking manner and allows us to handle a massive number of concurrent connections. We’ll cover:
•Push data to external or internal apps in near real time
•Push data onto the files and securely copy them to any cloud services
•Handle multiple third-party apps integrations
Continuous Intelligence for Customer Service Using Kafka Event Streams | Simo...HostedbyConfluent
Today’s products - devices, software and services - are well instrumented to permit users, vendors and service providers to gather maximum insight into how they are used, when they need repair and many other operational insights. Ensuring that products can rapidly adapt to a constantly changing environment and changing customer needs requires that the events they generate are analyzed continuously and in context. Insights can be synthesized from many sources in context - geospatial and proximity, trajectory and even predicted future states.Customers, vendors and service providers need to analyze, learn, and predict directly from streaming events because data volumes are huge and automated responses must often be delivered in milliseconds. To achieve insights quickly, we need to build models on-the-fly whose predictions are accurate and in sync with the real world, often to support automation. Many insights depend on analyzing the joint evolution of data sources whose behavior is correlated in time or space.In this talk we present Swim, an Apache 2.0 licensed platform for continuous intelligence applications. Swim builds a fluid model of data sources and their changing relationships in real-time - Swim applications analyze, learn and predict directly from event data. Swim applications integrate with Apache Kafka for event streaming. Developers need nothing more than Java skills. Swim deploys native or in containers on k8s, with the same code in each instance. Instances link to build an application layer mesh that facilitates distribution and massive scale without sacrificing consistency. We will present several continuous intelligence applications in use today that depend on real-time analysis, learning and prediction to power automation and deliver responses that are in sync with the real-world. We will show how easy it is to build, deploy and run distributed, highly available event streaming applications that analyze data from hundreds of millions of sources - petabytes per day. The architecture is intuitively appealing and blazingly fast.
Kafka & InfluxDB: BFFs for Enterprise Data Applications | Russ Savage, Influx...HostedbyConfluent
Modern data processing applications built on Kafka and InfluxDB deliver the performance, reliability, and flexibility that customers need for robust real-time data pipeline solutions. As the saying goes, the pipeline is greater than the sum of its Kafka and InfluxDB parts. In this session, Russ Savage, Director of Product Management at InfluxData will discuss basic concepts of integrating Kafka and InfluxDB while highlighting how companies are creating fault-tolerant, scalable and fast data pipelines with the power of InfluxDB and Kafka.
PCAP Graphs for Cybersecurity and System TuningDr. Mirko Kämpf
This document discusses analyzing network traffic patterns in Hadoop clusters. Packet capture data was collected from example Hadoop workloads and analyzed using Gephi. Initial results show the network structure and communication between nodes for batch processing (TeraSort) and real-time streaming (Twitter collection). Further analysis aims to classify components, understand dependencies, and identify anomalies over time to better understand typical and atypical workload behavior.
Extracting Value from IOT using Azure Cosmos DB, Azure Synapse Analytics and ...HostedbyConfluent
Due to explosion of IoT, we have streaming data that needs to be processed in real-time. This needs to be made available for applications as well as analytics scenarios such as anomaly detection. This workshop presents a solution using Confluent Cloud on Azure, Azure Cosmos DB and Azure Synapse Analytics which can be connected in a secure way within Azure VNET using Azure Private link configured on Kafka clusters.
Driving a Digital Thread Program in Manufacturing with Apache Kafka | Anu Mis...HostedbyConfluent
Forward-looking manufacturing companies have recognized the value of digital threads that bring together design and product information across the product life cycle, connecting the dots as information flows from design to manufacturing and on to services. Creating a reliable, scalable infrastructure to support digital thread programs can be a significant challenge, given the wide variety of legacy systems involved. At Mercury Systems we are using Kafka and Confluent to drive our digital thread program and put in place a product lifecycle management process for Industry 4.0. With the substantial year-on-year growth we were seeing, we needed a cloud-ready solution that goes beyond a basic, API-based integration layer based on Mulesoft or similar technology. If you’re wondering why Kafka makes sense for a digital thread, join us to learn how a real-time event streaming platform enables core strategies around ML/AI, microservices, model-based system engineering, and continuous improvement.
Streaming Data in the Cloud with Confluent and MongoDB Atlas | Robert Walters...HostedbyConfluent
This document discusses streaming data between Confluent Cloud and MongoDB Atlas. It provides an overview of MongoDB Atlas and its fully managed database capabilities in the cloud. It then demonstrates how to stream data from a Python generator application to MongoDB Atlas using Confluent Cloud and its connectors. The presentation concludes by providing a reference architecture for connecting Confluent Platform to MongoDB.
Blockchain and Kafka - A Modern Love Story | Suhavi Sandhu, Guidewire SoftwareHostedbyConfluent
After familiarizing myself with blockchain over the past couple years, I noticed a few caveats and setbacks that make blockchain a ‘not-so-ideal’ solution to record tracking in a P2P network. It’s not as scalable and it’s quite slow in a busy network. Enter Apache Kafka, with its high performance and immutable logging. In this talk, I want to explore the relationship between Blockchain and Kafka and demonstrate how the two technologies can benefit from each other. If you’re interested in the future of blockchain and love Kafka, this is definitely up your alley.
This document discusses several Azure serverless services for building event-driven applications at massive scale including Event Hubs for high-volume data streams, Service Bus for critical workflows, Event Grid for business logic triggered by events, and IoT Hub. It highlights key capabilities like near real-time processing, high reliability, and massive throughput of these services.
Evolving the Engineering Culture to Manage Kafka as a Service | Kate Agnew, O...HostedbyConfluent
Embracing open source software for critical platform operations is a tough organizational evolution for a company of any size. This is particularly daunting for technology teams accustomed to a fully supported managed service. Come learn about how we are using OSS to modernize Health Care at UnitedHealth Group as a roadmap to adopt and offer OSS in your own organization!
Over the last three years, Kafka as a Service within UnitedHealth Group has gone from non-existent to being centrally managed and utilized by over 200 internal application teams as an essential component to our ecosystem. In this session, I will share how to tactically implement a Kafka as a Service platform offering within any organization with a very lean team and how to get broad adoption from engineers and leadership.
I'll discuss the engineering cultural changes needed, both on the DevOps team as well as more broadly, to adopt OSS. Spoiler: Documentation is the key to success. I will talk about some of our "aha" moments, including the importance of internal Terms of Service and how to encourage teams to "Google first." I will include things that haven't worked as well, such as requiring manual review of all topic creation PRs (this doesn't scale!).
Attendees will learn how to both stand up their own OSS offering as well as how to be a good internal consumer of other such offerings. Come ready to learn and laugh about my journey to offering OSS to thousands of people!
Streaming data in the cloud with Confluent and MongoDB Atlas | Robert Waters,...HostedbyConfluent
This document discusses streaming data between Confluent Cloud and MongoDB Atlas. It provides an overview of MongoDB Atlas and its fully managed database capabilities in the cloud. It then demonstrates how to stream data from a Python generator application to MongoDB Atlas using Confluent Cloud and its connectors. The document promotes using MongoDB Atlas as a turnkey database as a service solution and shows how it can be integrated with Confluent Cloud for streaming data workflows.
The document provides an agenda for the Government Track at the Kafka Summit 2021. The agenda includes sessions on topics like improving veteran benefit services through efficient data streaming, Kafka migration for satellite event streaming data, Kafka powered near real-time data pipelines at extreme scale, transformation during a global pandemic, securing the message bus with Kafka streams, Kafka for connected vehicle research, and driving a digital thread program in manufacturing with Apache Kafka. Speakers include representatives from Booz Allen Hamilton, ASRC Federal, University of California San Diego, Confluent, Raft LLC, Leidos, Ohio Department of Transportation, and Mercury Systems.
Event & Data Mesh as a Service: Industrializing Microservices in the Enterpri...HostedbyConfluent
Kafka is widely positioned as the proverbial "central nervous system" of the enterprise. In this session, we explore how the central nervous system can be used to build a mesh topology & unified catalog of enterprise wide events, enabling development teams to build event driven architectures faster & better.
The central theme of this topic is also aligned to seeking idioms from API Management, Service Meshes, Workflow management and Service orchestration. We compare how these approaches can be harmonized with Kafka.
We will also touch upon the topic of how this relates to Domain Driven Design, CQRS & other patterns in microservices.
Some potential takeaways for the discerning audience:
1. Opportunities in a platform approach to Event Driven Architecture in the enterprise
2. Adopting a product mindset around Data & Event Streams
3. Seeking harmony with allied enterprise applications
ACDKOCHI19 - Complete Media Content Management System and Website on ServerlessAWS User Group Kochi
AWS Community Day Kochi 2019 - Technical Session
Complete Media Content Management System and Website on Serverless by Anoop Mohan, Associate Director Of Technology at Asianet
Accelerating Innovation with Apache Kafka, Heikki Nousiainen | Heikki Nousiai...HostedbyConfluent
Being a pioneer in the interactive gaming industry, SONY PlayStation has played a vital role in implementing technological advancements thus help bringing global video gaming community together. With the recent launch of next generation console PS-5 into the market by partnering with thousands of game developers and millions of video gamers across the globe, humongous volumes of data generation in playstation servers is quite inevitable. This presentation talks about how we leveraged big data technologies along with Apache Kafka to solve some of the realtime data analytical problems. Two important case studies we carryout recently are: ""Competitive pricing analysis of game titles across online video game marketplaces"" & ""understand the gamers sentiment by streaming data from social feeds and perform NLP""
Along with Apache Kafka, the technologies that we have used to architect the solution are: REST API, ZooKeeper, D3.js visualization, DoMo, Python, SQL, NLP, AWS Cloud & JSON.
How to Define and Share your Event APIs using AsyncAPI and Event API Products...HostedbyConfluent
Defining Asynchronous APIs and sharing them with your developer community is the most effective way for internal app developers and partners to create new services using real-time event streams. But how do you do it? What specification do you use to define the APIs? What are the best practices for sharing them with the developer community? What framework can you use to code? And what’s next? How do you manage the lifecycle of these APIs? In this talk, Fran Mendez, founder of AsyncAPI and Jonathan Schabowsky, Solace CTO Architect will introduce you to the AsyncAPI specification and show you two different methods to define and share your event APIs, quickly get up to speed, and more. You will learn how to create a Kafka application using asynchronous APIs in minutes!
How a Data Mesh is Driving our Platform | Trey Hicks, GlooHostedbyConfluent
At Gloo.us, we face a challenge in providing platform data to heterogeneous applications in a way that eliminates access contention, avoids high latency ETLs, and ensures consistency for many teams. We're solving this problem by adopting Data Mesh principles and leveraging Kafka, Kafka Connect, and Kafka streams to build an event driven architecture to connect applications to the data they need. A domain driven design keeps the boundaries between specialized process domains and singularly focused data domains clear, distinct, and disciplined. Applying the principles of a Data Mesh, process domains assume the responsibility of transforming, enriching, or aggregating data rather than relying on these changes at the source of truth -- the data domains. Architecturally, we've broken centralized big data lakes into smaller data stores that can be consumed into storage managed by process domains.
This session covers how we’re applying Kafka tools to enable our data mesh architecture. This includes how we interpret and apply the data mesh paradigm, the role of Kafka as the backbone for a mesh of connectivity, the role of Kafka Connect to generate and consume data events, and the use of KSQL to perform minor transformations for consumers.
Monitoreo sencillo de la infraestructura, de la ingesta a la visualizaciónElasticsearch
La visibilidad sobre la infraestructura es un elemento esencial, independientemente de que sea en tus propias máquinas o en la nube, virtualizada, en contenedores, o en un entorno híbrido. El Elastic (ELK) Stack, históricamente conocido por sus capacidades de logging, permite también monitorear tus métricas con el mismo rendimiento Descubre cómo facilitamos la ingesta de datos mediante cientos de integraciones prediseñadas, mejoramos tu día a día con alertas y machine learning, y mejoramos tus visualizaciones con nuevas herramientas desarrolladas para los casos de uso de monitoreo.
SingleStore & Kafka: Better Together to Power Modern Real-Time Data Architect...HostedbyConfluent
To remain competitive, organizations need to democratize access to fast analytics, not only to gain real-time insights on their business but also to power smart apps that need to react in the moment. In this session, you will learn how Kafka and SingleStore enable modern, yet simple data architecture to analyze both fast paced incoming data as well as large historical datasets. In particular, you will understand why SingleStore is well suited process data streams coming from Kafka.
TBD Data Governance | David Araujo and Michael Agnich, Confluent HostedbyConfluent
The document discusses Confluent Stream Governance, a solution for governing data in motion with metadata. It introduces tools for managing schemas, classifying metadata, tracking lineage, and monitoring data quality. This helps bring order to what would otherwise be a "giant mess" of ungoverned data by enforcing standards and providing visibility into data flows and definitions.
Data Con LA 2019 - Large scale streaming analytics using cloud based managed ...Data Con LA
Ingest, store, analyze, and monitor a large volume of real-time streaming data using cloud-based managed services. The reference architecture will showcase the following solution areas using managed services/serverless technologies.- Setup and Manage Mult-Region infrastructure via CI/CD - Ingest and Store a large volume of streaming data (Ex: 100K records/second) - Analyze and derive insights in near real-time - Monitor the infrastructure and pipelines
DataOps Automation for a Kafka Streaming Platform (Andrew Stevenson + Spiros ...HostedbyConfluent
DataOps challenges us to build data experiences in a repeatable way. For those with Kafka, this means finding a means of deploying flows in an automated and consistent fashion.
The challenge is to make the deployment of Kafka flows consistent across different technologies and systems: the topics, the schemas, the monitoring rules, the credentials, the connectors, the stream processing apps. And ideally not coupled to a particular infrastructure stack.
In this talk we will discuss the different approaches and benefits/disadvantages to automating the deployment of Kafka flows including Git operators and Kubernetes operators. We will walk through and demo deploying a flow on AWS EKS with MSK and Kafka Connect using GitOps practices: including a stream processing application, S3 connector with credentials held in AWS Secrets Manager.
Here are the slides from Farid Jiandani & Joe Onisick's PuppetConf 2016 presentation called PuppetConf 2016: Application Centric Automation with Puppet & Cisco. Watch the videos at https://www.youtube.com/playlist?list=PLV86BgbREluVjwwt-9UL8u2Uy8xnzpIqa
Rome 2017: Building advanced voice assistants and chat botsCisco DevNet
If it takes minutes to code a simple bot, building professional bots represents quite a challenge. Soon you realize you need serious programming and API architecture experience but also “Bot” specific skills. In this session, we'll first show the code of advanced Chat and Voice interactions, and then explore the challenges faced when building advanced Bots (Context storage, NLP approaches, Bot Metadata, OAuth scopes), and discuss interesting opportunities from latest industry trends (Bot platforms, Serverless, Microservices). This talk is about showing the code and sharing lessons learned.
PCAP Graphs for Cybersecurity and System TuningDr. Mirko Kämpf
This document discusses analyzing network traffic patterns in Hadoop clusters. Packet capture data was collected from example Hadoop workloads and analyzed using Gephi. Initial results show the network structure and communication between nodes for batch processing (TeraSort) and real-time streaming (Twitter collection). Further analysis aims to classify components, understand dependencies, and identify anomalies over time to better understand typical and atypical workload behavior.
Extracting Value from IOT using Azure Cosmos DB, Azure Synapse Analytics and ...HostedbyConfluent
Due to explosion of IoT, we have streaming data that needs to be processed in real-time. This needs to be made available for applications as well as analytics scenarios such as anomaly detection. This workshop presents a solution using Confluent Cloud on Azure, Azure Cosmos DB and Azure Synapse Analytics which can be connected in a secure way within Azure VNET using Azure Private link configured on Kafka clusters.
Driving a Digital Thread Program in Manufacturing with Apache Kafka | Anu Mis...HostedbyConfluent
Forward-looking manufacturing companies have recognized the value of digital threads that bring together design and product information across the product life cycle, connecting the dots as information flows from design to manufacturing and on to services. Creating a reliable, scalable infrastructure to support digital thread programs can be a significant challenge, given the wide variety of legacy systems involved. At Mercury Systems we are using Kafka and Confluent to drive our digital thread program and put in place a product lifecycle management process for Industry 4.0. With the substantial year-on-year growth we were seeing, we needed a cloud-ready solution that goes beyond a basic, API-based integration layer based on Mulesoft or similar technology. If you’re wondering why Kafka makes sense for a digital thread, join us to learn how a real-time event streaming platform enables core strategies around ML/AI, microservices, model-based system engineering, and continuous improvement.
Streaming Data in the Cloud with Confluent and MongoDB Atlas | Robert Walters...HostedbyConfluent
This document discusses streaming data between Confluent Cloud and MongoDB Atlas. It provides an overview of MongoDB Atlas and its fully managed database capabilities in the cloud. It then demonstrates how to stream data from a Python generator application to MongoDB Atlas using Confluent Cloud and its connectors. The presentation concludes by providing a reference architecture for connecting Confluent Platform to MongoDB.
Blockchain and Kafka - A Modern Love Story | Suhavi Sandhu, Guidewire SoftwareHostedbyConfluent
After familiarizing myself with blockchain over the past couple years, I noticed a few caveats and setbacks that make blockchain a ‘not-so-ideal’ solution to record tracking in a P2P network. It’s not as scalable and it’s quite slow in a busy network. Enter Apache Kafka, with its high performance and immutable logging. In this talk, I want to explore the relationship between Blockchain and Kafka and demonstrate how the two technologies can benefit from each other. If you’re interested in the future of blockchain and love Kafka, this is definitely up your alley.
This document discusses several Azure serverless services for building event-driven applications at massive scale including Event Hubs for high-volume data streams, Service Bus for critical workflows, Event Grid for business logic triggered by events, and IoT Hub. It highlights key capabilities like near real-time processing, high reliability, and massive throughput of these services.
Evolving the Engineering Culture to Manage Kafka as a Service | Kate Agnew, O...HostedbyConfluent
Embracing open source software for critical platform operations is a tough organizational evolution for a company of any size. This is particularly daunting for technology teams accustomed to a fully supported managed service. Come learn about how we are using OSS to modernize Health Care at UnitedHealth Group as a roadmap to adopt and offer OSS in your own organization!
Over the last three years, Kafka as a Service within UnitedHealth Group has gone from non-existent to being centrally managed and utilized by over 200 internal application teams as an essential component to our ecosystem. In this session, I will share how to tactically implement a Kafka as a Service platform offering within any organization with a very lean team and how to get broad adoption from engineers and leadership.
I'll discuss the engineering cultural changes needed, both on the DevOps team as well as more broadly, to adopt OSS. Spoiler: Documentation is the key to success. I will talk about some of our "aha" moments, including the importance of internal Terms of Service and how to encourage teams to "Google first." I will include things that haven't worked as well, such as requiring manual review of all topic creation PRs (this doesn't scale!).
Attendees will learn how to both stand up their own OSS offering as well as how to be a good internal consumer of other such offerings. Come ready to learn and laugh about my journey to offering OSS to thousands of people!
Streaming data in the cloud with Confluent and MongoDB Atlas | Robert Waters,...HostedbyConfluent
This document discusses streaming data between Confluent Cloud and MongoDB Atlas. It provides an overview of MongoDB Atlas and its fully managed database capabilities in the cloud. It then demonstrates how to stream data from a Python generator application to MongoDB Atlas using Confluent Cloud and its connectors. The document promotes using MongoDB Atlas as a turnkey database as a service solution and shows how it can be integrated with Confluent Cloud for streaming data workflows.
The document provides an agenda for the Government Track at the Kafka Summit 2021. The agenda includes sessions on topics like improving veteran benefit services through efficient data streaming, Kafka migration for satellite event streaming data, Kafka powered near real-time data pipelines at extreme scale, transformation during a global pandemic, securing the message bus with Kafka streams, Kafka for connected vehicle research, and driving a digital thread program in manufacturing with Apache Kafka. Speakers include representatives from Booz Allen Hamilton, ASRC Federal, University of California San Diego, Confluent, Raft LLC, Leidos, Ohio Department of Transportation, and Mercury Systems.
Event & Data Mesh as a Service: Industrializing Microservices in the Enterpri...HostedbyConfluent
Kafka is widely positioned as the proverbial "central nervous system" of the enterprise. In this session, we explore how the central nervous system can be used to build a mesh topology & unified catalog of enterprise wide events, enabling development teams to build event driven architectures faster & better.
The central theme of this topic is also aligned to seeking idioms from API Management, Service Meshes, Workflow management and Service orchestration. We compare how these approaches can be harmonized with Kafka.
We will also touch upon the topic of how this relates to Domain Driven Design, CQRS & other patterns in microservices.
Some potential takeaways for the discerning audience:
1. Opportunities in a platform approach to Event Driven Architecture in the enterprise
2. Adopting a product mindset around Data & Event Streams
3. Seeking harmony with allied enterprise applications
ACDKOCHI19 - Complete Media Content Management System and Website on ServerlessAWS User Group Kochi
AWS Community Day Kochi 2019 - Technical Session
Complete Media Content Management System and Website on Serverless by Anoop Mohan, Associate Director Of Technology at Asianet
Accelerating Innovation with Apache Kafka, Heikki Nousiainen | Heikki Nousiai...HostedbyConfluent
Being a pioneer in the interactive gaming industry, SONY PlayStation has played a vital role in implementing technological advancements thus help bringing global video gaming community together. With the recent launch of next generation console PS-5 into the market by partnering with thousands of game developers and millions of video gamers across the globe, humongous volumes of data generation in playstation servers is quite inevitable. This presentation talks about how we leveraged big data technologies along with Apache Kafka to solve some of the realtime data analytical problems. Two important case studies we carryout recently are: ""Competitive pricing analysis of game titles across online video game marketplaces"" & ""understand the gamers sentiment by streaming data from social feeds and perform NLP""
Along with Apache Kafka, the technologies that we have used to architect the solution are: REST API, ZooKeeper, D3.js visualization, DoMo, Python, SQL, NLP, AWS Cloud & JSON.
How to Define and Share your Event APIs using AsyncAPI and Event API Products...HostedbyConfluent
Defining Asynchronous APIs and sharing them with your developer community is the most effective way for internal app developers and partners to create new services using real-time event streams. But how do you do it? What specification do you use to define the APIs? What are the best practices for sharing them with the developer community? What framework can you use to code? And what’s next? How do you manage the lifecycle of these APIs? In this talk, Fran Mendez, founder of AsyncAPI and Jonathan Schabowsky, Solace CTO Architect will introduce you to the AsyncAPI specification and show you two different methods to define and share your event APIs, quickly get up to speed, and more. You will learn how to create a Kafka application using asynchronous APIs in minutes!
How a Data Mesh is Driving our Platform | Trey Hicks, GlooHostedbyConfluent
At Gloo.us, we face a challenge in providing platform data to heterogeneous applications in a way that eliminates access contention, avoids high latency ETLs, and ensures consistency for many teams. We're solving this problem by adopting Data Mesh principles and leveraging Kafka, Kafka Connect, and Kafka streams to build an event driven architecture to connect applications to the data they need. A domain driven design keeps the boundaries between specialized process domains and singularly focused data domains clear, distinct, and disciplined. Applying the principles of a Data Mesh, process domains assume the responsibility of transforming, enriching, or aggregating data rather than relying on these changes at the source of truth -- the data domains. Architecturally, we've broken centralized big data lakes into smaller data stores that can be consumed into storage managed by process domains.
This session covers how we’re applying Kafka tools to enable our data mesh architecture. This includes how we interpret and apply the data mesh paradigm, the role of Kafka as the backbone for a mesh of connectivity, the role of Kafka Connect to generate and consume data events, and the use of KSQL to perform minor transformations for consumers.
Monitoreo sencillo de la infraestructura, de la ingesta a la visualizaciónElasticsearch
La visibilidad sobre la infraestructura es un elemento esencial, independientemente de que sea en tus propias máquinas o en la nube, virtualizada, en contenedores, o en un entorno híbrido. El Elastic (ELK) Stack, históricamente conocido por sus capacidades de logging, permite también monitorear tus métricas con el mismo rendimiento Descubre cómo facilitamos la ingesta de datos mediante cientos de integraciones prediseñadas, mejoramos tu día a día con alertas y machine learning, y mejoramos tus visualizaciones con nuevas herramientas desarrolladas para los casos de uso de monitoreo.
SingleStore & Kafka: Better Together to Power Modern Real-Time Data Architect...HostedbyConfluent
To remain competitive, organizations need to democratize access to fast analytics, not only to gain real-time insights on their business but also to power smart apps that need to react in the moment. In this session, you will learn how Kafka and SingleStore enable modern, yet simple data architecture to analyze both fast paced incoming data as well as large historical datasets. In particular, you will understand why SingleStore is well suited process data streams coming from Kafka.
TBD Data Governance | David Araujo and Michael Agnich, Confluent HostedbyConfluent
The document discusses Confluent Stream Governance, a solution for governing data in motion with metadata. It introduces tools for managing schemas, classifying metadata, tracking lineage, and monitoring data quality. This helps bring order to what would otherwise be a "giant mess" of ungoverned data by enforcing standards and providing visibility into data flows and definitions.
Data Con LA 2019 - Large scale streaming analytics using cloud based managed ...Data Con LA
Ingest, store, analyze, and monitor a large volume of real-time streaming data using cloud-based managed services. The reference architecture will showcase the following solution areas using managed services/serverless technologies.- Setup and Manage Mult-Region infrastructure via CI/CD - Ingest and Store a large volume of streaming data (Ex: 100K records/second) - Analyze and derive insights in near real-time - Monitor the infrastructure and pipelines
DataOps Automation for a Kafka Streaming Platform (Andrew Stevenson + Spiros ...HostedbyConfluent
DataOps challenges us to build data experiences in a repeatable way. For those with Kafka, this means finding a means of deploying flows in an automated and consistent fashion.
The challenge is to make the deployment of Kafka flows consistent across different technologies and systems: the topics, the schemas, the monitoring rules, the credentials, the connectors, the stream processing apps. And ideally not coupled to a particular infrastructure stack.
In this talk we will discuss the different approaches and benefits/disadvantages to automating the deployment of Kafka flows including Git operators and Kubernetes operators. We will walk through and demo deploying a flow on AWS EKS with MSK and Kafka Connect using GitOps practices: including a stream processing application, S3 connector with credentials held in AWS Secrets Manager.
Here are the slides from Farid Jiandani & Joe Onisick's PuppetConf 2016 presentation called PuppetConf 2016: Application Centric Automation with Puppet & Cisco. Watch the videos at https://www.youtube.com/playlist?list=PLV86BgbREluVjwwt-9UL8u2Uy8xnzpIqa
Rome 2017: Building advanced voice assistants and chat botsCisco DevNet
If it takes minutes to code a simple bot, building professional bots represents quite a challenge. Soon you realize you need serious programming and API architecture experience but also “Bot” specific skills. In this session, we'll first show the code of advanced Chat and Voice interactions, and then explore the challenges faced when building advanced Bots (Context storage, NLP approaches, Bot Metadata, OAuth scopes), and discuss interesting opportunities from latest industry trends (Bot platforms, Serverless, Microservices). This talk is about showing the code and sharing lessons learned.
Coding 100 session that took place a week before the Coding Camp, Berlin event (13-14 Feb 2016), to teach people to code!
See http://hackathon.cisco.com/event/codingcamp-Berlin-2016 for the Coding Camp event
Este documento presenta una guía para la atención a la diversidad dirigida a docentes. Explica que la atención a la diversidad nace de la necesidad de ofrecer una educación adaptada a las características y necesidades de todos los estudiantes. Define la atención a la diversidad y describe factores como la diversidad de intereses, capacidades, estilos cognitivos, necesidades, motivación y cultura. Recomienda que los docentes se capaciten en adaptación curricular, potencien actividades que permitan a los estudiantes expresarse y acojan la legisl
The document describes the activities of a gnome named Pronto during his visit to the Bank of Powell. Pronto got acquainted with Betty, rode in the bank's drive-up tube, and greeted customers. He then went rock climbing, worked on his hunting skills and caught pheasants for dinner. Later, Pronto relaxed in the sun, took a motorcycle ride, and entered a fish catching contest at the bank before saying farewell on his last day.
DEVNET-1186 Harnessing the Power of the Cloud to Detect Advanced Threats: Cog...Cisco DevNet
This presentation starts by outlining key characteristics of advanced threats, helping to define these threats in an industry where they are most often associated with nation-state attacks. Smart malware, and recent examples of advanced threats such as Qakbot and Cryptolocker demonstrate the true nature of advanced threats as both persistent and subtle. New threats are also launched every day, requiring a security method designed to detect named and unnamed advanced threats that successfully penetrate the network. The presentation explains how CTA provides the visibility necessary to identify those infections. The explanation includes the history and technique of CTA in terms of telemetry and machine learning. The presentation also goes into depth on CTA's layered approach which combines anomaly detection, trust modeling, classification and entity modeling in an ensemble approach. The viewer will come away with an understanding of why CTA is a natural fit with AMP on CWS in the CWS Premium product offering. CWS Premium begins the customer's journey towards identifying zero day advanced threats in their network.
This document discusses the Radboud University Medical Center (Radboudumc) in Nijmegen, Netherlands. It provides details about:
1. Radboudumc's mission to have a significant impact on healthcare through personalized healthcare and the patient as partner approach.
2. The core activities of patient care, research, and education conducted by 11,000 colleagues across 52 departments and serving 3,300 students.
3. The 18 Technology Centers at Radboudumc that provide technological expertise and resources to approximately 1,600 internal and external users across 140 consortia working in areas like genomics, imaging, and clinical trials.
Are you afraid that without Matric you wont be able to study, get a qualification, and get ahead in life?
With an FET college like Skills Academy you don't have to worry about not having Matric. You can still study with us and get ahead in life.
View the presentation to find out more about how you can do this, and what life without Matric can be like if you have a college education.
The document proposes a magazine called "Evolution of DBZ" targeted at Dragon Ball Z fans. The core audience would be any gender interested in Dragon Ball Z willing to spend £1 on the magazine. A secondary audience would include those interested in learning about anime or seeing Dragon Ball Z as an interest. The front cover would appeal to audiences with colorful images of a DBZ character and the creator portraying a fighting pose, along with a martial arts symbol relating to DBZ and the creator. The main article would provide bright, colorful information about the newest Dragon Ball franchise iteration in an engaging style. Advertisements would use big fonts to promote related anime games and cinema systems. The total proposed annual budget is £240,000,
God spent extra time to carefully craft woman, as she had to meet many complex specifications - she had to be washable but not plastic, have over 200 movable parts, function on all foods, embrace and heal children with only two hands. An angel was impressed by her design but thought she seemed too fragile. However, God explained that though soft, woman is also strong and can endure much. He gave her the ability to think, reason, negotiate and express a range of emotions through tears. The angel was amazed at God's creation of such a marvelous being.
Contact To "Mumbai Academics" by mumbai.academics.blogspot.com
These topics are the most popular project topics taken as final year project recent years. Choose an appropriate one for your project. Remember to map your aspiration with your project, since your first employer may consider your project as your interesting topic.final Year Projects, Final Year Projects in Chennai, Software Projects, Embedded Projects, Microcontrollers Projects, DSP Projects, VLSI Projects, Matlab Projec
Enabling SDN for Service Providers by Khay Kid ChowMyNOG
1. The document discusses how programmable networks and network functions virtualization (NFV) enable new use cases and business models for service providers by making networks software-defined and services elastic.
2. Key aspects covered include centralizing network control, virtualizing network functions, and using orchestration to dynamically provision and monitor virtualized services across compute and network infrastructure on demand.
3. The benefits highlighted are automating network operations, enabling new self-service capabilities, and decreasing time to revenue through agile service creation.
Colt is evolving its VPN portfolio towards a hybrid of MPLS and SD WAN to address customer challenges around needs for higher bandwidth, faster network delivery, and more network agility. It is initially focusing on an SD WAN minimum viable product to directly address new market demand. Longer term, Colt aims to develop a unified, automated on-demand platform powered by network virtualization, orchestration, and artificial intelligence to further optimize services.
V like Velocity, Predicting in Real-Time with Azure MLBarbara Fusinska
This document discusses using Azure Machine Learning and stream processing to enable predictive maintenance for aircraft engines. It describes a use case of predicting whether a device will fail within the next two weeks using real-time sensor data streams. It then outlines the challenges of stream processing and applying machine learning to streaming data. The proposed solution architecture involves using Event Hub for data ingestion, Stream Analytics for stream processing and aggregations, Machine Learning for model training and predictions, and DocumentDB for storing prediction results. It provides examples of the Stream Analytics and Machine Learning workflows used to enable predictive maintenance from real-time sensor data streams.
This document discusses security automation through SDN and NFV. It begins with an overview of security challenges from a service provider perspective, such as growing traffic and threats. It then discusses how SDN can automate and accelerate DDoS mitigation by redirecting traffic. The document outlines Cisco's Firepower 9300 platform for integrated security services and its use with Radware virtual DDoS protection. It also discusses how the Cisco Application Centric Infrastructure automates security policy and service chains in the data center.
Cloud Experience: Data-driven Applications Made Simple and FastDatabricks
A complex real-time data workflow implementation is very challenging. This session will describe the architecture of a data platform that provides a single, secure, high-performance system that can be deployed in a hybrid cloud architectures. We will present how to support simultaneous, consistent and high-performance access through multiple industry open source and cloud compatible standards of streaming, table, TSDB, object, and file APIs. A new serverless technology is also used in the architecture to support a dynamic and flexible implementations. The presenter will also outline how the platform was integrated with the Spark eco-system, including AI and ML tools, to simplify the development process
Path to the future #4 - Ingestão, processamento e análise de dados em tempo realAmazon Web Services LATAM
Nesta sessão faremos uma demonstração de controle e defesa de tráfego aéreo utilizando processamento em tempo real.
Trataremos das boas práticas para ingestão, armazenamento, processamento e visualização de dados através de serviços da AWS como Kinesis, DynamoDB, Lambda, Redshift, Quicksight e Amazon Machine Learning.
With the advent of SDN driven network programmability and abstraction, IT operations management is poised for a transformation to higher levels of agility and automation.
The document lists over 850 project codes for various technology projects related to areas like cloud computing, big data, Android, machine learning, computer vision, and networking. Each project listing includes a project code, title, application area, and relevant technologies. The projects involve developing techniques for tasks such as automatically detecting errors in databases, analyzing marketing data, securing exam systems, generating cryptographic keys, and more. The document provides a high-level overview of the wide range of technical projects available.
The document lists over 850 project codes for various technology projects related to areas like cloud computing, big data, Android, machine learning, computer vision, and networking. Each project listing includes a project code, title, application area, and relevant technologies. The projects involve developing techniques for tasks such as automatic error correction in databases, theft detection in smart grids, exam management systems, key generation, marketing analysis, data sharing, learning support, and more.
Splunk App for Stream - Einblicke in Ihren NetzwerkverkehrGeorg Knon
The document discusses the Splunk App for Stream, which enables real-time insights into private, public and hybrid cloud infrastructures by capturing and analyzing critical events from wire data not found in logs or with other collection methods. It provides an overview of the app, what's new, important features, architecture and deployment, customer success examples, and FAQs.
Combining Logs, Metrics, and Traces for Unified ObservabilityElasticsearch
Learn how Elasticsearch efficiently combines data in a single store and how Kibana is used to analyze it. Plus, see how recent developments help identify, troubleshoot, and resolve operational issues faster.
Introducing ONAP for OpenStack St Louis Meetupdjzook
An introduction to the Open Networking Automation Platform (ONAP) a new Linux Foundation Project for SDN/NFV, as presented to the OpenStack St Louis Meetup on June 20, 2017
The document discusses reference architectures for building big data applications with Internet of Things (IoT) technologies. It describes an IoT reference architecture that includes components for device connectivity, data processing/analytics, and business connectivity. It provides examples of device types, connectivity options, and how to use Azure services for device identity/registry, stream processing, analytics, and presentation. Guiding principles are also outlined for building scalable, secure, and flexible IoT solutions.
Analytics for automating critical infrastructuresAdtran
This document discusses using network analytics and machine learning to automate critical infrastructures. It outlines a path towards network automation using simplified service ordering, streamlined operations, and intuitive interfaces. Analytics can provide insights to optimize network performance through continuous data collection, analysis using AI models, and automated orchestration. Open challenges include high data requirements, multi-vendor data sharing, and integrating ML solutions into network operations. Case studies demonstrate ML-based optical network monitoring and a data-sovereign telemetry broker. Overall, network analytics and AI will enhance optical network control and automation.
QNAP Systems, Inc., headquartered in Taipei, Taiwan, provides a comprehensive range of cutting-edge Network-attached Storage (NAS) and video surveillance solutions based on the principles of usability, high security, and flexible scalability. QNAP offers quality NAS products for home and business users, providing solutions for storage, backup/snapshot, virtualization, teamwork, multimedia, and more. QNAP envisions NAS as being more than "simple storage", and has created many NAS-based innovations to encourage users to host and develop Internet of Things, artificial intelligence, and machine learning solutions on their QNAP NAS.
Similar to DEVNET-1159 Deep Dive with the Cisco WAN Automation Engine (20)
Learn how and why John McDonough contributes to Ansible and how you can too. We’ll arm you with what you need to know, things like Python, Git, and YAML.
How to Build Advanced Voice Assistants and ChatbotsCisco DevNet
Learn more about the CodeMotion Voice Machine and Cisco DevNet Chatbot. Understand what a typical bot journey is and where to go to get more information about Cisco Spark and Tropo.
Cisco Spark and Tropo and the Programmable WebCisco DevNet
This document discusses integration platforms as a service (iPaaS) and provides examples of how Cisco Spark, Tropo, and Webex can be integrated using iPaaS solutions. It outlines key iPaaS concepts, popular iPaaS solutions like IFTTT, Zapier and Built.io, and use cases for both consumers and enterprises. It also describes an anatomy of a potential iPaaS solution using Built.io and highlights opportunities to learn more through Cisco DevNet labs and sessions.
Device Programmability with Cisco Plug-n-Play SolutionCisco DevNet
Cisco Open Plug-n-Play solution allows customers to reduce the costs associated with deployment/installation of network devices, increase the speed and reduce the complexity of deployments without compromising the security. Using Cisco Plug-n-Play solution, customers can do Zero Touch Installs of Cisco gear in various deployment scenarios and deployment locations.
Watch the DevNet 2052 replay from the Cisco Live On-Demand Library at: https://www.ciscolive.com/online/connect/sessionDetail.ww?SESSION_ID=91108&backBtn=true
Check out more and register for Cisco DevNet: http://ow.ly/jCNV3030OfS
Building a WiFi Hotspot with NodeJS: Cisco Meraki - ExCap APICisco DevNet
This document discusses building a WiFi hotspot using Node.js and the Cisco Meraki ExCap API. It describes using Node.js and Express to create web services that handle click-through, sign-on, and social login splash pages. Sessions are stored in MongoDB. Templates are rendered using Handlebars. The API provides parameters like login URLs and splash page URLs. Code examples show routing and passport authentication strategies for social logins.
Application Visibility and Experience through Flexible NetflowCisco DevNet
The world of applications is changing rapidly in the enterprise; from the way applications are increasingly hosted in the cloud, the diverse nature of apps and to the way they are consumed by many devices. The need for organizations and network administrators is to focus on "Fast IT" - "Innovation in the Enterprise" is growing, which means having to spend less time on daily operations, maintenance and troubleshooting and more time on delivering business value with newer services. Cisco AVC with its NBAR2 technology is designed to detect applications and measure application performance through measuring round trip time, retransmission rates, jitter, delay, packet loss, MoS, URL statistics etc. Those details are transmitted using Flexible Netflow/IPFIX, so partners could leverage the data for application usage reporting, performance reporting and troubleshooting application issues to deliver best possible application experience.
Watch the DevNet 2047 replay from the Cisco Live On-Demand Library at: https://www.ciscolive.com/online/connect/sessionDetail.ww?SESSION_ID=92664&backBtn=true
Check out more and register for Cisco DevNet: http://ow.ly/jCNV3030OfS
The WAN Automation Engine (WAE) is a software platform that provides multivendor and multilayer visibility and analysis for service provider and large enterprise networks. It plays a critical role in answering key questions of network resource availability, and when appropriate can automate and simplify Traffic Engineering mechanisms such as RSVP-TE and Segment Routing. This session will focus on use-cases and APIs for developers.
Watch the DevNet 2035 replay from the Cisco Live On-Demand Library at: https://www.ciscolive.com/online/connect/sessionDetail.ww?SESSION_ID=92720&backBtn=true
Check out more and register for Cisco DevNet: http://ow.ly/jCNV3030OfS
Cisco's Open Device Programmability Strategy: Open DiscussionCisco DevNet
Cisco DNA is an open and extensible, software-driven architecture built on a set of design principles with the objective of providing:
- Insights & Actions to drive faster business innovation
- Automaton & Assurance to lower IT costs and complexity while meeting business and user expectations
- Security & Compliance to reduce risk as the organization continues to expand and grow. The architecture extends to Cisco network elements.
This session will focus on the open, model-driven, programmable interfaces available across Cisco's network elements which enable you to leverage and extend your network through applications that directly access the routers and switches in your network.
Watch the DevNet 1028 replay from the Cisco Live On-Demand Library at: https://www.ciscolive.com/online/connect/sessionDetail.ww?SESSION_ID=91041&backBtn=true
Check out more and register for Cisco DevNet: http://ow.ly/jCNV3030OfS
Open Device Programmability: Hands-on Intro to RESTCONF (and a bit of NETCONF)Cisco DevNet
In this small group, hands-on workshop session you'll learn how to write your first Python application that uses YANG, NETCONF and , RESTCONF to access operational and configuration data on a device.
Watch the DevNet 2044 replay from the Cisco Live On-Demand Library at: https://www.ciscolive.com/online/connect/sessionDetail.ww?SESSION_ID=92725&backBtn=true
Check out more and register for Cisco DevNet: http://ow.ly/jCNV3030OfS
NETCONF & YANG Enablement of Network DevicesCisco DevNet
A technical discussion and a demo showing how Tail-f's ConfD management agent can be used to implement NETCONF and YANG, the industry-leading solution for providing a programmable management interface in a network element. ConfD is recognized as the best-in-breed embedded software for implementing management functions in network elements, including physical devices and virtualized network functions (VNF) for NFV.
This Workshop is a best fit for engineers who are involved in the design and development of embedded software for network devices. Attendees will gain a basic understanding of what NETCONF and YANG are and how ConfD provides a solution for embedding this technology in the network devices. More information about ConfD can be found at: https://developer.cisco.com/site/confD/
Watch the DevNet 1216 replay from the Cisco Live On-Demand Library at: https://www.ciscolive.com/online/connect/sessionDetail.ww?SESSION_ID=92703&backBtn=true
Check out more and register for Cisco DevNet: http://ow.ly/jCNV3030OfS
UCS Management APIs A Technical Deep DiveCisco DevNet
The document provides an overview and technical details of the UCS Management APIs:
- It discusses the structure, features, object model, and workflow of the UCS XML API. It also covers methods for sessions, queries, filters, and configurations.
- The API uses HTTP/HTTPS and XML, with role-based authentication and a published object model hierarchy. It supports transactions, high availability, and event subscriptions.
- Key methods and functionality covered include sessions, queries with filtering, resolving objects by DN/class/scope, configurations, and events/statistics. Understanding the low-level UCS API enables programmatic access to UCS environments.
The DevOps model is rapidly transforming IT operations and development practices. But what are the precursors necessary to implement DevOps? To achieve an agile, virtualized, and highly automated IT environment, what technological requirements need to be in place? OpenStack has the potential to facilitate DevOps implementation and practices at several different layers in the data center. In this session we'll quickly discuss what DevOps is, then discuss many components that are logically required to move towards DevOps in your environment. Finally we'll explore in depth several ways OpenStack can provide these baseline components.
Watch the DevNet 1104 replay from the Cisco Live On-Demand Library at: https://www.ciscolive.com/online/connect/sessionDetail.ww?SESSION_ID=92695&backBtn=true
Check out more and register for Cisco DevNet: http://ow.ly/jCNV3030OfS
NetDevOps for the Network Dude: How to get started with API's, Ansible and Py...Cisco DevNet
This document provides an agenda and overview for a presentation on network automation using APIs, Ansible, and Python. The presentation introduces network programmability and automation tools like Ansible, discusses using infrastructure as code approaches, and provides examples of automating network device configurations and modules using Python and Jinja templates. It aims to help network engineers get started with network automation.
The document outlines an agenda for a presentation on developing Tropo applications. The presentation covers topics like making incoming and outgoing calls, text messaging, call control features, and advanced speech concepts. Sample code is provided for different programming languages.
The document describes a Cisco Spark & Tropo API workshop that covers setting up a quiz application using the Cisco Spark and Tropo APIs. The workshop includes touring a demo quiz app, setting up an interactive voice response system with Tropo, adding a SMS bridge to onboard participants to a Cisco Spark room, and connecting an interactive assistant bot to a Spark room. Hands-on exercises guide attendees on configuring the various components.
Coding 102 REST API Basics Using SparkCisco DevNet
This document provides an overview and agenda for a workshop on REST API basics using the Cisco Spark API. The agenda includes an introduction to REST APIs and what makes them useful, a tour of the Cisco Spark API and its endpoints, and hands-on exercises for interacting with the Cisco Spark API using Postman and JavaScript examples. Attendees will learn how to retrieve room and membership data, add messages to rooms, and call API functions from JavaScript code. The workshop aims to help developers get started using the Cisco Spark API and provides resources for continuing their education on API design and development.
Cisco APIs: An Interactive Assistant for the Web2Day Developer ConferenceCisco DevNet
Stève Sfartz is an API evangelist at Cisco who presented on Cisco APIs and leveraging them through examples. The presentation covered Cisco technologies like Connected Mobile Experience (CMX), Mobility IQ, and Cisco Spark which have REST APIs that can be used to access location data, analytics, and collaboration features. It encouraged developers to join the Cisco DevNet community to learn about APIs, take labs, and interact with other developers.
DevNet Express - Spark & Tropo API - Lisbon May 2016Cisco DevNet
Direct from the Cisco DevNet Lisbon Portugal Express event in May 2016. Learn about Cisco DevNet, Spark and Tropo APIs any why there's never been a better time to innovate with Cisco.
Direct from DevNet@TAG in Milan and Rome in May 2016! Learn about Cisco DevNet, Spark and Tropo APIs any why there's never been a better time to innovate with Cisco.
Choosing PaaS: Cisco and Open Source Options: an overviewCisco DevNet
This document discusses container platforms and PaaS. It provides context on containers and supporting technologies like Docker. It describes how containers are limited when confined to a single host, and how schedulers can distribute containers across multiple hosts. It outlines common production tools used with containers like configuration management, monitoring, and logging. It compares PaaS and containers, noting how PaaS consumed containers before they were widely known, and how the lines between the two are blurring as container platforms provide more services. It introduces Mantl as Cisco's container stack designed to run container workloads and big data applications across clouds.
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
DEVNET-1159 Deep Dive with the Cisco WAN Automation Engine
1. Deep-Dive with the Cisco
WAN Automation Engine
Josh Peters joshpete@cisco.com
Derek Tay derekta@cisco.com
CiscoLive 2015 – San Diego
DevNet
2. WAN Automation Engine
Delivering Optimization and Automation
Modeling
What if/predictive analysis
Global optimization
Assess historical and
real-time data
Find and manage hot spots
Network efficiency analysis
Programmatic network
control
Extensible,
open data models
Real-time traffic balancing
Intelligent bandwidth
scheduling
Automated service delivery
WAE
Cycle
3. WAN Automation Engine
Network Plan
WAN Automation Software Suite
Design and Network Planning
Network
Planning
Optimization Failure
Analysis
Visualization, Analytics, BI, Inventory
Weather Map
Business
Intelligence
Network
Inventory
Service, Network,
and Analytics
REST APIs
Optimization and Prediction
DeployerCollector
New ModelCurrent Model
CalendaringAnalytics
Collection Drivers
NMS/EMSNetFlowCLISNMP BGP-LS ...PCEPOSC
NC/YA
NG
4. WAN Automation Applications
Optimize Bandwidth Placement
Bandwidth
Calendaring
Bandwidth on
Demand
Automated Tunnel Creation and
Traffic Load Management
Tunnel SplitterTunnel Builder Tunnel Balancer
Managing Resource Inventory,
Security, and Maintenance
Inventory
Maintenance
Window
Scheduler
Network ACL
Manager
Offline Planning, Design and
Analysis
Offline Planning
IGP Convergence
Analyzer
Failure Analysis
Online Visualization, Analytics, and
Business Intelligence
Weather Map
BGP Route
Visualizer
Business
Intelligence
Extensible Application Integration
Application
Latency Routing
Segment
Routing
Optimizer
5. WAE Use Cases
Unified Multi-Layer
Global network view | Optimization across layers
Futures: Add OTN to Activation, Planning and Optimization
Segment Routing
Built for SDN | Foundation for Application Engineered Routing
Applications will have the ability to direct network behavior.
WAE Applications
Coordinated Maintenance, Bandwidth Calendaring
Data Centre
B
Data Centre
A