This document provides an agenda and overview for a Sumo Logic webinar training session. The agenda includes sections on data collection, search and analysis, and visualizing and monitoring. It discusses Sumo Logic's analytics platform and data flow. It also provides instructions for logging into a training environment and demonstrates examples of searching log data and creating dashboards and alerts.
Designed for Sumo Administrators, this course shows you how to set up your data collection according to your organization’s data sources. Best practices around deployment options ensure you choose a deployment that scales as your organization grows. Because metadata is so important to a healthy environment, learn how to design and set up a naming convention that works best for your teams. Use Chef, Puppet or the likes? Learn how to automate your deployment. Test your deployment with simple searches, and learn to take advantage of optimization tools that can help you stay on top of your deployment.
Installation of Grafana on linux ; connectivity with Prometheus database , installation of Prometheus ; Installation of node_exporter ,Tomcat-exporter ; installation and configuration of alert manager .. Detailed step by step installation and working
Presentation for Pervasive Systems class lectured by prof. Ioannis Chatzigiannakis, a.y. 2015-16, about the No-SQL database InfluxDB. The course is intended for students of MS in Engineering in Computer Science at Sapienza - University of Rome.
The complete code for the demo is available on Github:
https://github.com/RobGaud/PervasiveSystemsPersonal
You can also find me on LinkedIn:
https://www.linkedin.com/in/roberto-gaudenzi-4b0422116
Brand new to Sumo Logic? Get started with these 5 easy steps and get certified!
Learn the basics for how to search, parse and analyze the logs and metrics that are important to your organization. This session will guide you through running searches, simple parsing and basic analytics on your data. Learn how to convert your queries to charts and add them to Dashboards to help you visualize trends and easily identify anomalies. Lastly, learn how Alerts can help you stay on top of your critical events.
Prometheus: What is is, what is new, what is comingJulien Pivotto
Prometheus is a metrics-based monitoring and alerting system and also the project with the second longest tenure within the CNCF. As such you have probably heard about it by now. We will give you a short introduction to Prometheus, what it is and why it was such a big deal when it was initially released. In all those years since then, the project has only gained speed, which provides us with the opportunity to tell you about all the exciting new features that have just been released or are in the pipeline, including opportunities to contribute to the project and its wider ecosystem.
Talk at kubecon 2021
Designed for Sumo Administrators, this course shows you how to set up your data collection according to your organization’s data sources. Best practices around deployment options ensure you choose a deployment that scales as your organization grows. Because metadata is so important to a healthy environment, learn how to design and set up a naming convention that works best for your teams. Use Chef, Puppet or the likes? Learn how to automate your deployment. Test your deployment with simple searches, and learn to take advantage of optimization tools that can help you stay on top of your deployment.
Installation of Grafana on linux ; connectivity with Prometheus database , installation of Prometheus ; Installation of node_exporter ,Tomcat-exporter ; installation and configuration of alert manager .. Detailed step by step installation and working
Presentation for Pervasive Systems class lectured by prof. Ioannis Chatzigiannakis, a.y. 2015-16, about the No-SQL database InfluxDB. The course is intended for students of MS in Engineering in Computer Science at Sapienza - University of Rome.
The complete code for the demo is available on Github:
https://github.com/RobGaud/PervasiveSystemsPersonal
You can also find me on LinkedIn:
https://www.linkedin.com/in/roberto-gaudenzi-4b0422116
Brand new to Sumo Logic? Get started with these 5 easy steps and get certified!
Learn the basics for how to search, parse and analyze the logs and metrics that are important to your organization. This session will guide you through running searches, simple parsing and basic analytics on your data. Learn how to convert your queries to charts and add them to Dashboards to help you visualize trends and easily identify anomalies. Lastly, learn how Alerts can help you stay on top of your critical events.
Prometheus: What is is, what is new, what is comingJulien Pivotto
Prometheus is a metrics-based monitoring and alerting system and also the project with the second longest tenure within the CNCF. As such you have probably heard about it by now. We will give you a short introduction to Prometheus, what it is and why it was such a big deal when it was initially released. In all those years since then, the project has only gained speed, which provides us with the opportunity to tell you about all the exciting new features that have just been released or are in the pipeline, including opportunities to contribute to the project and its wider ecosystem.
Talk at kubecon 2021
Presenter: Kenn Knowles, Software Engineer, Google & Apache Beam (incubating) PPMC member
Apache Beam (incubating) is a programming model and library for unified batch & streaming big data processing. This talk will cover the Beam programming model broadly, including its origin story and vision for the future. We will dig into how Beam separates concerns for authors of streaming data processing pipelines, isolating what you want to compute from where your data is distributed in time and when you want to produce output. Time permitting, we might dive deeper into what goes into building a Beam runner, for example atop Apache Apex.
In this training webinar, we will walk you through the basics of InfluxDB – the purpose-built time series database. InfluxDB has everything you need from a time series platform in a single binary – a multi-tenanted time series database, UI and dashboarding tools, background processing and monitoring agent. This one-hour session will include the training and time for live Q&A.
What you will learn
Core concepts of time series databases
An overview of the InfluxDB platform
How to ingesting and query data in InfluxDB
CNIT 126: 10: Kernel Debugging with WinDbgSam Bowne
Slides for a college course at City College San Francisco. Based on "Practical Malware Analysis: The Hands-On Guide to Dissecting Malicious Software", by Michael Sikorski and Andrew Honig; ISBN-10: 1593272901.
Instructor: Sam Bowne
Class website: https://samsclass.info/126/126_F19.shtml
Fundamentals of Stream Processing with Apache Beam, Tyler Akidau, Frances Perry confluent
Apache Beam (unified Batch and strEAM processing!) is a new Apache incubator project. Originally based on years of experience developing Big Data infrastructure within Google (such as MapReduce, FlumeJava, and MillWheel), it has now been donated to the OSS community at large.
Come learn about the fundamentals of out-of-order stream processing, and how Beam’s powerful tools for reasoning about time greatly simplify this complex task. Beam provides a model that allows developers to focus on the four important questions that must be answered by any stream processing pipeline:
What results are being calculated?
Where in event time are they calculated?
When in processing time are they materialized?
How do refinements of results relate?
Furthermore, by cleanly separating these questions from runtime characteristics, Beam programs become portable across multiple runtime environments, both proprietary (e.g., Google Cloud Dataflow) and open-source (e.g., Flink, Spark, et al).
Apache Spark Listeners: A Crash Course in Fast, Easy MonitoringDatabricks
The Spark Listener interface provides a fast, simple and efficient route to monitoring and observing your Spark application - and you can start using it in minutes. In this talk, we'll introduce the Spark Listener interfaces available in core and streaming applications, and show a few ways in which they've changed our world for the better at SpotX. If you're looking for a "Eureka!" moment in monitoring or tracking of your Spark apps, look no further than Spark Listeners and this talk!
I did this presentation for one of my java user groups at work.
Basically, this is a mashed up version of various presentations, slides and images that I gathered over the internet.
I've quoted the sources in the end. Feel free to reuse it as you like.
Dynamically Scaling Data Streams across Multiple Kafka Clusters with Zero Fli...Flink Forward
Flink Forward San Francisco 2022.
Flink consumers read from Kafka as a scalable, high throughput, and low latency data source. However, there are challenges in scaling out data streams where migration and multiple Kafka clusters are required. Thus, we introduced a new Kafka source to read sharded data across multiple Kafka clusters in a way that conforms well with elastic, dynamic, and reliable infrastructure. In this presentation, we will present the source design and how the solution increases application availability while reducing maintenance toil. Furthermore, we will describe how we extended the existing KafkaSource to provide mechanisms to read logical streams located on multiple clusters, to dynamically adapt to infrastructure changes, and to perform transparent cluster migrations and failover.
by
Mason Chen
Building a Streaming Microservice Architecture: with Apache Spark Structured ...Databricks
As we continue to push the boundaries of what is possible with respect to pipeline throughput and data serving tiers, new methodologies and techniques continue to emerge to handle larger and larger workloads
Presentation at Strata Data Conference 2018, New York
The controller is the brain of Apache Kafka. A big part of what the controller does is to maintain the consistency of the replicas and determine which replica can be used to serve the clients, especially during individual broker failure.
Jun Rao outlines the main data flow in the controller—in particular, when a broker fails, how the controller automatically promotes another replica as the leader to serve the clients, and when a broker is started, how the controller resumes the replication pipeline in the restarted broker.
Jun then describes recent improvements to the controller that allow it to handle certain edge cases correctly and increase its performance, which allows for more partitions in a Kafka cluster.
Building Cloud-Native App Series - Part 2 of 11
Microservices Architecture Series
Event Sourcing & CQRS,
Kafka, Rabbit MQ
Case Studies (E-Commerce App, Movie Streaming, Ticket Booking, Restaurant, Hospital Management)
Secret Management with Hashicorp’s VaultAWS Germany
When running a Kubernetes Cluster in AWS there are secrets like AWS and Kubernetes credentials, access information for databases or integration with the company LDAP that need to be stored and managed.
HashiCorp’s Vault secures, stores, and controls access to tokens, passwords, certificates, API keys, and other secrets . It handles leasing, key revocation, key rolling, and auditing.
This talk will give an overview of secret management in general and Vault’s concepts. The talk will explain how to make use of Vault’s extensive feature set and show patterns that implement integration between Kubernetes applications and Vault.
Kafka is most popular messaging queue.
Key Areas:
What is Messgaing Queue?
Why Messaging Queue?
Kafka- basic terminologies
Kafka- Architecture (Message Flow)
AWS SQS vs Apache Kafka
Designed for Administrators, this course shows you how to set up your data collection according to your organization’s data sources. Best practices around deployment options ensure you choose a deployment that scales as your organization grows. Because metadata is so important to a healthy environment, learn how to design and set up a naming convention that works best for your teams. Use Chef, Puppet or the likes? Learn how to automate your deployment. Test your deployment with simple searches, and learn to take advantage of optimization tools that can help you stay on top of your deployment.
Brand new to Sumo Logic? Learn how to get started and get the most out of your service. Learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
Video: https://www.sumologic.com/online-training/#QuickStart
Presenter: Kenn Knowles, Software Engineer, Google & Apache Beam (incubating) PPMC member
Apache Beam (incubating) is a programming model and library for unified batch & streaming big data processing. This talk will cover the Beam programming model broadly, including its origin story and vision for the future. We will dig into how Beam separates concerns for authors of streaming data processing pipelines, isolating what you want to compute from where your data is distributed in time and when you want to produce output. Time permitting, we might dive deeper into what goes into building a Beam runner, for example atop Apache Apex.
In this training webinar, we will walk you through the basics of InfluxDB – the purpose-built time series database. InfluxDB has everything you need from a time series platform in a single binary – a multi-tenanted time series database, UI and dashboarding tools, background processing and monitoring agent. This one-hour session will include the training and time for live Q&A.
What you will learn
Core concepts of time series databases
An overview of the InfluxDB platform
How to ingesting and query data in InfluxDB
CNIT 126: 10: Kernel Debugging with WinDbgSam Bowne
Slides for a college course at City College San Francisco. Based on "Practical Malware Analysis: The Hands-On Guide to Dissecting Malicious Software", by Michael Sikorski and Andrew Honig; ISBN-10: 1593272901.
Instructor: Sam Bowne
Class website: https://samsclass.info/126/126_F19.shtml
Fundamentals of Stream Processing with Apache Beam, Tyler Akidau, Frances Perry confluent
Apache Beam (unified Batch and strEAM processing!) is a new Apache incubator project. Originally based on years of experience developing Big Data infrastructure within Google (such as MapReduce, FlumeJava, and MillWheel), it has now been donated to the OSS community at large.
Come learn about the fundamentals of out-of-order stream processing, and how Beam’s powerful tools for reasoning about time greatly simplify this complex task. Beam provides a model that allows developers to focus on the four important questions that must be answered by any stream processing pipeline:
What results are being calculated?
Where in event time are they calculated?
When in processing time are they materialized?
How do refinements of results relate?
Furthermore, by cleanly separating these questions from runtime characteristics, Beam programs become portable across multiple runtime environments, both proprietary (e.g., Google Cloud Dataflow) and open-source (e.g., Flink, Spark, et al).
Apache Spark Listeners: A Crash Course in Fast, Easy MonitoringDatabricks
The Spark Listener interface provides a fast, simple and efficient route to monitoring and observing your Spark application - and you can start using it in minutes. In this talk, we'll introduce the Spark Listener interfaces available in core and streaming applications, and show a few ways in which they've changed our world for the better at SpotX. If you're looking for a "Eureka!" moment in monitoring or tracking of your Spark apps, look no further than Spark Listeners and this talk!
I did this presentation for one of my java user groups at work.
Basically, this is a mashed up version of various presentations, slides and images that I gathered over the internet.
I've quoted the sources in the end. Feel free to reuse it as you like.
Dynamically Scaling Data Streams across Multiple Kafka Clusters with Zero Fli...Flink Forward
Flink Forward San Francisco 2022.
Flink consumers read from Kafka as a scalable, high throughput, and low latency data source. However, there are challenges in scaling out data streams where migration and multiple Kafka clusters are required. Thus, we introduced a new Kafka source to read sharded data across multiple Kafka clusters in a way that conforms well with elastic, dynamic, and reliable infrastructure. In this presentation, we will present the source design and how the solution increases application availability while reducing maintenance toil. Furthermore, we will describe how we extended the existing KafkaSource to provide mechanisms to read logical streams located on multiple clusters, to dynamically adapt to infrastructure changes, and to perform transparent cluster migrations and failover.
by
Mason Chen
Building a Streaming Microservice Architecture: with Apache Spark Structured ...Databricks
As we continue to push the boundaries of what is possible with respect to pipeline throughput and data serving tiers, new methodologies and techniques continue to emerge to handle larger and larger workloads
Presentation at Strata Data Conference 2018, New York
The controller is the brain of Apache Kafka. A big part of what the controller does is to maintain the consistency of the replicas and determine which replica can be used to serve the clients, especially during individual broker failure.
Jun Rao outlines the main data flow in the controller—in particular, when a broker fails, how the controller automatically promotes another replica as the leader to serve the clients, and when a broker is started, how the controller resumes the replication pipeline in the restarted broker.
Jun then describes recent improvements to the controller that allow it to handle certain edge cases correctly and increase its performance, which allows for more partitions in a Kafka cluster.
Building Cloud-Native App Series - Part 2 of 11
Microservices Architecture Series
Event Sourcing & CQRS,
Kafka, Rabbit MQ
Case Studies (E-Commerce App, Movie Streaming, Ticket Booking, Restaurant, Hospital Management)
Secret Management with Hashicorp’s VaultAWS Germany
When running a Kubernetes Cluster in AWS there are secrets like AWS and Kubernetes credentials, access information for databases or integration with the company LDAP that need to be stored and managed.
HashiCorp’s Vault secures, stores, and controls access to tokens, passwords, certificates, API keys, and other secrets . It handles leasing, key revocation, key rolling, and auditing.
This talk will give an overview of secret management in general and Vault’s concepts. The talk will explain how to make use of Vault’s extensive feature set and show patterns that implement integration between Kubernetes applications and Vault.
Kafka is most popular messaging queue.
Key Areas:
What is Messgaing Queue?
Why Messaging Queue?
Kafka- basic terminologies
Kafka- Architecture (Message Flow)
AWS SQS vs Apache Kafka
Designed for Administrators, this course shows you how to set up your data collection according to your organization’s data sources. Best practices around deployment options ensure you choose a deployment that scales as your organization grows. Because metadata is so important to a healthy environment, learn how to design and set up a naming convention that works best for your teams. Use Chef, Puppet or the likes? Learn how to automate your deployment. Test your deployment with simple searches, and learn to take advantage of optimization tools that can help you stay on top of your deployment.
Brand new to Sumo Logic? Learn how to get started and get the most out of your service. Learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
Video: https://www.sumologic.com/online-training/#QuickStart
Sumo Logic QuickStart Webinar - Jan 2016Sumo Logic
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
https://www.sumologic.com/online-training/#start
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
Video: https://www.sumologic.com/online-training/#QuickStart
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
Sumo Logic QuickStart Webinar - Dec 2016Sumo Logic
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
Video: https://www.sumologic.com/online-training/#start
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
Video: https://www.sumologic.com/online-training/#start
Get Certified as a Sumo Power User!
Video: Video: https://www.sumologic.com/online-training/#Start
Designed for users, this series deep-dives into every aspect of analyzing your data. Run as a "how-to" webinar, this session walks viewers through data searching, filtering, parsing, and advanced analytics. This series concludes with "how to"details to create dashboards and alerts to monitor your data and get Sumo Logic to work for you.
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
More info: sumologic.com/training
Level 2 Certification: Using Sumo Logic - Oct 2018Sumo Logic
Get Certified as a Sumo Power User!
Designed for users, this series deep-dives into every aspect of analyzing your data. Run as a "how-to" webinar, this session walks viewers through data searching, filtering, parsing, and advanced analytics. This series concludes with "how to"details to create dashboards and alerts to monitor your data and get Sumo Logic to work for you.
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights
Sumo Logic Quickstart Training 10/14/2015Sumo Logic
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
Webinar: https://www.sumologic.com/online-training/#SettingUpSumo
Designed for Administrators, this course shows you how to set up your data collection according to your organization’s data sources. Best practices around deployment options ensure you choose a deployment that scales as your organization grows. Because metadata is so important to a healthy environment, learn how to design and set up a naming convention that works best for your teams. Use Chef, Puppet or the likes? Learn how to automate your deployment. Test your deployment with simple searches, and learn to take advantage of optimization tools that can help you stay on top of your deployment.
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
Live Webinar is found here: https://youtu.be/Q1yWlInxWVs
Webinar: https://www.sumologic.com/online-training/#SettingUpSumo
Designed for Administrators, this course shows you how to set up your data collection according to your organization’s data sources. Best practices around deployment options ensure you choose a deployment that scales as your organization grows. Because metadata is so important to a healthy environment, learn how to design and set up a naming convention that works best for your teams. Use Chef, Puppet or the likes? Learn how to automate your deployment. Test your deployment with simple searches, and learn to take advantage of optimization tools that can help you stay on top of your deployment.
Sumo Logic QuickStart Webinar - Get CertifiedSumo Logic
Video: https://www.sumologic.com/online-training/#start
Brand new to Sumo Logic?
Get started with these 5 easy steps. Learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
Sumo Logic Cert Jam - Security AnalyticsSumo Logic
With security threats on the rise, come join our Security and Compliance experts to learn how Sumo Logic’s Threat Intelligence can help you stay on top of your environment by matching IOCs like IP address, domain names, URL, email addresses, MD5 hashes and more, to increase velocity and accuracy of threat detection. Hands on labs help cement the knowledge learned.
Designed for all Sumo users, this series deep-dives into every aspect of analyzing your data. Run as a "how-to" webinar, this session walks viewers through data searching, filtering, parsing, and advanced analytics. This series concludes with "how to"details to create dashboards and alerts to monitor your data and get Sumo Logic to work for you.
Security Certification: Security Analytics using Sumo Logic - Oct 2018Sumo Logic
Get Certified as a Sumo Security Power User!
With security threats on the rise, come join our Security and Compliance experts to learn how Sumo Logic’s Threat Intelligence can help you stay on top of your environment by matching IOCs like IP address, domain names, URL, email addresses, MD5 hashes and more, to increase velocity and accuracy of threat detection. Hands on labs help cement the knowledge learned.
Level 3 Certification: Setting up Sumo Logic - Oct 2018Sumo Logic
Get Certified as a Sumo Power Admin!
Designed for Administrators, this course shows you how to set up your data collection according to your organization’s data sources. Best practices around deployment options ensure you choose a deployment that scales as your organization grows. Because metadata is so important to a healthy environment, learn how to design and set up a naming convention that works best for your teams. Use Chef, Puppet or the likes? Learn how to automate your deployment. Test your deployment with simple searches, and learn to take advantage of optimization tools that can help you stay on top of your deployment.
You Build It, You Secure It: Introduction to DevSecOpsSumo Logic
In this presentation, DevOps and DevSecOps expert John Willis dives into how to implement DevSecOps, including:
- Why traditional DevOps has shifted and what this shift means
- How DevSecOps can change the game for your team
- Tips and tricks for getting DevSecOps started within your organization
Making the Shift from DevOps to Practical DevSecOps | Sumo Logic WebinarSumo Logic
In this webinar, Sumo Logic VP of Security and Compliance George Gerchow dives into how to make the shift to DevSecOps, discussing how to:
- Incorporate fundamental and high impact security best practices into your current DevOps operations
- Gain visibility into your compliance posture
- Identify potential risks and threats in your environments
Machine Analytics: Correlate Your Logs and MetricsSumo Logic
To effectively manage your application, it’s critical to have visibility into both logs and metrics. Metrics can provide app and infrastructure KPI’s, while logs provide context into application and infrastructure execution KPIs. Managing one without the other, provides you with incomplete data; you need both to troubleshoot application issues quickly and efficiently.
This webinar will feature a live demo of Sumo Logic’s Unified Logs and Metrics machine data analytics platform and show how to:
Natively ingest your logs, host metrics, AWS metrics and Graphite-compatible metrics
Proactively set alerts based on logs and metrics thresholds
Analyze and correlate logs and metrics in real-time and in a unified way to reduce mean time to problem resolution (MTTR)
Scaling Your Tools for Your Modern ApplicationSumo Logic
In this presentation, we discuss Hootsuite - a customer of Sumo Logic and the leading provider of social media management services for enterprises - and their journey off of open source tools to Sumo Logic, including:
- The challenges in running & managing solutions like ELK and Graphite
- Sumo Logic unified logs and metrics monitoring solution and its advanced analytics, dashboarding and troubleshooting capabilities
- How Hootsuite was able to leverage Sumo Logic to deliver excellent user experience to their end customers
Sumo Logic exposes the Search Job API for access to resources and log data from third-party scripts and applications.
Targeting experienced Sumo Administrators, this webinar shows you how to leverage the Search Job API to interact with the Sumo Logic service. Everyone attending should be familiar with the concepts of RESTful web services and JSON. Through theory and demo, this webinar covers:
Creating a Search Job
Checking Status of a Search Job
Paging through messages and records
Bring your Graphite-compatible metrics into Sumo LogicSumo Logic
If you use open source Graphite software to monitor mission critical applications, you know well the challenges in running, managing and scaling Graphite. Graphite may be ok to get started, but it creates lots of cost and complexity and total-cost-of-ownership headaches as your environment scales.
Sumo Logic provides the industry’s first machine data analytics platform to natively ingest, index and analyze metrics and log data together in real-time.
In this webinar, we will show a live demo of how to:
Ingest graphite compatible metrics into the Sumo Logic service
Analyze and dashboard the metrics to get real-time real-time insights
Correlate Graphite metrics and logs to troubleshoot issues faster
See how easy it is to migrate from graphite to Sumo Logic.
Dashboards are fantastic, but how do I get notified of critical events? This webinar will cover how to create alerts that will allow your team to effectively monitor business-critical events. Alert channels include email or webhooks into Slack, PagerDuty, DataDog, ServiceNow, or any other webhook you want to develop. What about running custom scripts triggered from alerts? Let's do it.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
2. Sumo Logic confidential
Course Agenda
Data collection
Search and analyze
15 min.
15 min.
Visualize and monitor
15 min.
Q & A
Overview
5 min.
15 min.
3. Sumo Logic confidential
Our analytics
Your data
Sumo Logic Continuous Intelligence PlatformTM
Operational Intelligence Security Intelligence Business Intelligence Global Intelligence
Your people DevOps Engineering IT Ops Customer Product Data
SecOps
Success Scientist
Apps MicroServices Cloud Systems & SaaS Service Mobile devices 3rd
Party
infrastructure
systems & feeds
Multi-tenant Scalable & Machine API Advanced
Security & elastic learning analytics
monitoring
5. Sumo Logic confidential
Login to training environment
url: service.sumologic.com
email:
training+analyst###@sumologic.com
password: Security21!
### - a number between 001-999, for
example
training+analyst057@sumologic.com
Note: Place your ### number into chat so
that everyone knows not to use the one
you selected
6. 7
Collectors
Sources
1
Data Collection
Data Collection Activity 1
1. In the left navigation pane click Manage Data> Collection
> Collection
2. In the magnifying glass enter okta and press return
3. Notice the two collectors and the sources of data below
them
8. Sumo Logic confidential
Sends the data to the
Sumo service
Encrypts and
compresses the data
Installed Collector Overview
Collects logs and metrics
from its Sources
Installed Collector
Java Agent
9. Sumo Logic confidential
Sending Data ⇨ Metadata
Metadata tags are associated with each log message that is collected.
Tag Description
_collector Name of the collector (defaults to hostname)
_sourceHost Hostname of the server (defaults to hostname)
_sourceName Name and Path of the log file
_source Name of the source this data came through
_sourceCategory
Can be freely configured. Main metadata tag
(e.g. labs/apache/access)
10. Sumo Logic confidential
Metadata: Source Category Best Practices and Benefits
Common components (and any combination of):
• Environment (Prod/UAT/DEV)
• Application Name
• Geographic Information (East vs West datacenter, office location, etc.)
• AWS Region
• Business Unit
Highest level components should group the data how it is most often search together:
Prod/Web/Apache/Access
Dev/Web/Apache/Access
Prod/DB/MySQL/Error
Dev/DB/MySQL/Error
Web/Apache/Access/Prod
Web/Apache/Access/Dev
DB/MySQL/Error/Prod
DB/MySQL/Error/Dev
11. Sumo Logic confidential
What Data can I Analyze?
Option 1
Explore your Collectors
Option 2
Search for source categories
12. Sumo Logic confidential
Operators
Charts
2
Search & Analyze
Search & Analyze Activity 2
1. To open a query, at the top, click +New and select Log
Search
2. At the top query builder paste the following code (I will put
in chat for you to copy)
_sourceCategory=Labs/Apache/Access and "Mozilla"
| parse ""GET *" * " as url,status_code
| where status_code matches "5*"
| count by status_code
| sort by _count
1. You can modify the time to Last 60 minutes and click
Start
2. You will have a table displayed and you can click on other
chart types
13. Sumo Logic confidential
Data Analytics ⇨ Query Syntax
Syntax: metadata
Keywords and operators, separated by pipes, that build on top of each other
parse
filter
aggregate
format
keywords
_sourceCategory=Labs/Apache/Access and "Mozilla"
| parse ""GET *" * " as url,status_code
| where status_code matches “5*”
| count by status_code
| sort by _count
| limit 3
14. Sumo Logic confidential
Data Analytics ⇨ Query Syntax
Syntax:
Use metadata and keywords to narrow your search scope
Results
keyword
metadata keyword
| parse
| filter
| aggregate
| format
metadata + keywords
_sourceCategory=Labs/Apache/Access and "Mozilla"
15. Sumo Logic confidential
Data Analytics ⇨ Query Syntax
Syntax:
Extract meaningful fields to provide structure to your data
Parse Anchor:
| parse " *@* " as user,domain
Parse Regex:
| parse regex "(?<src_ip>d{1,3}
.d{1,3}.d{1,3}.d{1,3})”
Other Parse Operators:
csv, json, keyvalue, split, xml
Learn more: Parse Operators
| parse
| filter
| aggregate
| format
metadata + keywords
16. Sumo Logic confidential
Data Analytics ⇨ Query Syntax
Syntax:
Further filter results using your extracted fields
where operator:
| where !(status_code=304)
in operator:
| if(status_code in("501","502"),
"Error","OK") as code_type
Other Filter Operators:
join, lookup, matches, filter,
isEmpty, isNull, isBlank
Learn more: Filter operator example
| parse
| filter
| aggregate
| format
metadata + keywords
17. Sumo Logic confidential
Data Analytics ⇨ Query Syntax
Syntax:
Evaluate messages and place them into groups
count operator:
| count by status_code
avg operator:
| avg(size) by src_ip
pct operator:
| pct(filesize,75) by _sourceHost
Other Aggregation Operators:
sum, count_distinct, stddev, min,
max
Learn more: Aggregation operators
| parse
| filter
| aggregate
| format
metadata + keywords
18. Sumo Logic confidential
Data Analytics ⇨ Query Syntax
Syntax:
Format to display desired results succinctly
top operator:
| top 5 src_ip by avg_size
fields operator:
| fields src_ip, avg_size
transpose operator:
| transpose row src_ip column url
Other formatting Operators:
format, formatdate, limit, sort
Learn more: Trends over time using transpose
| parse
| filter
| aggregate
| format
metadata + keywords
19. Sumo Logic confidential
Advanced Analytics
Geo Lookup
_sourceCategory=Labs/Apache/Access
| lookup latitude, longitude from geo://default on ip=src_ip
| count by latitude, longitude
Outlier
_sourceCategory=Labs/Apache/Access and status_code=404
| timeslice 1m
| count(status_code) as error_count by _timeslice
| outlier error_count
Predict
_sourceCategory=Labs/Apache/Access
| timeslice 5m
| count as requests by _timeslice
| predict requests by 5m forecast=12
Log operators Cheat Sheet: https://help.sumologic.com/05Search/Search-Cheat-
Sheets/Log-Operators-Cheat-Sheet
20. Sumo Logic confidential
Advanced Analytics
Find the “needle in the haystack” by identifying patterns.
Compare today’s patterns with patterns in the past.
_sourceCategory=Labs/snort
| logreduce
_sourceCategory=Labs/snort
| logcompare timeshift -24h
LogReduce
LogCompare
21. Sumo Logic confidential
Alerts
Dashboards
3
Visualize & Monitor
Visualize & Monitor Activity 3
1. To create a dashboard, in the middle right click Add to
Dashboard
2. In the popup window under Panel Title enter Track 500s
under Dashboard enter Apache_db_<your initials###>
1. At the bottom of the popup click Add
1. To create an alert, in the left navigation pane click Manage
Data> Alert
2. In the upper right corner click Add and select New
Monitor
22. Sumo Logic confidential
Monitoring - Dashboards
• Each Panel processes results from
a single search
• Drill down into corresponding
query or link to another Dashboard
• Live Mode: provides live stream of
data
• Use Dashboards as templates with
Filters
23. Sumo Logic confidential
Sumo Logic Data Flow
Alerts
Dashboards
3
Visualize & Monitor
Operators
Charts
2
Search & Analyze
Collectors
Sources
1
Data Collection
What else may we address for you?
24. Sumo Logic confidential
• Onboarding Checklist
https://help.sumologic.com/01Sta
rt-Here/Onboarding_Checklist
• Take the training
https://www.sumologic.com/learn
/training/
• Read the docs
https://help/sumologic.com
Want to learn more?
26. Sumo Logic Confidential
Sumo Logic Confidential
Which Topic would you like covered next?
(Single Choice)
Answer 1: Logreduce Operator
Answer 2: New Dashboards
Answer 3: Best Practices for Search Queries
Answer 4: Partitions
Answer 5: Collector Setup Optimization
27. Sumo Logic confidential
LogReduce – Foundation for Anomaly Detection
Reduce hundreds of thousands of pages of results into a single page of
meaningful patterns.
Under the Covers:
• LogReduce deconstructs log messages into their most basic patterns to
facilitate overall behavioral analysis – at the printf level
• Log messages are converted into unique hashed signatures – these
signatures are the building blocks of anomaly detection
– Events that occur more than others (e.g. errors flooding your
logs)
– Events that occur very infrequently but are important (e.g. rare
exception)
• Edit each signature to tailor each experience
• Benefit from machine learning that improves over time based on your
data and activity
28. Sumo Logic confidential
New Dashboards
• New charts, like Honeycomb
• Full control over look and feel with
JSON
• Build panels directly in the
dashboard
• Advanced filtering and metrics
query building
• Basic charts, like time series and
categorical
• Few color and font choices
• Panels created from search and
metrics tabs
• Limited filters and queries
• Still supported
Classic Dashboard New Dashboard
About New dashboards: https://help.sumologic.com/Visualizations-and-
Alerts/Dashboard_(New)/About_Dashboard_(New)