This document provides an overview of Microsoft's StreamInsight Complex Event Processing (CEP) platform. It discusses CEP concepts and benefits, the StreamInsight architecture and development environment, and deployment scenarios. The presentation aims to introduce IT professionals to CEP and Microsoft's StreamInsight solution for building event-driven applications that process streaming data with low latency.
本資料は以下の情報源をもとに、2020年8月の更新情報を作成しています。
- https://azure.microsoft.com/ja-jp/updates/
Azure 更新情報のうち、以下8カテゴリ
DevOps、Web、コンテナー、ブロックチェーン、メディア、モバイル、開発者ツール、統合
- https://azure.github.io/AppService/
Azure App Service Team Blog のうち、機能更新にあたるもの
(読み物などを除いています)
World of Watson 2016 - Put your Analytics on Cloud 9Keith Redman
Wikipedia defines Cloud 9 as the state of euphoria. Wouldn’t we all like to experience euphoria more often? IBM analytics in the cloud is making that a possibility. Check out these sessions to learn how to put your business on Cloud 9.
Servizi Cloud Computing: Scenario, Strategia e Mercato Nicoletta MaggioreApulian ICT Living Labs
Presentazione nell'ambito del workshop: OPEN DATA E CLOUD COMPUTING: OPPORTUNITÀ DI BUSINESS. Una vista internazionale - 15 Settembre 2014 Pad. 152 della Regione Puglia - 78 Fiera del Levante Bari
Tiarrah Computing: The Next Generation of ComputingIJECEIAES
The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
本資料は以下の情報源をもとに、2020年8月の更新情報を作成しています。
- https://azure.microsoft.com/ja-jp/updates/
Azure 更新情報のうち、以下8カテゴリ
DevOps、Web、コンテナー、ブロックチェーン、メディア、モバイル、開発者ツール、統合
- https://azure.github.io/AppService/
Azure App Service Team Blog のうち、機能更新にあたるもの
(読み物などを除いています)
World of Watson 2016 - Put your Analytics on Cloud 9Keith Redman
Wikipedia defines Cloud 9 as the state of euphoria. Wouldn’t we all like to experience euphoria more often? IBM analytics in the cloud is making that a possibility. Check out these sessions to learn how to put your business on Cloud 9.
Servizi Cloud Computing: Scenario, Strategia e Mercato Nicoletta MaggioreApulian ICT Living Labs
Presentazione nell'ambito del workshop: OPEN DATA E CLOUD COMPUTING: OPPORTUNITÀ DI BUSINESS. Una vista internazionale - 15 Settembre 2014 Pad. 152 della Regione Puglia - 78 Fiera del Levante Bari
Tiarrah Computing: The Next Generation of ComputingIJECEIAES
The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
Presentation at the International Industry-Academia Workshop on Cloud Reliability and Resilience. 7-8 November 2016, Berlin, Germany.
Organized by EIT Digital and Huawei GRC, Germany.
Twitter: @CloudRR2016
Microsoft Telecommunications Newsletter | September 2021Rick Lievano
Monetizing the edge continues to be a top priority for telcos, and not a day goes by where we don’t have a meaningful conversation on the topic with a telco partner. While the edge’s killer app continues to elude the industry, private mobile networks and video analytics are quickly becoming the critical building blocks for bringing it to market – whatever it is.
So where can you learn more about monetizing the edge? The TM Forum Digital Transformation World Series 2021 provides a collaborative environment for operators and suppliers to come together, share ideas, and solve the industry’s toughest problems. Microsoft is an active participant in this year’s event, sharing best practices, successes, and industry insights across wide-ranging areas including edge, artificial intelligence, cloud transformation, and customer experience.
See the Events section for details on how Microsoft is participating at this year’s show. We look forward to seeing you virtually at the event!
Smart, Secure and Efficient Data Sharing in IoTAngelo Corsaro
The value of the Internet of Things is the data and the insights derived from it to optimise and improve potentially every aspect of our modern society. As IoT extends its application from consumer to ever more demanding industrial applications, the ability to smartly, securely and efficiently share data makes the difference between success and failure.
This presentation will (1) introduce the data sharing challenges posed by a large class of IoT applications often referred as Industrial IoT (IIoT) applications, (2) highlight how the standards identified by the Industrial Internet of Things Reference Architecture, such as DDS, address the need of smart, secure and efficient data sharing, and (3) showcase how this technology is used today in several IoT systems for ensuring smart, secure and efficient data sharing.
Azure IoT - A Practical Entry to New RetailDaniel Li
New Retail is in fact a paramount horizontal integration to connect consumers to manufacturers of highly personalized products. It's firstly brought up by Alibaba and is rapidly put in actions in tier 1 retailers. How retailers, retail device manufacturers to take practical actions to embrace the sea change. The presentation offers the hint with Azure IoT
The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
Ray Velez of Razorfish discussed how marketers marketers can more effectively manage the peaks and valleys of marketing campaigns by employing cloud technology at the Razorfish Client Summit in Boston. October 12-14, 2010.
Cloud based cyber-physical systems in manufacturingGiriraj Mannayee
Cyber-Physical Systems (CPS) are integrations of computation, cloud based networking, and physical processes. CPS integrates the dynamics of the physical processes with those of the software and networking, providing abstractions and modeling, design, and analysis techniques for the integrated whole.
Microsoft Telecommunications Industry News | October 2020Rick Lievano
The Microsoft Worldwide Telecommunications Industry team is pleased to share with you the October 2020 Telecommunications Industry Newsletter, available to both internal and external audiences. We encourage you to share it with your colleagues and distribute it to your customers and partners as appropriate. As always, we welcome your input, feedback, and suggestions!
Microsoft StreamInsight, part of the recent SQL Server 2008 R2 release, is a new platform for building rich applications that can process high volumes of event stream data with near-zero latency.
Mark Simms of Microsoft's SQLCAT will demonstrate the core skill sets and technologies needed to deliver StreamInsight enabled solutions, and discuss some of the core scenarios.
Mark will provide a detailed walkthrough of the three major components of StreamInsight: input and output adapters, the StreamInsight engine runtime, and the semantics of the continuous standing queries hosted in the StreamInsight engine.
This presentation includes hands-on demos, including building out a real-time data processing solution interacting with SQL Server and Sharepoint.
You will learn:
• The new capabilities StreamInsight brings to data processing and analytics, unlocking the ability to extract real time business intelligence from streaming data.
• How StreamInsight interacts with and compliments other components of SQL Server and the rest of the Microsoft technology stack.
• How to ramp up on the skills and technology necessary to build out end to end solutions leveraging streaming data sources.
Presentation at the International Industry-Academia Workshop on Cloud Reliability and Resilience. 7-8 November 2016, Berlin, Germany.
Organized by EIT Digital and Huawei GRC, Germany.
Twitter: @CloudRR2016
Microsoft Telecommunications Newsletter | September 2021Rick Lievano
Monetizing the edge continues to be a top priority for telcos, and not a day goes by where we don’t have a meaningful conversation on the topic with a telco partner. While the edge’s killer app continues to elude the industry, private mobile networks and video analytics are quickly becoming the critical building blocks for bringing it to market – whatever it is.
So where can you learn more about monetizing the edge? The TM Forum Digital Transformation World Series 2021 provides a collaborative environment for operators and suppliers to come together, share ideas, and solve the industry’s toughest problems. Microsoft is an active participant in this year’s event, sharing best practices, successes, and industry insights across wide-ranging areas including edge, artificial intelligence, cloud transformation, and customer experience.
See the Events section for details on how Microsoft is participating at this year’s show. We look forward to seeing you virtually at the event!
Smart, Secure and Efficient Data Sharing in IoTAngelo Corsaro
The value of the Internet of Things is the data and the insights derived from it to optimise and improve potentially every aspect of our modern society. As IoT extends its application from consumer to ever more demanding industrial applications, the ability to smartly, securely and efficiently share data makes the difference between success and failure.
This presentation will (1) introduce the data sharing challenges posed by a large class of IoT applications often referred as Industrial IoT (IIoT) applications, (2) highlight how the standards identified by the Industrial Internet of Things Reference Architecture, such as DDS, address the need of smart, secure and efficient data sharing, and (3) showcase how this technology is used today in several IoT systems for ensuring smart, secure and efficient data sharing.
Azure IoT - A Practical Entry to New RetailDaniel Li
New Retail is in fact a paramount horizontal integration to connect consumers to manufacturers of highly personalized products. It's firstly brought up by Alibaba and is rapidly put in actions in tier 1 retailers. How retailers, retail device manufacturers to take practical actions to embrace the sea change. The presentation offers the hint with Azure IoT
The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
Ray Velez of Razorfish discussed how marketers marketers can more effectively manage the peaks and valleys of marketing campaigns by employing cloud technology at the Razorfish Client Summit in Boston. October 12-14, 2010.
Cloud based cyber-physical systems in manufacturingGiriraj Mannayee
Cyber-Physical Systems (CPS) are integrations of computation, cloud based networking, and physical processes. CPS integrates the dynamics of the physical processes with those of the software and networking, providing abstractions and modeling, design, and analysis techniques for the integrated whole.
Microsoft Telecommunications Industry News | October 2020Rick Lievano
The Microsoft Worldwide Telecommunications Industry team is pleased to share with you the October 2020 Telecommunications Industry Newsletter, available to both internal and external audiences. We encourage you to share it with your colleagues and distribute it to your customers and partners as appropriate. As always, we welcome your input, feedback, and suggestions!
Microsoft StreamInsight, part of the recent SQL Server 2008 R2 release, is a new platform for building rich applications that can process high volumes of event stream data with near-zero latency.
Mark Simms of Microsoft's SQLCAT will demonstrate the core skill sets and technologies needed to deliver StreamInsight enabled solutions, and discuss some of the core scenarios.
Mark will provide a detailed walkthrough of the three major components of StreamInsight: input and output adapters, the StreamInsight engine runtime, and the semantics of the continuous standing queries hosted in the StreamInsight engine.
This presentation includes hands-on demos, including building out a real-time data processing solution interacting with SQL Server and Sharepoint.
You will learn:
• The new capabilities StreamInsight brings to data processing and analytics, unlocking the ability to extract real time business intelligence from streaming data.
• How StreamInsight interacts with and compliments other components of SQL Server and the rest of the Microsoft technology stack.
• How to ramp up on the skills and technology necessary to build out end to end solutions leveraging streaming data sources.
In this presentation we review the basic architecture behind SQL Server StreamInsight.
Regards,
Ing. Eduardo Castro Martínez, PhD – Microsoft SQL Server MVP
http://mswindowscr.org
http://comunidadwindows.org
Costa Rica
Technorati Tags: SQL Server
LiveJournal Tags: SQL Server
del.icio.us Tags: SQL Server
http://ecastrom.blogspot.com
http://ecastrom.wordpress.com
http://ecastrom.spaces.live.com
http://universosql.blogspot.com
http://todosobresql.blogspot.com
http://todosobresqlserver.wordpress.com
http://mswindowscr.org/blogs/sql/default.aspx
http://citicr.org/blogs/noticias/default.aspx
http://sqlserverpedia.blogspot.com/
Recent trends of Big Data and the Internet of Things pose challenges to our current computational paradigms such as event processing systems. While three dimensions of Big Data are identified including Volume, Variety and Velocity, we think that more attention shall be given to the Variety aspect within distributed and event based systems, as it can have profound challenges to our computational systems. Event-based systems follow an interaction model based on three decoupling dimensions: space, time, and synchronization. However, event producers and consumers are tightly coupled by event semantics: types, attributes, and values. That limits scalability in large scale heterogeneous environments such as the Internet of Things (IoT) due to difficulties to establish semantic agreements on such scales. This tutorial studies this problem from various perspectives and investigates the suitability of semantic models such as vector space models for tackling the issue.
Introduction to microsoft sql server 2008 r2Eduardo Castro
In this presentation we review the new features in SQL 2008 R2.
Regards,
Ing. Eduardo Castro Martinez, PhD
http://comunidadwindows.org
http://ecastrom.blogspot.com
Semantic Complex Event Processing with Reaction RuleML 1.0 and Prova 3.0Adrian Paschke
Seminar presented by Visiting Professor Adrian Paschke from Freie Universitaet Berlin as part of the BPM EduNet (http://bpmedu.net) staff exchange at University of Toronto, McGill University, Ryerson University, University of Ontario Institute of Technology
Semantic Complex Event Processing at Sem Tech 2010Adrian Paschke
Semantic Complex Event Processing - The Future of Dynamic IT
Presentation by Paul Vincent, Adrian Paschke, Harold Boley
at the RuleML Semantic Rules Track of the Semantic Technologies Conference 2010 (SemTech 2010), San Francisco, CA, USA
http://semtech2010.semanticuniverse.com/rules
Seminar about Semantic Complex Event Processing and Reaction RuleML presented at the School of Computer Science at McGill University on Sept. 9th, 2013 as part of the Transatlantic Business Process Management Education Network (http://bpmedu.net/) and presented at the DemAAL 2013 - Dem@Care Summer School on Ambient Assisted Living, 16-20 September 2013, Chania, Crete, Greece.
Replicate Salesforce Data in Real Time with Change Data CaptureSalesforce Developers
Migrate your batch processing, scheduled ETL, and nightly workloads to event-driven, real-time integrations using Change Data Capture. CDC means data change events are published to an event stream, allowing businesses to have up-to-date information across systems and applications. Join us to learn how to configure Change Data Capture and subscribe to the stream of change events, streamlining your architectures and processes.
Unify Analytics: Combine Strengths of Data Lake and Data WarehousePaige_Roberts
ODSC West Presentation Oct 2020: Technical and spiritual unification of BI and Data Science teams will benefit businesses powerfully. Data architectures evolution is making that possible.
Combining Logs, Metrics, and Traces for Unified ObservabilityElasticsearch
Learn how Elasticsearch efficiently combines data in a single store and how Kibana is used to analyze it. Plus, see how recent developments help identify, troubleshoot, and resolve operational issues faster.
SplunkLive! Frankfurt 2018 - Data Onboarding OverviewSplunk
Presented at SplunkLive! Frankfurt 2018:
Splunk Data Collection Architecture
Apps and Technology Add-ons
Demos / Examples
Best Practices
Resources and Q&A
Observability foundations in dynamically evolving architecturesBoyan Dimitrov
Holistic application health monitoring, request tracing across distributed systems, instrumentation, business process SLAs - all of them are integral parts of today’s technical stacks. Nevertheless many teams decide to integrate observability last which makes it an almost impossible challenge - especially if you have to deal with hundreds and thousands of services. Therefore starting early is essential and in this talk we are going to see how we can solve those challenges early and explore the foundations of building and evolving complex microservices platforms in respect to observability.
We are going to share some of the best practices and quick wins that allow us to correlate different telemetry systems and gradually build up towards more sophisticated use-cases.
We are also going to look at some of the standard AWS services such as X-Ray and Cloudwatch that help us get going "for free" and then discuss more complex tooling and integrations building up towards a fully integrated ecosystem. As part of this talk we are also going to share some of the learnings we have made at Sixt on this topic and we are going to introduce some of the solutions that help us operate our microservices stack
Stream Processing – Concepts and FrameworksGuido Schmutz
More and more data sources today provide a constant stream of data, from IoT devices to Social Media streams. It is one thing to collect these events in the velocity they arrive, without losing any single message. An Event Hub and a data flow engine can help here. It’s another thing to do some (complex) analytics on the data. There is always the option to first store in a data sink of choice and later analyze it. Storing even a high-volume event stream is feasible and not a challenge anymore. But this adds to the end-to-end latency and it takes minutes if not hours to present results. If you need to react fast, you simply can’t afford to first store the data. You need to do process it directly on the data stream. This is called Stream Processing or Stream Analytics. In this talk I will present the important concepts, a Stream Processing solution should support and then dive into some of the most popular frameworks available on the market and how they compare.
Event Driven Architecture (EDA), November 2, 2006Tim Bass
Event Driven Architecture (EDA), SOA Seminar Crystal City, Virginia, November 2nd, 2006, Tim Bass, CISSP, Principal Global Architect, Director. Co-Chair, Event Processing Reference Architecture Working Group (EPRAWG)
Most data visualisation solutions today still work on data sources which are stored persistently in a data store, using the so called “data at rest” paradigms. More and more data sources today provide a constant stream of data, from IoT devices to Social Media streams. These data stream publish with high velocity and messages often have to be processed as quick as possible. For the processing and analytics on the data, so called stream processing solutions are available. But these only provide minimal or no visualisation capabilities. One was is to first persist the data into a data store and then use a traditional data visualisation solution to present the data.
If latency is not an issue, such a solution might be good enough. An other question is which data store solution is necessary to keep up with the high load on write and read. If it is not an RDBMS but an NoSQL database, then not all traditional visualisation tools might already integrate with the specific data store. An other option is to use a Streaming Visualisation solution. They are specially built for streaming data and often do not support batch data. A much better solution would be to have one tool capable of handling both, batch and streaming data. This talk presents different architecture blueprints for integrating data visualisation into a fast data solution and highlights some of the products available to implement these blueprints.
Data Con LA 2020
Description
The data warehouse has been an analytics workhorse for decades. Unprecedented volumes of data, new types of data, and the need for advanced analyses like machine learning brought on the age of the data lake. But Hadoop by itself doesn't really live up to the hype. Now, many companies have a data lake, a data warehouse, or a mishmash of both, possibly combined with a mandate to go to the cloud. The end result can be a sprawling mess, a lot of duplicated effort, a lot of missed opportunities, a lot of projects that never made it into production, and a lot of financial investment without return. Technical and spiritual unification of the two opposed camps can make a powerful impact on the effectiveness of analytics for the business overall. Over time, different organizations with massive IoT workloads have found practical ways to bridge the artificial gap between these two data management strategies. Look under the hood at how companies have gotten IoT ML projects working, and how their data architectures have changed over time. Learn about new architectures that successfully supply the needs of both business analysts and data scientists. Get a peek at the future. In this area, no one likes surprises.
*Look at successful data architectures from companies like Philips, Anritsu, Uber,
*Learn to eliminate duplication of effort between data science and BI data engineering teams
*Avoid some of the traps that have caused so many big data analytics implementations to fail
*Get AI and ML projects into production where they have real impact, without bogging down essential BI
*Study analytics architectures that work, why and how they work, and where they're going from here
Speaker
Paige Roberts,Vertica, Open Source Relations Manager
Independent of the source of data, the integration of event streams into an Enterprise Architecture gets more and more important in the world of sensors, social media streams and Internet of Things. Events have to be accepted quickly and reliably, they have to be distributed and analysed, often with many consumers or systems interested in all or part of the events. Storing such huge event streams into HDFS or a NoSQL datastore is feasible and not such a challenge anymore. But if you want to be able to react fast, with minimal latency, you can not afford to first store the data and doing the analysis/analytics later. You have to be able to include part of your analytics right after you consume the data streams. Products for doing event processing, such as Oracle Event Processing or Esper, are avaialble for quite a long time and used to be called Complex Event Processing (CEP). In the past few years, another family of products appeared, mostly out of the Big Data Technology space, called Stream Processing or Streaming Analytics. These are mostly open source products/frameworks such as Apache Storm, Spark Streaming, Flink, Kafka Streams as well as supporting infrastructures such as Apache Kafka. In this talk I will present the theoretical foundations for Stream Processing, discuss the core properties a Stream Processing platform should provide and highlight what differences you might find between the more traditional CEP and the more modern Stream Processing solutions.
EDA Meets Data Engineering – What's the Big Deal?confluent
Presenter: Guru Sattanathan, Systems Engineer, Confluent
Event-driven architectures have been around for many years, much like Apache Kafka®, which first open sourced in 2011. The reality is that the true potential of Kafka is only being realised now. Kafka is becoming the central nervous system of many of today’s enterprises. It is bringing a profound paradigm shift to the way we think about enterprise IT. What has changed in Kafka to enable this paradigm shift? Is it not just a message broker, and how are enterprises using it today? This session will explore these key questions.
Sydney: https://content.deloitte.com.au/20200221-tel-event-tech-community-syd-registration
Melbourne: https://content.deloitte.com.au/20200221-tel-event-tech-community-mel-registration
Enterprise guide to building a Data MeshSion Smith
Making Data Mesh simple, Open Source and available to all; without vendor lock-in, without complex tooling and to use an approach centered around ‘specifications’, existing tools and baking in a ‘domain’ model.
Emerging Prevalence of Data Streaming in Analytics and it's Business Signific...Amazon Web Services
Learning Objectives:
- Get an overview of streaming data and it's application in analytics and big data.
- Understand the factors driving the accelerating transformation of batch processing to real-time.
- Learn how you should plan for incorporating data streaming in your analytics and processing workloads.
Business can now easily perform real-time analytics on data that has been traditionally analyzed using batch processing in data warehouses or using Hadoop frameworks, and react to new information in minutes or seconds instead of hours or days. In this webinar, Forrester analyst Mike Gualtieri and Amazon Kinesis GM Roger Barga will discuss this prevalent trend, it's business significance, and how you should plan for it. You will also learn about the AWS services that can help you get started quickly with real-time, streaming applications fore your analytics and big data workloads.
Independent of the source of data, the integration of event streams into an Enterprise Architecture gets more and more important in the world of sensors, social media streams and Internet of Things. Events have to be accepted quickly and reliably, they have to be distributed and analyzed, often with many consumers or systems interested in all or part of the events. Storing such huge event streams into HDFS or a NoSQL datastore is feasible and not such a challenge anymore. But if you want to be able to react fast, with minimal latency, you can not afford to first store the data and doing the analysis/analytics later. You have to be able to include part of your analytics right after you consume the data streams. Products for doing event processing, such as Oracle Event Processing or Esper, are available for quite a long time and used to be called Complex Event Processing (CEP). In the past few years, another family of products appeared, mostly out of the Big Data Technology space, called Stream Processing or Streaming Analytics. These are mostly open source products/frameworks such as Apache Storm, Spark Streaming, Flink, Kafka Streams as well as supporting infrastructures such as Apache Kafka. In this talk I will present the theoretical foundations for Stream Processing, discuss the core properties a Stream Processing platform should provide and highlight what differences you might find between the more traditional CEP and the more modern Stream Processing solutions.
Introduction to Time Series Analytics with Microsoft AzureCodit
Improve operations and decision-making by using real-time data insights and interactive analytics to accelerate IoT data use throughout your organization.
Discover the webinar here: https://bit.ly/38sMcrP
Similar to Microsoft SQL Server - StreamInsight Overview Presentation (20)
Migrating its entire virtualized environment from VMware to Hyper-V has helped Miele & Cie, an appliance manufacturer shrink its server space by more than 50 percent and improve productivity.
To add to the new flexible and energy-efficient data center, Miele has also ‘saved an estimated 35 percent in licensing costs by going with a Microsoft virtualization solution’.
Hyper-V has helped AcXess fill that much needed virtualization gap with an improvised and more competent functionality and reduced data center and hardware costs.
According to Tom Elowson, President and Cofounder, AcXess, “Hyper-V has made all the difference. It is amazing to think we have grown our business 300 percent over the last few years with Hyper-V. ”
There are many misconceptions surrounding Cloud Computing and what it has to offer.
Tell apart the facts from the myths with Cloud Computing Myth Busters and develop a deeper understanding of the Cloud.
Download Myth Busters >>
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
18. Scalable relational database platform Consistent, familiar model & tools Self-managed, highly available cloud services MPP support for 10s to 100s TB DW Highly scalable appliances Seamless integration with Microsoft BI A Comprehensive Platform Managed Self-Service BI Multi-server management Virtualization & Live Migration
19. TRUSTED, SCALABLE PLATFORM IT & DEVELOPER EFFICIENCY MANAGED SELF-SERVICE BI Enterprise-level security, scalability High-scale, complex event processing Data consistency across heterogeneous systems Multi-server management Virtualization & Live Migration Accelerated development& deployment Self-service analytics Self-service reporting Streamlined collaboration& management COMPREHENSIVE PLATFORM FOR IT VALUE MPP support for 10s-100s TB DW Highly scalable appliances Seamless BI Integration Scalable relational database platform Consistent, familiar model & toolsSelf-managed, highly available cloud service
20. What is CEP? Event Complex Event Processing (CEP) is the continuous and incremental processing of event streams from multiple sources based on declarative query and pattern specifications with near-zero latency. request output stream input stream response 6
21. Latency Scenarios for Event Processing Relational Database Applications CEP Target Scenarios Operational Analytics Applications, e.g., Logistics, etc. Data Warehousing Applications Web Analytics Applications Manufacturing Applications Financial trading Applications Monitoring Applications Aggregate Data Rate (Events/sec.) 7
56. Event Types Events in Microsoft’s CEP platform use the .NET type system Events are structured and can have multiple fields Fields are typed using the .NET framework types CEP engine provisioned timestamp fields capture all the different temporal event characteristics Event sources populate time stamp fields
61. properties to indicate adapter features to the engineAdapter features Feature Problem 16
62. Core CEP Query Engine Event Event Event Event Event Event Event Event Event CEP Engine Hosts “standing queries” Operators consume and produce streams Queries are composable Query results are computed incrementally Standing Queries Output Adapters Input Adapters Query instance management: Submit, start, stop Runtime statistics Takeaway: CEP engine does the heavy lifting for you when processing temporal event data 17
63. Typical CEP Queries Typical CEP queries require combination of functionality Complex type describes event properties Calculations introduce additional event properties Grouping by one or more event properties Aggregation for each event group over a pre-defined period of time, typically a window Multiple event groups monitored by the same query Correlate event streams Check for absence of activity with a data source Enrich events with reference data Collection of assets may change over time We want to make writing and maintaining those queries easy or even effortless
64. CEP Query Features Operators over streams Calculations (PROJECT) Correlation of streams from different data sources (JOIN) Check for absence of activity with a data source (EXISTS) Selection of events from streams (FILTER) Stream partitioning (GROUP & APPLY) Aggregation (SUM, COUNT, …) Ranking and heavy hitters (TOP-K) Temporal operations: hopping window, sliding window Extensibility – to add new domain-specific operators Queries are written over specific event types They can be evaluated on all data sources with the same event type Support for streaming data, reference data (lookup), and historical data (replay)
66. Extensibility Built-in operators do not cover all functionality Need for domain-specific extensions Integrate with functionality available in existing libraries Support for extensions in the CEP platform: User-defined operators, functions, aggregates Code written in .NET, deployed as .NET assembly Query operators can refer to functionality of the assembly LINQ queries can easily refer to user-defined functionality Temporal snap-shot operator framework Interface to implement user-defined operators Manages operator state and snapshot changes Framework does the heavy lifting to deal with intricate temporal behavior such as out-of-order events
67.
68. Server applicationsCompelling story for domain-specific extensions CEP development environment Builds on top of .NET Microsoft Visual Studio as IDE Applications written in C# Queries written in LINQ LINQ: varqueryFilter = fromcinTestEventStream wherec.Field1 > 1 selectc; 22
73. CEP Query Configuration Query Query binding: Coupling adapters with a query Event types required by the query need to be matched by types delivered by adapters Query can be re-used for all data sources of the same type No changes to query necessary for re-use Output Adapter Type Type Query Template Proj. Join Type Type Type Type Input Adapter Input Adapter
74. Recap: CEP Platform from Microsoft Event Event Event Event Event Event Event Event Event Development experience with .NET, C#, LINQ and Visual Studio 2008 CEP Application Development CEP platform from Microsoft to build event-driven applications Event targets Event sources Event-driven applications are fundamentally different from traditional database applications: queries are continuous, consume and produce streams, and compute results incrementally CEP Engine Standing Queries Output Adapters Input Adapters Flexible adapter SDK with high performance to connect to different event sources and sinks The CEP platform does the heavy lifting for you to deal with temporal characteristics of event stream data Static reference data 26
75. CEP Deployment Scenarios Custom CEP Application Scenario 1: Custom CEP Application Dev Scenario 2: Embed CEP in Application ISV Application with CEP Engine .NET, C#, LINQ CEP Engine CEP Engine KPIs KPI mining ETL Pipeline with CEP engine CEP Engine CEP CEP CEP CEP Scenario 4: Operational Intelligence w/ CEP Scenario 3: CEP Enabled Device Reference data Device with Embedded CEP Engine Reference data Madison CEP Engine
76. CEP Platform Roadmap Focus on Custom development platform for CEP applications CEP platform for Microsoft partners TAP (Technology Adopter Program) will launch in July 2009 Product availability targeted for 2010 CTP2 TAP program CTP3 TAP continues General Availability time 2010 Late 2009 July 2009
Now more than ever, organizations have the need to use information management to compete and grow in a difficult market by reducing costs and identifying the highest value opportunities for their organization. As we engage with customers, you explain the pressures you have to deliver more real-time information through rich applications while also reducing costs in this new economy. The converging IT trends of virtually free storage space, the rapid adoption of virtualization and the increasing capabilities of industry standard hardware, the emergence of the Cloud as a deployment option, and the need for real-time business information for all employees through easy to use tools are driving the data explosion we see today. You need a complete approach to managing, accessing and delivering information across your organization to accelerate and improve business decisions.. We also understand that at the center of delivering on these business needs there are people - IT Professionals, who support the expanding information needs of the business through IT services, Developers who build solutions in time to capture business opportunities in an increasingly competitive market and Business Users, who quickly mine volumes of data for business insights to help increase customer satisfaction and drive business results. Microsoft, along with our worldwide partners, is committed to deliver an Information Platform which enables your people and provides you with a complete set of technologies and tools to help you realize more value from your information at the lowest total cost of ownership. As we invest in the Information Platform we think about what we need to deliver to you in each of these 4 areas: Mission-Critical Platform Mission-Critical Platform for all your application requirements, delivered with the industry’s best TCOEnsuring your applications and systems are reliable, highly available, secure and deliver superior, predictable performance is top priority for IT and for Microsoft. At our core, we are focused on delivering an information platform which meets the needs for performance, scalability, availability, and security for your most mission critical business applications at a lower cost of ownership, including acquisition costs and the ongoing costs of support, management and maintenance. From the server to the datacenter we offer our customers a reliable information platform that grows with their needs at a third of the cost of our enterprise DB competitor.Secure and available infrastructureWe have seen tremendous growth in our Enterprise business – investing significantly in our datacenter capability on the Windows Server platform, building out the industry’s most widely deployed database platform. The largest enterprises on the planet are now running their top-tier applications on the Microsoft platform and seeing sustained high-performance and uptime that directly benefits their businesses’ bottom line. Scales to all business requirementsWith storage costs declining, and increasing data volumes, IT departments need the ability to scale up seamlessly to process an increasing number of transactions, as well as storing and processing larger volumes of information. Microsoft’s Information Platform provides high scalability and performance for OLTP and BI as well as new Data Warehousing capabilities to scale to hundreds of terabytes to deliver on the most critical business needs. A complete and interoperable platform that empowers IT to be more productive and agile The boundaries of the IT environment continue to be stretched and IT Professionals have increased responsibility to manage application requirements for the datacenter, across mobile devices and the desktop, and now out in the cloud. To successfully manage these resources across the enterprise IT professionals require a consistent, and productive platform, including easy to use and robust tools. Increased DBA Productivity and ControlSQL Server has long had a focus on DBA productivity – by providing rich management tools that empower administrators to take control of their environment. We continue to invest in tools that enable administrators to manage multiple servers at once via policies and to proactively manage system performance. We’re also investing to provide complementary tools for areas such as data integration and master data management. Consolidation and VirtualizationConsolidation and virtualization can deliver significant resource and operational benefits in the datacenter, particularly at a time when IT professionals are facing budget pressure along with the need to continue the same level of service. Together, Windows Server, System Center and SQL Server can deliver a virtualized Data Center, providing increased utilization of server resources to reduce costs, greater standardization to reduce administration overhead and greater agility to dynamically scale to support changing and growing business needs. The Power of Choice - Harnessing the CloudThe cloud offers new ways for IT to reduce the friction of connecting with customers and partners, the ability to quickly provision resources, and a way to reduce operational costs. Microsoft is delivering SQL Azure as part of the Windows Azure Platform to enable IT to manage these resources with a platform and tools that are consistent with their existing SQL Server environment.Dynamic DevelopmentIntegrated tools to help Developers more quickly build rich, intuitive and connected applications Developers are continually asked to deliver richer applications and services while lowering the time-to-solution to adapt to business needs and opportunities. With Visual Studio, the .NET Framework, and SQL Server developers have a highly productive platform to deliver data through their applications and to collaborate more efficiently with IT on deployment needs and requirements. The consistent development experience also gives developers the power to build applications for deployment across devices, desktops, the datacenter, and the cloud. Tight Integration across the stackSQL Server has deep integration with Visual Studio and .NET to provide developers with a rich, seamless development experience. Developers can use familiar tools and the power of a model-driven approach to vastly improve their productivity, by abstracting away from database specifics and instead focusing on the business logic and outcomes required. Platform for all DataBeing able to bring unstructured data alongside structured data in a single application gives developers the power to bring the relational model to new types of data. They can extend existing applications and develop new types of applications that incorporate location aware services, real time streaming data – all in a secure, synchronized way. Rich and embeddable user experiencesDevelopers can embed powerful user experiences into their applications through components such as spatial data for rich mapping scenarios, embeddable BI controls and data synchronization. Looking ahead, developers can integrate value-added cloud services into applications for a differentiated value prop and additional services opportunities.Pervasive InsightA complete Business Intelligence platform that enables users across the organization to derive greater insights through familiar toolsBusiness Intelligence continues to be the #1 priority for CIOs. Even as budgets are cut IT is being asked to do more to deliver information to the business. Getting the right information to the right person at the right time is critical for success now more than ever. This requires providing access through familiar tools to end users while also providing IT the ability to manage and secure access to the information. By enabling access to all types of users through familiar tools, in a secure and well managed way – businesses can get more value from their information than ever before. Microsoft helps companies consolidate information, deliver access and empower users at all levels of the business: strategic, tactical and operational decision makers. Integrate all your data and scale your reporting systemsWith SQL Server, you can integrate the many sources of data in your organization into large scale data warehouses and deliver rich, high-performance reporting to your whole organization. Enable users to derive insights through familiar toolsIncreased processing power and memory for desktops and laptops are enabling information workers to do more advanced data mining and analysis to improve decision making. We are extending our capabilities to deliver the power of self-service business intelligence to individual users through the familiar tools they use everyday such as Microsoft Excel. Enable collaborative decision making Today organizations have relevant data that lives in the datacenter, on user devices and desktops, and through external services. It is critical for users to be able to draw upon the wealth of information that is distributed throughout the organization and to do analysis and reporting with the data sources that are most relevant. The real value is that individual insight can be easily shared across work groups and across the organization. With Microsoft Office SharePoint, individual information workers can share and publish insights and analysis tools to scale reach and deliver greater value to the organization.ClosingLooking across the Information Platform, we think about how these capabilities can better support our customers’ information needs. For example a national retailer could use technology to track the demographics, volume and location of customers throughout stores across the country. By processing large volume of data in real-time, they could then evaluate the needs of their customer base at any given moment and make changes to promotions or the store environment like the background music, to better appeal to the audience and drive real-time purchase. Looking across the Information Platform, the ability to dynamically respond to customer needs could be built by working with your development organization to develop a new application or extending an existing application, partnering with IT to support the application through greater utilization of existing assets, and delivering information to business users through familiar tools. By providing business information pervasively throughout the organization in real-time this organization would be able to much more quickly identify and capture business opportunities.The Information Platform delivers the rich capabilities of SQL Server and SQL Azure to enable IT, developers and end users to deliver on your most critical business needs and maximize the value of your information throughout the organization at a low cost of ownership.
The purpose of this slide is to communicate the next wave of SQL innovation and how it should be messaged. There are three key areas of innovation: SQL Server 2008 R2 – the next generation release of the SQL Server database platform, Parallel Data Warehouse, appliances for high scale data warehousing and SQL Azure – scalable relational database platform service in the cloud. When you talk to each of these innovations, you want to emphasize that each is built on the solid foundation of SQL Server 2008.SQL Server 2008 R2 enables businesses to make more timely, better informed decisions by enabling users to access, integrate, analyze and share information using tools they are already familiar with – Microsoft Office. DBAs can more efficiently manage databases and applications at scale with multi-server management. Enhanced support for Virtualization will provide opportunities to further optimize resources through consolidation.SQL Azure – delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services. SQL Azure provides an internet-facing database and advanced query processing services and is ideal solution for customers building new applications or integrating with existing investments in the cloud. Built on SQL Server technologies, SQL Azure provides the same enterprise-class availability, scalability , and security with the benefits of built-in data protection, self healing and disaster recovery as well as a consistent SQL Server programming model & tools. SQL Azure is scheduled for release later this year.Parallel Data Warehouse is a highly scalable data warehouse appliance. It uses Massively Parallel Processing (MPP) to deliver high performance and scalability on SQL Server 2008, Windows Server 2008 and industry-standard hardware. Parallel Data Warehouse is a separate release/edition but will launch jointly with R2 in the first half of 2010.
The “hero” offering that we’ll really sell broadly is SQL Server 2008 R2. You’ll notice that the messaging pillars for R2 are aligned with the Trusted, Productive, Intelligent messaging pillars for SQL Server 2008.To provide a comprehensive view of the R2 release (especially for SA discussions) this slide focuses on the key technology investments for each of the pillars. Trusted, Scalable Platform, it’s important to re-emphasize the enterprise-level security, high availability and scale delivered in SQL 2008 for mission critical applications. What’s new to the R2 release is StreamInsight for Complex Event Processing, Master Data Services for data consistency across heterogeneous systems, SQL Server support for up to 256 Logical Processors – providing greater flexibility in hardware choice, and enhanced data compression including Unicode compression for greater storage efficiency.IT & Developer Efficiency, the release-defining scenarios for R2 are Virtualization and Multi-server Management to increase IT productivity and reduce data management costs. This is enabled by the new manageability enhancements delivered in Management Studio plus R2 support for Windows Server 2008 R2 Hyper-V Virtualization and Live Migration. With R2, we introduce a new concept to the market, a single unit of deployment which will significantly increase the efficiencies around data-tier application deployments and upgrades resulting in dramatically streamlined consolidation and management efforts. Managed Self Service BI – Self-service analytics is delivered with PowerPivot – which is about enabling users to get the data they need without IT having to do a 6-month project to create a new application. PowerPivot gives users the power and flexibility to take internal data, integrate it with other information – and within Excel do all the slicing and dicing you need to get to the view of information you really need for decision making. This is all done “in memory” through really high compression in RAM – which is very powerful. None of the other major BI vendors can do this. User can easily share their application with others by publishing it to SharePoint.Once published, there’s a PowerPivot Operations Dashboard in SharePoint that provides IT Pros with the ability to manage business intelligence through SharePoint like it was just another service they provide. This new dashboard allows IT Pros to look at which reports are being run, who’s running them, how often are they running them, is the server being over utilized, how can I move the app dynamically to another server…BI is basically another managed object for IT. The reports the users publish is the object and we enable IT to understand how it is being used and apply rules and policies to that.
Your free PASS membership enables access to 12 Virtual Chapters, 24 Hours of PASS broadcasts, local PASS Chapters, professional development resources, newsletters, events calendar, and much moreMeet up with like-minded professionals throughout the year at face-to-face PASS Chapter meetings, discuss topical SQL Server issues, share tips and tricks, network, enjoy special guest speakers and get access to all things PASSAt PASS Summit,meet top industry experts , learn about best practices, effective troubleshooting, how to prevent issues, save money, and build a better SQL Server environment