Streaming Analytics Patterns for
Your Digital Enterprise
Associate Director/Architect, WSO2
Sriskandarajah Suhothayan
• Introduction to streaming analytics
• About WSO2 Analytics Offering
– WSO2 Stream Processor
• Streaming analytics patterns
• Managing patterns
• Deployment of streaming analytics
Outline
What is
Streaming Analytics?
Software that provides analytical operators to
orchestrate data flow, calculate analytics, and
detect patterns on event data from multiple,
disparate live data sources to allow developers
to build applications that sense, think, and act
in real time.
- Forester
Streaming Analytics
WSO2 Analytics
Offering Stream Processor Core
Events
JMS, Thrift, SMTP, HTTP, MQTT, Kafka
Analytics Fabric
Complex Event
Processing
Incremental Time Series
Aggregation
Machine
Learning
Extension Store
FinancialandBanking
Analytics
RetailAnalytics
LocationAnalytics
OperationalAnalytics
SmartEnergyAnalytics
Custom
Analytics
Solutions
...
Solutions
StatusMonitoring
Rule
Mgmt.
Dashboard
WSO2 Stream Processor
• Lightweight, lean & cloud native
• Easy to learn Streaming SQL
• Native support for streaming Machine Learning
• Long term aggregations without batch analytics
• High performance analytics with just 2 nodes (HA)
• Highly scalable deployment with exactly-once
processing
• Tools for development, monitoring and business users
Overview of WSO2 Stream Processor
Siddhi App
Single configuration
for Analytics!
Stream Processing
from Sales#window.time(1 hour)
select region, brand, avg(quantity) as AvgQuantity
group by region, brand
insert into LastHourSales ;
Stream
Processor
Siddhi App
{ Siddhi }
Input Streams Output Streams
Filter Aggregate
JoinTransform
Pattern
Siddhi Extensions
Streaming SQL
Streaming Analytics Patterns
• To know what stream processing can do!
• To understand the difference between
– database applications & stream processing
• Where to use what?
• Best practices
Why Patterns ?
1. Streaming data pre processing
2. Data store integration
3. Streaming data summarization
4. KPI analysis and alerts
5. Event correlation
6. Trend analysis
7. Real-time prediction
8. Streaming machine learning
Streaming Analytics Patterns
Use Case :
Sweet Factory Management
Use Case :
Sweet Factory Management
• Monitor Supply, Production & Sales
• Optimize resource utilization
• Detect and alert failures
• Predict demand
• Manage processing rules online
• Visualise realtime performance
Use Case : Sweet Factory Management
• Collect events from multiple sources
• Convert them to streams
• Filter events
• Add defaults to missing fields
• Change event stream structure
1. Streaming Data Pre Processing
Filtering, Add Defaults, and Projection
Filter
Transform
Process
Add
Defaults
1. Streaming Data Pre Processing
@app:name(‘Sweet-Factory-Analytics’)
define stream SweetProductionStream(name string, amount double);
Define Stream
1. Streaming Data Pre Processing
@app:name(‘Sweet-Factory-Analytics’)
@source(type = mqtt, …, @map(type = json, …))
define stream SweetProductionStream(name string, amount double);
Consume Events :
MQTT, HTTP, TCP, Kafka, JMS, RabitMQ, etc.
Map Events to Streams :
JSON, XML, Text, Binary, WSO2Event, KeyValue, etc.
1. Streaming Data Pre Processing
@app:name(‘Sweet-Factory-Analytics’)
@source(type = mqtt, …, @map(type = json, …))
define stream SweetProductionStream(name string, amount double);
from SweetProductionStream
select *
insert into LawCandyProdcutionStream ;
Write Query
1. Streaming Data Pre Processing
@app:name(‘Sweet-Factory-Analytics’)
@source(type = mqtt, …, @map(type = json, …))
define stream SweetProductionStream(name string, amount double);
from SweetProductionStream[amount < 100 and name == ‘candy’]
select *
insert into LawCandyProdcutionStream ;
Filter
1. Streaming Data Pre Processing
@app:name(‘Sweet-Factory-Analytics’)
@source(type = mqtt, …, @map(type = json, …))
define stream SweetProductionStream(name string, amount double);
from SweetProductionStream[amount < 100 and name == ‘candy’]
select name, (amount * 0.05) + 5 as cost, ‘GBP’ as currency
insert into LawCandyProdcutionCostStream ;
Transformation and Defaults
1. Streaming Data Pre Processing
@app:name(‘Sweet-Factory-Analytics’)
@source(type = mqtt, …, @map(type = json, …))
define stream SweetProductionStream(name string, amount double);
from SweetProductionStream[amount < 100 and name == ‘candy’]
select name, calculateCost(amount,name) as cost, ‘GBP’ as currency
insert into LawCandyProdcutionCostStream ;
Functions :
Inbuilt, Custom UDF or Siddhi Extensions
• Incoming events to perform
operations on data stores
• Optimizations with
Primary and Indexing Keys
2. Data Store Integration
Store, Retrieve and Modify
• Update
• Contains
• Search
• Insert
• Delete
2. Data Store Integration
define stream SweetProductionStream(name string, amount double);
define table ProductionTable(id string, name string, amount double);
Define Table (In-Memory)
2. Data Store Integration
define stream SweetProductionStream(name string, amount double);
@primaryKey(‘id’)
@Index(amount)
define table ProductionTable(id string, name string, amount double);
Index and Primary Keys
2. Data Store Integration
define stream SweetProductionStream(name string, amount double);
@store(type=‘rdbms’, … )
@primaryKey(‘id’)
@Index(amount)
define table ProductionTable(id string, name string, amount double);
Table Backed by:
RDBMS, MongoDB, HBase, Cassandra, Solr, Hazelcast, etc.
2. Data Store Integration
define stream SweetProductionStream(name string, amount double);
@store(type=‘rdbms’, … )
@primaryKey(‘id’)
@Index(amount)
define table ProductionTable(id string, name string, amount double);
from SweetProductionStream
select UUID() as id, name, amount
insert into ProductionTable; Insert into Table
2. Data Store Integration
define stream SweetProductionStream(name string, amount double);
@store(type=‘rdbms’, … )
@primaryKey(‘id’)
@Index(amount)
define table ProductionTable(id string, name string, amount double);
from SweetProductionStream
select UUID() as id, name, amount
update or insert into ProductionTable
set ProductionTable.amount = amount
on ProductionTable.name == name;
Update or Insert into Table
• Sum, Count, Min, Max, etc.
within the
– last 5 minutes
– last 20 events
3. Streaming Data Summarization
Aggregations Over Short Time Periods
3. Streaming Data Summarization
define stream SweetProductionStream(name string, amount double);
from SweetProductionStream#window.time(‘1 min’)
select name, sum(amount) as totalSweets, currentTimeMillis() as timestamp
group by name
insert into LastMinProdStream;
Windows Sliding and Batch for Time, Length, etc.
• Incremental Aggregation for every
– Seconds, Minutes, Hours, Days, …, Year
• Support for out-of-order event arrival
• Fast data retrieval from memory and disk for
realtime update
3. Streaming Data Summarization
Current Min
Aggregations Over Long Time Periods
Current Hour
Sec
Min
Hour
0 - 1 - 5 ...
- 1
- 2 - 3 - 4 - 64 - 65 ...
- 2
- 124
3. Streaming Data Summarization
Aggregations Over Long Time Periods
define stream SweetProductionStream(name string, amount double);
define aggregation ProductionAggirgation
from SweetProductionStream
select name, sum(amount) as totalSweets, count(*) as noOfPacks
group by name
aggregate every seconds ... years ;
Predefined Aggregation
3. Streaming Data Summarization
Aggregations Over Long Time Periods
from ProductionAggirgation
select name, totalSweets, noOfPacks
on name == ‘candy’
within 2017/01/01 2017/02/01
per day ;
Data Retrieval for Dashboards
• Generate dashboard and widgets
• Fine grained permissions
– Dashboard level
– Widget level
– Data level
• Localization support
• Inter widget communication
• Shareable dashboards with widget
state persistence
Dashboard for
Business Users
• Identify KPIs using
– Filter, ifThenElse, having, etc.
• Send alerts using Sinks
4. KPI Analysis and Alerts
Generate Alerts Based on KPIs
4. KPI Analysis and Alerts
define stream LastMinProdStream (name string, totalSweets long,
timestamp long);
@sink(type=‘email’, to=‘manager@sf.com’
@map(type=‘text’,
@payload(‘‘‘
Low Production of {{name}} at factory {{factoryId}}.
’’’))
define stream LowProdAlertStream (name string, factoryId int, ...);
from LastMinProdStream[totalSweets < 5000]
insert into LowProdAlertStream;
Publishing with Mapping via :
Email, HTTP, TCP, Kafka, RabbitMQ, MQTT, etc.
• Identify complex patterns
– Followed by, non-occurrence, etc.
• Identify trends
– Peek, triple bottom, etc.
Event Correlation and Trend Analysis
Complex Event Processing
5. Event Correlation
Patterns
define stream RawMaterialStream (name string, amount double);
define stream ProductionInputStream (name string, amount double);
from every (e1 = RawMaterialStream)
-> not ProductionInputStream[name == e1.name and
amount == e1.amount] for 15 min
select e1.name, e1.amount
insert into ProductionStartDelayed ;
Non-occurrence of an event detection
6. Trend Analysis
Sequences
define stream LastMinProdStream (name string, totalSweets long,
timestamp long);
partition with (name of LastMinProdStream)
begin
from every e1=LastMinProdStream,
e2=LastMinProdStream[timestamp - e1.timestamp < 10
and e1.amount > amount]*,
e3=LastMinProdStream[timestamp - e1.timestamp > 10
and e2[last].amount > amount]
select e1.name, e1.amount as initialAmount, e3.amount as finalAmount
insert into ContinousProdReductionStream ;
end;
Identify decreasing
trend for 10 mins
• Import Machine Learning Models
for realtime predictions
– PMML, TensorFlow, etc
• Streaming Machine Learning
– Clustering, Classification, Regression
– Makove Models, Anomaly detection, etc...
Realtime Learning and Predictions
Machine Learning
define stream SugarSyrupDataStream (temperature double, density double);
from SugarSyrupDataStream#pmml:predict(“/home/user/ml.model”, “string”)
select *
Insert into PredictionStream ;
7. Real-Time Predictions
Run ML Models In Realtime
define stream SugarSyrupDataStream (temperature double, density double);
define stream SugarSyrupResultStream (temperature double, density double,
decision string);
from SugarSyrupResultStream
#streamingml:hoeffdingTreeTrain(‘Model’,temperature,density,decision)
...
from SugarSyrupDataStream
#streamingml:hoeffdingTreeClassifie(‘Model’,temperature,density)
...
8. Streaming Machine Learning
Continuous Learning and Feedback
Online Training !
Online Prediction !
Managing Patterns
Business
Rules for Non
Technical
Users
Define your own
Business Rules from
scratch
Modify templated
complex Business
Rules using rule
parameters
Editor
Debugger
Simulation
Testing
All in one
Development
Studio
for Siddhi Apps
Development
Studio
Editing
Simulation
Debugging
Testing
All in one
Streaming Analytics
Deployment
• High Performance
– Process around 100k events/sec
– Just 2 nodes
– While most others need 5+
• Zero Downtime
• Zero Event Loss
• Simple deployment with RDBMS
– No zookeeper, kafka, etc
• Multi Data Center Support
Stream Processor
Stream Processor
Minimum HA With 2 Nodes
Event Sources
Event
Store
Dashboard
Notification
Invocation
Data Source
Siddhi App
Siddhi App
Siddhi App
Siddhi App
Siddhi App
Siddhi App
• Exactly-once processing
• Fault tolerance
• Highly scalable
• No backpressure
• Distributed development configurations via
annotations
• Plugable distribution options (YARN, K8, etc)
Distributed Deployment with Kafka
Stream Processor
Stream Processor
Distributed Deployment with Kafka ...
Siddhi App
Siddhi App
KafkaEvent Sources Siddhi App
Event
Store
Dashboard
Notification
Invocation
Data Source
Siddhi App
Siddhi App
Stream Processor
Siddhi App
Worker
Siddhi App
Siddhi App
Siddhi App
Stream Processor
Job Manager
Worker
Worker
Monitor Apps, Nodes and Clusters
• Give your everything you need to build
Streaming Analytics
– Manage data streams
– Powerful Streaming SQL language
– Dashboards and more
• Can provide 100K+ events per second with two
node HA ( most alternatives need 5+ nodes) and
can scale more on top of Kafka
WSO2 Stream Processor
51
• We looked at what’s streaming analytics is ?
• Patterns of streaming analytics
• How WSO2 Stream Processor can help you implement
analytics patterns
• The business benefit of manageable rules and dashboards
• The ease of development, multi mode deployments and
monitoring with WSO2 Stream Processor
Summary
52
wso2.com

[WSO2Con EU 2017] Streaming Analytics Patterns for Your Digital Enterprise

  • 1.
    Streaming Analytics Patternsfor Your Digital Enterprise Associate Director/Architect, WSO2 Sriskandarajah Suhothayan
  • 2.
    • Introduction tostreaming analytics • About WSO2 Analytics Offering – WSO2 Stream Processor • Streaming analytics patterns • Managing patterns • Deployment of streaming analytics Outline
  • 3.
  • 4.
    Software that providesanalytical operators to orchestrate data flow, calculate analytics, and detect patterns on event data from multiple, disparate live data sources to allow developers to build applications that sense, think, and act in real time. - Forester Streaming Analytics
  • 5.
    WSO2 Analytics Offering StreamProcessor Core Events JMS, Thrift, SMTP, HTTP, MQTT, Kafka Analytics Fabric Complex Event Processing Incremental Time Series Aggregation Machine Learning Extension Store FinancialandBanking Analytics RetailAnalytics LocationAnalytics OperationalAnalytics SmartEnergyAnalytics Custom Analytics Solutions ... Solutions StatusMonitoring Rule Mgmt. Dashboard
  • 6.
  • 7.
    • Lightweight, lean& cloud native • Easy to learn Streaming SQL • Native support for streaming Machine Learning • Long term aggregations without batch analytics • High performance analytics with just 2 nodes (HA) • Highly scalable deployment with exactly-once processing • Tools for development, monitoring and business users Overview of WSO2 Stream Processor
  • 8.
    Siddhi App Single configuration forAnalytics! Stream Processing from Sales#window.time(1 hour) select region, brand, avg(quantity) as AvgQuantity group by region, brand insert into LastHourSales ; Stream Processor Siddhi App { Siddhi } Input Streams Output Streams Filter Aggregate JoinTransform Pattern Siddhi Extensions Streaming SQL
  • 9.
  • 10.
    • To knowwhat stream processing can do! • To understand the difference between – database applications & stream processing • Where to use what? • Best practices Why Patterns ?
  • 11.
    1. Streaming datapre processing 2. Data store integration 3. Streaming data summarization 4. KPI analysis and alerts 5. Event correlation 6. Trend analysis 7. Real-time prediction 8. Streaming machine learning Streaming Analytics Patterns
  • 12.
    Use Case : SweetFactory Management
  • 13.
    Use Case : SweetFactory Management
  • 14.
    • Monitor Supply,Production & Sales • Optimize resource utilization • Detect and alert failures • Predict demand • Manage processing rules online • Visualise realtime performance Use Case : Sweet Factory Management
  • 15.
    • Collect eventsfrom multiple sources • Convert them to streams • Filter events • Add defaults to missing fields • Change event stream structure 1. Streaming Data Pre Processing Filtering, Add Defaults, and Projection Filter Transform Process Add Defaults
  • 16.
    1. Streaming DataPre Processing @app:name(‘Sweet-Factory-Analytics’) define stream SweetProductionStream(name string, amount double); Define Stream
  • 17.
    1. Streaming DataPre Processing @app:name(‘Sweet-Factory-Analytics’) @source(type = mqtt, …, @map(type = json, …)) define stream SweetProductionStream(name string, amount double); Consume Events : MQTT, HTTP, TCP, Kafka, JMS, RabitMQ, etc. Map Events to Streams : JSON, XML, Text, Binary, WSO2Event, KeyValue, etc.
  • 18.
    1. Streaming DataPre Processing @app:name(‘Sweet-Factory-Analytics’) @source(type = mqtt, …, @map(type = json, …)) define stream SweetProductionStream(name string, amount double); from SweetProductionStream select * insert into LawCandyProdcutionStream ; Write Query
  • 19.
    1. Streaming DataPre Processing @app:name(‘Sweet-Factory-Analytics’) @source(type = mqtt, …, @map(type = json, …)) define stream SweetProductionStream(name string, amount double); from SweetProductionStream[amount < 100 and name == ‘candy’] select * insert into LawCandyProdcutionStream ; Filter
  • 20.
    1. Streaming DataPre Processing @app:name(‘Sweet-Factory-Analytics’) @source(type = mqtt, …, @map(type = json, …)) define stream SweetProductionStream(name string, amount double); from SweetProductionStream[amount < 100 and name == ‘candy’] select name, (amount * 0.05) + 5 as cost, ‘GBP’ as currency insert into LawCandyProdcutionCostStream ; Transformation and Defaults
  • 21.
    1. Streaming DataPre Processing @app:name(‘Sweet-Factory-Analytics’) @source(type = mqtt, …, @map(type = json, …)) define stream SweetProductionStream(name string, amount double); from SweetProductionStream[amount < 100 and name == ‘candy’] select name, calculateCost(amount,name) as cost, ‘GBP’ as currency insert into LawCandyProdcutionCostStream ; Functions : Inbuilt, Custom UDF or Siddhi Extensions
  • 22.
    • Incoming eventsto perform operations on data stores • Optimizations with Primary and Indexing Keys 2. Data Store Integration Store, Retrieve and Modify • Update • Contains • Search • Insert • Delete
  • 23.
    2. Data StoreIntegration define stream SweetProductionStream(name string, amount double); define table ProductionTable(id string, name string, amount double); Define Table (In-Memory)
  • 24.
    2. Data StoreIntegration define stream SweetProductionStream(name string, amount double); @primaryKey(‘id’) @Index(amount) define table ProductionTable(id string, name string, amount double); Index and Primary Keys
  • 25.
    2. Data StoreIntegration define stream SweetProductionStream(name string, amount double); @store(type=‘rdbms’, … ) @primaryKey(‘id’) @Index(amount) define table ProductionTable(id string, name string, amount double); Table Backed by: RDBMS, MongoDB, HBase, Cassandra, Solr, Hazelcast, etc.
  • 26.
    2. Data StoreIntegration define stream SweetProductionStream(name string, amount double); @store(type=‘rdbms’, … ) @primaryKey(‘id’) @Index(amount) define table ProductionTable(id string, name string, amount double); from SweetProductionStream select UUID() as id, name, amount insert into ProductionTable; Insert into Table
  • 27.
    2. Data StoreIntegration define stream SweetProductionStream(name string, amount double); @store(type=‘rdbms’, … ) @primaryKey(‘id’) @Index(amount) define table ProductionTable(id string, name string, amount double); from SweetProductionStream select UUID() as id, name, amount update or insert into ProductionTable set ProductionTable.amount = amount on ProductionTable.name == name; Update or Insert into Table
  • 28.
    • Sum, Count,Min, Max, etc. within the – last 5 minutes – last 20 events 3. Streaming Data Summarization Aggregations Over Short Time Periods
  • 29.
    3. Streaming DataSummarization define stream SweetProductionStream(name string, amount double); from SweetProductionStream#window.time(‘1 min’) select name, sum(amount) as totalSweets, currentTimeMillis() as timestamp group by name insert into LastMinProdStream; Windows Sliding and Batch for Time, Length, etc.
  • 30.
    • Incremental Aggregationfor every – Seconds, Minutes, Hours, Days, …, Year • Support for out-of-order event arrival • Fast data retrieval from memory and disk for realtime update 3. Streaming Data Summarization Current Min Aggregations Over Long Time Periods Current Hour Sec Min Hour 0 - 1 - 5 ... - 1 - 2 - 3 - 4 - 64 - 65 ... - 2 - 124
  • 31.
    3. Streaming DataSummarization Aggregations Over Long Time Periods define stream SweetProductionStream(name string, amount double); define aggregation ProductionAggirgation from SweetProductionStream select name, sum(amount) as totalSweets, count(*) as noOfPacks group by name aggregate every seconds ... years ; Predefined Aggregation
  • 32.
    3. Streaming DataSummarization Aggregations Over Long Time Periods from ProductionAggirgation select name, totalSweets, noOfPacks on name == ‘candy’ within 2017/01/01 2017/02/01 per day ; Data Retrieval for Dashboards
  • 33.
    • Generate dashboardand widgets • Fine grained permissions – Dashboard level – Widget level – Data level • Localization support • Inter widget communication • Shareable dashboards with widget state persistence Dashboard for Business Users
  • 34.
    • Identify KPIsusing – Filter, ifThenElse, having, etc. • Send alerts using Sinks 4. KPI Analysis and Alerts Generate Alerts Based on KPIs
  • 35.
    4. KPI Analysisand Alerts define stream LastMinProdStream (name string, totalSweets long, timestamp long); @sink(type=‘email’, to=‘manager@sf.com’ @map(type=‘text’, @payload(‘‘‘ Low Production of {{name}} at factory {{factoryId}}. ’’’)) define stream LowProdAlertStream (name string, factoryId int, ...); from LastMinProdStream[totalSweets < 5000] insert into LowProdAlertStream; Publishing with Mapping via : Email, HTTP, TCP, Kafka, RabbitMQ, MQTT, etc.
  • 36.
    • Identify complexpatterns – Followed by, non-occurrence, etc. • Identify trends – Peek, triple bottom, etc. Event Correlation and Trend Analysis Complex Event Processing
  • 37.
    5. Event Correlation Patterns definestream RawMaterialStream (name string, amount double); define stream ProductionInputStream (name string, amount double); from every (e1 = RawMaterialStream) -> not ProductionInputStream[name == e1.name and amount == e1.amount] for 15 min select e1.name, e1.amount insert into ProductionStartDelayed ; Non-occurrence of an event detection
  • 38.
    6. Trend Analysis Sequences definestream LastMinProdStream (name string, totalSweets long, timestamp long); partition with (name of LastMinProdStream) begin from every e1=LastMinProdStream, e2=LastMinProdStream[timestamp - e1.timestamp < 10 and e1.amount > amount]*, e3=LastMinProdStream[timestamp - e1.timestamp > 10 and e2[last].amount > amount] select e1.name, e1.amount as initialAmount, e3.amount as finalAmount insert into ContinousProdReductionStream ; end; Identify decreasing trend for 10 mins
  • 39.
    • Import MachineLearning Models for realtime predictions – PMML, TensorFlow, etc • Streaming Machine Learning – Clustering, Classification, Regression – Makove Models, Anomaly detection, etc... Realtime Learning and Predictions Machine Learning
  • 40.
    define stream SugarSyrupDataStream(temperature double, density double); from SugarSyrupDataStream#pmml:predict(“/home/user/ml.model”, “string”) select * Insert into PredictionStream ; 7. Real-Time Predictions Run ML Models In Realtime
  • 41.
    define stream SugarSyrupDataStream(temperature double, density double); define stream SugarSyrupResultStream (temperature double, density double, decision string); from SugarSyrupResultStream #streamingml:hoeffdingTreeTrain(‘Model’,temperature,density,decision) ... from SugarSyrupDataStream #streamingml:hoeffdingTreeClassifie(‘Model’,temperature,density) ... 8. Streaming Machine Learning Continuous Learning and Feedback Online Training ! Online Prediction !
  • 42.
  • 43.
    Business Rules for Non Technical Users Defineyour own Business Rules from scratch Modify templated complex Business Rules using rule parameters
  • 44.
  • 45.
  • 46.
  • 47.
    • High Performance –Process around 100k events/sec – Just 2 nodes – While most others need 5+ • Zero Downtime • Zero Event Loss • Simple deployment with RDBMS – No zookeeper, kafka, etc • Multi Data Center Support Stream Processor Stream Processor Minimum HA With 2 Nodes Event Sources Event Store Dashboard Notification Invocation Data Source Siddhi App Siddhi App Siddhi App Siddhi App Siddhi App Siddhi App
  • 48.
    • Exactly-once processing •Fault tolerance • Highly scalable • No backpressure • Distributed development configurations via annotations • Plugable distribution options (YARN, K8, etc) Distributed Deployment with Kafka
  • 49.
    Stream Processor Stream Processor DistributedDeployment with Kafka ... Siddhi App Siddhi App KafkaEvent Sources Siddhi App Event Store Dashboard Notification Invocation Data Source Siddhi App Siddhi App Stream Processor Siddhi App Worker Siddhi App Siddhi App Siddhi App Stream Processor Job Manager Worker Worker
  • 50.
    Monitor Apps, Nodesand Clusters
  • 51.
    • Give youreverything you need to build Streaming Analytics – Manage data streams – Powerful Streaming SQL language – Dashboards and more • Can provide 100K+ events per second with two node HA ( most alternatives need 5+ nodes) and can scale more on top of Kafka WSO2 Stream Processor 51
  • 52.
    • We lookedat what’s streaming analytics is ? • Patterns of streaming analytics • How WSO2 Stream Processor can help you implement analytics patterns • The business benefit of manageable rules and dashboards • The ease of development, multi mode deployments and monitoring with WSO2 Stream Processor Summary 52
  • 53.