SlideShare a Scribd company logo
1 of 119
Download to read offline
Vincenzo	Ferme	
Faculty	of	Informatics	
USI	Lugano,	Switzerland
WORKFLOW ENGINE
PERFORMANCE BENCHMARKING
WITH BENCHFLOW
Vincenzo Ferme
2
The BenchFlow Project
Design and implement the first benchmark to
assess and compare the performance of WfMSs
that are compliant with Business Process Model
and Notation 2.0 standard.
”
“
Vincenzo Ferme
3
What is a Workflow Management System?
WfMS Users
Applications
Application Server
Instance
Database
DBMS
Web
Service
D
A
B
C
Vincenzo Ferme
4
Many Vendors of BPMN 2.0 WfMSs
https://en.wikipedia.org/wiki/List_of_BPMN_2.0_engines
Jan 2011
BPMN 2.0
Jan 2014
BPMN 2.0.2
ISO/IEC 19510
Aug 2009
BETA
BPMN 2.0
2016 0 23
Grand
Total 23
NumberofBPMN2.0WfMSs
0
5
10
15
20
25
Year of the First Version Supporting BPMN 2.0
2009 2010 2011 2012 2013 2014 2015 2016
Vincenzo Ferme
5
end-users, vendors, developers
Why do We Need a Benchmark?
Vincenzo Ferme
5
end-users, vendors, developers
Why do We Need a Benchmark?
1. How to choose the best WfMS according to
the company’s technical requirements?
2. How to choose the best WfMS 

according to the company’s 

business process models (workflows)?
B
C
D
Vincenzo Ferme
5
end-users, vendors, developers
Why do We Need a Benchmark?
1. How to choose the best WfMS according to
the company’s technical requirements?
2. How to choose the best WfMS 

according to the company’s 

business process models (workflows)?
B
C
D
4. How to identify WfMS’s bottlenecks?
3. How to evaluate performance improvements

during WfMS’s development?
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Workload
Metrics
KPIs
Measure
Input/Process/Output Model
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Workload
Metrics
KPIs
Measure
Input/Process/Output Model
Workload
Mix
80%
C
A
B
20%
D
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload
Metrics
KPIs
Measure
Input/Process/Output Model
Workload
Mix
80%
C
A
B
20%
D
Load Functions
Test Data
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload
Metrics
KPIs
Measure
Input/Process/Output Model
Workload
Mix
80%
C
A
B
20%
D
Load Functions
Test Data
Containers
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload
Metrics
KPIs
Measure
Input/Process/Output Model
Workload
Mix
80%
C
A
B
20%
D
Load Functions
Test Data
Performance
Data
Process
Instance
Duration
Throughput
Containers
Vincenzo Ferme
Vendor
BenchFlow
Provide
Benchmarking
Methodology
Vendor
BenchFlow
Agree on adding
Vendor's WfMS to the
Benchmark
Benchmarking Methodology Agreement Proposal
Signed Agreement
Vendor
BenchFlow
Provide Containerized
Distribution of WfMS
Containerized WfMS Request
Containerized WfMS
Vendor
BenchFlow
Verify the Benchmark
Results
Results Verification Outcome
Results Verification Request
Vendor
BenchFlow
Provide Draft
Benchmark Results
Draft Benchmark ResultsVerified Benchmark Results
Not Valid
Are the
Results
Valid?
Community
BenchFlow
Publish Benchmark
Results
Valid
Figure 4: Benchmarking Methodology Choreography
pending on the WfMS’s architecture. The DBMS
Container can refer to an existing publicly available
Container distributions. The containerized WfMS
should be publicly available (e.g., at the Docker Hub
registry8), or the Benchflow team should be granted
access to a private registry used by the vendor. The
same applies to the Containers’ definition file, i.e., the
Dockerfile (Turnbull, 2014, ch. 4). While private reg-
istries are a solution that can work with vendors of
closed-source systems, they impact the reproducibil-
ity of the results. For each WfMS version to be in-
cluded in the benchmark, there must be a default con-
figuration, i.e., the configuration in which the WfMS
Containers start without modifying any configuration
parameters, except the ones required in order to cor-
rectly start the WfMS, e.g., the database connection.
However, if vendors want to benchmark the perfor-
mance of their WfMS with different configurations,
for example, the configuration provided to users as
a “getting started” configuration, or production-grade
configurations for real-life usage, they can also pro-
vide different configurations. To do that, the Con-
tainers must allow to issue the WfMS configurations
through additional environment variables9, and/or by
access the configuration files. Exposing the volumes
allows to access the files defined inside the Contain-
ers, on the host operating system. Precisely, the WES
Container has to, at least, enable the configuration of:
1) the used DB, i.e., DB driver, url, username and
password for connection; 2) the WES itself; and 3) the
logging level of the WES, and the application stack
layers on which it may depend on. Alternatively, in-
stead of providing configuration details, the vendors
can provide different Containers for different config-
urations. However, even in that case enabling the DB
configuration is required.
In order to access relevant data, all WfMS Con-
tainers have to specify the volumes to access the
WfMS log files, and to access all the data useful to
setup the system. For instance, if the WfMS defines
examples as part of the deployment, we may want
to remove those examples by overriding the volume
containing them. Moreover, the WES Container (or
the DBMS Container) has to create the WfMS’s DB
and populate it with data required to run the WfMS.
The vendors need to provide authentication configu-
ration details of the WfMS components (e.g., the user
7
Methodology Framework
+
Container Based Methodology and Framework
BenchFlow Framework
architecture
Instance
Database
DBMSFaban Drivers
ContainersServers
DATA

TRANSFORMERS
ANALYSERS
Performance
Metrics
Performance
KPIs
WfMS
TestExecutionAnalyses
Faban
+
Web
Service
Minio
harness
MONITOR
Adapters
COLLECTORS
Vincenzo Ferme
Vendor
BenchFlow
Provide
Benchmarking
Methodology
Vendor
BenchFlow
Agree on adding
Vendor's WfMS to the
Benchmark
Benchmarking Methodology Agreement Proposal
Signed Agreement
Vendor
BenchFlow
Provide Containerized
Distribution of WfMS
Containerized WfMS Request
Containerized WfMS
Vendor
BenchFlow
Verify the Benchmark
Results
Results Verification Outcome
Results Verification Request
Vendor
BenchFlow
Provide Draft
Benchmark Results
Draft Benchmark ResultsVerified Benchmark Results
Not Valid
Are the
Results
Valid?
Community
BenchFlow
Publish Benchmark
Results
Valid
Figure 4: Benchmarking Methodology Choreography
pending on the WfMS’s architecture. The DBMS
Container can refer to an existing publicly available
Container distributions. The containerized WfMS
should be publicly available (e.g., at the Docker Hub
registry8), or the Benchflow team should be granted
access to a private registry used by the vendor. The
same applies to the Containers’ definition file, i.e., the
Dockerfile (Turnbull, 2014, ch. 4). While private reg-
istries are a solution that can work with vendors of
closed-source systems, they impact the reproducibil-
ity of the results. For each WfMS version to be in-
cluded in the benchmark, there must be a default con-
figuration, i.e., the configuration in which the WfMS
Containers start without modifying any configuration
parameters, except the ones required in order to cor-
rectly start the WfMS, e.g., the database connection.
However, if vendors want to benchmark the perfor-
mance of their WfMS with different configurations,
for example, the configuration provided to users as
a “getting started” configuration, or production-grade
configurations for real-life usage, they can also pro-
vide different configurations. To do that, the Con-
tainers must allow to issue the WfMS configurations
through additional environment variables9, and/or by
access the configuration files. Exposing the volumes
allows to access the files defined inside the Contain-
ers, on the host operating system. Precisely, the WES
Container has to, at least, enable the configuration of:
1) the used DB, i.e., DB driver, url, username and
password for connection; 2) the WES itself; and 3) the
logging level of the WES, and the application stack
layers on which it may depend on. Alternatively, in-
stead of providing configuration details, the vendors
can provide different Containers for different config-
urations. However, even in that case enabling the DB
configuration is required.
In order to access relevant data, all WfMS Con-
tainers have to specify the volumes to access the
WfMS log files, and to access all the data useful to
setup the system. For instance, if the WfMS defines
examples as part of the deployment, we may want
to remove those examples by overriding the volume
containing them. Moreover, the WES Container (or
the DBMS Container) has to create the WfMS’s DB
and populate it with data required to run the WfMS.
The vendors need to provide authentication configu-
ration details of the WfMS components (e.g., the user
7
Methodology Framework
+
Container Based Methodology and Framework
BenchFlow Framework
architecture
Instance
Database
DBMSFaban Drivers
ContainersServers
DATA

TRANSFORMERS
ANALYSERS
Performance
Metrics
Performance
KPIs
WfMS
TestExecutionAnalyses
Faban
+
Web
Service
Minio
harness
MONITOR
Adapters
COLLECTORS
Provides Great Advantages for Automation
and Reproducibility of Results, while
Ensuring Negligible Performance Impact
Vincenzo Ferme
8
[CLOSER ’16]
Container-centric Methodology for Benchmarking Workflow Management
Systems.
Vendor
BenchFlow
Provide
Benchmarking
Methodology
Vendor
BenchFlow
Agree on adding
Vendor's WfMS to the
Benchmark
Benchmarking Methodology Agreement Proposal
Signed Agreement
Vendor
BenchFlow
Provide Containerized
Distribution of WfMS
Containerized WfMS Request
Containerized WfMS
Vendor
BenchFlow
Verify the Benchmark
Results
Results Verification Outcome
Results Verification Request
Vendor
BenchFlow
Provide Draft
Benchmark Results
Draft Benchmark ResultsVerified Benchmark Results
Not Valid
Are the
Results
Valid?
Community
BenchFlow
Publish Benchmark
Results
Valid
Figure 4: Benchmarking Methodology Choreography
pending on the WfMS’s architecture. The DBMS
Container can refer to an existing publicly available
Container distributions. The containerized WfMS
should be publicly available (e.g., at the Docker Hub
registry8), or the Benchflow team should be granted
access to a private registry used by the vendor. The
access the configuration files. Exposing the volumes
allows to access the files defined inside the Contain-
ers, on the host operating system. Precisely, the WES
Container has to, at least, enable the configuration of:
1) the used DB, i.e., DB driver, url, username and
password for connection; 2) the WES itself; and 3) the
Vincenzo Ferme
9
BenchFlow Benchmarking Methodology
requirements from the WfMS
Availability of APIs
• Deploy Process
• Start Process Instance
• Users API
• Web Service APIs
• Events APIs
Vincenzo Ferme
9
BenchFlow Benchmarking Methodology
requirements from the WfMS
Availability of APIs
• Deploy Process
• Start Process Instance
• Users API
• Web Service APIs
• Events APIs
• Workflow & Construct:
• Start Time
• End Time
• [Duration]
Availability of Timing Data
Vincenzo Ferme
9
BenchFlow Benchmarking Methodology
requirements from the WfMS
Availability of APIs
• Deploy Process
• Start Process Instance
• Users API
• Web Service APIs
• Events APIs
• Workflow & Construct:
• Start Time
• End Time
• [Duration]
Availability of Timing Data
Availability of Execution State
State of the workflow execution. E.g., running, completed, error
Vincenzo Ferme
10
BenchFlow Framework
architecture
Instance
Database
DBMSFaban Drivers
ContainersServers
DATA

TRANSFORMERS
ANALYSERS
Performance
Metrics
Performance
KPIs
WfMS
TestExecutionAnalyses
Faban
+
Web
Service
Minio
harness
benchflow [BPM ’15] [ICPE ’16]
MONITOR
Adapters
COLLECTORS
Vincenzo Ferme
11
BenchFlow Framework
WfMSs’ specific characteristics
•Manages the Automatic WfMS deployment;
•Provides a “plugin” mechanism to add new WfMSs;
•Performs automatic Process Models deployment;
•Collects Client and Server-side data and metrics;
•Automatically computes performance metrics and

statistics.
Vincenzo Ferme
12
Applications of the Methodology and Framework
[CAiSE ’16]
Micro-Benchmarking BPMN 2.0 Workflow Management Systems
with Workflow Patterns.
WfMS
MySQL
3 WfMSsWorkload
Empty
Script 1
Empty
Script 2
…
0
750
1500
0:00 4:00 10:00
Metrics
• Engine Level
• Process Level
• Environment
Vincenzo Ferme
Micro-Benchmarking with Workflow Patterns
13
research questions
Selected three WfMSs: popular open-source WfMSs actually used in industry
1. What is the impact of individual or a mix of
workflow patterns on the performance of each one
of the benchmarked BPMN 2.0 WfMSs?
2. Are there performance bottlenecks for the selected
WfMSs?
Vincenzo Ferme
14
Empty
Script 1
Empty
Script 2
Workload
Mix
Micro-Benchmarking with Workflow Patterns
workload mix
Sequence Pattern [SEQ]
• N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006
Pattern Reference:
Vincenzo Ferme
14
Empty
Script 1
Empty
Script 2
Workload
Mix
Micro-Benchmarking with Workflow Patterns
workload mix
Sequence Pattern [SEQ]exclusiveChoicePattern
generate
number
Empty
Script 1
Empty
Script 2case_2
case_1
Exclusive Choice and Simple Merge [EXC]
• N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006
Pattern Reference:
Vincenzo Ferme
15
lesPattern
generate
1 or 2
Empty
Script 1
Empty
Script 2
i++
i < 10
i >= 10
case_1
case_2
Workload
Mix
Micro-Benchmarking with Workflow Patterns
workload mix
Arbitrary Cycle [CYC]
• N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006
Pattern Reference:
Vincenzo Ferme
16
Workload
Mix
Micro-Benchmarking with Workflow Patterns
workload mix
Empty
Script 1
Empty
Script 2
Parallel Split and Synchronisation [PAR]
• N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006
Pattern Reference:
Vincenzo Ferme
16
Workload
Mix
Micro-Benchmarking with Workflow Patterns
workload mix
citTerminationPattern
Empty
Script 1
Wait 5
Sec
Explicit Termination Pattern [EXT]
Empty
Script 1
Empty
Script 2
Parallel Split and Synchronisation [PAR]
• N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006
Pattern Reference:
Vincenzo Ferme
17
Micro-Benchmarking with Workflow Patterns
workload mix design decisions
1. Maximise the simplicity of the model expressing the workflow
pattern
2. Omit the interactions with external systems by implementing all
tasks as script tasks
Vincenzo Ferme
18
Load
Functions
Micro-Benchmarking with Workflow Patterns
load function
Test
Type
Load test
Instance
Producers
0
300
600
900
1200
1500
Time (min:sec)
0:000:301:002:003:004:005:006:007:008:0010:00
Example for u=1500
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS B WfMS C
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS B WfMS C
MySQL MySQL MySQL
MySQL: Community Server 5.6.26
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS B WfMS C
MySQL MySQL MySQL
MySQL: Community Server 5.6.26
O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01
J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS B WfMS C
MySQL MySQL MySQL
MySQL: Community Server 5.6.26
O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01
J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79
App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS B WfMS C
MySQL MySQL MySQL
MySQL: Community Server 5.6.26
O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01
J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79
App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final
Max Java Heap: 32GB
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS B WfMS C
MySQL MySQL MySQL
MySQL: Community Server 5.6.26
O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01
J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79
App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final
Max Java Heap: 32GB Max DB Connections Number: 100
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS B WfMS C
MySQL MySQL MySQL
MySQL: Community Server 5.6.26
WfMS’s Containers: official ones, if available on DockerHub
O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01
J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79
App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final
Max Java Heap: 32GB Max DB Connections Number: 100
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS B WfMS C
MySQL MySQL MySQL
MySQL: Community Server 5.6.26
WfMS’s Containers: official ones, if available on DockerHub
WfMS’s Configuration: as suggested by vendor’s documentation
O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01
J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79
App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final
Max Java Heap: 32GB Max DB Connections Number: 100
Vincenzo Ferme
20
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS deployment
TEST ENVIRONMENT
CPU
64 Cores 

@ 1400 MHz
RAM 128 GB
Load Drivers
CPU
12 Cores 

@ 800 MHz
RAM 64 GB
WfMS
CPU
64 Cores 

@ 2300 MHz
RAM 128 GB
DBMS
Vincenzo Ferme
20
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS deployment
TEST ENVIRONMENT
CPU
64 Cores 

@ 1400 MHz
RAM 128 GB
Load Drivers
CPU
12 Cores 

@ 800 MHz
RAM 64 GB
WfMS
CPU
64 Cores 

@ 2300 MHz
RAM 128 GB
DBMS
10 Gbit/s
10 Gbit/s
Vincenzo Ferme
20
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS deployment
TEST ENVIRONMENT
CPU
64 Cores 

@ 1400 MHz
RAM 128 GB
Load Drivers
CPU
12 Cores 

@ 800 MHz
RAM 64 GB
WfMS
CPU
64 Cores 

@ 2300 MHz
RAM 128 GB
DBMS
10 Gbit/s
10 Gbit/s
O.S.: Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS
Vincenzo Ferme
20
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS deployment
TEST ENVIRONMENT
CPU
64 Cores 

@ 1400 MHz
RAM 128 GB
Load Drivers
CPU
12 Cores 

@ 800 MHz
RAM 64 GB
WfMS
CPU
64 Cores 

@ 2300 MHz
RAM 128 GB
DBMS
10 Gbit/s
10 Gbit/s
O.S.: Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS
1.8.2 1.8.2 1.8.2
Vincenzo Ferme
21
Micro-Benchmarking with Workflow Patterns
performed experiments
Individual patterns
1. 1500 (max) concurrent Instance Producers starting each pattern
[SEQ, EXC, CYC, EXT, PAR]
Mix of patterns
2. 1500 concurrent Instance Producers starting an equal number of
pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
Vincenzo Ferme
21
Micro-Benchmarking with Workflow Patterns
performed experiments
Three runs for each execution of the experiments
Individual patterns
1. 1500 (max) concurrent Instance Producers starting each pattern
[SEQ, EXC, CYC, EXT, PAR]
Mix of patterns
2. 1500 concurrent Instance Producers starting an equal number of
pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
Vincenzo Ferme
21
Micro-Benchmarking with Workflow Patterns
performed experiments
Three runs for each execution of the experiments
The data collected in the first minute of the execution has not been included in the analysis
Individual patterns
1. 1500 (max) concurrent Instance Producers starting each pattern
[SEQ, EXC, CYC, EXT, PAR]
Mix of patterns
2. 1500 concurrent Instance Producers starting an equal number of
pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
Vincenzo Ferme
21
Micro-Benchmarking with Workflow Patterns
performed experiments
Three runs for each execution of the experiments
The data collected in the first minute of the execution has not been included in the analysis
Individual patterns
1. 1500 (max) concurrent Instance Producers starting each pattern
[SEQ, EXC, CYC, EXT, PAR]
Mix of patterns
2. 1500 concurrent Instance Producers starting an equal number of
pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
Vincenzo Ferme
22
Number of Process
Instances
Computed Metrics and Statistics
engine level metrics
Throughput
Execution Time
Database Size
WfMS Users
Load Driver
Instance
Database
Application Server
DBMS
Web
Service
D
A
B
C
…
Vincenzo Ferme
22
Number of Process
Instances
Computed Metrics and Statistics
engine level metrics
Throughput
Execution Time
Database Size
WfMS Users
Load Driver
Instance
Database
Application Server
DBMS
Web
Service
D
A
B
C
…
Vincenzo Ferme
23
individual patterns: throughput (bp/sec)
Micro-Benchmarking with Workflow Patterns
SEQ EXC EXT PAR CYC
WfMS A
WfMS B
WfMS C
1456.79 1417.17 1455.68 1433.12 327.99
1447.66 1436.03 1426.35 1429.65 644.02
63.31 49.04 0.38 48.89 13.46
u=600
u=800
u=1500
Vincenzo Ferme
24
Micro-Benchmarking with Workflow Patterns
individual patterns: WfMS C Scalability for SEQ pattern
WfMS C uses a synchronous instantiation API,
load drivers must wait for workflows to end
before starting new ones
Vincenzo Ferme
24
Micro-Benchmarking with Workflow Patterns
individual patterns: WfMS C Scalability for SEQ pattern
WfMS C uses a synchronous instantiation API,
load drivers must wait for workflows to end
before starting new ones
Response
Time
0
14
28
500 1000 1500 2000
u Instance Producers
Throughput 0
40
80
500 1000 1500 2000
u Instance Producers
Vincenzo Ferme
24
Micro-Benchmarking with Workflow Patterns
individual patterns: WfMS C Scalability for SEQ pattern
WfMS C uses a synchronous instantiation API,
load drivers must wait for workflows to end
before starting new ones
Response
Time
0
14
28
500 1000 1500 2000
u Instance Producers
Throughput 0
40
80
500 1000 1500 2000
u Instance Producers
We have found a bottleneck! (~60 bp/sec)
Vincenzo Ferme
25
Computed Metrics and Statistics
process level metrics
Throughput
WfMS Users
Load Driver
Instance
Database
Application Server
Web
Service
D
A
B
C
DBMS
…
Instance Duration
PerProcess
Definition
Vincenzo Ferme
25
Computed Metrics and Statistics
process level metrics
Throughput
WfMS Users
Load Driver
Instance
Database
Application Server
Web
Service
D
A
B
C
DBMS
…
Instance Duration
PerProcess
Definition
Vincenzo Ferme
26
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (instance duration time)
Vincenzo Ferme
26
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (instance duration time)
Empty
Script 1
Empty
Script 2
[SEQ]
Vincenzo Ferme
26
Empty
Script 1
Wait 5
Sec
[EXT]
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (instance duration time)
Vincenzo Ferme
26
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (instance duration time)
Empty
Script 1
Empty
Script 2
[PAR]
Vincenzo Ferme
26
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (instance duration time)
CYC Instance
Producers:
u=600
[CYC]
tern
generate
1 or 2
Empty
Script 1
Empty
Script 2
i++
i < 10
i >= 10
case_1
case_2
Vincenzo Ferme
27
Computed Metrics and Statistics
environment metrics
WfMS Users
Load Driver
Instance
Database
Application Server
Web
Service
D
A
B
C
DBMS
CPU/RAM Usage per
Second
IO Operations/Network
Usage per Second
…
Vincenzo Ferme
27
Computed Metrics and Statistics
environment metrics
WfMS Users
Load Driver
Instance
Database
Application Server
Web
Service
D
A
B
C
DBMS
CPU/RAM Usage per
Second
IO Operations/Network
Usage per Second
…
Vincenzo Ferme
28
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (RAM) and mean (CPU) usage
RAM CPU
Vincenzo Ferme
29
Micro-Benchmarking with Workflow Patterns
performed experiments
Individual patterns
1. 1500 (max) concurrent Instance Producers starting each pattern
[SEQ, EXC, CYC, EXT, PAR]
Mix of patterns
2. 1500 concurrent Instance Producers starting an equal number of
pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
Three runs for each execution of the experiments
The data collected in the first minute of the execution has not been included in the analysis
Vincenzo Ferme
30
Micro-Benchmarking with Workflow Patterns
mix of patterns: duration, RAM, CPU
Vincenzo Ferme
31
Micro-Benchmarking with Workflow Patterns
summary
Even though the Executed Workflows are Simple
Vincenzo Ferme
31
Micro-Benchmarking with Workflow Patterns
summary
There are relevant differences in workflows’
duration and throughput among the WfMSs
Even though the Executed Workflows are Simple
Vincenzo Ferme
31
Micro-Benchmarking with Workflow Patterns
summary
There are relevant differences in workflows’
duration and throughput among the WfMSs
There are relevant differences in
resource usage among WfMSs
CPURAM
Even though the Executed Workflows are Simple
Vincenzo Ferme
31
Micro-Benchmarking with Workflow Patterns
summary
There are relevant differences in workflows’
duration and throughput among the WfMSs
One WfMS has bottlenecks
There are relevant differences in
resource usage among WfMSs
CPURAM
Even though the Executed Workflows are Simple
Highlights
Highlights
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload Metrics
KPIs
Measure
Input/Process/Output Model
Workload
Mix
80%
C
A
B
20%
D
Load Functions
Test Data
Performance
Data
Process
Instance
Duration
Throughput
Containers
Benchmarking Process
Highlights
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload Metrics
KPIs
Measure
Input/Process/Output Model
Workload
Mix
80%
C
A
B
20%
D
Load Functions
Test Data
Performance
Data
Process
Instance
Duration
Throughput
Containers
Benchmarking Process
Vincenzo Ferme
8
[CLOSER ’16]
Container-centric Methodology for Benchmarking Workflow Management
Systems.
Vendor
BenchFlow
Provide
Benchmarking
Methodology
Vendor
BenchFlow
Agree on adding
Vendor's WfMS to the
Benchmark
Benchmarking Methodology Agreement Proposal
Signed Agreement
Vendor
BenchFlow
Provide Containerized
Distribution of WfMS
Containerized WfMS Request
Containerized WfMS
Vendor
BenchFlow
Verify the Benchmark
Results
Results Verification Outcome
Results Verification Request
Vendor
BenchFlow
Provide Draft
Benchmark Results
Draft Benchmark ResultsVerified Benchmark Results
Not Valid
Are the
Results
Valid?
Community
BenchFlow
Publish Benchmark
Results
Valid
Benchmarking Methodology
Highlights
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload Metrics
KPIs
Measure
Input/Process/Output Model
Workload
Mix
80%
C
A
B
20%
D
Load Functions
Test Data
Performance
Data
Process
Instance
Duration
Throughput
Containers
Benchmarking Process
Vincenzo Ferme
8
[CLOSER ’16]
Container-centric Methodology for Benchmarking Workflow Management
Systems.
Vendor
BenchFlow
Provide
Benchmarking
Methodology
Vendor
BenchFlow
Agree on adding
Vendor's WfMS to the
Benchmark
Benchmarking Methodology Agreement Proposal
Signed Agreement
Vendor
BenchFlow
Provide Containerized
Distribution of WfMS
Containerized WfMS Request
Containerized WfMS
Vendor
BenchFlow
Verify the Benchmark
Results
Results Verification Outcome
Results Verification Request
Vendor
BenchFlow
Provide Draft
Benchmark Results
Draft Benchmark ResultsVerified Benchmark Results
Not Valid
Are the
Results
Valid?
Community
BenchFlow
Publish Benchmark
Results
Valid
Benchmarking Methodology
Vincenzo Ferme
10
BenchFlow Framework
architecture
Instance
Database
DBMSFaban Drivers
ContainersServers
DATA

TRANSFORMERS
ANALYSERS
Performance
Metrics
Performance
KPIs
WfMS
TestExecutionAnalyses
Faban
+
Web
Service
Minio
harness
benchflow [BPM ’15] [ICPE ’16]
MONITOR
Adapters
COLLECTORS
BenchFlow Framework
Highlights
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload Metrics
KPIs
Measure
Input/Process/Output Model
Workload
Mix
80%
C
A
B
20%
D
Load Functions
Test Data
Performance
Data
Process
Instance
Duration
Throughput
Containers
Benchmarking Process
Vincenzo Ferme
8
[CLOSER ’16]
Container-centric Methodology for Benchmarking Workflow Management
Systems.
Vendor
BenchFlow
Provide
Benchmarking
Methodology
Vendor
BenchFlow
Agree on adding
Vendor's WfMS to the
Benchmark
Benchmarking Methodology Agreement Proposal
Signed Agreement
Vendor
BenchFlow
Provide Containerized
Distribution of WfMS
Containerized WfMS Request
Containerized WfMS
Vendor
BenchFlow
Verify the Benchmark
Results
Results Verification Outcome
Results Verification Request
Vendor
BenchFlow
Provide Draft
Benchmark Results
Draft Benchmark ResultsVerified Benchmark Results
Not Valid
Are the
Results
Valid?
Community
BenchFlow
Publish Benchmark
Results
Valid
Benchmarking Methodology
Vincenzo Ferme
10
BenchFlow Framework
architecture
Instance
Database
DBMSFaban Drivers
ContainersServers
DATA

TRANSFORMERS
ANALYSERS
Performance
Metrics
Performance
KPIs
WfMS
TestExecutionAnalyses
Faban
+
Web
Service
Minio
harness
benchflow [BPM ’15] [ICPE ’16]
MONITOR
Adapters
COLLECTORS
BenchFlow Framework
Vincenzo Ferme
26
Empty
Script 1
Wait 5
Sec
[EXT]
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (instance duration time)
Empty
Script 1
Empty
Script 2
[SEQ]
Empty
Script 1
Empty
Script 2
[PAR]
CYC Instance
Producers:
u=600
[CYC]
generate
1 or 2
Empty
Script 1
Empty
Script 2
i++
i < 10
i >= 10
case_1
case_2
Micro-Benchmark with WP
Vincenzo Ferme
33
•We want to characterise the Workload Mix using Real-World process models
• Share your executable BPMN 2.0 process models, even anonymised
Process Models
•We want to characterise the Load Functions using Real-World behaviours
• Share your execution logs, even anonymised
Execution Logs
•We want to add more and more WfMSs to the benchmark
• Contact us for collaboration, and BenchFlow framework support
WfMSs
Call for Collaboration
WfMSs, process models, process logs
benchflow
benchflow
vincenzo.ferme@usi.ch
http://benchflow.inf.usi.ch
WORKFLOW ENGINE
PERFORMANCE BENCHMARKING
WITH BENCHFLOW
Vincenzo	Ferme	
Faculty	of	Informatics	
USI	Lugano,	Switzerland
BACKUP SLIDES
Vincenzo	Ferme	
Faculty	of	Informatics	
USI	Lugano,	Switzerland
Vincenzo Ferme
36
Published Work
[BTW ’15]
C. Pautasso, V. Ferme, D. Roller, F. Leymann, and M. Skouradaki. Towards workflow
benchmarking: Open research challenges. In Proc. of the 16th conference on
Database Systems for Business,Technology, and Web, BTW 2015, pages 331–350, 2015.
[SSP ’14]
M. Skouradaki, D. H. Roller, F. Leymann, V. Ferme, and C. Pautasso. Technical open
challenges on benchmarking workflow management systems. In Proc. of the 2014
Symposium on Software Performance, SSP 2014, pages 105–112, 2014.
[ICPE ’15]
M. Skouradaki, D. H. Roller, L. Frank, V. Ferme, and C. Pautasso. On the Road to
Benchmarking BPMN 2.0 Workflow Engines. In Proc. of the 6th ACM/SPEC
International Conference on Performance Engineering, ICPE ’15, pages 301–304, 2015.
Vincenzo Ferme
37
Published Work
[CLOSER ’15]
M. Skouradaki,V. Ferme, F. Leymann, C. Pautasso, and D. H. Roller. “BPELanon”: Protect
business processes on the cloud. In Proc. of the 5th International Conference on
Cloud Computing and Service Science, CLOSER 2015. SciTePress, 2015.
[SOSE ’15]
M. Skouradaki, K. Goerlach, M. Hahn, and F. Leymann. Application of Sub-Graph
Isomorphism to Extract Reoccurring Structures from BPMN 2.0 Process Models.
In Proc. of the 9th International IEEE Symposium on Service-Oriented System
Engineering, SOSE 2015, 2015.
[BPM ’15]
V. Ferme, A. Ivanchikj, C. Pautasso. A Framework for Benchmarking BPMN 2.0
Workflow Management Systems. In Proc. of the 13th International Conference on
Business Process Management, BPM ’15, pages 251-259, 2015.
Vincenzo Ferme
38
Published Work
[BPMD ’15]
A. Ivanchikj,V. Ferme, C. Pautasso. BPMeter: Web Service and Application for Static
Analysis of BPMN 2.0 Collections. In Proc. of the 13th International Conference on
Business Process Management [Demo], BPM ’15, pages 30-34, 2015.
[ICPE ’16]
V. Ferme, and C. Pautasso. Integrating Faban with Docker for Performance
Benchmarking. In Proc. of the 7th ACM/SPEC International Conference on Performance
Engineering, ICPE ’16, 2016.
[CLOSER ’16]
V. Ferme, A. Ivanchikj, C. Pautasso., M. Skouradaki, F. Leymann. A Container-centric
Methodology for Benchmarking Workflow Management Systems. In Proc. of the
6th International Conference on Cloud Computing and Service Science, CLOSER 2016.
SciTePress, 2016.
Vincenzo Ferme
[CAiSE ’16]
M. Skouradaki, V. Ferme, C. Pautasso, F. Leymann, A. van Hoorn. Micro-Benchmarking
BPMN 2.0 Workflow Management Systems with Workflow Patterns . In Proc. of
the 28th International Conference on Advanced Information Systems Engineering, CAiSE
’16, 2016.
39
Published Work
[ICWE ’16]
C. Jürgen,V. Ferme, H.C. Gall. Using Docker Containers to Improve Reproducibility
in Software and Web Engineering Research. In Proc. of the 16th International
Conference on Web Engineering, 2016.
[ICWS ’16]
M. Skouradaki,V.Andrikopoulos, F. Leymann. Representative BPMN 2.0 Process Model
Generation from Recurring Structures. In Proc. of the 23rd IEEE International
Conference on Web Services, ICWS '16, 2016.
Vincenzo Ferme
40
Published Work
[SummerSOC ’16]
M. Skouradaki, T. Azad, U. Breitenbücher, O. Kopp, F. Leymann. A Decision Support
System for the Performance Benchmarking of Workflow Management Systems.
In Proc. of the 10th Symposium and Summer School On Service-Oriented Computing,
SummerSOC '16, 2016.
[OTM ’16]
M. Skouradaki, V. Andrikopoulos, O. Kopp, F. Leymann. RoSE: Reoccurring Structures
Detection in BPMN 2.0 Process Models Collections. In Proc. of On the Move to
Meaningful Internet Systems Conference, OTM '16, 2016. (to appear)
[BPM Forum ’16]
V. Ferme, A. Ivanchikj, C. Pautasso. Estimating the Cost for Executing Business
Processes in the Cloud. In Proc. of the 14th International Conference on Business
Process Management, BPM Forum ’16, 2016. (to appear)
Vincenzo Ferme
41
Docker Performance
[IBM ’14]
W. Felter, A. Ferreira, R. Rajamony, and J. Rubio. An updated performance comparison
of virtual machines and Linux containers. IBM Research Report, 2014.
Although containers themselves have almost no overhead,
Docker is not without performance gotchas. Docker
volumes have noticeably better performance than files stored in
AUFS. Docker’s NAT also introduces overhead for workloads
with high packet rates. These features represent a tradeoff
between ease of management and performance and should
be considered on a case-by-case basis.
”
“Our results show that containers result in equal or better
performance than VMs in almost all cases.
“
”
BenchFlow Configures Docker for Performance by Default
Vincenzo Ferme
42
Benchmarking Requirements
• Relevant
• Representative
• Portable
• Scalable
• Simple
• Repeatable
• Vendor-neutral
• Accessible
• Efficient
• Affordable
• K. Huppler, The art of building a good benchmark, 2009
• J. Gray, The Benchmark Handbook for Database and Transaction Systems, 1993
• S. E. Sim, S. Easterbrook et al., Using benchmarking to advance research: A
challenge to software engineering, 2003
Vincenzo Ferme
43
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
NON-CORECORE
Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
Vincenzo Ferme
43
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
NON-CORECORE
WfMS
Initialisation APIs
Deploy Process
Start Process
Instance
Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
Vincenzo Ferme
43
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
NON-CORECORE
WfMS
Initialisation APIs
Deploy Process
Start Process
Instance
+ WfMS
Claim Task Complete Task
User APIs
Create User Create Group
Pending User Tasks
Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
Vincenzo Ferme
43
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
NON-CORECORE
WfMS
Initialisation APIs
Deploy Process
Start Process
Instance
+ WfMS
Claim Task Complete Task
User APIs
Create User Create Group
Pending User Tasks
Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
Invoke WS
Web Service APIs
Vincenzo Ferme
43
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
NON-CORECORE
WfMS
Initialisation APIs
Deploy Process
Start Process
Instance
Event APIs
Issue Event
WfMS
Pending Event Tasks
+ WfMS
Claim Task Complete Task
User APIs
Create User Create Group
Pending User Tasks
Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
Invoke WS
Web Service APIs
Vincenzo Ferme
44
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
Table 1: Summary of Core and Non-core APIs to be imple-
mented by the WfMS
Functionality Min Response Data
CoreAPIs
Initialisation
APIs
Deploy a process Deployed process ID
Start a process instance Process instance ID
Non-coreAPIs
User APIs Create a user User ID
Create a group of users User group ID
Access pending tasks Pending tasks IDs
Claim a task*
Complete a task
Event APIs Access pending events Pending events IDs
Issue events
Web service
APIs
Map tasks to Web
service endpoints
*Optional depending on the WfMS implementation
ing
the
a u
(U
ate
by
acc
a g
and
mo
ing
(Pi
has
col
Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Engine
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Engine
Containers
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Engine
Docker Machine
provides
Containers
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Swarm
Docker Engine
Docker Machine
provides
Containers
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Swarm
Docker Engine
Docker Machine
provides
manages
Containers
Servers
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Swarm
Docker Engine
Docker Compose
Docker Machine
provides
manages
Containers
Servers
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Swarm
Docker Engine
Docker Compose
Docker Machine
provides
manages
Containers
Servers
SUT’s Deployment Conf.
deploys
Vincenzo Ferme
46
Server-side Data and Metrics Collection
WfMS
Start Workflow
asynchronous execution of workflows
Vincenzo Ferme
Users
D
A
B
C
Start
Application Server
Web
Service
Load Driver
DBMS
WfMS
Instance
Database
46
Server-side Data and Metrics Collection
WfMS
Start Workflow
asynchronous execution of workflows
Vincenzo Ferme
Users
D
A
B
C
Start
Application Server
Web
Service
Load Driver
DBMS
WfMS
Instance
Database
46
Server-side Data and Metrics Collection
WfMS
Start Workflow
End
asynchronous execution of workflows
Vincenzo Ferme
47
Server-side Data and Metrics Collection
monitors
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
Vincenzo Ferme
47
Server-side Data and Metrics Collection
monitors
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
MONITOR
Vincenzo Ferme
48
Server-side Data and Metrics Collection
monitors
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
• CPU usage
• Database state
Examples of Monitors:Monitors’ Characteristics:
• RESTful services
• Lightweight (written in Go)
• As less invasive on the SUT as possible
Vincenzo Ferme
48
Server-side Data and Metrics Collection
monitors
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
CPU
Monitor
DB
Monitor
• CPU usage
• Database state
Examples of Monitors:Monitors’ Characteristics:
• RESTful services
• Lightweight (written in Go)
• As less invasive on the SUT as possible
Vincenzo Ferme
49
Server-side Data and Metrics Collection
collect data
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
Vincenzo Ferme
49
Server-side Data and Metrics Collection
collect data
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
Minio
Instance
Database
Analyses
COLLECTORS
Vincenzo Ferme
50
Server-side Data and Metrics Collection
collect data
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
• Container’s Stats (e.g., CPU usage)
• Database dump
• Applications Logs
Examples of Collectors:Collectors’ Characteristics:
• RESTful services
• Lightweight (written in Go)
• Two types: online and offline
• Buffer data locally
Vincenzo Ferme
50
Server-side Data and Metrics Collection
collect data
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestExecution
Web
Service
Stats
Collector
DB
Collector
• Container’s Stats (e.g., CPU usage)
• Database dump
• Applications Logs
Examples of Collectors:Collectors’ Characteristics:
• RESTful services
• Lightweight (written in Go)
• Two types: online and offline
• Buffer data locally
Vincenzo Ferme
51
TestExecution
A high-throughput
distributed messaging
system
Minio
Instance
Database
Analyses
DBMSFaban Drivers
ContainersServers
harness
WfMS
Web
Service
Stats
Collector
DB
Collector
Synchronisation with the Analyses
schedule analyses
Vincenzo Ferme
51
TestExecution
A high-throughput
distributed messaging
system
Minio
Instance
Database
Analyses
DBMSFaban Drivers
ContainersServers
harness
WfMS
Web
Service
Stats
Collector
DB
Collector
COLLECT
Synchronisation with the Analyses
schedule analyses
Vincenzo Ferme
51
TestExecution
A high-throughput
distributed messaging
system
Minio
Instance
Database
Analyses
DBMSFaban Drivers
ContainersServers
harness
WfMS
Web
Service
Stats
Collector
DB
Collector
COLLECT
Publish
Synchronisation with the Analyses
schedule analyses
Vincenzo Ferme
51
TestExecution
A high-throughput
distributed messaging
system
Minio
Instance
Database
Analyses
DBMSFaban Drivers
ContainersServers
harness
WfMS
Web
Service
Stats
Collector
DB
Collector
COLLECT
Publish
Subscribe
Synchronisation with the Analyses
schedule analyses
Vincenzo Ferme
51
TestExecution
A high-throughput
distributed messaging
system
Minio
Instance
Database
Analyses
DBMSFaban Drivers
ContainersServers
harness
WfMS
Web
Service
Stats
Collector
DB
Collector
COLLECT
Publish
Subscribe
Read
Synchronisation with the Analyses
schedule analyses
Vincenzo Ferme
52
Performance Metrics and KPIs
amount of data
REALISTIC
DATA
Number of
Repetitions
1
10
5
3
Vincenzo Ferme
52
Performance Metrics and KPIs
amount of data
REALISTIC
DATA
Number of
Repetitions
1
10
5
3
Number of
WfMSs
1
Vincenzo Ferme
53
Micro-Benchmarking with Workflow Patterns
mix of patterns: throughput (bp/sec)
MIX
WfMS A
WfMS B
WfMS C
1061.27
1402.33
1.78
Vincenzo Ferme
54
Computed Metrics and Statistics
feature level metrics
WfMS Users
Load Driver
Instance
Database
Application Server
Web
Service
D
A
B
C
DBMS
Throughput
…
Instance Duration
PerConstruct
(type,name)
Vincenzo Ferme
54
Computed Metrics and Statistics
feature level metrics
WfMS Users
Load Driver
Instance
Database
Application Server
Web
Service
D
A
B
C
DBMS
Throughput
…
Instance Duration
PerConstruct
(type,name)
Vincenzo Ferme
55
Computed Metrics and Statistics
interactions metrics
Response Time
Latency
…
WfMS Users
Load Driver
Instance
Database
Application Server
DBMS
Web
Service
D
A
B
C
Vincenzo Ferme
55
Computed Metrics and Statistics
interactions metrics
Response Time
Latency
…
WfMS Users
Load Driver
Instance
Database
Application Server
DBMS
Web
Service
D
A
B
C
Vincenzo Ferme
56
Computed Metrics and Statistics
descriptive and homogeneity statistics
• Descriptive Statistics (e.g., Mean, Confidence Interval, 

Standard Deviation, Percentiles, Quartiles, …)
• Coefficient ofVariation across trials
• Levene Test on trials

More Related Content

What's hot

Cognos 10 upgrade migrate fixpack by bhawani nandan prasad
Cognos 10 upgrade migrate fixpack by bhawani nandan prasadCognos 10 upgrade migrate fixpack by bhawani nandan prasad
Cognos 10 upgrade migrate fixpack by bhawani nandan prasadBhawani N Prasad
 
02.28.13 WANDisco SVN Training: Getting Info Out of SVN
02.28.13 WANDisco SVN Training: Getting Info Out of SVN02.28.13 WANDisco SVN Training: Getting Info Out of SVN
02.28.13 WANDisco SVN Training: Getting Info Out of SVNWANdisco Plc
 
Getting Started With Subversion
Getting Started With SubversionGetting Started With Subversion
Getting Started With SubversionJordan Hatch
 
Symfony 2 under control
Symfony 2 under controlSymfony 2 under control
Symfony 2 under controlMax Romanovsky
 
Subversion in 2010 and Beyond
Subversion in 2010 and BeyondSubversion in 2010 and Beyond
Subversion in 2010 and Beyondguest1243d91
 
OpenPOWER Application Optimization
OpenPOWER Application Optimization OpenPOWER Application Optimization
OpenPOWER Application Optimization Ganesan Narayanasamy
 
How to use CVS applied to SOLab
How to use CVS applied to SOLabHow to use CVS applied to SOLab
How to use CVS applied to SOLabPablo Arriazu
 
SVN Best Practices
SVN Best PracticesSVN Best Practices
SVN Best Practicesabackstrom
 
Subversion Best Practices
Subversion Best PracticesSubversion Best Practices
Subversion Best PracticesMatt Wood
 
SVN Usage & Best Practices
SVN Usage & Best PracticesSVN Usage & Best Practices
SVN Usage & Best PracticesAshraf Fouad
 
How to set up an ASP.NET 5 Continuous Delivery Pipeline using IBM Bluemix Dev...
How to set up an ASP.NET 5 Continuous Delivery Pipeline using IBM Bluemix Dev...How to set up an ASP.NET 5 Continuous Delivery Pipeline using IBM Bluemix Dev...
How to set up an ASP.NET 5 Continuous Delivery Pipeline using IBM Bluemix Dev...Richard Johansson
 
Building A Simple Web Service With CXF
Building A Simple Web Service With CXFBuilding A Simple Web Service With CXF
Building A Simple Web Service With CXFCarl Lu
 
Less13 performance
Less13 performanceLess13 performance
Less13 performanceAmit Bhalla
 
SVN Tool Information : Best Practices
SVN Tool Information  : Best PracticesSVN Tool Information  : Best Practices
SVN Tool Information : Best PracticesMaidul Islam
 
Database Change Management | Change Manager 5.1 Beta
Database Change Management | Change Manager 5.1 BetaDatabase Change Management | Change Manager 5.1 Beta
Database Change Management | Change Manager 5.1 BetaMichael Findling
 
Doors quality analyzer deployment guide
Doors quality analyzer deployment guideDoors quality analyzer deployment guide
Doors quality analyzer deployment guideReuseCompany
 
Oracle Release 12 E-Business Suite Patching Best Practices
Oracle Release 12 E-Business Suite Patching Best PracticesOracle Release 12 E-Business Suite Patching Best Practices
Oracle Release 12 E-Business Suite Patching Best PracticesScott Jenner
 

What's hot (20)

Cognos 10 upgrade migrate fixpack by bhawani nandan prasad
Cognos 10 upgrade migrate fixpack by bhawani nandan prasadCognos 10 upgrade migrate fixpack by bhawani nandan prasad
Cognos 10 upgrade migrate fixpack by bhawani nandan prasad
 
02.28.13 WANDisco SVN Training: Getting Info Out of SVN
02.28.13 WANDisco SVN Training: Getting Info Out of SVN02.28.13 WANDisco SVN Training: Getting Info Out of SVN
02.28.13 WANDisco SVN Training: Getting Info Out of SVN
 
Getting Started With Subversion
Getting Started With SubversionGetting Started With Subversion
Getting Started With Subversion
 
Symfony 2 under control
Symfony 2 under controlSymfony 2 under control
Symfony 2 under control
 
Subversion in 2010 and Beyond
Subversion in 2010 and BeyondSubversion in 2010 and Beyond
Subversion in 2010 and Beyond
 
OpenPOWER Application Optimization
OpenPOWER Application Optimization OpenPOWER Application Optimization
OpenPOWER Application Optimization
 
How to use CVS applied to SOLab
How to use CVS applied to SOLabHow to use CVS applied to SOLab
How to use CVS applied to SOLab
 
SVN Best Practices
SVN Best PracticesSVN Best Practices
SVN Best Practices
 
Subversion Best Practices
Subversion Best PracticesSubversion Best Practices
Subversion Best Practices
 
SVN Usage & Best Practices
SVN Usage & Best PracticesSVN Usage & Best Practices
SVN Usage & Best Practices
 
Svn Basic Tutorial
Svn Basic TutorialSvn Basic Tutorial
Svn Basic Tutorial
 
How to set up an ASP.NET 5 Continuous Delivery Pipeline using IBM Bluemix Dev...
How to set up an ASP.NET 5 Continuous Delivery Pipeline using IBM Bluemix Dev...How to set up an ASP.NET 5 Continuous Delivery Pipeline using IBM Bluemix Dev...
How to set up an ASP.NET 5 Continuous Delivery Pipeline using IBM Bluemix Dev...
 
Svn tutorial
Svn tutorialSvn tutorial
Svn tutorial
 
Building A Simple Web Service With CXF
Building A Simple Web Service With CXFBuilding A Simple Web Service With CXF
Building A Simple Web Service With CXF
 
Less13 performance
Less13 performanceLess13 performance
Less13 performance
 
SVN Tool Information : Best Practices
SVN Tool Information  : Best PracticesSVN Tool Information  : Best Practices
SVN Tool Information : Best Practices
 
Database Change Management | Change Manager 5.1 Beta
Database Change Management | Change Manager 5.1 BetaDatabase Change Management | Change Manager 5.1 Beta
Database Change Management | Change Manager 5.1 Beta
 
Subversion User Guide
Subversion User GuideSubversion User Guide
Subversion User Guide
 
Doors quality analyzer deployment guide
Doors quality analyzer deployment guideDoors quality analyzer deployment guide
Doors quality analyzer deployment guide
 
Oracle Release 12 E-Business Suite Patching Best Practices
Oracle Release 12 E-Business Suite Patching Best PracticesOracle Release 12 E-Business Suite Patching Best Practices
Oracle Release 12 E-Business Suite Patching Best Practices
 

Similar to Workflow Engine Performance Benchmarking with BenchFlow

Partner Development Guide for Kafka Connect
Partner Development Guide for Kafka ConnectPartner Development Guide for Kafka Connect
Partner Development Guide for Kafka Connectconfluent
 
Hpe service virtualization 3.8 what's new chicago adm
Hpe service virtualization 3.8 what's new chicago admHpe service virtualization 3.8 what's new chicago adm
Hpe service virtualization 3.8 what's new chicago admJeffrey Nunn
 
WebSphere Message Broker Training | IBM WebSphere Message Broker Online Training
WebSphere Message Broker Training | IBM WebSphere Message Broker Online TrainingWebSphere Message Broker Training | IBM WebSphere Message Broker Online Training
WebSphere Message Broker Training | IBM WebSphere Message Broker Online Trainingecorptraining2
 
Rit 8.5.0 training release notes
Rit 8.5.0 training release notesRit 8.5.0 training release notes
Rit 8.5.0 training release notesDarrel Rader
 
A Service Oriented Architecture For Order Processing In The I B M Supp...
A  Service  Oriented  Architecture For  Order  Processing In The  I B M  Supp...A  Service  Oriented  Architecture For  Order  Processing In The  I B M  Supp...
A Service Oriented Architecture For Order Processing In The I B M Supp...Kirill Osipov
 
Continuous Application Delivery to WebSphere - Featuring IBM UrbanCode
Continuous Application Delivery to WebSphere - Featuring IBM UrbanCodeContinuous Application Delivery to WebSphere - Featuring IBM UrbanCode
Continuous Application Delivery to WebSphere - Featuring IBM UrbanCodeIBM UrbanCode Products
 
Practical solutions for connections administrators
Practical solutions for connections administratorsPractical solutions for connections administrators
Practical solutions for connections administratorsSharon James
 
Pure Systems Patterns of Expertise - John Kaemmerer and Gerry Kovan, 11th Sep...
Pure Systems Patterns of Expertise - John Kaemmerer and Gerry Kovan, 11th Sep...Pure Systems Patterns of Expertise - John Kaemmerer and Gerry Kovan, 11th Sep...
Pure Systems Patterns of Expertise - John Kaemmerer and Gerry Kovan, 11th Sep...IBM Systems UKI
 
Pivotal Cloud Foundry 2.5: A First Look
Pivotal Cloud Foundry 2.5: A First LookPivotal Cloud Foundry 2.5: A First Look
Pivotal Cloud Foundry 2.5: A First LookVMware Tanzu
 
Networker integration for optimal performance
Networker integration for optimal performanceNetworker integration for optimal performance
Networker integration for optimal performanceMohamed Sohail
 
Datasheet was pluginforrd
Datasheet was pluginforrdDatasheet was pluginforrd
Datasheet was pluginforrdMidVision
 
Wmq wmb dist migration v1 030310
Wmq wmb dist migration v1 030310Wmq wmb dist migration v1 030310
Wmq wmb dist migration v1 030310karthickmsit
 
Orchestrating NFV Workloads in Multiple Clouds
Orchestrating NFV Workloads in Multiple CloudsOrchestrating NFV Workloads in Multiple Clouds
Orchestrating NFV Workloads in Multiple CloudsMichelle Holley
 
Sa106 – practical solutions for connections administrators
Sa106 – practical solutions for connections administratorsSa106 – practical solutions for connections administrators
Sa106 – practical solutions for connections administratorsSharon James
 
Central Administration
Central AdministrationCentral Administration
Central AdministrationRobert Crane
 
IBM SONAS and VMware vSphere 5 scale-out cloud foundation: A reference guide ...
IBM SONAS and VMware vSphere 5 scale-out cloud foundation: A reference guide ...IBM SONAS and VMware vSphere 5 scale-out cloud foundation: A reference guide ...
IBM SONAS and VMware vSphere 5 scale-out cloud foundation: A reference guide ...IBM India Smarter Computing
 
Mbd 5538-best practice mobile first pattern deployment-feb25
Mbd 5538-best practice mobile first pattern deployment-feb25Mbd 5538-best practice mobile first pattern deployment-feb25
Mbd 5538-best practice mobile first pattern deployment-feb25Sreeni Pamidala
 

Similar to Workflow Engine Performance Benchmarking with BenchFlow (20)

Partner Development Guide for Kafka Connect
Partner Development Guide for Kafka ConnectPartner Development Guide for Kafka Connect
Partner Development Guide for Kafka Connect
 
Hpe service virtualization 3.8 what's new chicago adm
Hpe service virtualization 3.8 what's new chicago admHpe service virtualization 3.8 what's new chicago adm
Hpe service virtualization 3.8 what's new chicago adm
 
WebSphere Message Broker Training | IBM WebSphere Message Broker Online Training
WebSphere Message Broker Training | IBM WebSphere Message Broker Online TrainingWebSphere Message Broker Training | IBM WebSphere Message Broker Online Training
WebSphere Message Broker Training | IBM WebSphere Message Broker Online Training
 
Rit 8.5.0 training release notes
Rit 8.5.0 training release notesRit 8.5.0 training release notes
Rit 8.5.0 training release notes
 
A Service Oriented Architecture For Order Processing In The I B M Supp...
A  Service  Oriented  Architecture For  Order  Processing In The  I B M  Supp...A  Service  Oriented  Architecture For  Order  Processing In The  I B M  Supp...
A Service Oriented Architecture For Order Processing In The I B M Supp...
 
Continuous Application Delivery to WebSphere - Featuring IBM UrbanCode
Continuous Application Delivery to WebSphere - Featuring IBM UrbanCodeContinuous Application Delivery to WebSphere - Featuring IBM UrbanCode
Continuous Application Delivery to WebSphere - Featuring IBM UrbanCode
 
Practical solutions for connections administrators
Practical solutions for connections administratorsPractical solutions for connections administrators
Practical solutions for connections administrators
 
4AA5-6907ENW
4AA5-6907ENW4AA5-6907ENW
4AA5-6907ENW
 
IBM Notes in the Cloud
IBM Notes in the CloudIBM Notes in the Cloud
IBM Notes in the Cloud
 
Pure Systems Patterns of Expertise - John Kaemmerer and Gerry Kovan, 11th Sep...
Pure Systems Patterns of Expertise - John Kaemmerer and Gerry Kovan, 11th Sep...Pure Systems Patterns of Expertise - John Kaemmerer and Gerry Kovan, 11th Sep...
Pure Systems Patterns of Expertise - John Kaemmerer and Gerry Kovan, 11th Sep...
 
Pivotal Cloud Foundry 2.5: A First Look
Pivotal Cloud Foundry 2.5: A First LookPivotal Cloud Foundry 2.5: A First Look
Pivotal Cloud Foundry 2.5: A First Look
 
Networker integration for optimal performance
Networker integration for optimal performanceNetworker integration for optimal performance
Networker integration for optimal performance
 
Datasheet was pluginforrd
Datasheet was pluginforrdDatasheet was pluginforrd
Datasheet was pluginforrd
 
Wmq wmb dist migration v1 030310
Wmq wmb dist migration v1 030310Wmq wmb dist migration v1 030310
Wmq wmb dist migration v1 030310
 
Orchestrating NFV Workloads in Multiple Clouds
Orchestrating NFV Workloads in Multiple CloudsOrchestrating NFV Workloads in Multiple Clouds
Orchestrating NFV Workloads in Multiple Clouds
 
Sa106 – practical solutions for connections administrators
Sa106 – practical solutions for connections administratorsSa106 – practical solutions for connections administrators
Sa106 – practical solutions for connections administrators
 
Central Administration
Central AdministrationCentral Administration
Central Administration
 
IBM SONAS and VMware vSphere 5 scale-out cloud foundation: A reference guide ...
IBM SONAS and VMware vSphere 5 scale-out cloud foundation: A reference guide ...IBM SONAS and VMware vSphere 5 scale-out cloud foundation: A reference guide ...
IBM SONAS and VMware vSphere 5 scale-out cloud foundation: A reference guide ...
 
Mbd 5538-best practice mobile first pattern deployment-feb25
Mbd 5538-best practice mobile first pattern deployment-feb25Mbd 5538-best practice mobile first pattern deployment-feb25
Mbd 5538-best practice mobile first pattern deployment-feb25
 
fgdfgdfg
fgdfgdfgfgdfgdfg
fgdfgdfg
 

More from Vincenzo Ferme

Declarative Performance Testing Automation - Automating Performance Testing f...
Declarative Performance Testing Automation - Automating Performance Testing f...Declarative Performance Testing Automation - Automating Performance Testing f...
Declarative Performance Testing Automation - Automating Performance Testing f...Vincenzo Ferme
 
Continuous Performance Testing for Microservices
Continuous Performance Testing for MicroservicesContinuous Performance Testing for Microservices
Continuous Performance Testing for MicroservicesVincenzo Ferme
 
A Declarative Approach for Performance Tests Execution in Continuous Software...
A Declarative Approach for Performance Tests Execution in Continuous Software...A Declarative Approach for Performance Tests Execution in Continuous Software...
A Declarative Approach for Performance Tests Execution in Continuous Software...Vincenzo Ferme
 
BenchFlow: A Platform for End-to-end Automation of Performance Testing and An...
BenchFlow: A Platform for End-to-end Automation of Performance Testing and An...BenchFlow: A Platform for End-to-end Automation of Performance Testing and An...
BenchFlow: A Platform for End-to-end Automation of Performance Testing and An...Vincenzo Ferme
 
Towards Holistic Continuous Software Performance Assessment
Towards Holistic Continuous Software Performance AssessmentTowards Holistic Continuous Software Performance Assessment
Towards Holistic Continuous Software Performance AssessmentVincenzo Ferme
 
Estimating the Cost for Executing Business Processes in the Cloud
Estimating the Cost for Executing Business Processes in the CloudEstimating the Cost for Executing Business Processes in the Cloud
Estimating the Cost for Executing Business Processes in the CloudVincenzo Ferme
 
Using Docker Containers to Improve Reproducibility in Software and Web Engine...
Using Docker Containers to Improve Reproducibility in Software and Web Engine...Using Docker Containers to Improve Reproducibility in Software and Web Engine...
Using Docker Containers to Improve Reproducibility in Software and Web Engine...Vincenzo Ferme
 
A Container-Centric Methodology for Benchmarking Workflow Management Systems
A Container-Centric Methodology for Benchmarking Workflow Management SystemsA Container-Centric Methodology for Benchmarking Workflow Management Systems
A Container-Centric Methodology for Benchmarking Workflow Management SystemsVincenzo Ferme
 
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management Systems
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management SystemsBenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management Systems
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management SystemsVincenzo Ferme
 
On the Road to Benchmarking BPMN 2.0 Workflow Engines
On the Road to Benchmarking BPMN 2.0 Workflow EnginesOn the Road to Benchmarking BPMN 2.0 Workflow Engines
On the Road to Benchmarking BPMN 2.0 Workflow EnginesVincenzo Ferme
 

More from Vincenzo Ferme (11)

Declarative Performance Testing Automation - Automating Performance Testing f...
Declarative Performance Testing Automation - Automating Performance Testing f...Declarative Performance Testing Automation - Automating Performance Testing f...
Declarative Performance Testing Automation - Automating Performance Testing f...
 
Continuous Performance Testing for Microservices
Continuous Performance Testing for MicroservicesContinuous Performance Testing for Microservices
Continuous Performance Testing for Microservices
 
A Declarative Approach for Performance Tests Execution in Continuous Software...
A Declarative Approach for Performance Tests Execution in Continuous Software...A Declarative Approach for Performance Tests Execution in Continuous Software...
A Declarative Approach for Performance Tests Execution in Continuous Software...
 
BenchFlow: A Platform for End-to-end Automation of Performance Testing and An...
BenchFlow: A Platform for End-to-end Automation of Performance Testing and An...BenchFlow: A Platform for End-to-end Automation of Performance Testing and An...
BenchFlow: A Platform for End-to-end Automation of Performance Testing and An...
 
Towards Holistic Continuous Software Performance Assessment
Towards Holistic Continuous Software Performance AssessmentTowards Holistic Continuous Software Performance Assessment
Towards Holistic Continuous Software Performance Assessment
 
Estimating the Cost for Executing Business Processes in the Cloud
Estimating the Cost for Executing Business Processes in the CloudEstimating the Cost for Executing Business Processes in the Cloud
Estimating the Cost for Executing Business Processes in the Cloud
 
Using Docker Containers to Improve Reproducibility in Software and Web Engine...
Using Docker Containers to Improve Reproducibility in Software and Web Engine...Using Docker Containers to Improve Reproducibility in Software and Web Engine...
Using Docker Containers to Improve Reproducibility in Software and Web Engine...
 
A Container-Centric Methodology for Benchmarking Workflow Management Systems
A Container-Centric Methodology for Benchmarking Workflow Management SystemsA Container-Centric Methodology for Benchmarking Workflow Management Systems
A Container-Centric Methodology for Benchmarking Workflow Management Systems
 
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management Systems
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management SystemsBenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management Systems
BenchFlow, a Framework for Benchmarking BPMN 2.0 Workflow Management Systems
 
On the Road to Benchmarking BPMN 2.0 Workflow Engines
On the Road to Benchmarking BPMN 2.0 Workflow EnginesOn the Road to Benchmarking BPMN 2.0 Workflow Engines
On the Road to Benchmarking BPMN 2.0 Workflow Engines
 
Open Data
Open DataOpen Data
Open Data
 

Recently uploaded

TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontologyjohnbeverley2021
 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfOrbitshub
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesrafiqahmad00786416
 
Introduction to use of FHIR Documents in ABDM
Introduction to use of FHIR Documents in ABDMIntroduction to use of FHIR Documents in ABDM
Introduction to use of FHIR Documents in ABDMKumar Satyam
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyKhushali Kathiriya
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 
Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityWSO2
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingEdi Saputra
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...DianaGray10
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusZilliz
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Orbitshub
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...apidays
 
JohnPollard-hybrid-app-RailsConf2024.pptx
JohnPollard-hybrid-app-RailsConf2024.pptxJohnPollard-hybrid-app-RailsConf2024.pptx
JohnPollard-hybrid-app-RailsConf2024.pptxJohnPollard37
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDropbox
 

Recently uploaded (20)

TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
 
Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
Introduction to use of FHIR Documents in ABDM
Introduction to use of FHIR Documents in ABDMIntroduction to use of FHIR Documents in ABDM
Introduction to use of FHIR Documents in ABDM
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital Adaptability
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
JohnPollard-hybrid-app-RailsConf2024.pptx
JohnPollard-hybrid-app-RailsConf2024.pptxJohnPollard-hybrid-app-RailsConf2024.pptx
JohnPollard-hybrid-app-RailsConf2024.pptx
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 

Workflow Engine Performance Benchmarking with BenchFlow

  • 2. Vincenzo Ferme 2 The BenchFlow Project Design and implement the first benchmark to assess and compare the performance of WfMSs that are compliant with Business Process Model and Notation 2.0 standard. ” “
  • 3. Vincenzo Ferme 3 What is a Workflow Management System? WfMS Users Applications Application Server Instance Database DBMS Web Service D A B C
  • 4. Vincenzo Ferme 4 Many Vendors of BPMN 2.0 WfMSs https://en.wikipedia.org/wiki/List_of_BPMN_2.0_engines Jan 2011 BPMN 2.0 Jan 2014 BPMN 2.0.2 ISO/IEC 19510 Aug 2009 BETA BPMN 2.0 2016 0 23 Grand Total 23 NumberofBPMN2.0WfMSs 0 5 10 15 20 25 Year of the First Version Supporting BPMN 2.0 2009 2010 2011 2012 2013 2014 2015 2016
  • 5. Vincenzo Ferme 5 end-users, vendors, developers Why do We Need a Benchmark?
  • 6. Vincenzo Ferme 5 end-users, vendors, developers Why do We Need a Benchmark? 1. How to choose the best WfMS according to the company’s technical requirements? 2. How to choose the best WfMS 
 according to the company’s 
 business process models (workflows)? B C D
  • 7. Vincenzo Ferme 5 end-users, vendors, developers Why do We Need a Benchmark? 1. How to choose the best WfMS according to the company’s technical requirements? 2. How to choose the best WfMS 
 according to the company’s 
 business process models (workflows)? B C D 4. How to identify WfMS’s bottlenecks? 3. How to evaluate performance improvements
 during WfMS’s development?
  • 8. Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Workload Metrics KPIs Measure Input/Process/Output Model
  • 9. Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D
  • 10. Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data
  • 11. Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Containers
  • 12. Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Performance Data Process Instance Duration Throughput Containers
  • 13. Vincenzo Ferme Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Figure 4: Benchmarking Methodology Choreography pending on the WfMS’s architecture. The DBMS Container can refer to an existing publicly available Container distributions. The containerized WfMS should be publicly available (e.g., at the Docker Hub registry8), or the Benchflow team should be granted access to a private registry used by the vendor. The same applies to the Containers’ definition file, i.e., the Dockerfile (Turnbull, 2014, ch. 4). While private reg- istries are a solution that can work with vendors of closed-source systems, they impact the reproducibil- ity of the results. For each WfMS version to be in- cluded in the benchmark, there must be a default con- figuration, i.e., the configuration in which the WfMS Containers start without modifying any configuration parameters, except the ones required in order to cor- rectly start the WfMS, e.g., the database connection. However, if vendors want to benchmark the perfor- mance of their WfMS with different configurations, for example, the configuration provided to users as a “getting started” configuration, or production-grade configurations for real-life usage, they can also pro- vide different configurations. To do that, the Con- tainers must allow to issue the WfMS configurations through additional environment variables9, and/or by access the configuration files. Exposing the volumes allows to access the files defined inside the Contain- ers, on the host operating system. Precisely, the WES Container has to, at least, enable the configuration of: 1) the used DB, i.e., DB driver, url, username and password for connection; 2) the WES itself; and 3) the logging level of the WES, and the application stack layers on which it may depend on. Alternatively, in- stead of providing configuration details, the vendors can provide different Containers for different config- urations. However, even in that case enabling the DB configuration is required. In order to access relevant data, all WfMS Con- tainers have to specify the volumes to access the WfMS log files, and to access all the data useful to setup the system. For instance, if the WfMS defines examples as part of the deployment, we may want to remove those examples by overriding the volume containing them. Moreover, the WES Container (or the DBMS Container) has to create the WfMS’s DB and populate it with data required to run the WfMS. The vendors need to provide authentication configu- ration details of the WfMS components (e.g., the user 7 Methodology Framework + Container Based Methodology and Framework BenchFlow Framework architecture Instance Database DBMSFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs WfMS TestExecutionAnalyses Faban + Web Service Minio harness MONITOR Adapters COLLECTORS
  • 14. Vincenzo Ferme Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Figure 4: Benchmarking Methodology Choreography pending on the WfMS’s architecture. The DBMS Container can refer to an existing publicly available Container distributions. The containerized WfMS should be publicly available (e.g., at the Docker Hub registry8), or the Benchflow team should be granted access to a private registry used by the vendor. The same applies to the Containers’ definition file, i.e., the Dockerfile (Turnbull, 2014, ch. 4). While private reg- istries are a solution that can work with vendors of closed-source systems, they impact the reproducibil- ity of the results. For each WfMS version to be in- cluded in the benchmark, there must be a default con- figuration, i.e., the configuration in which the WfMS Containers start without modifying any configuration parameters, except the ones required in order to cor- rectly start the WfMS, e.g., the database connection. However, if vendors want to benchmark the perfor- mance of their WfMS with different configurations, for example, the configuration provided to users as a “getting started” configuration, or production-grade configurations for real-life usage, they can also pro- vide different configurations. To do that, the Con- tainers must allow to issue the WfMS configurations through additional environment variables9, and/or by access the configuration files. Exposing the volumes allows to access the files defined inside the Contain- ers, on the host operating system. Precisely, the WES Container has to, at least, enable the configuration of: 1) the used DB, i.e., DB driver, url, username and password for connection; 2) the WES itself; and 3) the logging level of the WES, and the application stack layers on which it may depend on. Alternatively, in- stead of providing configuration details, the vendors can provide different Containers for different config- urations. However, even in that case enabling the DB configuration is required. In order to access relevant data, all WfMS Con- tainers have to specify the volumes to access the WfMS log files, and to access all the data useful to setup the system. For instance, if the WfMS defines examples as part of the deployment, we may want to remove those examples by overriding the volume containing them. Moreover, the WES Container (or the DBMS Container) has to create the WfMS’s DB and populate it with data required to run the WfMS. The vendors need to provide authentication configu- ration details of the WfMS components (e.g., the user 7 Methodology Framework + Container Based Methodology and Framework BenchFlow Framework architecture Instance Database DBMSFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs WfMS TestExecutionAnalyses Faban + Web Service Minio harness MONITOR Adapters COLLECTORS Provides Great Advantages for Automation and Reproducibility of Results, while Ensuring Negligible Performance Impact
  • 15. Vincenzo Ferme 8 [CLOSER ’16] Container-centric Methodology for Benchmarking Workflow Management Systems. Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Figure 4: Benchmarking Methodology Choreography pending on the WfMS’s architecture. The DBMS Container can refer to an existing publicly available Container distributions. The containerized WfMS should be publicly available (e.g., at the Docker Hub registry8), or the Benchflow team should be granted access to a private registry used by the vendor. The access the configuration files. Exposing the volumes allows to access the files defined inside the Contain- ers, on the host operating system. Precisely, the WES Container has to, at least, enable the configuration of: 1) the used DB, i.e., DB driver, url, username and password for connection; 2) the WES itself; and 3) the
  • 16. Vincenzo Ferme 9 BenchFlow Benchmarking Methodology requirements from the WfMS Availability of APIs • Deploy Process • Start Process Instance • Users API • Web Service APIs • Events APIs
  • 17. Vincenzo Ferme 9 BenchFlow Benchmarking Methodology requirements from the WfMS Availability of APIs • Deploy Process • Start Process Instance • Users API • Web Service APIs • Events APIs • Workflow & Construct: • Start Time • End Time • [Duration] Availability of Timing Data
  • 18. Vincenzo Ferme 9 BenchFlow Benchmarking Methodology requirements from the WfMS Availability of APIs • Deploy Process • Start Process Instance • Users API • Web Service APIs • Events APIs • Workflow & Construct: • Start Time • End Time • [Duration] Availability of Timing Data Availability of Execution State State of the workflow execution. E.g., running, completed, error
  • 19. Vincenzo Ferme 10 BenchFlow Framework architecture Instance Database DBMSFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs WfMS TestExecutionAnalyses Faban + Web Service Minio harness benchflow [BPM ’15] [ICPE ’16] MONITOR Adapters COLLECTORS
  • 20. Vincenzo Ferme 11 BenchFlow Framework WfMSs’ specific characteristics •Manages the Automatic WfMS deployment; •Provides a “plugin” mechanism to add new WfMSs; •Performs automatic Process Models deployment; •Collects Client and Server-side data and metrics; •Automatically computes performance metrics and
 statistics.
  • 21. Vincenzo Ferme 12 Applications of the Methodology and Framework [CAiSE ’16] Micro-Benchmarking BPMN 2.0 Workflow Management Systems with Workflow Patterns. WfMS MySQL 3 WfMSsWorkload Empty Script 1 Empty Script 2 … 0 750 1500 0:00 4:00 10:00 Metrics • Engine Level • Process Level • Environment
  • 22. Vincenzo Ferme Micro-Benchmarking with Workflow Patterns 13 research questions Selected three WfMSs: popular open-source WfMSs actually used in industry 1. What is the impact of individual or a mix of workflow patterns on the performance of each one of the benchmarked BPMN 2.0 WfMSs? 2. Are there performance bottlenecks for the selected WfMSs?
  • 23. Vincenzo Ferme 14 Empty Script 1 Empty Script 2 Workload Mix Micro-Benchmarking with Workflow Patterns workload mix Sequence Pattern [SEQ] • N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006 Pattern Reference:
  • 24. Vincenzo Ferme 14 Empty Script 1 Empty Script 2 Workload Mix Micro-Benchmarking with Workflow Patterns workload mix Sequence Pattern [SEQ]exclusiveChoicePattern generate number Empty Script 1 Empty Script 2case_2 case_1 Exclusive Choice and Simple Merge [EXC] • N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006 Pattern Reference:
  • 25. Vincenzo Ferme 15 lesPattern generate 1 or 2 Empty Script 1 Empty Script 2 i++ i < 10 i >= 10 case_1 case_2 Workload Mix Micro-Benchmarking with Workflow Patterns workload mix Arbitrary Cycle [CYC] • N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006 Pattern Reference:
  • 26. Vincenzo Ferme 16 Workload Mix Micro-Benchmarking with Workflow Patterns workload mix Empty Script 1 Empty Script 2 Parallel Split and Synchronisation [PAR] • N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006 Pattern Reference:
  • 27. Vincenzo Ferme 16 Workload Mix Micro-Benchmarking with Workflow Patterns workload mix citTerminationPattern Empty Script 1 Wait 5 Sec Explicit Termination Pattern [EXT] Empty Script 1 Empty Script 2 Parallel Split and Synchronisation [PAR] • N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006 Pattern Reference:
  • 28. Vincenzo Ferme 17 Micro-Benchmarking with Workflow Patterns workload mix design decisions 1. Maximise the simplicity of the model expressing the workflow pattern 2. Omit the interactions with external systems by implementing all tasks as script tasks
  • 29. Vincenzo Ferme 18 Load Functions Micro-Benchmarking with Workflow Patterns load function Test Type Load test Instance Producers 0 300 600 900 1200 1500 Time (min:sec) 0:000:301:002:003:004:005:006:007:008:0010:00 Example for u=1500
  • 30. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C
  • 31. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26
  • 32. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79
  • 33. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79 App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final
  • 34. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79 App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final Max Java Heap: 32GB
  • 35. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79 App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final Max Java Heap: 32GB Max DB Connections Number: 100
  • 36. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 WfMS’s Containers: official ones, if available on DockerHub O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79 App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final Max Java Heap: 32GB Max DB Connections Number: 100
  • 37. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 WfMS’s Containers: official ones, if available on DockerHub WfMS’s Configuration: as suggested by vendor’s documentation O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79 App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final Max Java Heap: 32GB Max DB Connections Number: 100
  • 38. Vincenzo Ferme 20 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS deployment TEST ENVIRONMENT CPU 64 Cores 
 @ 1400 MHz RAM 128 GB Load Drivers CPU 12 Cores 
 @ 800 MHz RAM 64 GB WfMS CPU 64 Cores 
 @ 2300 MHz RAM 128 GB DBMS
  • 39. Vincenzo Ferme 20 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS deployment TEST ENVIRONMENT CPU 64 Cores 
 @ 1400 MHz RAM 128 GB Load Drivers CPU 12 Cores 
 @ 800 MHz RAM 64 GB WfMS CPU 64 Cores 
 @ 2300 MHz RAM 128 GB DBMS 10 Gbit/s 10 Gbit/s
  • 40. Vincenzo Ferme 20 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS deployment TEST ENVIRONMENT CPU 64 Cores 
 @ 1400 MHz RAM 128 GB Load Drivers CPU 12 Cores 
 @ 800 MHz RAM 64 GB WfMS CPU 64 Cores 
 @ 2300 MHz RAM 128 GB DBMS 10 Gbit/s 10 Gbit/s O.S.: Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS
  • 41. Vincenzo Ferme 20 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS deployment TEST ENVIRONMENT CPU 64 Cores 
 @ 1400 MHz RAM 128 GB Load Drivers CPU 12 Cores 
 @ 800 MHz RAM 64 GB WfMS CPU 64 Cores 
 @ 2300 MHz RAM 128 GB DBMS 10 Gbit/s 10 Gbit/s O.S.: Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS 1.8.2 1.8.2 1.8.2
  • 42. Vincenzo Ferme 21 Micro-Benchmarking with Workflow Patterns performed experiments Individual patterns 1. 1500 (max) concurrent Instance Producers starting each pattern [SEQ, EXC, CYC, EXT, PAR] Mix of patterns 2. 1500 concurrent Instance Producers starting an equal number of pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
  • 43. Vincenzo Ferme 21 Micro-Benchmarking with Workflow Patterns performed experiments Three runs for each execution of the experiments Individual patterns 1. 1500 (max) concurrent Instance Producers starting each pattern [SEQ, EXC, CYC, EXT, PAR] Mix of patterns 2. 1500 concurrent Instance Producers starting an equal number of pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
  • 44. Vincenzo Ferme 21 Micro-Benchmarking with Workflow Patterns performed experiments Three runs for each execution of the experiments The data collected in the first minute of the execution has not been included in the analysis Individual patterns 1. 1500 (max) concurrent Instance Producers starting each pattern [SEQ, EXC, CYC, EXT, PAR] Mix of patterns 2. 1500 concurrent Instance Producers starting an equal number of pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
  • 45. Vincenzo Ferme 21 Micro-Benchmarking with Workflow Patterns performed experiments Three runs for each execution of the experiments The data collected in the first minute of the execution has not been included in the analysis Individual patterns 1. 1500 (max) concurrent Instance Producers starting each pattern [SEQ, EXC, CYC, EXT, PAR] Mix of patterns 2. 1500 concurrent Instance Producers starting an equal number of pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
  • 46. Vincenzo Ferme 22 Number of Process Instances Computed Metrics and Statistics engine level metrics Throughput Execution Time Database Size WfMS Users Load Driver Instance Database Application Server DBMS Web Service D A B C …
  • 47. Vincenzo Ferme 22 Number of Process Instances Computed Metrics and Statistics engine level metrics Throughput Execution Time Database Size WfMS Users Load Driver Instance Database Application Server DBMS Web Service D A B C …
  • 48. Vincenzo Ferme 23 individual patterns: throughput (bp/sec) Micro-Benchmarking with Workflow Patterns SEQ EXC EXT PAR CYC WfMS A WfMS B WfMS C 1456.79 1417.17 1455.68 1433.12 327.99 1447.66 1436.03 1426.35 1429.65 644.02 63.31 49.04 0.38 48.89 13.46 u=600 u=800 u=1500
  • 49. Vincenzo Ferme 24 Micro-Benchmarking with Workflow Patterns individual patterns: WfMS C Scalability for SEQ pattern WfMS C uses a synchronous instantiation API, load drivers must wait for workflows to end before starting new ones
  • 50. Vincenzo Ferme 24 Micro-Benchmarking with Workflow Patterns individual patterns: WfMS C Scalability for SEQ pattern WfMS C uses a synchronous instantiation API, load drivers must wait for workflows to end before starting new ones Response Time 0 14 28 500 1000 1500 2000 u Instance Producers Throughput 0 40 80 500 1000 1500 2000 u Instance Producers
  • 51. Vincenzo Ferme 24 Micro-Benchmarking with Workflow Patterns individual patterns: WfMS C Scalability for SEQ pattern WfMS C uses a synchronous instantiation API, load drivers must wait for workflows to end before starting new ones Response Time 0 14 28 500 1000 1500 2000 u Instance Producers Throughput 0 40 80 500 1000 1500 2000 u Instance Producers We have found a bottleneck! (~60 bp/sec)
  • 52. Vincenzo Ferme 25 Computed Metrics and Statistics process level metrics Throughput WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS … Instance Duration PerProcess Definition
  • 53. Vincenzo Ferme 25 Computed Metrics and Statistics process level metrics Throughput WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS … Instance Duration PerProcess Definition
  • 54. Vincenzo Ferme 26 Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time)
  • 55. Vincenzo Ferme 26 Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time) Empty Script 1 Empty Script 2 [SEQ]
  • 56. Vincenzo Ferme 26 Empty Script 1 Wait 5 Sec [EXT] Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time)
  • 57. Vincenzo Ferme 26 Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time) Empty Script 1 Empty Script 2 [PAR]
  • 58. Vincenzo Ferme 26 Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time) CYC Instance Producers: u=600 [CYC] tern generate 1 or 2 Empty Script 1 Empty Script 2 i++ i < 10 i >= 10 case_1 case_2
  • 59. Vincenzo Ferme 27 Computed Metrics and Statistics environment metrics WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS CPU/RAM Usage per Second IO Operations/Network Usage per Second …
  • 60. Vincenzo Ferme 27 Computed Metrics and Statistics environment metrics WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS CPU/RAM Usage per Second IO Operations/Network Usage per Second …
  • 61. Vincenzo Ferme 28 Micro-Benchmarking with Workflow Patterns individual patterns: mean (RAM) and mean (CPU) usage RAM CPU
  • 62. Vincenzo Ferme 29 Micro-Benchmarking with Workflow Patterns performed experiments Individual patterns 1. 1500 (max) concurrent Instance Producers starting each pattern [SEQ, EXC, CYC, EXT, PAR] Mix of patterns 2. 1500 concurrent Instance Producers starting an equal number of pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR]) Three runs for each execution of the experiments The data collected in the first minute of the execution has not been included in the analysis
  • 63. Vincenzo Ferme 30 Micro-Benchmarking with Workflow Patterns mix of patterns: duration, RAM, CPU
  • 64. Vincenzo Ferme 31 Micro-Benchmarking with Workflow Patterns summary Even though the Executed Workflows are Simple
  • 65. Vincenzo Ferme 31 Micro-Benchmarking with Workflow Patterns summary There are relevant differences in workflows’ duration and throughput among the WfMSs Even though the Executed Workflows are Simple
  • 66. Vincenzo Ferme 31 Micro-Benchmarking with Workflow Patterns summary There are relevant differences in workflows’ duration and throughput among the WfMSs There are relevant differences in resource usage among WfMSs CPURAM Even though the Executed Workflows are Simple
  • 67. Vincenzo Ferme 31 Micro-Benchmarking with Workflow Patterns summary There are relevant differences in workflows’ duration and throughput among the WfMSs One WfMS has bottlenecks There are relevant differences in resource usage among WfMSs CPURAM Even though the Executed Workflows are Simple
  • 69. Highlights Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Performance Data Process Instance Duration Throughput Containers Benchmarking Process
  • 70. Highlights Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Performance Data Process Instance Duration Throughput Containers Benchmarking Process Vincenzo Ferme 8 [CLOSER ’16] Container-centric Methodology for Benchmarking Workflow Management Systems. Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Benchmarking Methodology
  • 71. Highlights Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Performance Data Process Instance Duration Throughput Containers Benchmarking Process Vincenzo Ferme 8 [CLOSER ’16] Container-centric Methodology for Benchmarking Workflow Management Systems. Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Benchmarking Methodology Vincenzo Ferme 10 BenchFlow Framework architecture Instance Database DBMSFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs WfMS TestExecutionAnalyses Faban + Web Service Minio harness benchflow [BPM ’15] [ICPE ’16] MONITOR Adapters COLLECTORS BenchFlow Framework
  • 72. Highlights Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Performance Data Process Instance Duration Throughput Containers Benchmarking Process Vincenzo Ferme 8 [CLOSER ’16] Container-centric Methodology for Benchmarking Workflow Management Systems. Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Benchmarking Methodology Vincenzo Ferme 10 BenchFlow Framework architecture Instance Database DBMSFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs WfMS TestExecutionAnalyses Faban + Web Service Minio harness benchflow [BPM ’15] [ICPE ’16] MONITOR Adapters COLLECTORS BenchFlow Framework Vincenzo Ferme 26 Empty Script 1 Wait 5 Sec [EXT] Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time) Empty Script 1 Empty Script 2 [SEQ] Empty Script 1 Empty Script 2 [PAR] CYC Instance Producers: u=600 [CYC] generate 1 or 2 Empty Script 1 Empty Script 2 i++ i < 10 i >= 10 case_1 case_2 Micro-Benchmark with WP
  • 73. Vincenzo Ferme 33 •We want to characterise the Workload Mix using Real-World process models • Share your executable BPMN 2.0 process models, even anonymised Process Models •We want to characterise the Load Functions using Real-World behaviours • Share your execution logs, even anonymised Execution Logs •We want to add more and more WfMSs to the benchmark • Contact us for collaboration, and BenchFlow framework support WfMSs Call for Collaboration WfMSs, process models, process logs
  • 76. Vincenzo Ferme 36 Published Work [BTW ’15] C. Pautasso, V. Ferme, D. Roller, F. Leymann, and M. Skouradaki. Towards workflow benchmarking: Open research challenges. In Proc. of the 16th conference on Database Systems for Business,Technology, and Web, BTW 2015, pages 331–350, 2015. [SSP ’14] M. Skouradaki, D. H. Roller, F. Leymann, V. Ferme, and C. Pautasso. Technical open challenges on benchmarking workflow management systems. In Proc. of the 2014 Symposium on Software Performance, SSP 2014, pages 105–112, 2014. [ICPE ’15] M. Skouradaki, D. H. Roller, L. Frank, V. Ferme, and C. Pautasso. On the Road to Benchmarking BPMN 2.0 Workflow Engines. In Proc. of the 6th ACM/SPEC International Conference on Performance Engineering, ICPE ’15, pages 301–304, 2015.
  • 77. Vincenzo Ferme 37 Published Work [CLOSER ’15] M. Skouradaki,V. Ferme, F. Leymann, C. Pautasso, and D. H. Roller. “BPELanon”: Protect business processes on the cloud. In Proc. of the 5th International Conference on Cloud Computing and Service Science, CLOSER 2015. SciTePress, 2015. [SOSE ’15] M. Skouradaki, K. Goerlach, M. Hahn, and F. Leymann. Application of Sub-Graph Isomorphism to Extract Reoccurring Structures from BPMN 2.0 Process Models. In Proc. of the 9th International IEEE Symposium on Service-Oriented System Engineering, SOSE 2015, 2015. [BPM ’15] V. Ferme, A. Ivanchikj, C. Pautasso. A Framework for Benchmarking BPMN 2.0 Workflow Management Systems. In Proc. of the 13th International Conference on Business Process Management, BPM ’15, pages 251-259, 2015.
  • 78. Vincenzo Ferme 38 Published Work [BPMD ’15] A. Ivanchikj,V. Ferme, C. Pautasso. BPMeter: Web Service and Application for Static Analysis of BPMN 2.0 Collections. In Proc. of the 13th International Conference on Business Process Management [Demo], BPM ’15, pages 30-34, 2015. [ICPE ’16] V. Ferme, and C. Pautasso. Integrating Faban with Docker for Performance Benchmarking. In Proc. of the 7th ACM/SPEC International Conference on Performance Engineering, ICPE ’16, 2016. [CLOSER ’16] V. Ferme, A. Ivanchikj, C. Pautasso., M. Skouradaki, F. Leymann. A Container-centric Methodology for Benchmarking Workflow Management Systems. In Proc. of the 6th International Conference on Cloud Computing and Service Science, CLOSER 2016. SciTePress, 2016.
  • 79. Vincenzo Ferme [CAiSE ’16] M. Skouradaki, V. Ferme, C. Pautasso, F. Leymann, A. van Hoorn. Micro-Benchmarking BPMN 2.0 Workflow Management Systems with Workflow Patterns . In Proc. of the 28th International Conference on Advanced Information Systems Engineering, CAiSE ’16, 2016. 39 Published Work [ICWE ’16] C. Jürgen,V. Ferme, H.C. Gall. Using Docker Containers to Improve Reproducibility in Software and Web Engineering Research. In Proc. of the 16th International Conference on Web Engineering, 2016. [ICWS ’16] M. Skouradaki,V.Andrikopoulos, F. Leymann. Representative BPMN 2.0 Process Model Generation from Recurring Structures. In Proc. of the 23rd IEEE International Conference on Web Services, ICWS '16, 2016.
  • 80. Vincenzo Ferme 40 Published Work [SummerSOC ’16] M. Skouradaki, T. Azad, U. Breitenbücher, O. Kopp, F. Leymann. A Decision Support System for the Performance Benchmarking of Workflow Management Systems. In Proc. of the 10th Symposium and Summer School On Service-Oriented Computing, SummerSOC '16, 2016. [OTM ’16] M. Skouradaki, V. Andrikopoulos, O. Kopp, F. Leymann. RoSE: Reoccurring Structures Detection in BPMN 2.0 Process Models Collections. In Proc. of On the Move to Meaningful Internet Systems Conference, OTM '16, 2016. (to appear) [BPM Forum ’16] V. Ferme, A. Ivanchikj, C. Pautasso. Estimating the Cost for Executing Business Processes in the Cloud. In Proc. of the 14th International Conference on Business Process Management, BPM Forum ’16, 2016. (to appear)
  • 81. Vincenzo Ferme 41 Docker Performance [IBM ’14] W. Felter, A. Ferreira, R. Rajamony, and J. Rubio. An updated performance comparison of virtual machines and Linux containers. IBM Research Report, 2014. Although containers themselves have almost no overhead, Docker is not without performance gotchas. Docker volumes have noticeably better performance than files stored in AUFS. Docker’s NAT also introduces overhead for workloads with high packet rates. These features represent a tradeoff between ease of management and performance and should be considered on a case-by-case basis. ” “Our results show that containers result in equal or better performance than VMs in almost all cases. “ ” BenchFlow Configures Docker for Performance by Default
  • 82. Vincenzo Ferme 42 Benchmarking Requirements • Relevant • Representative • Portable • Scalable • Simple • Repeatable • Vendor-neutral • Accessible • Efficient • Affordable • K. Huppler, The art of building a good benchmark, 2009 • J. Gray, The Benchmark Handbook for Database and Transaction Systems, 1993 • S. E. Sim, S. Easterbrook et al., Using benchmarking to advance research: A challenge to software engineering, 2003
  • 83. Vincenzo Ferme 43 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS NON-CORECORE Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
  • 84. Vincenzo Ferme 43 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS NON-CORECORE WfMS Initialisation APIs Deploy Process Start Process Instance Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
  • 85. Vincenzo Ferme 43 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS NON-CORECORE WfMS Initialisation APIs Deploy Process Start Process Instance + WfMS Claim Task Complete Task User APIs Create User Create Group Pending User Tasks Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
  • 86. Vincenzo Ferme 43 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS NON-CORECORE WfMS Initialisation APIs Deploy Process Start Process Instance + WfMS Claim Task Complete Task User APIs Create User Create Group Pending User Tasks Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work Invoke WS Web Service APIs
  • 87. Vincenzo Ferme 43 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS NON-CORECORE WfMS Initialisation APIs Deploy Process Start Process Instance Event APIs Issue Event WfMS Pending Event Tasks + WfMS Claim Task Complete Task User APIs Create User Create Group Pending User Tasks Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work Invoke WS Web Service APIs
  • 88. Vincenzo Ferme 44 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS Table 1: Summary of Core and Non-core APIs to be imple- mented by the WfMS Functionality Min Response Data CoreAPIs Initialisation APIs Deploy a process Deployed process ID Start a process instance Process instance ID Non-coreAPIs User APIs Create a user User ID Create a group of users User group ID Access pending tasks Pending tasks IDs Claim a task* Complete a task Event APIs Access pending events Pending events IDs Issue events Web service APIs Map tasks to Web service endpoints *Optional depending on the WfMS implementation ing the a u (U ate by acc a g and mo ing (Pi has col Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
  • 90. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Engine Containers
  • 91. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Engine Docker Machine provides Containers
  • 92. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Swarm Docker Engine Docker Machine provides Containers
  • 93. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Swarm Docker Engine Docker Machine provides manages Containers Servers
  • 94. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Swarm Docker Engine Docker Compose Docker Machine provides manages Containers Servers
  • 95. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Swarm Docker Engine Docker Compose Docker Machine provides manages Containers Servers SUT’s Deployment Conf. deploys
  • 96. Vincenzo Ferme 46 Server-side Data and Metrics Collection WfMS Start Workflow asynchronous execution of workflows
  • 97. Vincenzo Ferme Users D A B C Start Application Server Web Service Load Driver DBMS WfMS Instance Database 46 Server-side Data and Metrics Collection WfMS Start Workflow asynchronous execution of workflows
  • 98. Vincenzo Ferme Users D A B C Start Application Server Web Service Load Driver DBMS WfMS Instance Database 46 Server-side Data and Metrics Collection WfMS Start Workflow End asynchronous execution of workflows
  • 99. Vincenzo Ferme 47 Server-side Data and Metrics Collection monitors DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service
  • 100. Vincenzo Ferme 47 Server-side Data and Metrics Collection monitors DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service MONITOR
  • 101. Vincenzo Ferme 48 Server-side Data and Metrics Collection monitors DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service • CPU usage • Database state Examples of Monitors:Monitors’ Characteristics: • RESTful services • Lightweight (written in Go) • As less invasive on the SUT as possible
  • 102. Vincenzo Ferme 48 Server-side Data and Metrics Collection monitors DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service CPU Monitor DB Monitor • CPU usage • Database state Examples of Monitors:Monitors’ Characteristics: • RESTful services • Lightweight (written in Go) • As less invasive on the SUT as possible
  • 103. Vincenzo Ferme 49 Server-side Data and Metrics Collection collect data DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service
  • 104. Vincenzo Ferme 49 Server-side Data and Metrics Collection collect data DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service Minio Instance Database Analyses COLLECTORS
  • 105. Vincenzo Ferme 50 Server-side Data and Metrics Collection collect data DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service • Container’s Stats (e.g., CPU usage) • Database dump • Applications Logs Examples of Collectors:Collectors’ Characteristics: • RESTful services • Lightweight (written in Go) • Two types: online and offline • Buffer data locally
  • 106. Vincenzo Ferme 50 Server-side Data and Metrics Collection collect data DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service Stats Collector DB Collector • Container’s Stats (e.g., CPU usage) • Database dump • Applications Logs Examples of Collectors:Collectors’ Characteristics: • RESTful services • Lightweight (written in Go) • Two types: online and offline • Buffer data locally
  • 107. Vincenzo Ferme 51 TestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector Synchronisation with the Analyses schedule analyses
  • 108. Vincenzo Ferme 51 TestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Synchronisation with the Analyses schedule analyses
  • 109. Vincenzo Ferme 51 TestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Publish Synchronisation with the Analyses schedule analyses
  • 110. Vincenzo Ferme 51 TestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Publish Subscribe Synchronisation with the Analyses schedule analyses
  • 111. Vincenzo Ferme 51 TestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Publish Subscribe Read Synchronisation with the Analyses schedule analyses
  • 112. Vincenzo Ferme 52 Performance Metrics and KPIs amount of data REALISTIC DATA Number of Repetitions 1 10 5 3
  • 113. Vincenzo Ferme 52 Performance Metrics and KPIs amount of data REALISTIC DATA Number of Repetitions 1 10 5 3 Number of WfMSs 1
  • 114. Vincenzo Ferme 53 Micro-Benchmarking with Workflow Patterns mix of patterns: throughput (bp/sec) MIX WfMS A WfMS B WfMS C 1061.27 1402.33 1.78
  • 115. Vincenzo Ferme 54 Computed Metrics and Statistics feature level metrics WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS Throughput … Instance Duration PerConstruct (type,name)
  • 116. Vincenzo Ferme 54 Computed Metrics and Statistics feature level metrics WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS Throughput … Instance Duration PerConstruct (type,name)
  • 117. Vincenzo Ferme 55 Computed Metrics and Statistics interactions metrics Response Time Latency … WfMS Users Load Driver Instance Database Application Server DBMS Web Service D A B C
  • 118. Vincenzo Ferme 55 Computed Metrics and Statistics interactions metrics Response Time Latency … WfMS Users Load Driver Instance Database Application Server DBMS Web Service D A B C
  • 119. Vincenzo Ferme 56 Computed Metrics and Statistics descriptive and homogeneity statistics • Descriptive Statistics (e.g., Mean, Confidence Interval, 
 Standard Deviation, Percentiles, Quartiles, …) • Coefficient ofVariation across trials • Levene Test on trials