Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Vincenzo	Ferme	
Faculty	of	Informatics	
USI	Lugano,	Switzerland
WORKFLOW ENGINE
PERFORMANCE BENCHMARKING
WITH BENCHFLOW
Vincenzo Ferme
2
The BenchFlow Project
Design and implement the first benchmark to
assess and compare the performance of Wf...
Vincenzo Ferme
3
What is a Workflow Management System?
WfMS Users
Applications
Application Server
Instance
Database
DBMS
We...
Vincenzo Ferme
4
Many Vendors of BPMN 2.0 WfMSs
https://en.wikipedia.org/wiki/List_of_BPMN_2.0_engines
Jan 2011
BPMN 2.0
J...
Vincenzo Ferme
5
end-users, vendors, developers
Why do We Need a Benchmark?
Vincenzo Ferme
5
end-users, vendors, developers
Why do We Need a Benchmark?
1. How to choose the best WfMS according to
th...
Vincenzo Ferme
5
end-users, vendors, developers
Why do We Need a Benchmark?
1. How to choose the best WfMS according to
th...
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Workload
Metrics
KPIs
Measure
Input/...
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Workload
Metrics
KPIs
Measure
Input/...
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload
Metrics
KPIs
Mea...
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload
Metrics
KPIs
Mea...
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload
Metrics
KPIs
Mea...
Vincenzo Ferme
Vendor
BenchFlow
Provide
Benchmarking
Methodology
Vendor
BenchFlow
Agree on adding
Vendor's WfMS to the
Ben...
Vincenzo Ferme
Vendor
BenchFlow
Provide
Benchmarking
Methodology
Vendor
BenchFlow
Agree on adding
Vendor's WfMS to the
Ben...
Vincenzo Ferme
8
[CLOSER ’16]
Container-centric Methodology for Benchmarking Workflow Management
Systems.
Vendor
BenchFlow
...
Vincenzo Ferme
9
BenchFlow Benchmarking Methodology
requirements from the WfMS
Availability of APIs
• Deploy Process
• Sta...
Vincenzo Ferme
9
BenchFlow Benchmarking Methodology
requirements from the WfMS
Availability of APIs
• Deploy Process
• Sta...
Vincenzo Ferme
9
BenchFlow Benchmarking Methodology
requirements from the WfMS
Availability of APIs
• Deploy Process
• Sta...
Vincenzo Ferme
10
BenchFlow Framework
architecture
Instance
Database
DBMSFaban Drivers
ContainersServers
DATA

TRANSFORMER...
Vincenzo Ferme
11
BenchFlow Framework
WfMSs’ specific characteristics
•Manages the Automatic WfMS deployment;
•Provides a “...
Vincenzo Ferme
12
Applications of the Methodology and Framework
[CAiSE ’16]
Micro-Benchmarking BPMN 2.0 Workflow Management...
Vincenzo Ferme
Micro-Benchmarking with Workflow Patterns
13
research questions
Selected three WfMSs: popular open-source Wf...
Vincenzo Ferme
14
Empty
Script 1
Empty
Script 2
Workload
Mix
Micro-Benchmarking with Workflow Patterns
workload mix
Sequenc...
Vincenzo Ferme
14
Empty
Script 1
Empty
Script 2
Workload
Mix
Micro-Benchmarking with Workflow Patterns
workload mix
Sequenc...
Vincenzo Ferme
15
lesPattern
generate
1 or 2
Empty
Script 1
Empty
Script 2
i++
i < 10
i >= 10
case_1
case_2
Workload
Mix
M...
Vincenzo Ferme
16
Workload
Mix
Micro-Benchmarking with Workflow Patterns
workload mix
Empty
Script 1
Empty
Script 2
Paralle...
Vincenzo Ferme
16
Workload
Mix
Micro-Benchmarking with Workflow Patterns
workload mix
citTerminationPattern
Empty
Script 1
...
Vincenzo Ferme
17
Micro-Benchmarking with Workflow Patterns
workload mix design decisions
1. Maximise the simplicity of the...
Vincenzo Ferme
18
Load
Functions
Micro-Benchmarking with Workflow Patterns
load function
Test
Type
Load test
Instance
Produ...
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS...
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS...
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS...
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS...
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS...
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS...
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS...
Vincenzo Ferme
19
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS environments
WfMS A WfMS...
Vincenzo Ferme
20
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS deployment
TEST ENVIRONM...
Vincenzo Ferme
20
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS deployment
TEST ENVIRONM...
Vincenzo Ferme
20
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS deployment
TEST ENVIRONM...
Vincenzo Ferme
20
WfMS
Configurations
Micro-Benchmarking with Workflow Patterns
configurations: WfMS deployment
TEST ENVIRONM...
Vincenzo Ferme
21
Micro-Benchmarking with Workflow Patterns
performed experiments
Individual patterns
1. 1500 (max) concurr...
Vincenzo Ferme
21
Micro-Benchmarking with Workflow Patterns
performed experiments
Three runs for each execution of the expe...
Vincenzo Ferme
21
Micro-Benchmarking with Workflow Patterns
performed experiments
Three runs for each execution of the expe...
Vincenzo Ferme
21
Micro-Benchmarking with Workflow Patterns
performed experiments
Three runs for each execution of the expe...
Vincenzo Ferme
22
Number of Process
Instances
Computed Metrics and Statistics
engine level metrics
Throughput
Execution Ti...
Vincenzo Ferme
22
Number of Process
Instances
Computed Metrics and Statistics
engine level metrics
Throughput
Execution Ti...
Vincenzo Ferme
23
individual patterns: throughput (bp/sec)
Micro-Benchmarking with Workflow Patterns
SEQ EXC EXT PAR CYC
Wf...
Vincenzo Ferme
24
Micro-Benchmarking with Workflow Patterns
individual patterns: WfMS C Scalability for SEQ pattern
WfMS C ...
Vincenzo Ferme
24
Micro-Benchmarking with Workflow Patterns
individual patterns: WfMS C Scalability for SEQ pattern
WfMS C ...
Vincenzo Ferme
24
Micro-Benchmarking with Workflow Patterns
individual patterns: WfMS C Scalability for SEQ pattern
WfMS C ...
Vincenzo Ferme
25
Computed Metrics and Statistics
process level metrics
Throughput
WfMS Users
Load Driver
Instance
Databas...
Vincenzo Ferme
25
Computed Metrics and Statistics
process level metrics
Throughput
WfMS Users
Load Driver
Instance
Databas...
Vincenzo Ferme
26
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (instance duration time)
Vincenzo Ferme
26
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (instance duration time)
Empty
Script...
Vincenzo Ferme
26
Empty
Script 1
Wait 5
Sec
[EXT]
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (inst...
Vincenzo Ferme
26
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (instance duration time)
Empty
Script...
Vincenzo Ferme
26
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (instance duration time)
CYC Instance...
Vincenzo Ferme
27
Computed Metrics and Statistics
environment metrics
WfMS Users
Load Driver
Instance
Database
Application...
Vincenzo Ferme
27
Computed Metrics and Statistics
environment metrics
WfMS Users
Load Driver
Instance
Database
Application...
Vincenzo Ferme
28
Micro-Benchmarking with Workflow Patterns
individual patterns: mean (RAM) and mean (CPU) usage
RAM CPU
Vincenzo Ferme
29
Micro-Benchmarking with Workflow Patterns
performed experiments
Individual patterns
1. 1500 (max) concurr...
Vincenzo Ferme
30
Micro-Benchmarking with Workflow Patterns
mix of patterns: duration, RAM, CPU
Vincenzo Ferme
31
Micro-Benchmarking with Workflow Patterns
summary
Even though the Executed Workflows are Simple
Vincenzo Ferme
31
Micro-Benchmarking with Workflow Patterns
summary
There are relevant differences in workflows’
duration an...
Vincenzo Ferme
31
Micro-Benchmarking with Workflow Patterns
summary
There are relevant differences in workflows’
duration an...
Vincenzo Ferme
31
Micro-Benchmarking with Workflow Patterns
summary
There are relevant differences in workflows’
duration an...
Highlights
Highlights
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload Metri...
Highlights
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload Metri...
Highlights
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload Metri...
Highlights
Vincenzo Ferme
6
The BenchFlow Benchmarking Process
Workload
Model
WfMS
Configurations
Test
Types
Workload Metri...
Vincenzo Ferme
33
•We want to characterise the Workload Mix using Real-World process models
• Share your executable BPMN 2...
benchflow
benchflow
vincenzo.ferme@usi.ch
http://benchflow.inf.usi.ch
WORKFLOW ENGINE
PERFORMANCE BENCHMARKING
WITH BENCHF...
BACKUP SLIDES
Vincenzo	Ferme	
Faculty	of	Informatics	
USI	Lugano,	Switzerland
Vincenzo Ferme
36
Published Work
[BTW ’15]
C. Pautasso, V. Ferme, D. Roller, F. Leymann, and M. Skouradaki. Towards workflo...
Vincenzo Ferme
37
Published Work
[CLOSER ’15]
M. Skouradaki,V. Ferme, F. Leymann, C. Pautasso, and D. H. Roller. “BPELanon...
Vincenzo Ferme
38
Published Work
[BPMD ’15]
A. Ivanchikj,V. Ferme, C. Pautasso. BPMeter: Web Service and Application for S...
Vincenzo Ferme
[CAiSE ’16]
M. Skouradaki, V. Ferme, C. Pautasso, F. Leymann, A. van Hoorn. Micro-Benchmarking
BPMN 2.0 Wor...
Vincenzo Ferme
40
Published Work
[SummerSOC ’16]
M. Skouradaki, T. Azad, U. Breitenbücher, O. Kopp, F. Leymann. A Decision...
Vincenzo Ferme
41
Docker Performance
[IBM ’14]
W. Felter, A. Ferreira, R. Rajamony, and J. Rubio. An updated performance c...
Vincenzo Ferme
42
Benchmarking Requirements
• Relevant
• Representative
• Portable
• Scalable
• Simple
• Repeatable
• Vend...
Vincenzo Ferme
43
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
NON-CORECORE
Context » B...
Vincenzo Ferme
43
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
NON-CORECORE
WfMS
Initia...
Vincenzo Ferme
43
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
NON-CORECORE
WfMS
Initia...
Vincenzo Ferme
43
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
NON-CORECORE
WfMS
Initia...
Vincenzo Ferme
43
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
NON-CORECORE
WfMS
Initia...
Vincenzo Ferme
44
BenchFlow Benchmarking MethodologyWfMS
Configurations
requirements from the WfMS
Table 1: Summary of Core...
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Engine
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Engine
Containers
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Engine
Docker Machine
provides
Containers
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Swarm
Docker Engine
Docker Machine
provides
Containers
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Swarm
Docker Engine
Docker Machine
provides
manages
Contain...
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Swarm
Docker Engine
Docker Compose
Docker Machine
provides
...
Vincenzo Ferme
45
BenchFlow Framework
system under test
Docker Swarm
Docker Engine
Docker Compose
Docker Machine
provides
...
Vincenzo Ferme
46
Server-side Data and Metrics Collection
WfMS
Start Workflow
asynchronous execution of workflows
Vincenzo Ferme
Users
D
A
B
C
Start
Application Server
Web
Service
Load Driver
DBMS
WfMS
Instance
Database
46
Server-side D...
Vincenzo Ferme
Users
D
A
B
C
Start
Application Server
Web
Service
Load Driver
DBMS
WfMS
Instance
Database
46
Server-side D...
Vincenzo Ferme
47
Server-side Data and Metrics Collection
monitors
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestEx...
Vincenzo Ferme
47
Server-side Data and Metrics Collection
monitors
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestEx...
Vincenzo Ferme
48
Server-side Data and Metrics Collection
monitors
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestEx...
Vincenzo Ferme
48
Server-side Data and Metrics Collection
monitors
DBMSFaban Drivers
ContainersServers
harness
WfMS
TestEx...
Vincenzo Ferme
49
Server-side Data and Metrics Collection
collect data
DBMSFaban Drivers
ContainersServers
harness
WfMS
Te...
Vincenzo Ferme
49
Server-side Data and Metrics Collection
collect data
DBMSFaban Drivers
ContainersServers
harness
WfMS
Te...
Vincenzo Ferme
50
Server-side Data and Metrics Collection
collect data
DBMSFaban Drivers
ContainersServers
harness
WfMS
Te...
Vincenzo Ferme
50
Server-side Data and Metrics Collection
collect data
DBMSFaban Drivers
ContainersServers
harness
WfMS
Te...
Vincenzo Ferme
51
TestExecution
A high-throughput
distributed messaging
system
Minio
Instance
Database
Analyses
DBMSFaban ...
Vincenzo Ferme
51
TestExecution
A high-throughput
distributed messaging
system
Minio
Instance
Database
Analyses
DBMSFaban ...
Vincenzo Ferme
51
TestExecution
A high-throughput
distributed messaging
system
Minio
Instance
Database
Analyses
DBMSFaban ...
Vincenzo Ferme
51
TestExecution
A high-throughput
distributed messaging
system
Minio
Instance
Database
Analyses
DBMSFaban ...
Vincenzo Ferme
51
TestExecution
A high-throughput
distributed messaging
system
Minio
Instance
Database
Analyses
DBMSFaban ...
Vincenzo Ferme
52
Performance Metrics and KPIs
amount of data
REALISTIC
DATA
Number of
Repetitions
1
10
5
3
Vincenzo Ferme
52
Performance Metrics and KPIs
amount of data
REALISTIC
DATA
Number of
Repetitions
1
10
5
3
Number of
WfMS...
Vincenzo Ferme
53
Micro-Benchmarking with Workflow Patterns
mix of patterns: throughput (bp/sec)
MIX
WfMS A
WfMS B
WfMS C
1...
Vincenzo Ferme
54
Computed Metrics and Statistics
feature level metrics
WfMS Users
Load Driver
Instance
Database
Applicati...
Vincenzo Ferme
54
Computed Metrics and Statistics
feature level metrics
WfMS Users
Load Driver
Instance
Database
Applicati...
Vincenzo Ferme
55
Computed Metrics and Statistics
interactions metrics
Response Time
Latency
…
WfMS Users
Load Driver
Inst...
Vincenzo Ferme
55
Computed Metrics and Statistics
interactions metrics
Response Time
Latency
…
WfMS Users
Load Driver
Inst...
Vincenzo Ferme
56
Computed Metrics and Statistics
descriptive and homogeneity statistics
• Descriptive Statistics (e.g., M...
Upcoming SlideShare
Loading in …5
×

Workflow Engine Performance Benchmarking with BenchFlow

550 views

Published on

As opposed to databases for which established benchmarks have been driving the advancement of the field since a long time, workflow engines still lack a well-accepted benchmark that allows to give a fair comparison of their performance. In this talk we discuss how BenchFlow addresses the main challenges related to benchmarking these complex middleware systems at the core of business process automation and service composition solutions. In particular, we look at how to define suitable workloads, representative performance metrics, and how to fully automate the execution of the performance experiments. Concerning the automation of experiments execution and analysis, we designed and implemented the BenchFlow framework. The framework automates the deployment of WfMS packaged within Docker containers. This way the initial configuration and conditions for each experiment can be precisely controlled and reproduced. Moreover, the BenchFlow framework supports heterogeneous WfMS APIs by providing an extensible plugin mechanism. During the benchmark execution it automatically deploys a set of BPMN models and invokes them according to parametric load functions specified declaratively. It automatically collects performance and resource consumption data, both on the driver and servers during the experiment as well as extracting them from the WfMS database afterwards. In addition to latency, throughput and resource utilisation, we compute multiple performance metrics that characterise the WfMS performance at the engine level, at the process level and at the BPMN construct level. To ensure reliability and improve usefulness of the obtained results, we automatically compute descriptive statistics and perform statistical tests to asses the homogeneity of results obtained from different repetitions of the same experiment.
The talk will also present experimental results obtained while benchmarking popular open source engines, using workflow patterns as a micro-benchmark.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Workflow Engine Performance Benchmarking with BenchFlow

  1. 1. Vincenzo Ferme Faculty of Informatics USI Lugano, Switzerland WORKFLOW ENGINE PERFORMANCE BENCHMARKING WITH BENCHFLOW
  2. 2. Vincenzo Ferme 2 The BenchFlow Project Design and implement the first benchmark to assess and compare the performance of WfMSs that are compliant with Business Process Model and Notation 2.0 standard. ” “
  3. 3. Vincenzo Ferme 3 What is a Workflow Management System? WfMS Users Applications Application Server Instance Database DBMS Web Service D A B C
  4. 4. Vincenzo Ferme 4 Many Vendors of BPMN 2.0 WfMSs https://en.wikipedia.org/wiki/List_of_BPMN_2.0_engines Jan 2011 BPMN 2.0 Jan 2014 BPMN 2.0.2 ISO/IEC 19510 Aug 2009 BETA BPMN 2.0 2016 0 23 Grand Total 23 NumberofBPMN2.0WfMSs 0 5 10 15 20 25 Year of the First Version Supporting BPMN 2.0 2009 2010 2011 2012 2013 2014 2015 2016
  5. 5. Vincenzo Ferme 5 end-users, vendors, developers Why do We Need a Benchmark?
  6. 6. Vincenzo Ferme 5 end-users, vendors, developers Why do We Need a Benchmark? 1. How to choose the best WfMS according to the company’s technical requirements? 2. How to choose the best WfMS 
 according to the company’s 
 business process models (workflows)? B C D
  7. 7. Vincenzo Ferme 5 end-users, vendors, developers Why do We Need a Benchmark? 1. How to choose the best WfMS according to the company’s technical requirements? 2. How to choose the best WfMS 
 according to the company’s 
 business process models (workflows)? B C D 4. How to identify WfMS’s bottlenecks? 3. How to evaluate performance improvements
 during WfMS’s development?
  8. 8. Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Workload Metrics KPIs Measure Input/Process/Output Model
  9. 9. Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D
  10. 10. Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data
  11. 11. Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Containers
  12. 12. Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Performance Data Process Instance Duration Throughput Containers
  13. 13. Vincenzo Ferme Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Figure 4: Benchmarking Methodology Choreography pending on the WfMS’s architecture. The DBMS Container can refer to an existing publicly available Container distributions. The containerized WfMS should be publicly available (e.g., at the Docker Hub registry8), or the Benchflow team should be granted access to a private registry used by the vendor. The same applies to the Containers’ definition file, i.e., the Dockerfile (Turnbull, 2014, ch. 4). While private reg- istries are a solution that can work with vendors of closed-source systems, they impact the reproducibil- ity of the results. For each WfMS version to be in- cluded in the benchmark, there must be a default con- figuration, i.e., the configuration in which the WfMS Containers start without modifying any configuration parameters, except the ones required in order to cor- rectly start the WfMS, e.g., the database connection. However, if vendors want to benchmark the perfor- mance of their WfMS with different configurations, for example, the configuration provided to users as a “getting started” configuration, or production-grade configurations for real-life usage, they can also pro- vide different configurations. To do that, the Con- tainers must allow to issue the WfMS configurations through additional environment variables9, and/or by access the configuration files. Exposing the volumes allows to access the files defined inside the Contain- ers, on the host operating system. Precisely, the WES Container has to, at least, enable the configuration of: 1) the used DB, i.e., DB driver, url, username and password for connection; 2) the WES itself; and 3) the logging level of the WES, and the application stack layers on which it may depend on. Alternatively, in- stead of providing configuration details, the vendors can provide different Containers for different config- urations. However, even in that case enabling the DB configuration is required. In order to access relevant data, all WfMS Con- tainers have to specify the volumes to access the WfMS log files, and to access all the data useful to setup the system. For instance, if the WfMS defines examples as part of the deployment, we may want to remove those examples by overriding the volume containing them. Moreover, the WES Container (or the DBMS Container) has to create the WfMS’s DB and populate it with data required to run the WfMS. The vendors need to provide authentication configu- ration details of the WfMS components (e.g., the user 7 Methodology Framework + Container Based Methodology and Framework BenchFlow Framework architecture Instance Database DBMSFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs WfMS TestExecutionAnalyses Faban + Web Service Minio harness MONITOR Adapters COLLECTORS
  14. 14. Vincenzo Ferme Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Figure 4: Benchmarking Methodology Choreography pending on the WfMS’s architecture. The DBMS Container can refer to an existing publicly available Container distributions. The containerized WfMS should be publicly available (e.g., at the Docker Hub registry8), or the Benchflow team should be granted access to a private registry used by the vendor. The same applies to the Containers’ definition file, i.e., the Dockerfile (Turnbull, 2014, ch. 4). While private reg- istries are a solution that can work with vendors of closed-source systems, they impact the reproducibil- ity of the results. For each WfMS version to be in- cluded in the benchmark, there must be a default con- figuration, i.e., the configuration in which the WfMS Containers start without modifying any configuration parameters, except the ones required in order to cor- rectly start the WfMS, e.g., the database connection. However, if vendors want to benchmark the perfor- mance of their WfMS with different configurations, for example, the configuration provided to users as a “getting started” configuration, or production-grade configurations for real-life usage, they can also pro- vide different configurations. To do that, the Con- tainers must allow to issue the WfMS configurations through additional environment variables9, and/or by access the configuration files. Exposing the volumes allows to access the files defined inside the Contain- ers, on the host operating system. Precisely, the WES Container has to, at least, enable the configuration of: 1) the used DB, i.e., DB driver, url, username and password for connection; 2) the WES itself; and 3) the logging level of the WES, and the application stack layers on which it may depend on. Alternatively, in- stead of providing configuration details, the vendors can provide different Containers for different config- urations. However, even in that case enabling the DB configuration is required. In order to access relevant data, all WfMS Con- tainers have to specify the volumes to access the WfMS log files, and to access all the data useful to setup the system. For instance, if the WfMS defines examples as part of the deployment, we may want to remove those examples by overriding the volume containing them. Moreover, the WES Container (or the DBMS Container) has to create the WfMS’s DB and populate it with data required to run the WfMS. The vendors need to provide authentication configu- ration details of the WfMS components (e.g., the user 7 Methodology Framework + Container Based Methodology and Framework BenchFlow Framework architecture Instance Database DBMSFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs WfMS TestExecutionAnalyses Faban + Web Service Minio harness MONITOR Adapters COLLECTORS Provides Great Advantages for Automation and Reproducibility of Results, while Ensuring Negligible Performance Impact
  15. 15. Vincenzo Ferme 8 [CLOSER ’16] Container-centric Methodology for Benchmarking Workflow Management Systems. Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Figure 4: Benchmarking Methodology Choreography pending on the WfMS’s architecture. The DBMS Container can refer to an existing publicly available Container distributions. The containerized WfMS should be publicly available (e.g., at the Docker Hub registry8), or the Benchflow team should be granted access to a private registry used by the vendor. The access the configuration files. Exposing the volumes allows to access the files defined inside the Contain- ers, on the host operating system. Precisely, the WES Container has to, at least, enable the configuration of: 1) the used DB, i.e., DB driver, url, username and password for connection; 2) the WES itself; and 3) the
  16. 16. Vincenzo Ferme 9 BenchFlow Benchmarking Methodology requirements from the WfMS Availability of APIs • Deploy Process • Start Process Instance • Users API • Web Service APIs • Events APIs
  17. 17. Vincenzo Ferme 9 BenchFlow Benchmarking Methodology requirements from the WfMS Availability of APIs • Deploy Process • Start Process Instance • Users API • Web Service APIs • Events APIs • Workflow & Construct: • Start Time • End Time • [Duration] Availability of Timing Data
  18. 18. Vincenzo Ferme 9 BenchFlow Benchmarking Methodology requirements from the WfMS Availability of APIs • Deploy Process • Start Process Instance • Users API • Web Service APIs • Events APIs • Workflow & Construct: • Start Time • End Time • [Duration] Availability of Timing Data Availability of Execution State State of the workflow execution. E.g., running, completed, error
  19. 19. Vincenzo Ferme 10 BenchFlow Framework architecture Instance Database DBMSFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs WfMS TestExecutionAnalyses Faban + Web Service Minio harness benchflow [BPM ’15] [ICPE ’16] MONITOR Adapters COLLECTORS
  20. 20. Vincenzo Ferme 11 BenchFlow Framework WfMSs’ specific characteristics •Manages the Automatic WfMS deployment; •Provides a “plugin” mechanism to add new WfMSs; •Performs automatic Process Models deployment; •Collects Client and Server-side data and metrics; •Automatically computes performance metrics and
 statistics.
  21. 21. Vincenzo Ferme 12 Applications of the Methodology and Framework [CAiSE ’16] Micro-Benchmarking BPMN 2.0 Workflow Management Systems with Workflow Patterns. WfMS MySQL 3 WfMSsWorkload Empty Script 1 Empty Script 2 … 0 750 1500 0:00 4:00 10:00 Metrics • Engine Level • Process Level • Environment
  22. 22. Vincenzo Ferme Micro-Benchmarking with Workflow Patterns 13 research questions Selected three WfMSs: popular open-source WfMSs actually used in industry 1. What is the impact of individual or a mix of workflow patterns on the performance of each one of the benchmarked BPMN 2.0 WfMSs? 2. Are there performance bottlenecks for the selected WfMSs?
  23. 23. Vincenzo Ferme 14 Empty Script 1 Empty Script 2 Workload Mix Micro-Benchmarking with Workflow Patterns workload mix Sequence Pattern [SEQ] • N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006 Pattern Reference:
  24. 24. Vincenzo Ferme 14 Empty Script 1 Empty Script 2 Workload Mix Micro-Benchmarking with Workflow Patterns workload mix Sequence Pattern [SEQ]exclusiveChoicePattern generate number Empty Script 1 Empty Script 2case_2 case_1 Exclusive Choice and Simple Merge [EXC] • N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006 Pattern Reference:
  25. 25. Vincenzo Ferme 15 lesPattern generate 1 or 2 Empty Script 1 Empty Script 2 i++ i < 10 i >= 10 case_1 case_2 Workload Mix Micro-Benchmarking with Workflow Patterns workload mix Arbitrary Cycle [CYC] • N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006 Pattern Reference:
  26. 26. Vincenzo Ferme 16 Workload Mix Micro-Benchmarking with Workflow Patterns workload mix Empty Script 1 Empty Script 2 Parallel Split and Synchronisation [PAR] • N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006 Pattern Reference:
  27. 27. Vincenzo Ferme 16 Workload Mix Micro-Benchmarking with Workflow Patterns workload mix citTerminationPattern Empty Script 1 Wait 5 Sec Explicit Termination Pattern [EXT] Empty Script 1 Empty Script 2 Parallel Split and Synchronisation [PAR] • N. Russell et al, Workflow Control-Flow Patterns: A Revised View, 2006 Pattern Reference:
  28. 28. Vincenzo Ferme 17 Micro-Benchmarking with Workflow Patterns workload mix design decisions 1. Maximise the simplicity of the model expressing the workflow pattern 2. Omit the interactions with external systems by implementing all tasks as script tasks
  29. 29. Vincenzo Ferme 18 Load Functions Micro-Benchmarking with Workflow Patterns load function Test Type Load test Instance Producers 0 300 600 900 1200 1500 Time (min:sec) 0:000:301:002:003:004:005:006:007:008:0010:00 Example for u=1500
  30. 30. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C
  31. 31. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26
  32. 32. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79
  33. 33. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79 App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final
  34. 34. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79 App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final Max Java Heap: 32GB
  35. 35. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79 App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final Max Java Heap: 32GB Max DB Connections Number: 100
  36. 36. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 WfMS’s Containers: official ones, if available on DockerHub O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79 App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final Max Java Heap: 32GB Max DB Connections Number: 100
  37. 37. Vincenzo Ferme 19 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS environments WfMS A WfMS B WfMS C MySQL MySQL MySQL MySQL: Community Server 5.6.26 WfMS’s Containers: official ones, if available on DockerHub WfMS’s Configuration: as suggested by vendor’s documentation O.S.: Ubuntu 14.04.01 Ubuntu 14.04.01 Ubuntu 14.04.01 J.V.M.: Oracle Serv. 7u79 Oracle Serv. 7u79 Oracle Serv. 7u79 App. Server: Ap.Tomcat 7.0.62 Ap.Tomcat 7.0.62 Wildfly 8.1.0. Final Max Java Heap: 32GB Max DB Connections Number: 100
  38. 38. Vincenzo Ferme 20 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS deployment TEST ENVIRONMENT CPU 64 Cores 
 @ 1400 MHz RAM 128 GB Load Drivers CPU 12 Cores 
 @ 800 MHz RAM 64 GB WfMS CPU 64 Cores 
 @ 2300 MHz RAM 128 GB DBMS
  39. 39. Vincenzo Ferme 20 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS deployment TEST ENVIRONMENT CPU 64 Cores 
 @ 1400 MHz RAM 128 GB Load Drivers CPU 12 Cores 
 @ 800 MHz RAM 64 GB WfMS CPU 64 Cores 
 @ 2300 MHz RAM 128 GB DBMS 10 Gbit/s 10 Gbit/s
  40. 40. Vincenzo Ferme 20 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS deployment TEST ENVIRONMENT CPU 64 Cores 
 @ 1400 MHz RAM 128 GB Load Drivers CPU 12 Cores 
 @ 800 MHz RAM 64 GB WfMS CPU 64 Cores 
 @ 2300 MHz RAM 128 GB DBMS 10 Gbit/s 10 Gbit/s O.S.: Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS
  41. 41. Vincenzo Ferme 20 WfMS Configurations Micro-Benchmarking with Workflow Patterns configurations: WfMS deployment TEST ENVIRONMENT CPU 64 Cores 
 @ 1400 MHz RAM 128 GB Load Drivers CPU 12 Cores 
 @ 800 MHz RAM 64 GB WfMS CPU 64 Cores 
 @ 2300 MHz RAM 128 GB DBMS 10 Gbit/s 10 Gbit/s O.S.: Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS Ubuntu 14.04.3 LTS 1.8.2 1.8.2 1.8.2
  42. 42. Vincenzo Ferme 21 Micro-Benchmarking with Workflow Patterns performed experiments Individual patterns 1. 1500 (max) concurrent Instance Producers starting each pattern [SEQ, EXC, CYC, EXT, PAR] Mix of patterns 2. 1500 concurrent Instance Producers starting an equal number of pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
  43. 43. Vincenzo Ferme 21 Micro-Benchmarking with Workflow Patterns performed experiments Three runs for each execution of the experiments Individual patterns 1. 1500 (max) concurrent Instance Producers starting each pattern [SEQ, EXC, CYC, EXT, PAR] Mix of patterns 2. 1500 concurrent Instance Producers starting an equal number of pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
  44. 44. Vincenzo Ferme 21 Micro-Benchmarking with Workflow Patterns performed experiments Three runs for each execution of the experiments The data collected in the first minute of the execution has not been included in the analysis Individual patterns 1. 1500 (max) concurrent Instance Producers starting each pattern [SEQ, EXC, CYC, EXT, PAR] Mix of patterns 2. 1500 concurrent Instance Producers starting an equal number of pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
  45. 45. Vincenzo Ferme 21 Micro-Benchmarking with Workflow Patterns performed experiments Three runs for each execution of the experiments The data collected in the first minute of the execution has not been included in the analysis Individual patterns 1. 1500 (max) concurrent Instance Producers starting each pattern [SEQ, EXC, CYC, EXT, PAR] Mix of patterns 2. 1500 concurrent Instance Producers starting an equal number of pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR])
  46. 46. Vincenzo Ferme 22 Number of Process Instances Computed Metrics and Statistics engine level metrics Throughput Execution Time Database Size WfMS Users Load Driver Instance Database Application Server DBMS Web Service D A B C …
  47. 47. Vincenzo Ferme 22 Number of Process Instances Computed Metrics and Statistics engine level metrics Throughput Execution Time Database Size WfMS Users Load Driver Instance Database Application Server DBMS Web Service D A B C …
  48. 48. Vincenzo Ferme 23 individual patterns: throughput (bp/sec) Micro-Benchmarking with Workflow Patterns SEQ EXC EXT PAR CYC WfMS A WfMS B WfMS C 1456.79 1417.17 1455.68 1433.12 327.99 1447.66 1436.03 1426.35 1429.65 644.02 63.31 49.04 0.38 48.89 13.46 u=600 u=800 u=1500
  49. 49. Vincenzo Ferme 24 Micro-Benchmarking with Workflow Patterns individual patterns: WfMS C Scalability for SEQ pattern WfMS C uses a synchronous instantiation API, load drivers must wait for workflows to end before starting new ones
  50. 50. Vincenzo Ferme 24 Micro-Benchmarking with Workflow Patterns individual patterns: WfMS C Scalability for SEQ pattern WfMS C uses a synchronous instantiation API, load drivers must wait for workflows to end before starting new ones Response Time 0 14 28 500 1000 1500 2000 u Instance Producers Throughput 0 40 80 500 1000 1500 2000 u Instance Producers
  51. 51. Vincenzo Ferme 24 Micro-Benchmarking with Workflow Patterns individual patterns: WfMS C Scalability for SEQ pattern WfMS C uses a synchronous instantiation API, load drivers must wait for workflows to end before starting new ones Response Time 0 14 28 500 1000 1500 2000 u Instance Producers Throughput 0 40 80 500 1000 1500 2000 u Instance Producers We have found a bottleneck! (~60 bp/sec)
  52. 52. Vincenzo Ferme 25 Computed Metrics and Statistics process level metrics Throughput WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS … Instance Duration PerProcess Definition
  53. 53. Vincenzo Ferme 25 Computed Metrics and Statistics process level metrics Throughput WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS … Instance Duration PerProcess Definition
  54. 54. Vincenzo Ferme 26 Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time)
  55. 55. Vincenzo Ferme 26 Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time) Empty Script 1 Empty Script 2 [SEQ]
  56. 56. Vincenzo Ferme 26 Empty Script 1 Wait 5 Sec [EXT] Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time)
  57. 57. Vincenzo Ferme 26 Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time) Empty Script 1 Empty Script 2 [PAR]
  58. 58. Vincenzo Ferme 26 Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time) CYC Instance Producers: u=600 [CYC] tern generate 1 or 2 Empty Script 1 Empty Script 2 i++ i < 10 i >= 10 case_1 case_2
  59. 59. Vincenzo Ferme 27 Computed Metrics and Statistics environment metrics WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS CPU/RAM Usage per Second IO Operations/Network Usage per Second …
  60. 60. Vincenzo Ferme 27 Computed Metrics and Statistics environment metrics WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS CPU/RAM Usage per Second IO Operations/Network Usage per Second …
  61. 61. Vincenzo Ferme 28 Micro-Benchmarking with Workflow Patterns individual patterns: mean (RAM) and mean (CPU) usage RAM CPU
  62. 62. Vincenzo Ferme 29 Micro-Benchmarking with Workflow Patterns performed experiments Individual patterns 1. 1500 (max) concurrent Instance Producers starting each pattern [SEQ, EXC, CYC, EXT, PAR] Mix of patterns 2. 1500 concurrent Instance Producers starting an equal number of pattern instances (20% mix of [SEQ, EXC, CYC, EXT, PAR]) Three runs for each execution of the experiments The data collected in the first minute of the execution has not been included in the analysis
  63. 63. Vincenzo Ferme 30 Micro-Benchmarking with Workflow Patterns mix of patterns: duration, RAM, CPU
  64. 64. Vincenzo Ferme 31 Micro-Benchmarking with Workflow Patterns summary Even though the Executed Workflows are Simple
  65. 65. Vincenzo Ferme 31 Micro-Benchmarking with Workflow Patterns summary There are relevant differences in workflows’ duration and throughput among the WfMSs Even though the Executed Workflows are Simple
  66. 66. Vincenzo Ferme 31 Micro-Benchmarking with Workflow Patterns summary There are relevant differences in workflows’ duration and throughput among the WfMSs There are relevant differences in resource usage among WfMSs CPURAM Even though the Executed Workflows are Simple
  67. 67. Vincenzo Ferme 31 Micro-Benchmarking with Workflow Patterns summary There are relevant differences in workflows’ duration and throughput among the WfMSs One WfMS has bottlenecks There are relevant differences in resource usage among WfMSs CPURAM Even though the Executed Workflows are Simple
  68. 68. Highlights
  69. 69. Highlights Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Performance Data Process Instance Duration Throughput Containers Benchmarking Process
  70. 70. Highlights Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Performance Data Process Instance Duration Throughput Containers Benchmarking Process Vincenzo Ferme 8 [CLOSER ’16] Container-centric Methodology for Benchmarking Workflow Management Systems. Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Benchmarking Methodology
  71. 71. Highlights Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Performance Data Process Instance Duration Throughput Containers Benchmarking Process Vincenzo Ferme 8 [CLOSER ’16] Container-centric Methodology for Benchmarking Workflow Management Systems. Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Benchmarking Methodology Vincenzo Ferme 10 BenchFlow Framework architecture Instance Database DBMSFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs WfMS TestExecutionAnalyses Faban + Web Service Minio harness benchflow [BPM ’15] [ICPE ’16] MONITOR Adapters COLLECTORS BenchFlow Framework
  72. 72. Highlights Vincenzo Ferme 6 The BenchFlow Benchmarking Process Workload Model WfMS Configurations Test Types Workload Metrics KPIs Measure Input/Process/Output Model Workload Mix 80% C A B 20% D Load Functions Test Data Performance Data Process Instance Duration Throughput Containers Benchmarking Process Vincenzo Ferme 8 [CLOSER ’16] Container-centric Methodology for Benchmarking Workflow Management Systems. Vendor BenchFlow Provide Benchmarking Methodology Vendor BenchFlow Agree on adding Vendor's WfMS to the Benchmark Benchmarking Methodology Agreement Proposal Signed Agreement Vendor BenchFlow Provide Containerized Distribution of WfMS Containerized WfMS Request Containerized WfMS Vendor BenchFlow Verify the Benchmark Results Results Verification Outcome Results Verification Request Vendor BenchFlow Provide Draft Benchmark Results Draft Benchmark ResultsVerified Benchmark Results Not Valid Are the Results Valid? Community BenchFlow Publish Benchmark Results Valid Benchmarking Methodology Vincenzo Ferme 10 BenchFlow Framework architecture Instance Database DBMSFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs WfMS TestExecutionAnalyses Faban + Web Service Minio harness benchflow [BPM ’15] [ICPE ’16] MONITOR Adapters COLLECTORS BenchFlow Framework Vincenzo Ferme 26 Empty Script 1 Wait 5 Sec [EXT] Micro-Benchmarking with Workflow Patterns individual patterns: mean (instance duration time) Empty Script 1 Empty Script 2 [SEQ] Empty Script 1 Empty Script 2 [PAR] CYC Instance Producers: u=600 [CYC] generate 1 or 2 Empty Script 1 Empty Script 2 i++ i < 10 i >= 10 case_1 case_2 Micro-Benchmark with WP
  73. 73. Vincenzo Ferme 33 •We want to characterise the Workload Mix using Real-World process models • Share your executable BPMN 2.0 process models, even anonymised Process Models •We want to characterise the Load Functions using Real-World behaviours • Share your execution logs, even anonymised Execution Logs •We want to add more and more WfMSs to the benchmark • Contact us for collaboration, and BenchFlow framework support WfMSs Call for Collaboration WfMSs, process models, process logs
  74. 74. benchflow benchflow vincenzo.ferme@usi.ch http://benchflow.inf.usi.ch WORKFLOW ENGINE PERFORMANCE BENCHMARKING WITH BENCHFLOW Vincenzo Ferme Faculty of Informatics USI Lugano, Switzerland
  75. 75. BACKUP SLIDES Vincenzo Ferme Faculty of Informatics USI Lugano, Switzerland
  76. 76. Vincenzo Ferme 36 Published Work [BTW ’15] C. Pautasso, V. Ferme, D. Roller, F. Leymann, and M. Skouradaki. Towards workflow benchmarking: Open research challenges. In Proc. of the 16th conference on Database Systems for Business,Technology, and Web, BTW 2015, pages 331–350, 2015. [SSP ’14] M. Skouradaki, D. H. Roller, F. Leymann, V. Ferme, and C. Pautasso. Technical open challenges on benchmarking workflow management systems. In Proc. of the 2014 Symposium on Software Performance, SSP 2014, pages 105–112, 2014. [ICPE ’15] M. Skouradaki, D. H. Roller, L. Frank, V. Ferme, and C. Pautasso. On the Road to Benchmarking BPMN 2.0 Workflow Engines. In Proc. of the 6th ACM/SPEC International Conference on Performance Engineering, ICPE ’15, pages 301–304, 2015.
  77. 77. Vincenzo Ferme 37 Published Work [CLOSER ’15] M. Skouradaki,V. Ferme, F. Leymann, C. Pautasso, and D. H. Roller. “BPELanon”: Protect business processes on the cloud. In Proc. of the 5th International Conference on Cloud Computing and Service Science, CLOSER 2015. SciTePress, 2015. [SOSE ’15] M. Skouradaki, K. Goerlach, M. Hahn, and F. Leymann. Application of Sub-Graph Isomorphism to Extract Reoccurring Structures from BPMN 2.0 Process Models. In Proc. of the 9th International IEEE Symposium on Service-Oriented System Engineering, SOSE 2015, 2015. [BPM ’15] V. Ferme, A. Ivanchikj, C. Pautasso. A Framework for Benchmarking BPMN 2.0 Workflow Management Systems. In Proc. of the 13th International Conference on Business Process Management, BPM ’15, pages 251-259, 2015.
  78. 78. Vincenzo Ferme 38 Published Work [BPMD ’15] A. Ivanchikj,V. Ferme, C. Pautasso. BPMeter: Web Service and Application for Static Analysis of BPMN 2.0 Collections. In Proc. of the 13th International Conference on Business Process Management [Demo], BPM ’15, pages 30-34, 2015. [ICPE ’16] V. Ferme, and C. Pautasso. Integrating Faban with Docker for Performance Benchmarking. In Proc. of the 7th ACM/SPEC International Conference on Performance Engineering, ICPE ’16, 2016. [CLOSER ’16] V. Ferme, A. Ivanchikj, C. Pautasso., M. Skouradaki, F. Leymann. A Container-centric Methodology for Benchmarking Workflow Management Systems. In Proc. of the 6th International Conference on Cloud Computing and Service Science, CLOSER 2016. SciTePress, 2016.
  79. 79. Vincenzo Ferme [CAiSE ’16] M. Skouradaki, V. Ferme, C. Pautasso, F. Leymann, A. van Hoorn. Micro-Benchmarking BPMN 2.0 Workflow Management Systems with Workflow Patterns . In Proc. of the 28th International Conference on Advanced Information Systems Engineering, CAiSE ’16, 2016. 39 Published Work [ICWE ’16] C. Jürgen,V. Ferme, H.C. Gall. Using Docker Containers to Improve Reproducibility in Software and Web Engineering Research. In Proc. of the 16th International Conference on Web Engineering, 2016. [ICWS ’16] M. Skouradaki,V.Andrikopoulos, F. Leymann. Representative BPMN 2.0 Process Model Generation from Recurring Structures. In Proc. of the 23rd IEEE International Conference on Web Services, ICWS '16, 2016.
  80. 80. Vincenzo Ferme 40 Published Work [SummerSOC ’16] M. Skouradaki, T. Azad, U. Breitenbücher, O. Kopp, F. Leymann. A Decision Support System for the Performance Benchmarking of Workflow Management Systems. In Proc. of the 10th Symposium and Summer School On Service-Oriented Computing, SummerSOC '16, 2016. [OTM ’16] M. Skouradaki, V. Andrikopoulos, O. Kopp, F. Leymann. RoSE: Reoccurring Structures Detection in BPMN 2.0 Process Models Collections. In Proc. of On the Move to Meaningful Internet Systems Conference, OTM '16, 2016. (to appear) [BPM Forum ’16] V. Ferme, A. Ivanchikj, C. Pautasso. Estimating the Cost for Executing Business Processes in the Cloud. In Proc. of the 14th International Conference on Business Process Management, BPM Forum ’16, 2016. (to appear)
  81. 81. Vincenzo Ferme 41 Docker Performance [IBM ’14] W. Felter, A. Ferreira, R. Rajamony, and J. Rubio. An updated performance comparison of virtual machines and Linux containers. IBM Research Report, 2014. Although containers themselves have almost no overhead, Docker is not without performance gotchas. Docker volumes have noticeably better performance than files stored in AUFS. Docker’s NAT also introduces overhead for workloads with high packet rates. These features represent a tradeoff between ease of management and performance and should be considered on a case-by-case basis. ” “Our results show that containers result in equal or better performance than VMs in almost all cases. “ ” BenchFlow Configures Docker for Performance by Default
  82. 82. Vincenzo Ferme 42 Benchmarking Requirements • Relevant • Representative • Portable • Scalable • Simple • Repeatable • Vendor-neutral • Accessible • Efficient • Affordable • K. Huppler, The art of building a good benchmark, 2009 • J. Gray, The Benchmark Handbook for Database and Transaction Systems, 1993 • S. E. Sim, S. Easterbrook et al., Using benchmarking to advance research: A challenge to software engineering, 2003
  83. 83. Vincenzo Ferme 43 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS NON-CORECORE Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
  84. 84. Vincenzo Ferme 43 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS NON-CORECORE WfMS Initialisation APIs Deploy Process Start Process Instance Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
  85. 85. Vincenzo Ferme 43 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS NON-CORECORE WfMS Initialisation APIs Deploy Process Start Process Instance + WfMS Claim Task Complete Task User APIs Create User Create Group Pending User Tasks Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
  86. 86. Vincenzo Ferme 43 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS NON-CORECORE WfMS Initialisation APIs Deploy Process Start Process Instance + WfMS Claim Task Complete Task User APIs Create User Create Group Pending User Tasks Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work Invoke WS Web Service APIs
  87. 87. Vincenzo Ferme 43 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS NON-CORECORE WfMS Initialisation APIs Deploy Process Start Process Instance Event APIs Issue Event WfMS Pending Event Tasks + WfMS Claim Task Complete Task User APIs Create User Create Group Pending User Tasks Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work Invoke WS Web Service APIs
  88. 88. Vincenzo Ferme 44 BenchFlow Benchmarking MethodologyWfMS Configurations requirements from the WfMS Table 1: Summary of Core and Non-core APIs to be imple- mented by the WfMS Functionality Min Response Data CoreAPIs Initialisation APIs Deploy a process Deployed process ID Start a process instance Process instance ID Non-coreAPIs User APIs Create a user User ID Create a group of users User group ID Access pending tasks Pending tasks IDs Claim a task* Complete a task Event APIs Access pending events Pending events IDs Issue events Web service APIs Map tasks to Web service endpoints *Optional depending on the WfMS implementation ing the a u (U ate by acc a g and mo ing (Pi has col Context » Benchmarking Requirements » Methodology Overview » Methodology Details » Advantage of Containers » 1st Application » Future Work
  89. 89. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Engine
  90. 90. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Engine Containers
  91. 91. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Engine Docker Machine provides Containers
  92. 92. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Swarm Docker Engine Docker Machine provides Containers
  93. 93. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Swarm Docker Engine Docker Machine provides manages Containers Servers
  94. 94. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Swarm Docker Engine Docker Compose Docker Machine provides manages Containers Servers
  95. 95. Vincenzo Ferme 45 BenchFlow Framework system under test Docker Swarm Docker Engine Docker Compose Docker Machine provides manages Containers Servers SUT’s Deployment Conf. deploys
  96. 96. Vincenzo Ferme 46 Server-side Data and Metrics Collection WfMS Start Workflow asynchronous execution of workflows
  97. 97. Vincenzo Ferme Users D A B C Start Application Server Web Service Load Driver DBMS WfMS Instance Database 46 Server-side Data and Metrics Collection WfMS Start Workflow asynchronous execution of workflows
  98. 98. Vincenzo Ferme Users D A B C Start Application Server Web Service Load Driver DBMS WfMS Instance Database 46 Server-side Data and Metrics Collection WfMS Start Workflow End asynchronous execution of workflows
  99. 99. Vincenzo Ferme 47 Server-side Data and Metrics Collection monitors DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service
  100. 100. Vincenzo Ferme 47 Server-side Data and Metrics Collection monitors DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service MONITOR
  101. 101. Vincenzo Ferme 48 Server-side Data and Metrics Collection monitors DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service • CPU usage • Database state Examples of Monitors:Monitors’ Characteristics: • RESTful services • Lightweight (written in Go) • As less invasive on the SUT as possible
  102. 102. Vincenzo Ferme 48 Server-side Data and Metrics Collection monitors DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service CPU Monitor DB Monitor • CPU usage • Database state Examples of Monitors:Monitors’ Characteristics: • RESTful services • Lightweight (written in Go) • As less invasive on the SUT as possible
  103. 103. Vincenzo Ferme 49 Server-side Data and Metrics Collection collect data DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service
  104. 104. Vincenzo Ferme 49 Server-side Data and Metrics Collection collect data DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service Minio Instance Database Analyses COLLECTORS
  105. 105. Vincenzo Ferme 50 Server-side Data and Metrics Collection collect data DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service • Container’s Stats (e.g., CPU usage) • Database dump • Applications Logs Examples of Collectors:Collectors’ Characteristics: • RESTful services • Lightweight (written in Go) • Two types: online and offline • Buffer data locally
  106. 106. Vincenzo Ferme 50 Server-side Data and Metrics Collection collect data DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service Stats Collector DB Collector • Container’s Stats (e.g., CPU usage) • Database dump • Applications Logs Examples of Collectors:Collectors’ Characteristics: • RESTful services • Lightweight (written in Go) • Two types: online and offline • Buffer data locally
  107. 107. Vincenzo Ferme 51 TestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector Synchronisation with the Analyses schedule analyses
  108. 108. Vincenzo Ferme 51 TestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Synchronisation with the Analyses schedule analyses
  109. 109. Vincenzo Ferme 51 TestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Publish Synchronisation with the Analyses schedule analyses
  110. 110. Vincenzo Ferme 51 TestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Publish Subscribe Synchronisation with the Analyses schedule analyses
  111. 111. Vincenzo Ferme 51 TestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Publish Subscribe Read Synchronisation with the Analyses schedule analyses
  112. 112. Vincenzo Ferme 52 Performance Metrics and KPIs amount of data REALISTIC DATA Number of Repetitions 1 10 5 3
  113. 113. Vincenzo Ferme 52 Performance Metrics and KPIs amount of data REALISTIC DATA Number of Repetitions 1 10 5 3 Number of WfMSs 1
  114. 114. Vincenzo Ferme 53 Micro-Benchmarking with Workflow Patterns mix of patterns: throughput (bp/sec) MIX WfMS A WfMS B WfMS C 1061.27 1402.33 1.78
  115. 115. Vincenzo Ferme 54 Computed Metrics and Statistics feature level metrics WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS Throughput … Instance Duration PerConstruct (type,name)
  116. 116. Vincenzo Ferme 54 Computed Metrics and Statistics feature level metrics WfMS Users Load Driver Instance Database Application Server Web Service D A B C DBMS Throughput … Instance Duration PerConstruct (type,name)
  117. 117. Vincenzo Ferme 55 Computed Metrics and Statistics interactions metrics Response Time Latency … WfMS Users Load Driver Instance Database Application Server DBMS Web Service D A B C
  118. 118. Vincenzo Ferme 55 Computed Metrics and Statistics interactions metrics Response Time Latency … WfMS Users Load Driver Instance Database Application Server DBMS Web Service D A B C
  119. 119. Vincenzo Ferme 56 Computed Metrics and Statistics descriptive and homogeneity statistics • Descriptive Statistics (e.g., Mean, Confidence Interval, 
 Standard Deviation, Percentiles, Quartiles, …) • Coefficient ofVariation across trials • Levene Test on trials

×