Invited talk at the CAiSE'2019 Workshop on Blockchains for Inter-Organizational Collaboration and Flexible Advanced Information Systems (BIOC & FAiSE 2019).
Interpreted Execution of Business Process Models on BlockchainMarlon Dumas
Research paper presentation delivered at the IEEE Enterprise Computing Conference (EDOC), Paris, France, 30 October 2019. The paper introduces the technical details of Caterpillar´s business process execution engine v3.0 https://git.io/caterpillar - Paper available at https://arxiv.org/pdf/1906.01420.pdf
Process Mining and Predictive Process Monitoring: From Technology to Business...Marlon Dumas
This document discusses process mining and predictive process monitoring. It begins with an overview of business process management and how process mining fits within the broader process architecture. It then covers the key techniques in process mining like process discovery, conformance checking, performance mining, and predictive process monitoring. Examples of process mining case studies in different domains are provided. The document concludes with a discussion of how process mining can be used to enable automated process improvement.
Process Mining 2.0: From Insights to ActionsMarlon Dumas
The document discusses several topics in process mining research including predictive process monitoring, prescriptive process monitoring, robotic process mining, data-driven simulation, and causal process mining. It provides references for further research on each topic, with links to relevant papers that outline techniques in each area.
Business Process Analytics: From Insights to PredictionsMarlon Dumas
Keynote talk at the 13th Baltic Conference on Databases and Information Systems, Trakai, Lithuania, 2 July 2018.
Abstract
Business process analytics is a body of methods for analyzing data generated by the execution of business processes in order to extract insights about weaknesses and improvement opportunities, both at the tactical and operational levels. Tactical process analytics methods (also known as process mining) allow us to understand how a given business process is actually executed, if and how its execution deviates with respect to expected or normative pathways, and what factors contribute to poor process performance or undesirable outcomes. Meantime, operational process analytics methods allow us to monitor ongoing executions of a business process in order to predict future states and undesirable outcomes at runtime (predictive process monitoring). Existing methods in this space allow us to predict, for example, which task will be executed next in a case, when, and who will perform it? When will an ongoing case complete? What will its outcome be and how can negative outcomes be avoided? This keynote will present a framework for conceptualizing business process analytics methods and applications. The talk will provide an overview of state-of-art methods and tools in the field and will outline open challenges and research opportunities.
In Processes We Trust: Privacy and Trust in Business ProcessesMarlon Dumas
This document discusses challenges and opportunities around privacy and trust in business processes. It begins by defining key concepts like security, privacy, and trust. It then outlines topics related to business process security and privacy, such as access control, flow analysis to detect unauthorized data access, and privacy-aware business process execution. The document proposes approaches for privacy-aware business processes using techniques like k-anonymization and multi-party computation. It describes a system called Pleak.io that aims to model stakeholders, data flows, and privacy-enhancing technologies to quantify privacy leaks and accuracy loss in processes. The document concludes by discussing challenges around collaborative processes with untrusted parties and the potential use of distributed ledgers and smart contracts to address issues of
Apromore: Advanced Business Process Analytics on the CloudMarlon Dumas
Tutorial delivered at the 16th International Conference on Business Process Management (BPM'2018), Sydney, Australia, 13 September 2018. The tutorial provides an introduction to process mining and predictive process monitoring using Apromore
Caterpillar: A Blockchain-Based Business Proces Management SystemMarlon Dumas
Caterpillar is a blockchain-based business process management system (BPMS) that uses smart contracts to store process state and drive process execution without a database or separate execution engine. Key process data like the process model and instance state are stored on the blockchain, providing a single source of truth. Process models designed in BPMN can be automatically translated to smart contracts. This ensures correct and transparent execution across participants on the blockchain network. While promising, challenges remain around transaction costs, throughput limits, and handling large amounts of process data efficiently.
Process Mining and Predictive Process MonitoringMarlon Dumas
This document discusses process mining and predictive process monitoring. It begins with an overview of offline process mining techniques like process discovery, conformance checking, and deviance mining. It then discusses applying these techniques online for predictive process monitoring, including predicting outcomes, deviations, or failures. Various techniques are presented like nearest neighbor classification of partial traces and clustering traces before classification. The goal is to accurately predict outcomes during process execution based on control flow, data attributes, and textual case data.
Interpreted Execution of Business Process Models on BlockchainMarlon Dumas
Research paper presentation delivered at the IEEE Enterprise Computing Conference (EDOC), Paris, France, 30 October 2019. The paper introduces the technical details of Caterpillar´s business process execution engine v3.0 https://git.io/caterpillar - Paper available at https://arxiv.org/pdf/1906.01420.pdf
Process Mining and Predictive Process Monitoring: From Technology to Business...Marlon Dumas
This document discusses process mining and predictive process monitoring. It begins with an overview of business process management and how process mining fits within the broader process architecture. It then covers the key techniques in process mining like process discovery, conformance checking, performance mining, and predictive process monitoring. Examples of process mining case studies in different domains are provided. The document concludes with a discussion of how process mining can be used to enable automated process improvement.
Process Mining 2.0: From Insights to ActionsMarlon Dumas
The document discusses several topics in process mining research including predictive process monitoring, prescriptive process monitoring, robotic process mining, data-driven simulation, and causal process mining. It provides references for further research on each topic, with links to relevant papers that outline techniques in each area.
Business Process Analytics: From Insights to PredictionsMarlon Dumas
Keynote talk at the 13th Baltic Conference on Databases and Information Systems, Trakai, Lithuania, 2 July 2018.
Abstract
Business process analytics is a body of methods for analyzing data generated by the execution of business processes in order to extract insights about weaknesses and improvement opportunities, both at the tactical and operational levels. Tactical process analytics methods (also known as process mining) allow us to understand how a given business process is actually executed, if and how its execution deviates with respect to expected or normative pathways, and what factors contribute to poor process performance or undesirable outcomes. Meantime, operational process analytics methods allow us to monitor ongoing executions of a business process in order to predict future states and undesirable outcomes at runtime (predictive process monitoring). Existing methods in this space allow us to predict, for example, which task will be executed next in a case, when, and who will perform it? When will an ongoing case complete? What will its outcome be and how can negative outcomes be avoided? This keynote will present a framework for conceptualizing business process analytics methods and applications. The talk will provide an overview of state-of-art methods and tools in the field and will outline open challenges and research opportunities.
In Processes We Trust: Privacy and Trust in Business ProcessesMarlon Dumas
This document discusses challenges and opportunities around privacy and trust in business processes. It begins by defining key concepts like security, privacy, and trust. It then outlines topics related to business process security and privacy, such as access control, flow analysis to detect unauthorized data access, and privacy-aware business process execution. The document proposes approaches for privacy-aware business processes using techniques like k-anonymization and multi-party computation. It describes a system called Pleak.io that aims to model stakeholders, data flows, and privacy-enhancing technologies to quantify privacy leaks and accuracy loss in processes. The document concludes by discussing challenges around collaborative processes with untrusted parties and the potential use of distributed ledgers and smart contracts to address issues of
Apromore: Advanced Business Process Analytics on the CloudMarlon Dumas
Tutorial delivered at the 16th International Conference on Business Process Management (BPM'2018), Sydney, Australia, 13 September 2018. The tutorial provides an introduction to process mining and predictive process monitoring using Apromore
Caterpillar: A Blockchain-Based Business Proces Management SystemMarlon Dumas
Caterpillar is a blockchain-based business process management system (BPMS) that uses smart contracts to store process state and drive process execution without a database or separate execution engine. Key process data like the process model and instance state are stored on the blockchain, providing a single source of truth. Process models designed in BPMN can be automatically translated to smart contracts. This ensures correct and transparent execution across participants on the blockchain network. While promising, challenges remain around transaction costs, throughput limits, and handling large amounts of process data efficiently.
Process Mining and Predictive Process MonitoringMarlon Dumas
This document discusses process mining and predictive process monitoring. It begins with an overview of offline process mining techniques like process discovery, conformance checking, and deviance mining. It then discusses applying these techniques online for predictive process monitoring, including predicting outcomes, deviations, or failures. Various techniques are presented like nearest neighbor classification of partial traces and clustering traces before classification. The goal is to accurately predict outcomes during process execution based on control flow, data attributes, and textual case data.
Split Miner: Discovering Accurate and Simple Business Process Models from Eve...Marlon Dumas
Paper presentation delivered by Adriano Augusto at the IEEE International Conference on Data Mining (ICDM'2017) on 21 November 2017. The paper is available at: http://kodu.ut.ee/~dumas/pubs/icdm2017-split-miner.pdf
Automated Discovery of Data Transformations for Robotic Process AutomationMarlon Dumas
Paper presentation by Artem Polyvyanyy at the AAAI Workshop on Intelligent Process Automation (IPA), New York, 7 February 2020. Paper available at: https://arxiv.org/pdf/1912.01855.pdf
Multi-Perspective Comparison of Business Processes Variants Based on Event LogsMarlon Dumas
This document presents a method for multi-perspective comparison of business process variants based on event logs. The method involves constructing perspective graphs from different abstractions of event logs to analyze processes from different perspectives based on event attributes. Differential perspective graphs are then used to identify statistically significant differences between two event logs, representing different process variants. The method was experimentally applied to compare differences between divisions in an IT incident handling process using various abstractions and observations. The experiments revealed differences in activity statuses, control flows between countries, and control flow frequencies over time between the divisions.
Process Mining Reloaded: Event Structures as a Unified Representation of Proc...Marlon Dumas
Keynote talk at the 36th International Conference on Application and Theory of Petri Nets and Concurrency (Petri Nets 2015).
Screencast available at: https://youtu.be/9bQr0r_WaoE
Automated Process Improvement: Status, Challenges, and PerspectivesMarlon Dumas
Automated process improvement uses process mining techniques to recommend optimizations to business processes. It can suggest changes to tasks, control flow, decisions, and resource allocation based on event log analysis. Process mining discovers predictive models and simulates the effects of different changes to identify sets of improvements that optimize given performance metrics. Key challenges include scaling to real processes, estimating impacts on multiple metrics, and usability of change recommendations.
Learning Accurate LSTM Models of Business ProcessesMarlon Dumas
Presentation delivered at the 17th International Conference on Business Process Management (BPM), Vienna, Austria, 3 September 2019. Paper available at: http://kodu.ut.ee/~dumas/pubs/bpm2019lstm.pdf
Presenter: Manuel Camargo
Beyond Tasks and Gateways: Automated Discovery of BPMN Models with Subprocess...Marlon Dumas
Paper presentation at the 12th International BPM Conference, Eindhoven, The Netherlands, September 2014. The corresponding paper can be found at: http://math.ut.ee/~dumas/pubs/bpm2014bpmnminer.pdf
White-box prediction of process performance indicators via flow analysisMarlon Dumas
Presentation delivered by Ilya Verenich at the International Conference on Software Processes (ICSSP'2017), Paris, France, July 2017. This paper received the best paper award at the conference. Paper available at: http://kodu.ut.ee/~dumas/pubs/icssp2017whitebox.pdf
This document describes an FPGA CEP (complex event processing) appliance that can analyze streaming data and events in real-time with ultra-low latency. The appliance provides sub-microsecond latency, high throughput of over 1 million events per second, and low power consumption. It is useful for applications that require fast real-time decision making based on relationships between different data streams, such as high-frequency trading, network monitoring, and sensor data analysis. The appliance achieves its performance through the parallelism and lack of overhead provided by implementing the processing pipeline directly in an FPGA.
SERENE 2014 Workshop: Paper "Combined Error Propagation Analysis and Runtime ...SERENEWorkshop
SERENE 2014 - 6th International Workshop on Software Engineering for Resilient Systems
http://serene.disim.univaq.it/
Session 4: Monitoring
Paper 3: Combined Error Propagation Analysis and Runtime Event Detection in Process-driven Systems
An innovative software framework and toolkit for process optimization deploye...Sudhendu Rai
The document discusses how process optimization solutions were developed and delivered at scale to Xerox's document production outsourcing services business. An automated software toolkit was created that encapsulates advanced analytics, process modeling, optimization, and scheduling techniques. This allowed the delivery of process optimization services using less skilled personnel and enabled significant cost savings and improved customer satisfaction. The key aspects of the solution included modeling different components of the print production process, collecting shop floor data, and developing algorithms and modules for simulation, optimization, and monitoring.
Complex Event Processing (CEP) involves detecting patterns in streams of event data. CEP tools analyze multiple simple events to identify complex events inferred from simpler ones. Typical applications of CEP include monitoring for business anomalies, detecting fraud or security threats. CEP augments service-oriented architectures by allowing services to trigger from events and generate new event streams. Event processing engines use techniques like filtering, windows, and correlation to detect patterns across events over time.
This document proposes an algorithm to merge business processes from different organizations to facilitate collaboration. The algorithm begins by preprocessing the business processes into directed graph formats. It then identifies the maximum common regions between the graphs and computes added nodes and edges. A transitive reduction is applied to extract the minimum number of edges that maintains reachability. The merged graph is then converted back into a business process model and validated. The algorithm aims to increase compatibility between organizational processes to reduce collaboration costs and effort.
Rhf2019 how totackle barriersofapplicationmodernization_ap16_enMasahiko Umeno
This is a translated presentation at Red Hat Forum Tokyo 2019.
Every company are facing some problem in Application Modernization, and all of them have same issue. I told about 3 things, Application Architecture, Granularity and Development method.
Here is also a message of what we have to do before containerize.
QualityBPM@Heidelberg Innovation Forum 2014Tobias Unger
The document discusses QualityDrivenSimulations (QDS), a solution that uses data quality analysis and business process management techniques to optimize complex simulations. QDS executes simulations only once by analyzing data quality at runtime, adapting data and parameters based on the results, and transforming simulations into transparent, efficient business processes. This approach can reduce simulation time by 70% for applications like modeling human organ behavior. The founders seek funding to further develop their data quality framework and business consulting services for automotive and other industries.
IBM Blockchain Platform - Architectural Good Practices v1.0Matt Lucas
This document discusses architectural good practices for blockchains and Hyperledger Fabric performance. It provides an overview of key concepts like transaction processing in Fabric and performance metrics. It also covers optimizing different parts of the Fabric network like client applications, peers, ordering service, and chaincode. The document recommends using tools like Hyperledger Caliper and custom test harnesses for performance testing and monitoring Fabric deployments. It highlights lessons learned from real projects around reusing connections and load balancing requests.
Modeling is important for translating business concepts into technical concepts that can be implemented on a blockchain. Key elements to model include assets, contracts, transactions, the business network, and participants. Assets correspond to real-world items and are modeled as data structures. Contracts define the rules and algorithms for modifying asset states. Transactions invoke contracts and are recorded on the blockchain. The business network and participants are modeled as the technical network and identities. Modeling provides a bridge between business and technical views to enable blockchain solutions.
This document discusses key concepts and components related to blockchain solutions, including actors such as users, developers, operators, and architects. It describes various components that make up blockchain solutions such as ledgers, smart contracts, consensus mechanisms, and how applications interact with blockchains. It also covers considerations for blockchain developers and operators, and challenges around integrating blockchains with existing systems and achieving determinism.
Digital Transformation is all about looking at your business models and industries in fundamentally new ways of operating at higher levels of innovation in order to provide better business value for customers while lowering TCO. Whether you are born digital or analog, an effective DT strategy is essential to remain competitive in this new era of business, by using Multi-Speed IT architectures and the latest technologies enabling businesses to do more, thereby providing a better experience to partners and customers.
Blockchain is such a revolutionary technology and in this webinar, we'll explore the fundamentals of Blockchain, how IBM Blockchain works on IBM's Digital Innovation Platform: Bluemix and how it has the potential to increase trust, transparency, and efficiency in your business or industry.
Blockchain is a technology for a new generation of transactional applications that establishes trust,
accountability and transparency while streamlining processes in business networks. Think of it as
an operating system for interactions between participants in a business network. It has the potential
to vastly reduce the cost and complexity of getting things done
Split Miner: Discovering Accurate and Simple Business Process Models from Eve...Marlon Dumas
Paper presentation delivered by Adriano Augusto at the IEEE International Conference on Data Mining (ICDM'2017) on 21 November 2017. The paper is available at: http://kodu.ut.ee/~dumas/pubs/icdm2017-split-miner.pdf
Automated Discovery of Data Transformations for Robotic Process AutomationMarlon Dumas
Paper presentation by Artem Polyvyanyy at the AAAI Workshop on Intelligent Process Automation (IPA), New York, 7 February 2020. Paper available at: https://arxiv.org/pdf/1912.01855.pdf
Multi-Perspective Comparison of Business Processes Variants Based on Event LogsMarlon Dumas
This document presents a method for multi-perspective comparison of business process variants based on event logs. The method involves constructing perspective graphs from different abstractions of event logs to analyze processes from different perspectives based on event attributes. Differential perspective graphs are then used to identify statistically significant differences between two event logs, representing different process variants. The method was experimentally applied to compare differences between divisions in an IT incident handling process using various abstractions and observations. The experiments revealed differences in activity statuses, control flows between countries, and control flow frequencies over time between the divisions.
Process Mining Reloaded: Event Structures as a Unified Representation of Proc...Marlon Dumas
Keynote talk at the 36th International Conference on Application and Theory of Petri Nets and Concurrency (Petri Nets 2015).
Screencast available at: https://youtu.be/9bQr0r_WaoE
Automated Process Improvement: Status, Challenges, and PerspectivesMarlon Dumas
Automated process improvement uses process mining techniques to recommend optimizations to business processes. It can suggest changes to tasks, control flow, decisions, and resource allocation based on event log analysis. Process mining discovers predictive models and simulates the effects of different changes to identify sets of improvements that optimize given performance metrics. Key challenges include scaling to real processes, estimating impacts on multiple metrics, and usability of change recommendations.
Learning Accurate LSTM Models of Business ProcessesMarlon Dumas
Presentation delivered at the 17th International Conference on Business Process Management (BPM), Vienna, Austria, 3 September 2019. Paper available at: http://kodu.ut.ee/~dumas/pubs/bpm2019lstm.pdf
Presenter: Manuel Camargo
Beyond Tasks and Gateways: Automated Discovery of BPMN Models with Subprocess...Marlon Dumas
Paper presentation at the 12th International BPM Conference, Eindhoven, The Netherlands, September 2014. The corresponding paper can be found at: http://math.ut.ee/~dumas/pubs/bpm2014bpmnminer.pdf
White-box prediction of process performance indicators via flow analysisMarlon Dumas
Presentation delivered by Ilya Verenich at the International Conference on Software Processes (ICSSP'2017), Paris, France, July 2017. This paper received the best paper award at the conference. Paper available at: http://kodu.ut.ee/~dumas/pubs/icssp2017whitebox.pdf
This document describes an FPGA CEP (complex event processing) appliance that can analyze streaming data and events in real-time with ultra-low latency. The appliance provides sub-microsecond latency, high throughput of over 1 million events per second, and low power consumption. It is useful for applications that require fast real-time decision making based on relationships between different data streams, such as high-frequency trading, network monitoring, and sensor data analysis. The appliance achieves its performance through the parallelism and lack of overhead provided by implementing the processing pipeline directly in an FPGA.
SERENE 2014 Workshop: Paper "Combined Error Propagation Analysis and Runtime ...SERENEWorkshop
SERENE 2014 - 6th International Workshop on Software Engineering for Resilient Systems
http://serene.disim.univaq.it/
Session 4: Monitoring
Paper 3: Combined Error Propagation Analysis and Runtime Event Detection in Process-driven Systems
An innovative software framework and toolkit for process optimization deploye...Sudhendu Rai
The document discusses how process optimization solutions were developed and delivered at scale to Xerox's document production outsourcing services business. An automated software toolkit was created that encapsulates advanced analytics, process modeling, optimization, and scheduling techniques. This allowed the delivery of process optimization services using less skilled personnel and enabled significant cost savings and improved customer satisfaction. The key aspects of the solution included modeling different components of the print production process, collecting shop floor data, and developing algorithms and modules for simulation, optimization, and monitoring.
Complex Event Processing (CEP) involves detecting patterns in streams of event data. CEP tools analyze multiple simple events to identify complex events inferred from simpler ones. Typical applications of CEP include monitoring for business anomalies, detecting fraud or security threats. CEP augments service-oriented architectures by allowing services to trigger from events and generate new event streams. Event processing engines use techniques like filtering, windows, and correlation to detect patterns across events over time.
This document proposes an algorithm to merge business processes from different organizations to facilitate collaboration. The algorithm begins by preprocessing the business processes into directed graph formats. It then identifies the maximum common regions between the graphs and computes added nodes and edges. A transitive reduction is applied to extract the minimum number of edges that maintains reachability. The merged graph is then converted back into a business process model and validated. The algorithm aims to increase compatibility between organizational processes to reduce collaboration costs and effort.
Rhf2019 how totackle barriersofapplicationmodernization_ap16_enMasahiko Umeno
This is a translated presentation at Red Hat Forum Tokyo 2019.
Every company are facing some problem in Application Modernization, and all of them have same issue. I told about 3 things, Application Architecture, Granularity and Development method.
Here is also a message of what we have to do before containerize.
QualityBPM@Heidelberg Innovation Forum 2014Tobias Unger
The document discusses QualityDrivenSimulations (QDS), a solution that uses data quality analysis and business process management techniques to optimize complex simulations. QDS executes simulations only once by analyzing data quality at runtime, adapting data and parameters based on the results, and transforming simulations into transparent, efficient business processes. This approach can reduce simulation time by 70% for applications like modeling human organ behavior. The founders seek funding to further develop their data quality framework and business consulting services for automotive and other industries.
IBM Blockchain Platform - Architectural Good Practices v1.0Matt Lucas
This document discusses architectural good practices for blockchains and Hyperledger Fabric performance. It provides an overview of key concepts like transaction processing in Fabric and performance metrics. It also covers optimizing different parts of the Fabric network like client applications, peers, ordering service, and chaincode. The document recommends using tools like Hyperledger Caliper and custom test harnesses for performance testing and monitoring Fabric deployments. It highlights lessons learned from real projects around reusing connections and load balancing requests.
Modeling is important for translating business concepts into technical concepts that can be implemented on a blockchain. Key elements to model include assets, contracts, transactions, the business network, and participants. Assets correspond to real-world items and are modeled as data structures. Contracts define the rules and algorithms for modifying asset states. Transactions invoke contracts and are recorded on the blockchain. The business network and participants are modeled as the technical network and identities. Modeling provides a bridge between business and technical views to enable blockchain solutions.
This document discusses key concepts and components related to blockchain solutions, including actors such as users, developers, operators, and architects. It describes various components that make up blockchain solutions such as ledgers, smart contracts, consensus mechanisms, and how applications interact with blockchains. It also covers considerations for blockchain developers and operators, and challenges around integrating blockchains with existing systems and achieving determinism.
Digital Transformation is all about looking at your business models and industries in fundamentally new ways of operating at higher levels of innovation in order to provide better business value for customers while lowering TCO. Whether you are born digital or analog, an effective DT strategy is essential to remain competitive in this new era of business, by using Multi-Speed IT architectures and the latest technologies enabling businesses to do more, thereby providing a better experience to partners and customers.
Blockchain is such a revolutionary technology and in this webinar, we'll explore the fundamentals of Blockchain, how IBM Blockchain works on IBM's Digital Innovation Platform: Bluemix and how it has the potential to increase trust, transparency, and efficiency in your business or industry.
Blockchain is a technology for a new generation of transactional applications that establishes trust,
accountability and transparency while streamlining processes in business networks. Think of it as
an operating system for interactions between participants in a business network. It has the potential
to vastly reduce the cost and complexity of getting things done
This document provides an overview of blockchain technology and its applications for business. It begins with defining blockchain as a shared, immutable ledger for recording transactions across a network of nodes. It then discusses how blockchain can be used to build trust and transparency in business processes by enabling real-time sharing of information across organizations. The document provides examples of how blockchain is being applied in various industries like trade finance, food supply chains, and healthcare to improve processes like payments, provenance tracking, and data sharing. It also outlines factors to consider when selecting blockchain use cases and developing blockchain solutions.
Blockchain can improve business processes by functioning as a shared system of record that eliminates the need for reconciling disparate ledgers. Each member has access rights so confidential information is selectively shared. Consensus from all members is required, and validated transactions cannot be deleted, providing an immutable record. The document provides examples of how blockchain could track high-resolution photos and product details for diamonds throughout the supply chain and maintain real-time payment records.
Blockchain technology is increasingly being considered for applications in business contexts due to its key properties. It is also very much hyped for its potential to transform existing industries and business models. In Part 1, we will introduce the key properties of blockchain, its limitations, the field and the relevance for SAP and enterprises in general. In Part 2, we will focus on one of the prominent suites available today and provide an demonstration of the POC we’ve developed.
Blockchains and Smart Contracts: Architecture Design and Model-Driven Develop...Ingo Weber
The document discusses research conducted by Data61's Architecture and Analytics Platforms (AAP) team on blockchains and smart contracts. The research includes developing a taxonomy and design process for architecting applications on blockchain, comparing the cost of using blockchain versus cloud services for business process execution, using architectural modeling to predict latency for blockchain-based systems, and developing a model-driven approach to define and execute smart contracts for monitoring and executing collaborative business processes across untrusted organizations.
- Hyperledger Fabric now supports Ethereum smart contracts through integration with the Ethereum Virtual Machine (EVM). This will allow Ethereum developers to work with Hyperledger Fabric and migrate smart contracts and decentralized apps between the platforms.
- Hyperledger is an open source blockchain project hosted by the Linux Foundation. It includes various blockchain frameworks and tools including Fabric, Sawtooth, and Composer. Fabric is the most widely adopted Hyperledger blockchain framework.
- Hyperledger blockchain applications interact with peers to access and update the shared ledger. The ledger contains a growing list of immutable transaction records organized into blocks.
Blockchain technology provides benefits for supply chain processes by creating an immutable shared ledger that multiple parties can access in real-time. This removes costs and complexity from transactions. Potential supply chain use cases for blockchain include tracking shipments and inventory, managing supplier relationships, ensuring quality compliance, and providing supply chain visibility and traceability. Challenges include selecting appropriate use cases and investing in more complex processes that require higher performance. Blockchain is best applied to multi-party processes that involve exchanging assets and could benefit from increased trust, transparency and efficiency.
Slides for the talk by Dr Michael Zargham at the University of Pennsylvania's Warren Center for Network and Data Sciences on April 19, 2018. Concepts, formal theory and data is presented.
The document provides an overview of IBM Blockchain Platform and Hyperledger Fabric. It discusses key concepts like transactions, endorsement policies, smart contracts, and the transaction lifecycle of execute-order-validate on the blockchain network. It also covers developer tools like SDKs, wallets, and how applications can interact with the blockchain network through submitting transactions and listening for events.
Hyperledger Composer is a framework for developing blockchain applications that focuses on business logic rather than technical implementation details. It allows users to model assets, participants, transactions and events using familiar programming concepts. These models are used to generate code for a distributed ledger that can integrate with existing systems. Composer provides tools for modeling, access control, transaction processing and deployment to distributed ledgers like Hyperledger Fabric.
The document discusses how a company integrated their Taleo Enterprise Edition cloud application with their on-premise Oracle E-Business Suite. They built lightweight, reusable integration components to enable bi-directional data transfers between the two systems. The integration was completed within three weeks by designing transactions from the destination system backwards and building the integration from the destination to the source.
Blockchain solution architecture deliverableSarmad Ibrahim
This document discusses key architectural decisions for designing blockchain solution networks using Hyperledger Fabric. It outlines considerations for direct vs indirect network participation, secure key management, certificate authority design, data storage choices regarding on-chain and off-chain data, endorsement policy design, integration with enterprise systems, and deployment models. The document provides guidance for solution architects in assessing these decisions and designing blockchain business networks.
IBM Cloud Côte d'Azur Meetup - Blockchain Business Processes & Rule-based Sm...IBM France Lab
The document discusses integrating IBM Operational Decision Manager (ODM) with blockchain and smart contracts to enable rule-based smart contracts. It provides examples of how ODM can be used to author, govern, and execute the business logic and rules that define smart contract terms and conditions. Key points include externalizing smart contract business logic in ODM for maintenance by business users, and integrating an ODM rule execution server with Hyperledger Fabric to allow calling decision services from smart contracts via REST APIs. This enables evolving smart contracts governed by business users without changing code.
Migrating from oracle soa suite to microservices on kubernetesKonveyor Community
Watch presentation recording: https://youtu.be/cxH6WjDZc2c
In this session, we’ll explore how Randoli helped a Postal Technology company migrate their payment gateway applications off Oracle SOA Suite to Camel/Springboot on Kubernetes.
The primary drivers for the migration were: move to cloud-native technologies in keeping with the organizational digital transformation mandate; move away from an outdated centralized platform to a decentralized architecture for efficiency, scalability, and manageability; and very high licensing costs of the existing platform.
We’ll discuss:
- The high-level approach we took during the migration including architecture and design decisions.
- How we used Camel/Springboot to implement the services.
- Why and how we used Drools for implementing business rules.
- The test-driven approach using Camel testing framework and how it helped reduce issues.
- CI/CD and build process on Kubernetes.
- How we tackled logging, monitoring, and tracing challenges.
Presenter: Rajith Attapattu, Managing Partner & CTO @ Randoli Inc.
Blockchain and distributed ledger technology (DLT) can help address trust issues in business networks. Blockchain provides a decentralized peer-to-peer network with a shared, distributed ledger that allows all participants to see a single system of record. It records all transactions across the business network, shares the ledger among participants, and replicates it so each participant has their own copy while maintaining privacy. This can enable use cases like maintaining shared reference data, tracking provenance in supply chains, creating immutable financial records for auditing, and facilitating faster letter of credit transactions with reduced costs and risk. DLTs like Hyperledger and Corda implement features of blockchains but without requiring a native digital asset.
Continuous Lifecycle London 2018 Event KeynoteWeaveworks
Today it’s all about delivering velocity without compromising on quality, yet it’s becoming increasingly difficult for organisations to keep up with the challenges of current release management and traditional operations. The demand for developers to own the end-to-end delivery, including operational ownership, is increasing. A “you build it, you own it” development process requires tools that developers know and understand. So I’d like to introduce “GitOps”- an agile software lifecycle for modern applications.
In this session, I will discuss these industry challenges, including current CICD trends and how they’re converging with operations and monitoring. I’ll also illustrate the GitOps model, identify best practices and tools to use, and explain how you can benefit from adopting this methodology inherited from best practices going back 10-15 years.
Similar to Collaborative Business Process Execution on Blockchain: The Caterpillar Approach (20)
How GenAI will (not) change your business?Marlon Dumas
Not all new technology waves are the same. Some waves are vertical (3D printing, digital twins, blockchain) while others are horizontal (the PC in the 80s, the Web in the 90s). GenAI is a horizontal wave. The question is not if GenAI will impact my business, but what will be the scope of this impact. In this talk, we will go through a journey of collisions: GenAI colliding with customer service, clerical work, information search, content production, IT development, product design, and other knowledge work. A common thread to understand the impact of GenAI is to distinguish between descriptive use cases (search, summarize, expand, transcribe & translate) versus creative use.
Walking the Way from Process Mining to AI-Driven Process OptimizationMarlon Dumas
While generative AI grabs headlines, most organizations are yet to achieve continuous process improvement from predictive and prescriptive analytics.
Why? It’s largely about data, people, and a methodical approach to deploy AI to connect data and people. The good news is that if your organization has built a process mining capability, you are well placed to climb the ladder to achieve AI-driven process optimization. But to get there, you need a disciplined step-by-step approach along two tracks: a tactical management track and an operational management track.
First, it’s about predicting what will happen if you leave your process as-is, and what will happen if you implement a change in your process. At a tactical level, a predictive capability allows you to prioritize improvement opportunities. At an operational level, it allows you to predict issues, such as deadline violations. The challenges here are how to manage the inherent uncertainty of data-driven AI systems, and how to change your people and culture to manage processes proactively, rather than reactively. One thing is to deploy predictive dashboards, another entirely different thing is to get people to use them effectively to improve the processes.
Next, it’s about becoming preemptive: continuously optimizing your processes by leveraging streams of data-driven recommendations to trigger changes and actions. At the tactical level, this prescriptive capability allows you to implement the right changes to maximize competing KPIs. At the operational level, it means triggering interventions in your processes to “wow” customers and to meet SLAs in a cost-effective manner. The challenge here is how to help process owners, workers, and other stakeholders to understand the causes of performance issues and how the recommendations generated by the AI-driven optimization system will tackle those causes?
And finally, as an icing on the cake, generative AI allows you to produce improvement scenarios to adapt to external changes. Importantly, the transformative potential of generative AI in the context of process improvement does not come from its ability to provide question-and-answer interfaces to query data. It comes from its ability to support continuous process adaptation by generating and validating hypotheses based on a holistic view of your organization.
In this talk, we will discuss how organizations are driving sustainable business value by strategically layering predictive, prescriptive, and generative AI onto a process mining foundation, one brick at a time.
Industry keynote talk by Marlon Dumas at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, 25 October 2023
Discovery and Simulation of Business Processes with Probabilistic Resource Av...Marlon Dumas
In the field of business process simulation, the availability of resources is captured by assigning a calendar to each resource, e.g., Monday-Friday 9:00-18:00. Resources are assumed to be always available to perform activities during their calendar. This assumption often does not hold due to interruptions, breaks, or because resources time-share across multiple processes. A simulation model that captures availability via crisp time slots (a resource is either on or off during a slot) does not capture these behaviors, leading to inaccuracies in the simulation output. This paper presents a simulation approach wherein resource availability is modeled probabilistically. In this approach, each availability time slot is associated with a probability, allowing us to capture, for example, that a resource is available on Fridays between 14:00-15:00 with 90% probability and between 17:00-18:00 with 50% probability. The paper proposes an algorithm to discover probabilistic availability calendars from event logs. An empirical evaluation shows that simulation models with probabilistic calendars discovered from event logs, replicate the temporal distribution of activity instances and cycle times of a process more closely than simulation models with crisp calendars.
This presentation was delivered at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, October 2023.
The paper is available at: https://easychair.org/publications/preprint/Rz9g
Can I Trust My Simulation Model? Measuring the Quality of Business Process Si...Marlon Dumas
Business Process Simulation (BPS) is an approach to analyze the performance of business processes under different scenarios. For example, BPS allows us to estimate what would be the cycle time of a process if one or more resources became unavailable. The starting point of BPS is a process model annotated with simulation parameters (a BPS model). BPS models may be manually designed, based on information collected from stakeholders and empirical observations, or automatically discovered from execution data. Regardless of its origin, a key question when using a BPS model is how to assess its quality. In this paper, we propose a collection of measures to evaluate the quality of a BPS model w.r.t. its ability to replicate the observed behavior of the process. We advocate an approach whereby different measures tackle different process perspectives. We evaluate the ability of the proposed measures to discern the impact of modifications to a BPS model, and their ability to uncover the relative strengths and weaknesses of two approaches for automated discovery of BPS models. The evaluation shows that the measures not only capture how close a BPS model is to the observed behavior, but they also help us to identify sources of discrepancies.
Presentation delivered by David Chapela-Campa at the BPM'2023 conference, Utrecht, September 2023.
Business Process Optimization: Status and PerspectivesMarlon Dumas
For decades, business process optimization has been largely about art and craft (and sometimes wizardry). Apart from narrowly scoped approaches to optimize resource allocation (often assuming that workers behave like robots), a lot of business process optimization relies on high-level guidelines, with A/B testing for idea validation, which is hard to scale to complex processes. As a result, managers end up settling for a "good enough" process. Can we do more? In this talk, we review recent work on the use of high-fidelity simulation models discovered from execution data. The talk also explores the possibilities (and perils) that LLMs bring to the field of business process optimization.
This talk was delivered at the Workshop on Data-Driven Business Process Optimization at the BPM'2023 conference.
Learning When to Treat Business Processes: Prescriptive Process Monitoring wi...Marlon Dumas
Paper presentation at the 35th International Conference on Advanced Information Systems Engineering (CAiSE'2023).
Abstract.
Increasing the success rate of a process, i.e. the percentage of cases that end in a positive outcome, is a recurrent process improvement goal. At runtime, there are often certain actions (a.k.a. treatments) that workers may execute to lift the probability that a case ends in a positive outcome. For example, in a loan origination process, a possible treatment is to issue multiple loan offers to increase the probability that the customer takes a loan. Each treatment has a cost. Thus, when defining policies for prescribing treatments to cases, managers need to consider the net gain of the treatments. Also, the effect of a treatment varies over time: treating a case earlier may be more effective than later in a case. This paper presents a prescriptive monitoring method that automates this decision-making task. The method combines causal inference and reinforcement learning to learn treatment policies that maximize the net gain. The method leverages a conformal prediction technique to speed up the convergence of the reinforcement learning mechanism by separating cases that are likely to end up in a positive or negative outcome, from uncertain cases. An evaluation on two real-life datasets shows that the proposed method outperforms a state-of-the-art baseline.
Why am I Waiting Data-Driven Analysis of Waiting Times in Business ProcessesMarlon Dumas
Presentation of a research paper at the 35th International Conference on Advanced Information Systems Engineering (CAiSE) in Zaragoza Spain. The paper presents a classification of causes of waiting times in business processes and a method to automatically detect and quantify the presence of each of these causes in a business process recorded in an event log.
This talk introduces the concept of Augmented Business Process Management System: An ABPMS is a process-aware information system that relies on trustworthy AI technology to
reason and act upon data, within a set of restrictions, with the aim to continuously adapt and
improve a set of business processes with respect to one or more key performance indicators.
The talk describes the transition from existing process mining technology to AI-Augmented BPM as a pyramid, where predictive, prescriptive, conversational and reasoning capabilities are stacked up incrementally to reach the level of Augmented BPM.
Talk delivered at the AAAI'2023 Workshop on AI for Business Process Management.
Process Mining and Data-Driven Process SimulationMarlon Dumas
Guest lecture delivered at the - Institut Teknologi Sepuluh on 8 December 2022.
This lecture gives an overview of process mining and simulation techniques, and how the two can be used together in process improvement projects.
Modeling Extraneous Activity Delays in Business Process SimulationMarlon Dumas
This paper presents a technique to enhance the fidelity of business process simulation models by detecting unexplained (extraneous) delays from business process execution data, and modeling these delays in the simulation model, via timer events.
The presentation was delivered at the 4th International Conference on Process Mining (ICPM'2022).
Paper available at: https://arxiv.org/abs/2206.14051
Business Process Simulation with Differentiated Resources: Does it Make a Dif...Marlon Dumas
Existing methods for discovering business process simulation models from execution data (event logs) assume that all resources in a pool have the same performance and share the same availability calendars. This paper proposes a method for discovering simulation models, wherein each resource is treated as an individual entity, with its own performance and availability calendar. An evaluation shows that simulation models with differentiated resources more closely replicate the distributions of cycle times and the work rhythm in a process than models with undifferentiated resources. The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-16103-2_24
Prescriptive Process Monitoring Under Uncertainty and Resource ConstraintsMarlon Dumas
This paper presents an approach to trigger runtime interventions at runtime, in order to improve the success rate of a process, when the number of resources who can perform these interventions is limited.
The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-16171-1_13
The presentation delivered at the 20th International Conference on Business Process Management (BPM'2022), in Muenster, Germany, September 2022.
Slides of a lecture delivered at the First Process Mining Summer School in Aachen, Germany, July 2022.
This lecture introduces techniques in the area of "task mining" with an emphasis on Robotic Process Mining. Robotic Process Mining (RPM) is a family of techniques to discover repetitive routines that can be automated using Robotic Process Automation (RPA) technology, by analyzing interactions between
one or more workers and one or more software applications, during the performance of one or more tasks in a business process. In general, RPM techniques take as input logs of User Interactions (UI logs). These UI logs are recorded while workers interact with one or more applications, typically desktop applications. Based on these logs, RPM techniques produce specifications of one or more routines that can be automated using RPA or related tools.
Accurate and Reliable What-If Analysis of Business Processes: Is it Achievable?Marlon Dumas
This document discusses using event logs to generate business process simulation models. It describes traditional discrete event simulation approaches that discover simulation models from event logs recorded by information systems. Deep learning techniques are also discussed that can generate traces without an explicit process model. The document suggests that combining discrete event simulation and deep learning may produce more accurate simulations, but challenges remain around validating such hybrid approaches and testing them in previously unseen scenarios. More research is needed before these data-driven simulation methods can reliably predict the effects of interventions.
Learning Accurate Business Process Simulation Models from Event Logs via Auto...Marlon Dumas
Paper presentation at the International Conference on Advanced Information Systems Engineering (CAiSE).
This paper presents an approach to automatically discover business process simulation models from event logs by combining process mining and deep learning techniques.
Paper available at: https://link.springer.com/chapter/10.1007/978-3-031-07472-1_4
Process Mining: A Guide for PractitionersMarlon Dumas
This document presents a guide for practitioners on process mining. It introduces process mining and discusses its main use cases. These use cases are categorized into discovery oriented, future and change oriented, alignment oriented, variant oriented, and performance oriented. The document also provides a framework to classify use cases and discusses the business-oriented questions that can be answered using different process mining use cases, such as improving transparency, quality, agility, efficiency and conformance.
Process Mining for Process Improvement.pptxMarlon Dumas
Presentation of a research paper at the 16th International Conference on Research Challenges in Information Science (RCIS). The paper presents the results of an empirical study on how practitioners use process mining to identify business process improvement opportunities. The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-05760-1_13
Data-Driven Analysis of Batch Processing Inefficiencies in Business ProcessesMarlon Dumas
Slides of a research paper presentation at the 16th International Conference on Research Challenges in Information Science (RCIS).
The research paper presents an approach to analyze event logs of business processes in order to identify batched activities and to analyze the waiting times caused by these activities.
Paper available at: https://link.springer.com/chapter/10.1007/978-3-031-05760-1_14
Optimización de procesos basada en datosMarlon Dumas
Ponencia en BPM Day Lima 2021.
En esta charla, hablaremos de métodos y aplicaciones emergentes en el ámbito de la optimización de procesos basada en datos. Hablaremos de avances en el área de la minería de procesos, de métodos de construcción de gemelos digitales de procesos y de métodos de monitoreo predictivo. Mostraremos por medio de ejemplos y casos de estudio, cómo estos métodos permiten guiar las iniciativas de transformación digital y de mejora continua de procesos, En particular, ilustraremos el uso de estos métodos para: (1) analizar el rendimiento de los procesos de negocio de manera a identificar fricciones y oportunidades de automatización; (2) predecir el impacto de cambios, y en particular, predecir el impacto de una iniciativa de automatización; (3) realizar predicciones sobre el rendimiento del proceso y ajustar la ejecución del proceso de manera a prevenir incumplimientos del SLA, quejas de clientes, y otros eventos indeseables.
Process Mining and AI for Continuous Process ImprovementMarlon Dumas
Talk delivered at BPM Day Rio Grande do Sul on 11 November 2021.
Abstract.
Process mining is a technology that marries methods from business process management and from data science, to support operational excellence and digital transformation. Process mining tools can transform data extracted from enterprise systems, into visualizations and reports that allow managers to improve organizational performance along different dimensions, such as efficiency, quality, and compliance. In this talk, we will give an overview of the capabilities of process mining tools, and we will illustrate the benefits of process mining via several case studies in the fields of insurance, manufacturing, and IT service management.
Authoring a personal GPT for your research and practice: How we created the Q...Leonel Morgado
Thematic analysis in qualitative research is a time-consuming and systematic task, typically done using teams. Team members must ground their activities on common understandings of the major concepts underlying the thematic analysis, and define criteria for its development. However, conceptual misunderstandings, equivocations, and lack of adherence to criteria are challenges to the quality and speed of this process. Given the distributed and uncertain nature of this process, we wondered if the tasks in thematic analysis could be supported by readily available artificial intelligence chatbots. Our early efforts point to potential benefits: not just saving time in the coding process but better adherence to criteria and grounding, by increasing triangulation between humans and artificial intelligence. This tutorial will provide a description and demonstration of the process we followed, as two academic researchers, to develop a custom ChatGPT to assist with qualitative coding in the thematic data analysis process of immersive learning accounts in a survey of the academic literature: QUAL-E Immersive Learning Thematic Analysis Helper. In the hands-on time, participants will try out QUAL-E and develop their ideas for their own qualitative coding ChatGPT. Participants that have the paid ChatGPT Plus subscription can create a draft of their assistants. The organizers will provide course materials and slide deck that participants will be able to utilize to continue development of their custom GPT. The paid subscription to ChatGPT Plus is not required to participate in this workshop, just for trying out personal GPTs during it.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
Or: Beyond linear.
Abstract: Equivariant neural networks are neural networks that incorporate symmetries. The nonlinear activation functions in these networks result in interesting nonlinear equivariant maps between simple representations, and motivate the key player of this talk: piecewise linear representation theory.
Disclaimer: No one is perfect, so please mind that there might be mistakes and typos.
dtubbenhauer@gmail.com
Corrected slides: dtubbenhauer.com/talks.html
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills MN
Travis Hills of Minnesota developed a method to convert waste into high-value dry fertilizer, significantly enriching soil quality. By providing farmers with a valuable resource derived from waste, Travis Hills helps enhance farm profitability while promoting environmental stewardship. Travis Hills' sustainable practices lead to cost savings and increased revenue for farmers by improving resource efficiency and reducing waste.
Current Ms word generated power point presentation covers major details about the micronuclei test. It's significance and assays to conduct it. It is used to detect the micronuclei formation inside the cells of nearly every multicellular organism. It's formation takes place during chromosomal sepration at metaphase.
Unlocking the mysteries of reproduction: Exploring fecundity and gonadosomati...AbdullaAlAsif1
The pygmy halfbeak Dermogenys colletei, is known for its viviparous nature, this presents an intriguing case of relatively low fecundity, raising questions about potential compensatory reproductive strategies employed by this species. Our study delves into the examination of fecundity and the Gonadosomatic Index (GSI) in the Pygmy Halfbeak, D. colletei (Meisner, 2001), an intriguing viviparous fish indigenous to Sarawak, Borneo. We hypothesize that the Pygmy halfbeak, D. colletei, may exhibit unique reproductive adaptations to offset its low fecundity, thus enhancing its survival and fitness. To address this, we conducted a comprehensive study utilizing 28 mature female specimens of D. colletei, carefully measuring fecundity and GSI to shed light on the reproductive adaptations of this species. Our findings reveal that D. colletei indeed exhibits low fecundity, with a mean of 16.76 ± 2.01, and a mean GSI of 12.83 ± 1.27, providing crucial insights into the reproductive mechanisms at play in this species. These results underscore the existence of unique reproductive strategies in D. colletei, enabling its adaptation and persistence in Borneo's diverse aquatic ecosystems, and call for further ecological research to elucidate these mechanisms. This study lends to a better understanding of viviparous fish in Borneo and contributes to the broader field of aquatic ecology, enhancing our knowledge of species adaptations to unique ecological challenges.
Remote Sensing and Computational, Evolutionary, Supercomputing, and Intellige...University of Maribor
Slides from talk:
Aleš Zamuda: Remote Sensing and Computational, Evolutionary, Supercomputing, and Intelligent Systems.
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Inter-Society Networking Panel GRSS/MTT-S/CIS Panel Session: Promoting Connection and Cooperation
https://www.etran.rs/2024/en/home-english/
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...Leonel Morgado
Current descriptions of immersive learning cases are often difficult or impossible to compare. This is due to a myriad of different options on what details to include, which aspects are relevant, and on the descriptive approaches employed. Also, these aspects often combine very specific details with more general guidelines or indicate intents and rationales without clarifying their implementation. In this paper we provide a method to describe immersive learning cases that is structured to enable comparisons, yet flexible enough to allow researchers and practitioners to decide which aspects to include. This method leverages a taxonomy that classifies educational aspects at three levels (uses, practices, and strategies) and then utilizes two frameworks, the Immersive Learning Brain and the Immersion Cube, to enable a structured description and interpretation of immersive learning cases. The method is then demonstrated on a published immersive learning case on training for wind turbine maintenance using virtual reality. Applying the method results in a structured artifact, the Immersive Learning Case Sheet, that tags the case with its proximal uses, practices, and strategies, and refines the free text case description to ensure that matching details are included. This contribution is thus a case description method in support of future comparative research of immersive learning cases. We then discuss how the resulting description and interpretation can be leveraged to change immersion learning cases, by enriching them (considering low-effort changes or additions) or innovating (exploring more challenging avenues of transformation). The method holds significant promise to support better-grounded research in immersive learning.
When I was asked to give a companion lecture in support of ‘The Philosophy of Science’ (https://shorturl.at/4pUXz) I decided not to walk through the detail of the many methodologies in order of use. Instead, I chose to employ a long standing, and ongoing, scientific development as an exemplar. And so, I chose the ever evolving story of Thermodynamics as a scientific investigation at its best.
Conducted over a period of >200 years, Thermodynamics R&D, and application, benefitted from the highest levels of professionalism, collaboration, and technical thoroughness. New layers of application, methodology, and practice were made possible by the progressive advance of technology. In turn, this has seen measurement and modelling accuracy continually improved at a micro and macro level.
Perhaps most importantly, Thermodynamics rapidly became a primary tool in the advance of applied science/engineering/technology, spanning micro-tech, to aerospace and cosmology. I can think of no better a story to illustrate the breadth of scientific methodologies and applications at their best.
ESPP presentation to EU Waste Water Network, 4th June 2024 “EU policies driving nutrient removal and recycling
and the revised UWWTD (Urban Waste Water Treatment Directive)”
The technology uses reclaimed CO₂ as the dyeing medium in a closed loop process. When pressurized, CO₂ becomes supercritical (SC-CO₂). In this state CO₂ has a very high solvent power, allowing the dye to dissolve easily.
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxMAGOTI ERNEST
Although Artemia has been known to man for centuries, its use as a food for the culture of larval organisms apparently began only in the 1930s, when several investigators found that it made an excellent food for newly hatched fish larvae (Litvinenko et al., 2023). As aquaculture developed in the 1960s and ‘70s, the use of Artemia also became more widespread, due both to its convenience and to its nutritional value for larval organisms (Arenas-Pardo et al., 2024). The fact that Artemia dormant cysts can be stored for long periods in cans, and then used as an off-the-shelf food requiring only 24 h of incubation makes them the most convenient, least labor-intensive, live food available for aquaculture (Sorgeloos & Roubach, 2021). The nutritional value of Artemia, especially for marine organisms, is not constant, but varies both geographically and temporally. During the last decade, however, both the causes of Artemia nutritional variability and methods to improve poorquality Artemia have been identified (Loufi et al., 2024).
Brine shrimp (Artemia spp.) are used in marine aquaculture worldwide. Annually, more than 2,000 metric tons of dry cysts are used for cultivation of fish, crustacean, and shellfish larva. Brine shrimp are important to aquaculture because newly hatched brine shrimp nauplii (larvae) provide a food source for many fish fry (Mozanzadeh et al., 2021). Culture and harvesting of brine shrimp eggs represents another aspect of the aquaculture industry. Nauplii and metanauplii of Artemia, commonly known as brine shrimp, play a crucial role in aquaculture due to their nutritional value and suitability as live feed for many aquatic species, particularly in larval stages (Sorgeloos & Roubach, 2021).
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
Collaborative Business Process Execution on Blockchain: The Caterpillar Approach
1. Collaborative Business Process
Execution on Blockchain:
The Caterpillar Approach
Orlenys López-Pintado, Marlon Dumas,
Luciano García-Bañuelos, Ingo Weber
1BIOC Workshop at CAiSE’2019. Rome, Italy, 3 June 2019
2. 2
Background
BLOCKCHAIN
P2P Network
- No Central Authority
- Untrusted Parties
Append only chain
Each node stores a copy
- Consensus
Validate Transactions
Create/Validate Blocks
Proof of Work or Stake
- Smart Contracts
4. Drawbacks
• Bilateral message exchange is not geared to building consensus,
hence:
• Correct execution hinges on parties checking on each other
• Traceability is a nightmare
• Exceptions cause havoc
• Disputes require manual resolution
• Process change and new partner on-boarding are hard
• Money flows happen outside the collaborative process
• EDI networks can help
• But rely on a central trust provider to which everyone must be connected
• Quite inflexible
• …and money flows still happen elsewhere
4
5. • Blockchain enables us to approach collaborative process execution
differently
• No messages, only blockchain transactions
• Collaborative process models can be translated to smart contracts
• Maintain execution state
• Ensure that the process is executed correctly
• Process execution data on blockchain
• Single source of truth across the participants
• Full traceability
5
Collaborative Processes on Blockchain
6. Blockchain for collaborative process execution
Two approaches
• Blockchain as a recording medium
• Parties still exchange messages and maintain the state of the part of the process
• Each event (e.g. task completion) in the collaborative process is written on the
blockchain
• Supported by some BPMSs (e.g. Bizagi)
• No need for smart contracts
• Compliance by monitoring
• Blockchain as an execution medium
• No more messages. All events are recorded as blockchain transactions.
• Smart contracts check compliance before changing the state of the process
• Compliance by design
6
7. 7
Collaborative Business Process
Execution on Blockchain
2018- Lorikeet (BPMN Choreographies )
2018- Caterpillar (Executable
BPMN)
2018- M. Madsen et al. (Declarative Workflows)
2019- G. Falazi et al. (BPMN extension)
2018- H. Nakamura et al. (BPMN &
Statecharts) }Compiled or Interpreted
Control-flow, data, resources
Compiled,
Control-Flow Only
}
8. Caterpillar
• Business Process Management System (BPMS) without DB and
without execution engine
• State stored on blockchain
• Execution by smart contracts
• Based on the Business Process Model and Notation (BPMN 2.0)
• Supports >80% of BPMN constructs
• Runs on top of Ethereum
• Open-source
8
9. Caterpillar: Design Principles
1. Transparent design of collaborative business processes
• A collaborative process model looks exactly like a regular intra-organizational
process
• One lane = one role in the collaboration
• Sequence flow across lanes handover between two parties
• Kiss goodbye to bilateral messages…
9
10. Collaborative Process on Blockchain
10
OrdertoCashProcess
CustomerSupplierCarrier
Goods delivering
Carrier selection
PO created
Submit PO to
Supplier
Verify PO
PO rejected
PO accepted
Request carrier
quote
Submit quote
Appoint carrier
Ship goods
PO cancelled
Issue invoice for
customer
Issue invoice for
supplier
Approve invoice
from supplier
Resend invoice to
customer
Invoice accepted
Approve invoice
from carrier
Resend invoice to
supplier
Invoice accepted
11. Caterpillar: Design Principles
1. Transparent design of collaborative business processes
• A collaborative process model looks exactly like a regular intra-organizational process
• One lane = one role in the collaboration
• Sequence flow across lanes handover between two parties
• Kiss goodbye to bilateral messages
2. Everything needed to execute the process is on the blockchain
• All process instance state on the blockchain
• All execution logic is encoded in smart contracts
• Design-time component and tools only needed to deploy the process
• Off-chain runtime component is purely for convenience, BYO runtime is OK
3. Actors may be bound to roles at runtime
• According to policies that are tied to the deployed process model and enforced by
smart contracts
11
15. 15
Caterpillar’s Role Binding Model
Customer Supplier
Task * 1 Role
Role 1 1 Actor per (sub-)process instance
Role-actor assignments are scoped
User Group
System
IoT
Blockchain
Account/identity
18. 18
Policy Consistency
N E
A is case-creator;
A nominates B;
A nominates C;
C nominates D, endorsed-by A and B;
uA nA bA
N E
uB nB bB
N E
uC nC bC
N E
uD nD bD
A & B
N
N
E
E
N
A & B
E
NO DEADLOCKS
20. 20
Compiled versus Interpreted?
COMPILED
APPROACHESLack of FLEXIBILITY
Code Generation:
Model Dependent
Redundant
Full conformance with the
model
Immutable = Secure = Tamper proof
Prevent changes in the process model during its execution
How to execute inter-organizational processes involving untrusted
actors in a flexible and scalable manner on blockchain?
21. 21
Interpreted Caterpillar:
Overview2018- C. Sturm et al. (Single Contract
Execution)
INTERPRETED
EXECUTION
Dynamic data structures to store
process-specific data.
Process perspectives decoupled on a Modular Architecture
Flexibility for the participants of the process to react under unexpected
situations during the execution
(1) Keeping different variants of the same model,
(2) to deviate the flow during the execution temporarily,
(3) to permanently modify a process model with impact in all the future
instances
BPMN Interpreter- Single Smart
contract encoding BPMN
Standard
29. Remaining Challenges
• Cost and throughput are major challenges
• 0.02-0.04 Ether per process instance for real (but small) processes (EUR 5+)
• Encryption of case data and seamless access to encrypted case data
• Given an RBAC model with dynamic role binding
• Current version of Caterpillar tied to Ethereum
• Open question: is it possible to support seamless interoperation with multiple
blockchain platforms?
Caterpillar: A BPMS on the Blockchain WU Vienna Ressearch Seminar, 18 Dec 2017 29
30. Want to know more?
• O. López-Pintado, L. García-Bañuelos, M. Dumas, I. Weber, A.
Ponomarev. “Caterpillar: A Business Process Execution Engine on the
Ethereum Blockchain.” Software Practice and Experience, 2019.
• https://arxiv.org/abs/1808.03517
• O. López-Pintado, M. Dumas, L. García-Bañuelos, I. Weber: “Dynamic
Role Binding in Blockchain-Based Collaborative Business Processes.” In
CAiSE’2019
• O. López-Pintado, L. García-Bañuelos, M. Dumas, I. Weber:
“Caterpillar: A Blockchain-Based Business Process Management
System.” Proceeding of BPM’2017 Demos (tool paper)
Caterpillar: A BPMS on the Blockchain WU Vienna Ressearch Seminar, 18 Dec 2017 30