Presentation of a research paper at the 35th International Conference on Advanced Information Systems Engineering (CAiSE) in Zaragoza Spain. The paper presents a classification of causes of waiting times in business processes and a method to automatically detect and quantify the presence of each of these causes in a business process recorded in an event log.
Keynote talk by Marlon Dumas at the Bolzano Rules and Artificial INtelligence Summit (BRAIN 2019), RuleML+RR and GCAI Conferences, Bolzano, Italy, 17 September 2019. The talk gives an overview of state-of-the-art methods in the field of process mining and predictive process monitoring and spells out research challenges in the fields of prescriptive process monitoring and automated process improvement.
Almost all companies work with constantly grown SKU’s and raw materials numbers, that leads to working with smaller and smaller batches, shorter and shorter lead times and higher and higher quality standards which brings high complexity in effective production flow management.
To manage and visualize Information Flow became necessary condition for effective production, business management or any kind of improvements – LEAN, TOC, Six Sigma or TLS.
Data Warehouse Development Standardization Framework (DWDSF): A Way to Handle...IOSRjournaljce
Why does large number of data warehousing projects fail? How to avoid such failures? How to meet out user’s expectations and fulfil data analysis needs of business managers from data warehousing solutions? How to make data warehousing projects successful? These are some of the key questions before data warehouse research community in the present time. Literature shows that large numbers of data warehousing projects undertaken eventually result in a failure. In this paper, we have designed a framework named “Data Warehouse Development Standardization Framework” (DWDSF), to help data warehouse developer’s community in implementing effective data warehousing solutions. We have critically analysed literature to find out possible reasons of data warehouse project failure. Our framework has been designed to overcome such issues and enable implementation of successful data warehousing solutions. To verify usefulness of our framework, we have applied guidelines of DWDSF framework to design and implement data warehousing solution for National Rural Health Mission (NRHM) project which offers various health services throughout the country. The developed solution is giving results for all type of queries business managers want to run. We have shown results of some sample queries executed over the implemented data warehouse repository. All results are meeting out business manager’s query expectations.
Architecting a Platform for Enterprise Use - Strata London 2018mark madsen
The goal in most organizations is to build multi-use data infrastructure that is not subject to past constraints. This session will discuss hidden design assumptions, review design principles to apply when building multi-use data infrastructure, and provide a reference architecture to use as you work to unify your analytics infrastructure.
The focus in our market has been on acquiring technology, and that ignores the more important part: the larger IT landscape within which this technology lives and the data architecture that lies at its core. If one expects longevity from a platform then it should be a designed rather than accidental architecture.
Architecture is more than just software. It starts from use and includes the data, technology, methods of building and maintaining, and organization of people. What are the design principles that lead to good design and a functional data architecture? What are the assumptions that limit older approaches? How can one integrate with, migrate from or modernize an existing data environment? How will this affect an organization's data management practices? This tutorial will help you answer these questions.
Topics covered:
* A brief history of data infrastructure and past design assumptions
* Categories of data and data use in organizations
* Analytic workload characteristics and constraints
* Data architecture
* Functional architecture
* Tradeoffs between different classes of technology
* Technology planning assumptions and guidance
#strataconf
ACC 564 – Accounting Information Systems(Prerequisite ACC 5.docxnettletondevon
ACC 564 – Accounting Information Systems
(Prerequisite: ACC 562)
COURSE DESCRIPTION
Introduces the student to systems analysis and application of information systems concepts to the accounting process and accounting models, both manual and automated.
INSTRUCTIONAL MATERIALS
Required Resources
Romney, M. B., & Steinbart, P. J. (2012). Accounting information systems. (12th ed.). Upper Saddle River, NJ: Pearson.
Supplemental Resources
Dehghanzade, H., Moradi, M. A., & Raghibi, M. (2011). A Survey of Human Factors' Impacts on the Effectiveness of Accounting Information Systems. International Journal of Business Administration. 2(4), 166-174. doi: 10.5430/ijba.v2n4p166
Grabski, S. V., Leech, S. A., & Schmidt, P. J. (2011). A Review of ERP Research: A Future Agenda for Accounting Information Systems. Journal of Information Systems. 25(1), 37-78. doi: 10.2308/jis.2011.25.1.37
Guan, J., Levitan, A. S., & Kuhn, J. R. (2013). How AIS can progress along with ontology
research in IS. International Journal of Accounting Information Systems. 14(1), 21-38. doi: 10.1016/j.accinf.2012.08.002
Moorthy, M., Krishna, O. O. V., Samsuri, C. A. S. B., Gopalan, M., & King-Tak, Y. (2012). Application of Information Technology in Management Accounting Decision Making. International Journal of Academic Research in Business & Social Sciences. Vol. 2 Issue 3, p1-16
Shamszadeh, B. & Sharif, A. A. (2012). Computerized Accounting Information Systems (CAIS) Versus Security Threats. Journal of Academic Research in Economics. Vol. 4 Issue 1, p69-79
Soudani, S. N. (2012). The Usefulness of an Accounting Information System for
Effective Organizational Performance. International Journal of Economics & Finance. 4(5), 136-145. doi: 10.5539/ijef.v4n5p136
Wilkin, C. L. & Chenhall, R. H. (2010). A Review of IT Governance: A Taxonomy to Inform
Accounting Information Systems. Journal of Information Systems. 24(2), 107-146. doi: 10.2308/jis.2010.24.2.107
COURSE LEARNING OUTCOMES
1. Examine accounting information systems, activities, transactions, and their impact on organizational performance, strategy, and culture.
2. Analyze the business activities that comprise an accounting information system to determine the information needs to support decision-making function.
3. Examine and use data flow diagrams and flowcharts to understand, evaluate, and design information systems.
4. Evaluate the approaches and techniques that are used to commit and prevent computer fraud.
5. Examine control and security concepts related to accounting information systems to ensure data integrity and safety.
6. Analyze the accounting information systems audit process.
7. Apply fundamental concepts related to database systems and management.
8. Examine the phases of the systems development life cycle and key issues related to systems analysis.
9. Analyze the systems design, implemental, and operational processes.
10. Use technology and information resources to research issues in accounting i.
Continuous auditing and monitoring (“continuous reviews”) has been discussed for decades but implemented in moderation based on recent surveys. It comes down to how much are data analytics integrated into our audit processes initially to then become continuous. If a high degree of integration exists, then there is probably a good amount of continuous reviews happening in the organization already.
However, most companies fall into the other camp and have not integrated analytics well enough or considered how to take full advantage of continuous reviews.
This course will explain culturally what audit departments must do to embrace continuous reviews and how that can be integrated with ACL Desktop software techniques. Sample files and scripts will be provided to get you started down the road to continuous reviews.
As regulatory changes sweep the globe, auditors, risk management, and compliance professionals are using more sophisticated tools, and methods.
Using a live/video training library approach, we help companies of all sizes use audit and assurance software to improve business intelligence, increase efficiencies, identify fraud, test controls, and bottom line savings.
AuditNet and Cash Recovery Partners Webinar recording available at auditsoftwarevideos.com and AuditNet.tv (registration required) Recording free to view.
Sample Data Files for All Courses are available for $49
To purchase access to all sample data files, Excel macros and ACL scripts associated with the free training visit AuditSoftwareVideos.
Improving Healthcare Operations Using Process Data Mining
It’s estimated that 80% of healthcare data is unstructured, which makes it challenging to do any sort of analytics to drive improvements in population health, patient care and operational efficiency. Machine learning techniques can be utilized to predict future events from similar past events, anticipate resource capacity issues and proactively identify bottlenecks and patient outcome risks. This session will provide an overview of how process data mining can be applied to healthcare and provide real-world examples of process data mining in action.
Keynote talk by Marlon Dumas at the Bolzano Rules and Artificial INtelligence Summit (BRAIN 2019), RuleML+RR and GCAI Conferences, Bolzano, Italy, 17 September 2019. The talk gives an overview of state-of-the-art methods in the field of process mining and predictive process monitoring and spells out research challenges in the fields of prescriptive process monitoring and automated process improvement.
Almost all companies work with constantly grown SKU’s and raw materials numbers, that leads to working with smaller and smaller batches, shorter and shorter lead times and higher and higher quality standards which brings high complexity in effective production flow management.
To manage and visualize Information Flow became necessary condition for effective production, business management or any kind of improvements – LEAN, TOC, Six Sigma or TLS.
Data Warehouse Development Standardization Framework (DWDSF): A Way to Handle...IOSRjournaljce
Why does large number of data warehousing projects fail? How to avoid such failures? How to meet out user’s expectations and fulfil data analysis needs of business managers from data warehousing solutions? How to make data warehousing projects successful? These are some of the key questions before data warehouse research community in the present time. Literature shows that large numbers of data warehousing projects undertaken eventually result in a failure. In this paper, we have designed a framework named “Data Warehouse Development Standardization Framework” (DWDSF), to help data warehouse developer’s community in implementing effective data warehousing solutions. We have critically analysed literature to find out possible reasons of data warehouse project failure. Our framework has been designed to overcome such issues and enable implementation of successful data warehousing solutions. To verify usefulness of our framework, we have applied guidelines of DWDSF framework to design and implement data warehousing solution for National Rural Health Mission (NRHM) project which offers various health services throughout the country. The developed solution is giving results for all type of queries business managers want to run. We have shown results of some sample queries executed over the implemented data warehouse repository. All results are meeting out business manager’s query expectations.
Architecting a Platform for Enterprise Use - Strata London 2018mark madsen
The goal in most organizations is to build multi-use data infrastructure that is not subject to past constraints. This session will discuss hidden design assumptions, review design principles to apply when building multi-use data infrastructure, and provide a reference architecture to use as you work to unify your analytics infrastructure.
The focus in our market has been on acquiring technology, and that ignores the more important part: the larger IT landscape within which this technology lives and the data architecture that lies at its core. If one expects longevity from a platform then it should be a designed rather than accidental architecture.
Architecture is more than just software. It starts from use and includes the data, technology, methods of building and maintaining, and organization of people. What are the design principles that lead to good design and a functional data architecture? What are the assumptions that limit older approaches? How can one integrate with, migrate from or modernize an existing data environment? How will this affect an organization's data management practices? This tutorial will help you answer these questions.
Topics covered:
* A brief history of data infrastructure and past design assumptions
* Categories of data and data use in organizations
* Analytic workload characteristics and constraints
* Data architecture
* Functional architecture
* Tradeoffs between different classes of technology
* Technology planning assumptions and guidance
#strataconf
ACC 564 – Accounting Information Systems(Prerequisite ACC 5.docxnettletondevon
ACC 564 – Accounting Information Systems
(Prerequisite: ACC 562)
COURSE DESCRIPTION
Introduces the student to systems analysis and application of information systems concepts to the accounting process and accounting models, both manual and automated.
INSTRUCTIONAL MATERIALS
Required Resources
Romney, M. B., & Steinbart, P. J. (2012). Accounting information systems. (12th ed.). Upper Saddle River, NJ: Pearson.
Supplemental Resources
Dehghanzade, H., Moradi, M. A., & Raghibi, M. (2011). A Survey of Human Factors' Impacts on the Effectiveness of Accounting Information Systems. International Journal of Business Administration. 2(4), 166-174. doi: 10.5430/ijba.v2n4p166
Grabski, S. V., Leech, S. A., & Schmidt, P. J. (2011). A Review of ERP Research: A Future Agenda for Accounting Information Systems. Journal of Information Systems. 25(1), 37-78. doi: 10.2308/jis.2011.25.1.37
Guan, J., Levitan, A. S., & Kuhn, J. R. (2013). How AIS can progress along with ontology
research in IS. International Journal of Accounting Information Systems. 14(1), 21-38. doi: 10.1016/j.accinf.2012.08.002
Moorthy, M., Krishna, O. O. V., Samsuri, C. A. S. B., Gopalan, M., & King-Tak, Y. (2012). Application of Information Technology in Management Accounting Decision Making. International Journal of Academic Research in Business & Social Sciences. Vol. 2 Issue 3, p1-16
Shamszadeh, B. & Sharif, A. A. (2012). Computerized Accounting Information Systems (CAIS) Versus Security Threats. Journal of Academic Research in Economics. Vol. 4 Issue 1, p69-79
Soudani, S. N. (2012). The Usefulness of an Accounting Information System for
Effective Organizational Performance. International Journal of Economics & Finance. 4(5), 136-145. doi: 10.5539/ijef.v4n5p136
Wilkin, C. L. & Chenhall, R. H. (2010). A Review of IT Governance: A Taxonomy to Inform
Accounting Information Systems. Journal of Information Systems. 24(2), 107-146. doi: 10.2308/jis.2010.24.2.107
COURSE LEARNING OUTCOMES
1. Examine accounting information systems, activities, transactions, and their impact on organizational performance, strategy, and culture.
2. Analyze the business activities that comprise an accounting information system to determine the information needs to support decision-making function.
3. Examine and use data flow diagrams and flowcharts to understand, evaluate, and design information systems.
4. Evaluate the approaches and techniques that are used to commit and prevent computer fraud.
5. Examine control and security concepts related to accounting information systems to ensure data integrity and safety.
6. Analyze the accounting information systems audit process.
7. Apply fundamental concepts related to database systems and management.
8. Examine the phases of the systems development life cycle and key issues related to systems analysis.
9. Analyze the systems design, implemental, and operational processes.
10. Use technology and information resources to research issues in accounting i.
Continuous auditing and monitoring (“continuous reviews”) has been discussed for decades but implemented in moderation based on recent surveys. It comes down to how much are data analytics integrated into our audit processes initially to then become continuous. If a high degree of integration exists, then there is probably a good amount of continuous reviews happening in the organization already.
However, most companies fall into the other camp and have not integrated analytics well enough or considered how to take full advantage of continuous reviews.
This course will explain culturally what audit departments must do to embrace continuous reviews and how that can be integrated with ACL Desktop software techniques. Sample files and scripts will be provided to get you started down the road to continuous reviews.
As regulatory changes sweep the globe, auditors, risk management, and compliance professionals are using more sophisticated tools, and methods.
Using a live/video training library approach, we help companies of all sizes use audit and assurance software to improve business intelligence, increase efficiencies, identify fraud, test controls, and bottom line savings.
AuditNet and Cash Recovery Partners Webinar recording available at auditsoftwarevideos.com and AuditNet.tv (registration required) Recording free to view.
Sample Data Files for All Courses are available for $49
To purchase access to all sample data files, Excel macros and ACL scripts associated with the free training visit AuditSoftwareVideos.
Improving Healthcare Operations Using Process Data Mining
It’s estimated that 80% of healthcare data is unstructured, which makes it challenging to do any sort of analytics to drive improvements in population health, patient care and operational efficiency. Machine learning techniques can be utilized to predict future events from similar past events, anticipate resource capacity issues and proactively identify bottlenecks and patient outcome risks. This session will provide an overview of how process data mining can be applied to healthcare and provide real-world examples of process data mining in action.
IT PROJECT SHOWSTOPPER FRAMEWORK: THE VIEW OF PRACTITIONERSijseajournal
The study intended to unravel critical IT project showstoppers which tend to halt IT projects temporarily or permanently, and ultimately cause them to fail, by positioning them in the systems development life cycle (SDLC) framework. Interviewing 8 IT project and program managers of the banking and telecommunications industries in Ghana individually and in a group, 19 critical showstoppers were identified spanning the whole SDLC. Generally, it was observed that for the successful completion of IT projects, the expertise and availability of project managers and team members are critical. Again, the project manager must be able to prove that the project is in line with the objectives and strategic direction of the business, is being mounted to gain competitive advantage, and has a solid business case. Thirdly, funding is key at all stages of the cycle, as well as approval for continuation at various stages.
Role of Operational System Design in Data Warehouse Implementation: Identifyi...iosrjce
Data warehouse designing process takes input from operational system of the organization. Quality
of data warehousing solution depends on design of operational system. Often, operational system
implementations of organizations have some limitations. Thus, we cannot proceed for data warehouse
designing so easily. In this paper, we have tried to investigate operational system of the organization for
identifying such limitations and determine role of operational system design in the process of data warehouse
design and implementation. We have worked out to find possible methods to handle such limitations and have
proposed techniques to get a quality data warehousing solution under such limitations. To make the work based
on live example, National Rural Health Mission (NRHM) Project has been taken. It is a national project of
health sector, managed by Indian Government across the country. The complex structure and high volume of
data makes it an ideal case for data warehouse implementation.
Process mining is a set of data analysis techniques and tools for extracting information from so called event logs which are commonly available in modern IT systems. Event logs register activities performed by an organization's employees. Such logs are typically created, inter alia, in document workflow systems, customer relationship management systems, task management systems. Because information systems support the operation of many areas of an organization, event logs record its real manner of operation. Real, in other words not presumed. Process mining joins ideas of process modelling and analysis on the one hand and data mining and machine learning on the other.
Well-performing organisation consists of individuals collaborating together in some social context to achieve common and individual goals. To achieve those goals effectively organisations must address the challenges of dynamic, turbulent and competitive environment. This leads to constant, ongoing change in working methods, methods of goods and service delivery etc. Knowledge about real shape of business processes is a first step to perform such a change effectively. The two characteristics of process mining facilitate efficient change in organization and distinguish process mining from other data analytic techniques: (1) focus on people and their decisions, interactions, collaboration patterns and organizational dependencies, (2) focus on activities performed by those people and casual and time dependencies among those activities.
Discovery and Simulation of Business Processes with Probabilistic Resource Av...Marlon Dumas
In the field of business process simulation, the availability of resources is captured by assigning a calendar to each resource, e.g., Monday-Friday 9:00-18:00. Resources are assumed to be always available to perform activities during their calendar. This assumption often does not hold due to interruptions, breaks, or because resources time-share across multiple processes. A simulation model that captures availability via crisp time slots (a resource is either on or off during a slot) does not capture these behaviors, leading to inaccuracies in the simulation output. This paper presents a simulation approach wherein resource availability is modeled probabilistically. In this approach, each availability time slot is associated with a probability, allowing us to capture, for example, that a resource is available on Fridays between 14:00-15:00 with 90% probability and between 17:00-18:00 with 50% probability. The paper proposes an algorithm to discover probabilistic availability calendars from event logs. An empirical evaluation shows that simulation models with probabilistic calendars discovered from event logs, replicate the temporal distribution of activity instances and cycle times of a process more closely than simulation models with crisp calendars.
This presentation was delivered at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, October 2023.
The paper is available at: https://easychair.org/publications/preprint/Rz9g
Business Process Analytics: From Insights to PredictionsMarlon Dumas
Keynote talk at the 13th Baltic Conference on Databases and Information Systems, Trakai, Lithuania, 2 July 2018.
Abstract
Business process analytics is a body of methods for analyzing data generated by the execution of business processes in order to extract insights about weaknesses and improvement opportunities, both at the tactical and operational levels. Tactical process analytics methods (also known as process mining) allow us to understand how a given business process is actually executed, if and how its execution deviates with respect to expected or normative pathways, and what factors contribute to poor process performance or undesirable outcomes. Meantime, operational process analytics methods allow us to monitor ongoing executions of a business process in order to predict future states and undesirable outcomes at runtime (predictive process monitoring). Existing methods in this space allow us to predict, for example, which task will be executed next in a case, when, and who will perform it? When will an ongoing case complete? What will its outcome be and how can negative outcomes be avoided? This keynote will present a framework for conceptualizing business process analytics methods and applications. The talk will provide an overview of state-of-art methods and tools in the field and will outline open challenges and research opportunities.
OSTHUS-Allotrope presents "Laboratory Informatics Strategy" at SmartLab 2015OSTHUS
Building your laboratory informatics strategy: The benefit of reference architectures & data standardization.
Presented by:
Wolfgang Colsman, OSTHUS
Dana Vanderwall, Bristol-Myers Squibb
Requirements engineering (RE), as a part of the project development life cycle, has increasingly been
recognized as the key to ensure on-time, on-budget, and goal-based delivery of software projects. RE of big
data projects is even more crucial because of the rapid growth of big data applications over the past few
years. Data processing, being a part of big data RE, is an essential job in driving big data RE process
successfully. Business can be overwhelmed by data and underwhelmed by the information so, data
processing is very critical in big data projects. Employing traditional data processing techniques lacks the
invention of useful information because of the main characteristics of big data, including high volume,
velocity, and variety. Data processing can be benefited by process mining, and in turn, helps to increase
the productivity of the big data projects. In this paper, the capability of process mining in big data RE to
discover valuable insights and business values from event logs and processes of the systems has been
highlighted. Also, the proposed big data requirements engineering framework, named REBD, helps
software requirements engineers to eradicate many challenges of big data RE
Splunk: How to Design, Build and Map IT ServicesSplunk
Your IT department supports critical business functions, processes and products. You're most effective when your technology initiatives are closely aligned and measured with specific business objectives. This session covers best practices and techniques for designing and building an effective service model, using the domain knowledge of your experts and capturing and reporting on key metrics that everyone can understand.
Process Discovery and Process Mining has always been
the “1st Chess coin move” by most of the high-end IT
Automation Consulting and System Integrators like us.
How GenAI will (not) change your business?Marlon Dumas
Not all new technology waves are the same. Some waves are vertical (3D printing, digital twins, blockchain) while others are horizontal (the PC in the 80s, the Web in the 90s). GenAI is a horizontal wave. The question is not if GenAI will impact my business, but what will be the scope of this impact. In this talk, we will go through a journey of collisions: GenAI colliding with customer service, clerical work, information search, content production, IT development, product design, and other knowledge work. A common thread to understand the impact of GenAI is to distinguish between descriptive use cases (search, summarize, expand, transcribe & translate) versus creative use.
Walking the Way from Process Mining to AI-Driven Process OptimizationMarlon Dumas
While generative AI grabs headlines, most organizations are yet to achieve continuous process improvement from predictive and prescriptive analytics.
Why? It’s largely about data, people, and a methodical approach to deploy AI to connect data and people. The good news is that if your organization has built a process mining capability, you are well placed to climb the ladder to achieve AI-driven process optimization. But to get there, you need a disciplined step-by-step approach along two tracks: a tactical management track and an operational management track.
First, it’s about predicting what will happen if you leave your process as-is, and what will happen if you implement a change in your process. At a tactical level, a predictive capability allows you to prioritize improvement opportunities. At an operational level, it allows you to predict issues, such as deadline violations. The challenges here are how to manage the inherent uncertainty of data-driven AI systems, and how to change your people and culture to manage processes proactively, rather than reactively. One thing is to deploy predictive dashboards, another entirely different thing is to get people to use them effectively to improve the processes.
Next, it’s about becoming preemptive: continuously optimizing your processes by leveraging streams of data-driven recommendations to trigger changes and actions. At the tactical level, this prescriptive capability allows you to implement the right changes to maximize competing KPIs. At the operational level, it means triggering interventions in your processes to “wow” customers and to meet SLAs in a cost-effective manner. The challenge here is how to help process owners, workers, and other stakeholders to understand the causes of performance issues and how the recommendations generated by the AI-driven optimization system will tackle those causes?
And finally, as an icing on the cake, generative AI allows you to produce improvement scenarios to adapt to external changes. Importantly, the transformative potential of generative AI in the context of process improvement does not come from its ability to provide question-and-answer interfaces to query data. It comes from its ability to support continuous process adaptation by generating and validating hypotheses based on a holistic view of your organization.
In this talk, we will discuss how organizations are driving sustainable business value by strategically layering predictive, prescriptive, and generative AI onto a process mining foundation, one brick at a time.
Industry keynote talk by Marlon Dumas at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, 25 October 2023
More Related Content
Similar to Why am I Waiting Data-Driven Analysis of Waiting Times in Business Processes
IT PROJECT SHOWSTOPPER FRAMEWORK: THE VIEW OF PRACTITIONERSijseajournal
The study intended to unravel critical IT project showstoppers which tend to halt IT projects temporarily or permanently, and ultimately cause them to fail, by positioning them in the systems development life cycle (SDLC) framework. Interviewing 8 IT project and program managers of the banking and telecommunications industries in Ghana individually and in a group, 19 critical showstoppers were identified spanning the whole SDLC. Generally, it was observed that for the successful completion of IT projects, the expertise and availability of project managers and team members are critical. Again, the project manager must be able to prove that the project is in line with the objectives and strategic direction of the business, is being mounted to gain competitive advantage, and has a solid business case. Thirdly, funding is key at all stages of the cycle, as well as approval for continuation at various stages.
Role of Operational System Design in Data Warehouse Implementation: Identifyi...iosrjce
Data warehouse designing process takes input from operational system of the organization. Quality
of data warehousing solution depends on design of operational system. Often, operational system
implementations of organizations have some limitations. Thus, we cannot proceed for data warehouse
designing so easily. In this paper, we have tried to investigate operational system of the organization for
identifying such limitations and determine role of operational system design in the process of data warehouse
design and implementation. We have worked out to find possible methods to handle such limitations and have
proposed techniques to get a quality data warehousing solution under such limitations. To make the work based
on live example, National Rural Health Mission (NRHM) Project has been taken. It is a national project of
health sector, managed by Indian Government across the country. The complex structure and high volume of
data makes it an ideal case for data warehouse implementation.
Process mining is a set of data analysis techniques and tools for extracting information from so called event logs which are commonly available in modern IT systems. Event logs register activities performed by an organization's employees. Such logs are typically created, inter alia, in document workflow systems, customer relationship management systems, task management systems. Because information systems support the operation of many areas of an organization, event logs record its real manner of operation. Real, in other words not presumed. Process mining joins ideas of process modelling and analysis on the one hand and data mining and machine learning on the other.
Well-performing organisation consists of individuals collaborating together in some social context to achieve common and individual goals. To achieve those goals effectively organisations must address the challenges of dynamic, turbulent and competitive environment. This leads to constant, ongoing change in working methods, methods of goods and service delivery etc. Knowledge about real shape of business processes is a first step to perform such a change effectively. The two characteristics of process mining facilitate efficient change in organization and distinguish process mining from other data analytic techniques: (1) focus on people and their decisions, interactions, collaboration patterns and organizational dependencies, (2) focus on activities performed by those people and casual and time dependencies among those activities.
Discovery and Simulation of Business Processes with Probabilistic Resource Av...Marlon Dumas
In the field of business process simulation, the availability of resources is captured by assigning a calendar to each resource, e.g., Monday-Friday 9:00-18:00. Resources are assumed to be always available to perform activities during their calendar. This assumption often does not hold due to interruptions, breaks, or because resources time-share across multiple processes. A simulation model that captures availability via crisp time slots (a resource is either on or off during a slot) does not capture these behaviors, leading to inaccuracies in the simulation output. This paper presents a simulation approach wherein resource availability is modeled probabilistically. In this approach, each availability time slot is associated with a probability, allowing us to capture, for example, that a resource is available on Fridays between 14:00-15:00 with 90% probability and between 17:00-18:00 with 50% probability. The paper proposes an algorithm to discover probabilistic availability calendars from event logs. An empirical evaluation shows that simulation models with probabilistic calendars discovered from event logs, replicate the temporal distribution of activity instances and cycle times of a process more closely than simulation models with crisp calendars.
This presentation was delivered at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, October 2023.
The paper is available at: https://easychair.org/publications/preprint/Rz9g
Business Process Analytics: From Insights to PredictionsMarlon Dumas
Keynote talk at the 13th Baltic Conference on Databases and Information Systems, Trakai, Lithuania, 2 July 2018.
Abstract
Business process analytics is a body of methods for analyzing data generated by the execution of business processes in order to extract insights about weaknesses and improvement opportunities, both at the tactical and operational levels. Tactical process analytics methods (also known as process mining) allow us to understand how a given business process is actually executed, if and how its execution deviates with respect to expected or normative pathways, and what factors contribute to poor process performance or undesirable outcomes. Meantime, operational process analytics methods allow us to monitor ongoing executions of a business process in order to predict future states and undesirable outcomes at runtime (predictive process monitoring). Existing methods in this space allow us to predict, for example, which task will be executed next in a case, when, and who will perform it? When will an ongoing case complete? What will its outcome be and how can negative outcomes be avoided? This keynote will present a framework for conceptualizing business process analytics methods and applications. The talk will provide an overview of state-of-art methods and tools in the field and will outline open challenges and research opportunities.
OSTHUS-Allotrope presents "Laboratory Informatics Strategy" at SmartLab 2015OSTHUS
Building your laboratory informatics strategy: The benefit of reference architectures & data standardization.
Presented by:
Wolfgang Colsman, OSTHUS
Dana Vanderwall, Bristol-Myers Squibb
Requirements engineering (RE), as a part of the project development life cycle, has increasingly been
recognized as the key to ensure on-time, on-budget, and goal-based delivery of software projects. RE of big
data projects is even more crucial because of the rapid growth of big data applications over the past few
years. Data processing, being a part of big data RE, is an essential job in driving big data RE process
successfully. Business can be overwhelmed by data and underwhelmed by the information so, data
processing is very critical in big data projects. Employing traditional data processing techniques lacks the
invention of useful information because of the main characteristics of big data, including high volume,
velocity, and variety. Data processing can be benefited by process mining, and in turn, helps to increase
the productivity of the big data projects. In this paper, the capability of process mining in big data RE to
discover valuable insights and business values from event logs and processes of the systems has been
highlighted. Also, the proposed big data requirements engineering framework, named REBD, helps
software requirements engineers to eradicate many challenges of big data RE
Splunk: How to Design, Build and Map IT ServicesSplunk
Your IT department supports critical business functions, processes and products. You're most effective when your technology initiatives are closely aligned and measured with specific business objectives. This session covers best practices and techniques for designing and building an effective service model, using the domain knowledge of your experts and capturing and reporting on key metrics that everyone can understand.
Process Discovery and Process Mining has always been
the “1st Chess coin move” by most of the high-end IT
Automation Consulting and System Integrators like us.
Similar to Why am I Waiting Data-Driven Analysis of Waiting Times in Business Processes (20)
How GenAI will (not) change your business?Marlon Dumas
Not all new technology waves are the same. Some waves are vertical (3D printing, digital twins, blockchain) while others are horizontal (the PC in the 80s, the Web in the 90s). GenAI is a horizontal wave. The question is not if GenAI will impact my business, but what will be the scope of this impact. In this talk, we will go through a journey of collisions: GenAI colliding with customer service, clerical work, information search, content production, IT development, product design, and other knowledge work. A common thread to understand the impact of GenAI is to distinguish between descriptive use cases (search, summarize, expand, transcribe & translate) versus creative use.
Walking the Way from Process Mining to AI-Driven Process OptimizationMarlon Dumas
While generative AI grabs headlines, most organizations are yet to achieve continuous process improvement from predictive and prescriptive analytics.
Why? It’s largely about data, people, and a methodical approach to deploy AI to connect data and people. The good news is that if your organization has built a process mining capability, you are well placed to climb the ladder to achieve AI-driven process optimization. But to get there, you need a disciplined step-by-step approach along two tracks: a tactical management track and an operational management track.
First, it’s about predicting what will happen if you leave your process as-is, and what will happen if you implement a change in your process. At a tactical level, a predictive capability allows you to prioritize improvement opportunities. At an operational level, it allows you to predict issues, such as deadline violations. The challenges here are how to manage the inherent uncertainty of data-driven AI systems, and how to change your people and culture to manage processes proactively, rather than reactively. One thing is to deploy predictive dashboards, another entirely different thing is to get people to use them effectively to improve the processes.
Next, it’s about becoming preemptive: continuously optimizing your processes by leveraging streams of data-driven recommendations to trigger changes and actions. At the tactical level, this prescriptive capability allows you to implement the right changes to maximize competing KPIs. At the operational level, it means triggering interventions in your processes to “wow” customers and to meet SLAs in a cost-effective manner. The challenge here is how to help process owners, workers, and other stakeholders to understand the causes of performance issues and how the recommendations generated by the AI-driven optimization system will tackle those causes?
And finally, as an icing on the cake, generative AI allows you to produce improvement scenarios to adapt to external changes. Importantly, the transformative potential of generative AI in the context of process improvement does not come from its ability to provide question-and-answer interfaces to query data. It comes from its ability to support continuous process adaptation by generating and validating hypotheses based on a holistic view of your organization.
In this talk, we will discuss how organizations are driving sustainable business value by strategically layering predictive, prescriptive, and generative AI onto a process mining foundation, one brick at a time.
Industry keynote talk by Marlon Dumas at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, 25 October 2023
Can I Trust My Simulation Model? Measuring the Quality of Business Process Si...Marlon Dumas
Business Process Simulation (BPS) is an approach to analyze the performance of business processes under different scenarios. For example, BPS allows us to estimate what would be the cycle time of a process if one or more resources became unavailable. The starting point of BPS is a process model annotated with simulation parameters (a BPS model). BPS models may be manually designed, based on information collected from stakeholders and empirical observations, or automatically discovered from execution data. Regardless of its origin, a key question when using a BPS model is how to assess its quality. In this paper, we propose a collection of measures to evaluate the quality of a BPS model w.r.t. its ability to replicate the observed behavior of the process. We advocate an approach whereby different measures tackle different process perspectives. We evaluate the ability of the proposed measures to discern the impact of modifications to a BPS model, and their ability to uncover the relative strengths and weaknesses of two approaches for automated discovery of BPS models. The evaluation shows that the measures not only capture how close a BPS model is to the observed behavior, but they also help us to identify sources of discrepancies.
Presentation delivered by David Chapela-Campa at the BPM'2023 conference, Utrecht, September 2023.
Business Process Optimization: Status and PerspectivesMarlon Dumas
For decades, business process optimization has been largely about art and craft (and sometimes wizardry). Apart from narrowly scoped approaches to optimize resource allocation (often assuming that workers behave like robots), a lot of business process optimization relies on high-level guidelines, with A/B testing for idea validation, which is hard to scale to complex processes. As a result, managers end up settling for a "good enough" process. Can we do more? In this talk, we review recent work on the use of high-fidelity simulation models discovered from execution data. The talk also explores the possibilities (and perils) that LLMs bring to the field of business process optimization.
This talk was delivered at the Workshop on Data-Driven Business Process Optimization at the BPM'2023 conference.
Learning When to Treat Business Processes: Prescriptive Process Monitoring wi...Marlon Dumas
Paper presentation at the 35th International Conference on Advanced Information Systems Engineering (CAiSE'2023).
Abstract.
Increasing the success rate of a process, i.e. the percentage of cases that end in a positive outcome, is a recurrent process improvement goal. At runtime, there are often certain actions (a.k.a. treatments) that workers may execute to lift the probability that a case ends in a positive outcome. For example, in a loan origination process, a possible treatment is to issue multiple loan offers to increase the probability that the customer takes a loan. Each treatment has a cost. Thus, when defining policies for prescribing treatments to cases, managers need to consider the net gain of the treatments. Also, the effect of a treatment varies over time: treating a case earlier may be more effective than later in a case. This paper presents a prescriptive monitoring method that automates this decision-making task. The method combines causal inference and reinforcement learning to learn treatment policies that maximize the net gain. The method leverages a conformal prediction technique to speed up the convergence of the reinforcement learning mechanism by separating cases that are likely to end up in a positive or negative outcome, from uncertain cases. An evaluation on two real-life datasets shows that the proposed method outperforms a state-of-the-art baseline.
This talk introduces the concept of Augmented Business Process Management System: An ABPMS is a process-aware information system that relies on trustworthy AI technology to
reason and act upon data, within a set of restrictions, with the aim to continuously adapt and
improve a set of business processes with respect to one or more key performance indicators.
The talk describes the transition from existing process mining technology to AI-Augmented BPM as a pyramid, where predictive, prescriptive, conversational and reasoning capabilities are stacked up incrementally to reach the level of Augmented BPM.
Talk delivered at the AAAI'2023 Workshop on AI for Business Process Management.
Process Mining and Data-Driven Process SimulationMarlon Dumas
Guest lecture delivered at the - Institut Teknologi Sepuluh on 8 December 2022.
This lecture gives an overview of process mining and simulation techniques, and how the two can be used together in process improvement projects.
Modeling Extraneous Activity Delays in Business Process SimulationMarlon Dumas
This paper presents a technique to enhance the fidelity of business process simulation models by detecting unexplained (extraneous) delays from business process execution data, and modeling these delays in the simulation model, via timer events.
The presentation was delivered at the 4th International Conference on Process Mining (ICPM'2022).
Paper available at: https://arxiv.org/abs/2206.14051
Business Process Simulation with Differentiated Resources: Does it Make a Dif...Marlon Dumas
Existing methods for discovering business process simulation models from execution data (event logs) assume that all resources in a pool have the same performance and share the same availability calendars. This paper proposes a method for discovering simulation models, wherein each resource is treated as an individual entity, with its own performance and availability calendar. An evaluation shows that simulation models with differentiated resources more closely replicate the distributions of cycle times and the work rhythm in a process than models with undifferentiated resources. The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-16103-2_24
Prescriptive Process Monitoring Under Uncertainty and Resource ConstraintsMarlon Dumas
This paper presents an approach to trigger runtime interventions at runtime, in order to improve the success rate of a process, when the number of resources who can perform these interventions is limited.
The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-16171-1_13
The presentation delivered at the 20th International Conference on Business Process Management (BPM'2022), in Muenster, Germany, September 2022.
Slides of a lecture delivered at the First Process Mining Summer School in Aachen, Germany, July 2022.
This lecture introduces techniques in the area of "task mining" with an emphasis on Robotic Process Mining. Robotic Process Mining (RPM) is a family of techniques to discover repetitive routines that can be automated using Robotic Process Automation (RPA) technology, by analyzing interactions between
one or more workers and one or more software applications, during the performance of one or more tasks in a business process. In general, RPM techniques take as input logs of User Interactions (UI logs). These UI logs are recorded while workers interact with one or more applications, typically desktop applications. Based on these logs, RPM techniques produce specifications of one or more routines that can be automated using RPA or related tools.
Accurate and Reliable What-If Analysis of Business Processes: Is it Achievable?Marlon Dumas
In this talk, I discuss the problem of how to discover simulation models that can be used to, accurately and reliably, predict the impact of a change on a business process, e.g. what-if we automate an activity? what-if 10% of our workers become unavailable? I focus on recent approaches that exploit the availability of data in enterprise systems to address this question.
Learning Accurate Business Process Simulation Models from Event Logs via Auto...Marlon Dumas
Paper presentation at the International Conference on Advanced Information Systems Engineering (CAiSE).
This paper presents an approach to automatically discover business process simulation models from event logs by combining process mining and deep learning techniques.
Paper available at: https://link.springer.com/chapter/10.1007/978-3-031-07472-1_4
Process Mining: A Guide for PractitionersMarlon Dumas
Paper presentation delivered at the Research Conference on Challenges in Information Science (RCIS 2022). The paper studies the following questions:
1) What are the most common use cases for process mining methods?
2) What business questions do process mining methods address?
Paper available at:
https://link.springer.com/chapter/10.1007/978-3-031-05760-1_16
Process Mining for Process Improvement.pptxMarlon Dumas
Presentation of a research paper at the 16th International Conference on Research Challenges in Information Science (RCIS). The paper presents the results of an empirical study on how practitioners use process mining to identify business process improvement opportunities. The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-05760-1_13
Data-Driven Analysis of Batch Processing Inefficiencies in Business ProcessesMarlon Dumas
Slides of a research paper presentation at the 16th International Conference on Research Challenges in Information Science (RCIS).
The research paper presents an approach to analyze event logs of business processes in order to identify batched activities and to analyze the waiting times caused by these activities.
Paper available at: https://link.springer.com/chapter/10.1007/978-3-031-05760-1_14
Optimización de procesos basada en datosMarlon Dumas
Ponencia en BPM Day Lima 2021.
En esta charla, hablaremos de métodos y aplicaciones emergentes en el ámbito de la optimización de procesos basada en datos. Hablaremos de avances en el área de la minería de procesos, de métodos de construcción de gemelos digitales de procesos y de métodos de monitoreo predictivo. Mostraremos por medio de ejemplos y casos de estudio, cómo estos métodos permiten guiar las iniciativas de transformación digital y de mejora continua de procesos, En particular, ilustraremos el uso de estos métodos para: (1) analizar el rendimiento de los procesos de negocio de manera a identificar fricciones y oportunidades de automatización; (2) predecir el impacto de cambios, y en particular, predecir el impacto de una iniciativa de automatización; (3) realizar predicciones sobre el rendimiento del proceso y ajustar la ejecución del proceso de manera a prevenir incumplimientos del SLA, quejas de clientes, y otros eventos indeseables.
Process Mining and AI for Continuous Process ImprovementMarlon Dumas
Talk delivered at BPM Day Rio Grande do Sul on 11 November 2021.
Abstract.
Process mining is a technology that marries methods from business process management and from data science, to support operational excellence and digital transformation. Process mining tools can transform data extracted from enterprise systems, into visualizations and reports that allow managers to improve organizational performance along different dimensions, such as efficiency, quality, and compliance. In this talk, we will give an overview of the capabilities of process mining tools, and we will illustrate the benefits of process mining via several case studies in the fields of insurance, manufacturing, and IT service management.
Prescriptive Process Monitoring for Cost-Aware Cycle Time ReductionMarlon Dumas
Paper presentation at the 3rd International Conference on Process Mining (ICPM), 4 November 2021.
The paper is available at: https://arxiv.org/abs/2105.07111
Mine Your Simulation Model: Automated Discovery of Business Process Simulatio...Marlon Dumas
Keynote talk by Marlon Dumas at the SIMULTECH 2021 conference. The talk gives an overview of ongoing research on automated construction of simulation models / digital twins from business process execution logs, including approaches that combine discrete event simulation with deep learning methods.
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Why am I Waiting Data-Driven Analysis of Waiting Times in Business Processes
1. Why am I Waiting?
Data-Driven Analysis of Waiting Times
in Business Processes
Katsiaryna Lashkevich, Fredrik Milani,
David Chapela-Campa, Ihar Suvorau and Marlon Dumas
35th International Conference on Advanced Information
Systems Engineering (CAiSE ’23)
2. 2
Waiting time visualization using process mining
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
Introduction
Process mining
IT
systems
Event
logs
Process mining
techniques
Process
insights
Images: Flaticon.com
3. 3
Research questions
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
How can these waiting time causes be automatically
discovered from an event log?
What are the direct causes of waiting time between activity
instances in a process?
How can we effectively describe the contribution of each
waiting time cause to the temporal efficiency of a process?
1
2
3
4. 4
Why am I waiting?
Batching
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
Resource contention
Prioritization
Resource unavailability
(off-duty)
Extraneous factors
Enabled
activity
instance
Images: Flaticon.com
5. 5
Overview of the proposed approach
Approach
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
6. 6
Approach
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
Causal directly-follows relations
between activities
Approach
Concurrent
activity pairs
Heuristic
concurrency
oracle
7. 7
Approach
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
Causal graph of a case
Activity instance transitions:
Register invoice -> Notify acceptance
Register invoice -> Post invoice
Post invoice -> Pay invoice
8. 8
Approach
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
Causal graph of a case
Activity instance transitions:
Register invoice -> Notify acceptance
Register invoice -> Post invoice
Post invoice -> Pay invoice
9. 9
Approach
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
10. 10
Waiting time due to batching.
Approach
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
Lashkevich, K., Milani, F., Chapela-Campa, D., Dumas, M.: Data-driven analysis of batch
processing inefficiencies in business processes. In: RCIS. pp. 231–247. Springer (2022)
Batching discovery technique
11. 11
Approach
Waiting time due to resource contention and due to prioritization.
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
12. 12
Approach
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
Waiting time due to resource unavailability and due to extraneous factors.
López-Pintado, O., Dumas, M.: Business process simulation with differentiated resources:
Does it make a difference? In: BPM. pp. 361–378. Springer (2022)
Working Calendar mining technique
13. 13
Approach
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
Output: Waiting time causes per transition.
Source activity Target activity Total waiting time Case frequency Total frequency
Post invoice Pay invoice 100 h 100 % 1 000
100 h
8 h 30 h
45 h
5 h 12 h
14. 14
Cycle time efficiency (CTE) = PT / (PT + WT)
Impact of waiting time causes = CTE if a particular waiting time is eliminated
Metrics:
1. impact of each waiting time cause on the process CTE,
2. impact of each transition on the process CTE,
3. impact of each waiting time cause in each transition on the process CTE
Approach
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
15. 15
Evaluation
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
Waiting time causes in the manufacturing process.
Process map of the real-life manufacturing process.
Total waiting time: 20 yrs 7 mths 27 d
Log includes 225 cases with start and end timestamps, and resources.
11 yrs 2 mths 28 days
5 yrs 2 mths 29 days
1 yr 6 mths 13 days
10 mths 11 days
1 yrs 9 mths 10 days
16. 16
Potential CTE improvement per waiting time cause in the manufacturing process.
Evaluation
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
17. 17
Waiting time causes in activity transitions of the manufacturing process.
Evaluation
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.
18. 3
2
1
18
What are the direct causes of waiting time between activity instances in a
process?
• batching, resource contention, prioritization, resource unavailability, and
extraneous factors
How can these waiting time causes be automatically discovered from an
event log?
• with the developed technique that decomposes the waiting time into
direct causes from the activity instance log with the enabled, start, and
end times, and resources
How can we effectively describe the contribution of each waiting time cause
to the temporal efficiency of a process?
• by identifying the impact of waiting time causes on the temporal
performance using CTE
Tool implementation:
http://kronos.cloud.ut.ee/
https://github.com/AutomatedProcessImprovement/waiting-time-analysis/
Summary
"Why Am I Waiting? Data-Driven Analysis of Waiting Times in Business Processes" by Lashkevich et al.