Presentation of a research paper at the 16th International Conference on Research Challenges in Information Science (RCIS). The paper presents the results of an empirical study on how practitioners use process mining to identify business process improvement opportunities. The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-05760-1_13
IT Projects are typically complex undertakings where requirements, and their gathering from often multiple stakeholders, can often be difficult. The practice of Quality Management during the course of IT Projects should, in theory, lead to better governance and overall outcomes. This paper explores four individual IT Projects and details the methods of qualification gathering, stakeholder engagement and overall Quality Management employed. Following this is a review of the common themes found in each project and a discussion on their application from a Quality Management perspective.
Impact assessment of factors affecting information technology projects in riv...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
IT Projects are typically complex undertakings where requirements, and their gathering from often multiple stakeholders, can often be difficult. The practice of Quality Management during the course of IT Projects should, in theory, lead to better governance and overall outcomes. This paper explores four individual IT Projects and details the methods of qualification gathering, stakeholder engagement and overall Quality Management employed. Following this is a review of the common themes found in each project and a discussion on their application from a Quality Management perspective.
Impact assessment of factors affecting information technology projects in riv...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
In this webinar we will talk about some of outcomes of recent projects along with the implications of those outcomes and how you can adopt some of the ideas into your own projects.
How can a data scientist expert solve real world problems? priyanka rajput
Expert data scientists are essential in today's data-driven world for resolving challenging real-world issues in a variety of fields. Their broad skill set, which includes data collection, preparation, modelling, validation, and deployment, gives them the means to draw out useful information from big, complicated datasets. You can opt for data science course in Hisar, Delhi, Pune, Chennai and other parts of India.
“RAKTANCHAL” An Blood bank finder application (Priyanka Tiwari) (4) (1).docxArpitMishra139426
HR FINAL YEAR PROJECT
This idea came into my mind with my own family situation... Some time ago... My relative accidental case. He need blood but at that time there is no availability of blood bank in my town
So i thought i will make an app which helps people to easily communicate with blood bank.
This idea not only for buisness purpose,it also save human life. Make human life easy and more happy....
This idea came into my mind with my own family situation... Some time ago... My relative accidental case. He need blood but at that time there is no availability of blood bank in my town
So i thought i will make an app which helps people to easily communicate with blood bank.
This idea not only for buisness purpose,it also save human life. Make human life easy and more happy....
Chapter 3: FEASIBILITY STUDY
FEASIBILITY STUDY
A feasibility study is a high-level capsule version of the entire System analysis and Design Process. The study begins by classifying the problem definition. Feasibility is to determine if it’s worth doing. Once an acceptance problem definition has been generated, the analyst develops a logical model of the system. A search for alternatives is analyzed carefully. There are 3 parts in feasibility study.
i. Managerial feasibility
ii. Technical feasibility
iii. Financial feasibility
iv. Commercial and economic feasibility
v. Social feasibility or acceptability
1. Managerial feasibility
Operational feasibility is the measure of how well a proposed system solves the problems, and takes advantage of the opportunities identified during scope definition and how it satisfies the requirements identified in the requirements analysis phase of system development.
The operational feasibility assessment focuses on the degree to which the proposed development projects fits in with the existing business environment and objectives with regard to development schedule, delivery date, corporate culture and existing business processes.To ensure success, desired operational outcomes must be imparted during design and development. These include such design-dependent parameters as reliability, maintainability, supportability, usability, producibility, disposability, sustainability, affordability and others. These parameters are required to be considered at the early stages of design if desired operational behaviours are to be realised.
A system design and development requires appropriate and timely application of engineering and management efforts to meet the previously mentioned parameters.
A system may serve its intended purpose most effectively when its technical and operating characteristics are engineered into the design. Therefore, operational feasibility is a critical aspect of systems engineering that needs to be an integral part of the early design phases.
2. TECHNICAL FEASIBILITY
This involves questions such as whether the technology needed for the system exists, how difficult it will be to build, and whether the firm has
GICT Certified Predictive Modeler (CPM) course covers the concept of business analytics with more focus on predictive analytics
Find Out More : https://globalicttraining.com
Business Capability-centric Management of Services and Business Process ModelsWassim Derguech
With the advent of Industry 4.0, more and more companies are actively working on digitising their assets (i.e., services, processes, etc.) for better control, collaboration, modularity, analysis, etc. By 2020 more than 80% of companies will have digitised their business processes and value chains. This creates more services and processes, making their indexing, discovery, configuration, etc. more challenging. Thus, digitising assets needs a data model to describe them together with algorithms for indexing, discovery and configuration.
This thesis details a concept model for describing the business capability of services and business processes from a functional perspective in terms of what do they achieve together with related business properties. Furthermore, this work proposes the aggregation, indexing, discovery and configuration of services and business processes using the concept of business capability.
SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...Susan Hanley
Measurement is not just about looking for a bottom-line result to justify investments. It’s also a tool to provide feedback about where the organization is along the road to successfully leveraging investments in SharePoint and the business outcomes it provides. At every stage in the development of your solution, metrics provide a valuable means for focusing attention on desired behaviors and results. This presentation showcases a practical and realistic framework for SharePoint metrics based on real world examples and successes.
How GenAI will (not) change your business?Marlon Dumas
Not all new technology waves are the same. Some waves are vertical (3D printing, digital twins, blockchain) while others are horizontal (the PC in the 80s, the Web in the 90s). GenAI is a horizontal wave. The question is not if GenAI will impact my business, but what will be the scope of this impact. In this talk, we will go through a journey of collisions: GenAI colliding with customer service, clerical work, information search, content production, IT development, product design, and other knowledge work. A common thread to understand the impact of GenAI is to distinguish between descriptive use cases (search, summarize, expand, transcribe & translate) versus creative use.
Walking the Way from Process Mining to AI-Driven Process OptimizationMarlon Dumas
While generative AI grabs headlines, most organizations are yet to achieve continuous process improvement from predictive and prescriptive analytics.
Why? It’s largely about data, people, and a methodical approach to deploy AI to connect data and people. The good news is that if your organization has built a process mining capability, you are well placed to climb the ladder to achieve AI-driven process optimization. But to get there, you need a disciplined step-by-step approach along two tracks: a tactical management track and an operational management track.
First, it’s about predicting what will happen if you leave your process as-is, and what will happen if you implement a change in your process. At a tactical level, a predictive capability allows you to prioritize improvement opportunities. At an operational level, it allows you to predict issues, such as deadline violations. The challenges here are how to manage the inherent uncertainty of data-driven AI systems, and how to change your people and culture to manage processes proactively, rather than reactively. One thing is to deploy predictive dashboards, another entirely different thing is to get people to use them effectively to improve the processes.
Next, it’s about becoming preemptive: continuously optimizing your processes by leveraging streams of data-driven recommendations to trigger changes and actions. At the tactical level, this prescriptive capability allows you to implement the right changes to maximize competing KPIs. At the operational level, it means triggering interventions in your processes to “wow” customers and to meet SLAs in a cost-effective manner. The challenge here is how to help process owners, workers, and other stakeholders to understand the causes of performance issues and how the recommendations generated by the AI-driven optimization system will tackle those causes?
And finally, as an icing on the cake, generative AI allows you to produce improvement scenarios to adapt to external changes. Importantly, the transformative potential of generative AI in the context of process improvement does not come from its ability to provide question-and-answer interfaces to query data. It comes from its ability to support continuous process adaptation by generating and validating hypotheses based on a holistic view of your organization.
In this talk, we will discuss how organizations are driving sustainable business value by strategically layering predictive, prescriptive, and generative AI onto a process mining foundation, one brick at a time.
Industry keynote talk by Marlon Dumas at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, 25 October 2023
More Related Content
Similar to Process Mining for Process Improvement.pptx
In this webinar we will talk about some of outcomes of recent projects along with the implications of those outcomes and how you can adopt some of the ideas into your own projects.
How can a data scientist expert solve real world problems? priyanka rajput
Expert data scientists are essential in today's data-driven world for resolving challenging real-world issues in a variety of fields. Their broad skill set, which includes data collection, preparation, modelling, validation, and deployment, gives them the means to draw out useful information from big, complicated datasets. You can opt for data science course in Hisar, Delhi, Pune, Chennai and other parts of India.
“RAKTANCHAL” An Blood bank finder application (Priyanka Tiwari) (4) (1).docxArpitMishra139426
HR FINAL YEAR PROJECT
This idea came into my mind with my own family situation... Some time ago... My relative accidental case. He need blood but at that time there is no availability of blood bank in my town
So i thought i will make an app which helps people to easily communicate with blood bank.
This idea not only for buisness purpose,it also save human life. Make human life easy and more happy....
This idea came into my mind with my own family situation... Some time ago... My relative accidental case. He need blood but at that time there is no availability of blood bank in my town
So i thought i will make an app which helps people to easily communicate with blood bank.
This idea not only for buisness purpose,it also save human life. Make human life easy and more happy....
Chapter 3: FEASIBILITY STUDY
FEASIBILITY STUDY
A feasibility study is a high-level capsule version of the entire System analysis and Design Process. The study begins by classifying the problem definition. Feasibility is to determine if it’s worth doing. Once an acceptance problem definition has been generated, the analyst develops a logical model of the system. A search for alternatives is analyzed carefully. There are 3 parts in feasibility study.
i. Managerial feasibility
ii. Technical feasibility
iii. Financial feasibility
iv. Commercial and economic feasibility
v. Social feasibility or acceptability
1. Managerial feasibility
Operational feasibility is the measure of how well a proposed system solves the problems, and takes advantage of the opportunities identified during scope definition and how it satisfies the requirements identified in the requirements analysis phase of system development.
The operational feasibility assessment focuses on the degree to which the proposed development projects fits in with the existing business environment and objectives with regard to development schedule, delivery date, corporate culture and existing business processes.To ensure success, desired operational outcomes must be imparted during design and development. These include such design-dependent parameters as reliability, maintainability, supportability, usability, producibility, disposability, sustainability, affordability and others. These parameters are required to be considered at the early stages of design if desired operational behaviours are to be realised.
A system design and development requires appropriate and timely application of engineering and management efforts to meet the previously mentioned parameters.
A system may serve its intended purpose most effectively when its technical and operating characteristics are engineered into the design. Therefore, operational feasibility is a critical aspect of systems engineering that needs to be an integral part of the early design phases.
2. TECHNICAL FEASIBILITY
This involves questions such as whether the technology needed for the system exists, how difficult it will be to build, and whether the firm has
GICT Certified Predictive Modeler (CPM) course covers the concept of business analytics with more focus on predictive analytics
Find Out More : https://globalicttraining.com
Business Capability-centric Management of Services and Business Process ModelsWassim Derguech
With the advent of Industry 4.0, more and more companies are actively working on digitising their assets (i.e., services, processes, etc.) for better control, collaboration, modularity, analysis, etc. By 2020 more than 80% of companies will have digitised their business processes and value chains. This creates more services and processes, making their indexing, discovery, configuration, etc. more challenging. Thus, digitising assets needs a data model to describe them together with algorithms for indexing, discovery and configuration.
This thesis details a concept model for describing the business capability of services and business processes from a functional perspective in terms of what do they achieve together with related business properties. Furthermore, this work proposes the aggregation, indexing, discovery and configuration of services and business processes using the concept of business capability.
SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...Susan Hanley
Measurement is not just about looking for a bottom-line result to justify investments. It’s also a tool to provide feedback about where the organization is along the road to successfully leveraging investments in SharePoint and the business outcomes it provides. At every stage in the development of your solution, metrics provide a valuable means for focusing attention on desired behaviors and results. This presentation showcases a practical and realistic framework for SharePoint metrics based on real world examples and successes.
How GenAI will (not) change your business?Marlon Dumas
Not all new technology waves are the same. Some waves are vertical (3D printing, digital twins, blockchain) while others are horizontal (the PC in the 80s, the Web in the 90s). GenAI is a horizontal wave. The question is not if GenAI will impact my business, but what will be the scope of this impact. In this talk, we will go through a journey of collisions: GenAI colliding with customer service, clerical work, information search, content production, IT development, product design, and other knowledge work. A common thread to understand the impact of GenAI is to distinguish between descriptive use cases (search, summarize, expand, transcribe & translate) versus creative use.
Walking the Way from Process Mining to AI-Driven Process OptimizationMarlon Dumas
While generative AI grabs headlines, most organizations are yet to achieve continuous process improvement from predictive and prescriptive analytics.
Why? It’s largely about data, people, and a methodical approach to deploy AI to connect data and people. The good news is that if your organization has built a process mining capability, you are well placed to climb the ladder to achieve AI-driven process optimization. But to get there, you need a disciplined step-by-step approach along two tracks: a tactical management track and an operational management track.
First, it’s about predicting what will happen if you leave your process as-is, and what will happen if you implement a change in your process. At a tactical level, a predictive capability allows you to prioritize improvement opportunities. At an operational level, it allows you to predict issues, such as deadline violations. The challenges here are how to manage the inherent uncertainty of data-driven AI systems, and how to change your people and culture to manage processes proactively, rather than reactively. One thing is to deploy predictive dashboards, another entirely different thing is to get people to use them effectively to improve the processes.
Next, it’s about becoming preemptive: continuously optimizing your processes by leveraging streams of data-driven recommendations to trigger changes and actions. At the tactical level, this prescriptive capability allows you to implement the right changes to maximize competing KPIs. At the operational level, it means triggering interventions in your processes to “wow” customers and to meet SLAs in a cost-effective manner. The challenge here is how to help process owners, workers, and other stakeholders to understand the causes of performance issues and how the recommendations generated by the AI-driven optimization system will tackle those causes?
And finally, as an icing on the cake, generative AI allows you to produce improvement scenarios to adapt to external changes. Importantly, the transformative potential of generative AI in the context of process improvement does not come from its ability to provide question-and-answer interfaces to query data. It comes from its ability to support continuous process adaptation by generating and validating hypotheses based on a holistic view of your organization.
In this talk, we will discuss how organizations are driving sustainable business value by strategically layering predictive, prescriptive, and generative AI onto a process mining foundation, one brick at a time.
Industry keynote talk by Marlon Dumas at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, 25 October 2023
Discovery and Simulation of Business Processes with Probabilistic Resource Av...Marlon Dumas
In the field of business process simulation, the availability of resources is captured by assigning a calendar to each resource, e.g., Monday-Friday 9:00-18:00. Resources are assumed to be always available to perform activities during their calendar. This assumption often does not hold due to interruptions, breaks, or because resources time-share across multiple processes. A simulation model that captures availability via crisp time slots (a resource is either on or off during a slot) does not capture these behaviors, leading to inaccuracies in the simulation output. This paper presents a simulation approach wherein resource availability is modeled probabilistically. In this approach, each availability time slot is associated with a probability, allowing us to capture, for example, that a resource is available on Fridays between 14:00-15:00 with 90% probability and between 17:00-18:00 with 50% probability. The paper proposes an algorithm to discover probabilistic availability calendars from event logs. An empirical evaluation shows that simulation models with probabilistic calendars discovered from event logs, replicate the temporal distribution of activity instances and cycle times of a process more closely than simulation models with crisp calendars.
This presentation was delivered at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, October 2023.
The paper is available at: https://easychair.org/publications/preprint/Rz9g
Can I Trust My Simulation Model? Measuring the Quality of Business Process Si...Marlon Dumas
Business Process Simulation (BPS) is an approach to analyze the performance of business processes under different scenarios. For example, BPS allows us to estimate what would be the cycle time of a process if one or more resources became unavailable. The starting point of BPS is a process model annotated with simulation parameters (a BPS model). BPS models may be manually designed, based on information collected from stakeholders and empirical observations, or automatically discovered from execution data. Regardless of its origin, a key question when using a BPS model is how to assess its quality. In this paper, we propose a collection of measures to evaluate the quality of a BPS model w.r.t. its ability to replicate the observed behavior of the process. We advocate an approach whereby different measures tackle different process perspectives. We evaluate the ability of the proposed measures to discern the impact of modifications to a BPS model, and their ability to uncover the relative strengths and weaknesses of two approaches for automated discovery of BPS models. The evaluation shows that the measures not only capture how close a BPS model is to the observed behavior, but they also help us to identify sources of discrepancies.
Presentation delivered by David Chapela-Campa at the BPM'2023 conference, Utrecht, September 2023.
Business Process Optimization: Status and PerspectivesMarlon Dumas
For decades, business process optimization has been largely about art and craft (and sometimes wizardry). Apart from narrowly scoped approaches to optimize resource allocation (often assuming that workers behave like robots), a lot of business process optimization relies on high-level guidelines, with A/B testing for idea validation, which is hard to scale to complex processes. As a result, managers end up settling for a "good enough" process. Can we do more? In this talk, we review recent work on the use of high-fidelity simulation models discovered from execution data. The talk also explores the possibilities (and perils) that LLMs bring to the field of business process optimization.
This talk was delivered at the Workshop on Data-Driven Business Process Optimization at the BPM'2023 conference.
Learning When to Treat Business Processes: Prescriptive Process Monitoring wi...Marlon Dumas
Paper presentation at the 35th International Conference on Advanced Information Systems Engineering (CAiSE'2023).
Abstract.
Increasing the success rate of a process, i.e. the percentage of cases that end in a positive outcome, is a recurrent process improvement goal. At runtime, there are often certain actions (a.k.a. treatments) that workers may execute to lift the probability that a case ends in a positive outcome. For example, in a loan origination process, a possible treatment is to issue multiple loan offers to increase the probability that the customer takes a loan. Each treatment has a cost. Thus, when defining policies for prescribing treatments to cases, managers need to consider the net gain of the treatments. Also, the effect of a treatment varies over time: treating a case earlier may be more effective than later in a case. This paper presents a prescriptive monitoring method that automates this decision-making task. The method combines causal inference and reinforcement learning to learn treatment policies that maximize the net gain. The method leverages a conformal prediction technique to speed up the convergence of the reinforcement learning mechanism by separating cases that are likely to end up in a positive or negative outcome, from uncertain cases. An evaluation on two real-life datasets shows that the proposed method outperforms a state-of-the-art baseline.
Why am I Waiting Data-Driven Analysis of Waiting Times in Business ProcessesMarlon Dumas
Presentation of a research paper at the 35th International Conference on Advanced Information Systems Engineering (CAiSE) in Zaragoza Spain. The paper presents a classification of causes of waiting times in business processes and a method to automatically detect and quantify the presence of each of these causes in a business process recorded in an event log.
This talk introduces the concept of Augmented Business Process Management System: An ABPMS is a process-aware information system that relies on trustworthy AI technology to
reason and act upon data, within a set of restrictions, with the aim to continuously adapt and
improve a set of business processes with respect to one or more key performance indicators.
The talk describes the transition from existing process mining technology to AI-Augmented BPM as a pyramid, where predictive, prescriptive, conversational and reasoning capabilities are stacked up incrementally to reach the level of Augmented BPM.
Talk delivered at the AAAI'2023 Workshop on AI for Business Process Management.
Process Mining and Data-Driven Process SimulationMarlon Dumas
Guest lecture delivered at the - Institut Teknologi Sepuluh on 8 December 2022.
This lecture gives an overview of process mining and simulation techniques, and how the two can be used together in process improvement projects.
Modeling Extraneous Activity Delays in Business Process SimulationMarlon Dumas
This paper presents a technique to enhance the fidelity of business process simulation models by detecting unexplained (extraneous) delays from business process execution data, and modeling these delays in the simulation model, via timer events.
The presentation was delivered at the 4th International Conference on Process Mining (ICPM'2022).
Paper available at: https://arxiv.org/abs/2206.14051
Business Process Simulation with Differentiated Resources: Does it Make a Dif...Marlon Dumas
Existing methods for discovering business process simulation models from execution data (event logs) assume that all resources in a pool have the same performance and share the same availability calendars. This paper proposes a method for discovering simulation models, wherein each resource is treated as an individual entity, with its own performance and availability calendar. An evaluation shows that simulation models with differentiated resources more closely replicate the distributions of cycle times and the work rhythm in a process than models with undifferentiated resources. The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-16103-2_24
Prescriptive Process Monitoring Under Uncertainty and Resource ConstraintsMarlon Dumas
This paper presents an approach to trigger runtime interventions at runtime, in order to improve the success rate of a process, when the number of resources who can perform these interventions is limited.
The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-16171-1_13
The presentation delivered at the 20th International Conference on Business Process Management (BPM'2022), in Muenster, Germany, September 2022.
Slides of a lecture delivered at the First Process Mining Summer School in Aachen, Germany, July 2022.
This lecture introduces techniques in the area of "task mining" with an emphasis on Robotic Process Mining. Robotic Process Mining (RPM) is a family of techniques to discover repetitive routines that can be automated using Robotic Process Automation (RPA) technology, by analyzing interactions between
one or more workers and one or more software applications, during the performance of one or more tasks in a business process. In general, RPM techniques take as input logs of User Interactions (UI logs). These UI logs are recorded while workers interact with one or more applications, typically desktop applications. Based on these logs, RPM techniques produce specifications of one or more routines that can be automated using RPA or related tools.
Accurate and Reliable What-If Analysis of Business Processes: Is it Achievable?Marlon Dumas
In this talk, I discuss the problem of how to discover simulation models that can be used to, accurately and reliably, predict the impact of a change on a business process, e.g. what-if we automate an activity? what-if 10% of our workers become unavailable? I focus on recent approaches that exploit the availability of data in enterprise systems to address this question.
Learning Accurate Business Process Simulation Models from Event Logs via Auto...Marlon Dumas
Paper presentation at the International Conference on Advanced Information Systems Engineering (CAiSE).
This paper presents an approach to automatically discover business process simulation models from event logs by combining process mining and deep learning techniques.
Paper available at: https://link.springer.com/chapter/10.1007/978-3-031-07472-1_4
Process Mining: A Guide for PractitionersMarlon Dumas
Paper presentation delivered at the Research Conference on Challenges in Information Science (RCIS 2022). The paper studies the following questions:
1) What are the most common use cases for process mining methods?
2) What business questions do process mining methods address?
Paper available at:
https://link.springer.com/chapter/10.1007/978-3-031-05760-1_16
Data-Driven Analysis of Batch Processing Inefficiencies in Business ProcessesMarlon Dumas
Slides of a research paper presentation at the 16th International Conference on Research Challenges in Information Science (RCIS).
The research paper presents an approach to analyze event logs of business processes in order to identify batched activities and to analyze the waiting times caused by these activities.
Paper available at: https://link.springer.com/chapter/10.1007/978-3-031-05760-1_14
Optimización de procesos basada en datosMarlon Dumas
Ponencia en BPM Day Lima 2021.
En esta charla, hablaremos de métodos y aplicaciones emergentes en el ámbito de la optimización de procesos basada en datos. Hablaremos de avances en el área de la minería de procesos, de métodos de construcción de gemelos digitales de procesos y de métodos de monitoreo predictivo. Mostraremos por medio de ejemplos y casos de estudio, cómo estos métodos permiten guiar las iniciativas de transformación digital y de mejora continua de procesos, En particular, ilustraremos el uso de estos métodos para: (1) analizar el rendimiento de los procesos de negocio de manera a identificar fricciones y oportunidades de automatización; (2) predecir el impacto de cambios, y en particular, predecir el impacto de una iniciativa de automatización; (3) realizar predicciones sobre el rendimiento del proceso y ajustar la ejecución del proceso de manera a prevenir incumplimientos del SLA, quejas de clientes, y otros eventos indeseables.
Process Mining and AI for Continuous Process ImprovementMarlon Dumas
Talk delivered at BPM Day Rio Grande do Sul on 11 November 2021.
Abstract.
Process mining is a technology that marries methods from business process management and from data science, to support operational excellence and digital transformation. Process mining tools can transform data extracted from enterprise systems, into visualizations and reports that allow managers to improve organizational performance along different dimensions, such as efficiency, quality, and compliance. In this talk, we will give an overview of the capabilities of process mining tools, and we will illustrate the benefits of process mining via several case studies in the fields of insurance, manufacturing, and IT service management.
Prescriptive Process Monitoring for Cost-Aware Cycle Time ReductionMarlon Dumas
Paper presentation at the 3rd International Conference on Process Mining (ICPM), 4 November 2021.
The paper is available at: https://arxiv.org/abs/2105.07111
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
1. Process Mining for Process
Improvement - An Evaluation of
Analysis Practices
Kateryna Kubrak, Fredrik Milani, Alexander Nolte
University of Tartu,
Narva mnt 18, 51009 Tartu, Estonia
16th International Conference on Research Challenges
in Information Science (RCIS 2022)
2. 2
Anna
Process Analyst at ABC
I need to schedule a
meeting to get the
inputs from manager X.
I still have those
50 pages of
documentation
left to read.
The presentation of the
diagrams is due Monday
so I better get to it.
A day in Anna's life
before process mining …
Introduction
3. 3
Introduction
Anna Process Mining Tool
No extensive reading of
documentation? No
manual diagramming?
Great!
A day in Anna's life
with process mining …
6. Problem Statement
Previous work
● Studies on practical aspects of process mining, such as process
managers’ perception of adopting, using, and managing process
mining1
● Studies on how process mining is used by organizations2
● The need to research teams and skills needed for successful
process mining projects has been highlighted3
1Grisold,T.,Mendling,J.,Otto,M.,vom Brocke,J.:Adoption, use and management of process mining in practice. Bus. Process. Manag. J. 27(2), 369–387 (2021)
2Thiede, M., Fuerstenau, D., Barquet, A.P.B.: How is process mining technology used by organizations? A systematic literature review of empirical studies. Bus.
Process. Manag. J. 24(4), 900–922 (2018)
3Martin, N., Fischer, D.A., Kerpedzhiev, G.D., Goel, K.,Leemans, S.J.J., Röglinger, M., van der Aalst, W.M.P., Dumas, M., Rosa, M.L., Wynn, M.T.: Opportunities and
challenges for process mining in organizations: Results of a delphi study. Bus. Inf. Syst. Eng. 63(5), 511–527 (2021)
7. Problem Statement
Research objective
How do process analysts work with process mining when engaged in
process improvement initiatives?
● RQ1 How do process analysts use process mining to identify
improvement opportunities?
● RQ2 How do process analysts use process mining to select
improvement opportunities to address?
● RQ3 How do process analysts use process mining to communicate
their findings?
8. Method - Study Setup
Exploratory interview study
with 7 participants
Selected based on …
… their role (working internally in
the company or as a consultant)
… and their experience as a
process analyst
Semi-structured
online interviews
9. Method - Study Setup
Code Domain Project Study role (experience)
I-01 Electrical
Engineering
Improving order-to-fulfillment process Internal process analyst
(2 years)
I-02 Insurance Improving claim-to-resolution process Internal process analyst
(1 year)
I-03 Public Services Improving application-to-approval
process for immigration
Internal process analyst
(1 year)
I-04 Data Science Improving application-to-approval
process
Consultant
(4 years)
I-05 Auditing Analyzing claim-to-resolution process
at a regional paying agency
Consultant
(2 years)
I-06 Process Mining Analyzing standardization and
harmonization of processes
Consultant
(5 years)
I-07 E-Commerce Improving order-to-cash process Internal process analyst
(1 year)
10. Method - Data Collection
Interview guide based on RQs (sample questions):
RQ1 – Improvement opportunity identification
● "Were any visualizations used to help identify the improvement opportunity?"
● "What challenges did you have when you tried to analyze the process?"
RQ2 – Improvement opportunity prioritization
● "How was it decided which improvement opportunity to address?"
● "Who made this decision?"
RQ3 – Improvement opportunity communication
● "Who were the results presented to?"
● "How did you present your results?"
11. 11
Results
RQ1 – Finding 1
Identify improvement opportunities
using structured methods. Develop
own methods based on experience.
12. 12
Results
RQ1 – Finding 2
Divide the problem into sub-
problems to investigate them
separately.
13. 13
Results
RQ1 – Finding 3
Visually analyze discovered
process models in process mining
tools. Use other tools for advanced
view into the analyzed data and
visualizations.
14. 14
Results
RQ1 – Finding 4
Find a compromise between the
domain knowledge and process
mining outlook of the problem.
15. 15
Results
RQ2 – Finding 1
Assess the impact of the finding on
the process in terms of its location
and number of cases and variants
involved.
16. 16
Results
RQ2 – Finding 2
Analyze the dependency on entities
outside of the process or the
organization.
18. 18
Results
RQ3 – Finding 1
Use storytelling to present the
improvement opportunities and
select visual representations
according to the story.
19. 19
Results
RQ3 – Finding 2
Adjust the communication to the client's
needs (i.e. including more technical or
business details).
20. Discussion
To identify improvement opportunities
(RQ1)
● Take structured approach and consider adjusting
an existing method to own projects.
● Decompose the big problem into separate
problems and investigate them separately.
● Apply business-driven rationale when using
process mining for analysis.
● Consult other tools for deeper insight into data,
where required.
21. Discussion
To select improvement opportunities (RQ2)
● Analyze the improvement opportunity wrt its
location and proportion of cases/variants affected.
● Consider improvement opportunity's dependency
on outside entities.
● Estimate potential savings and costs.
22. Discussion
To communicate improvement opportunities
(RQ3)
● Consider structuring the communication using
storytelling.
● Adjust the story wrt to the audience's knowledge of
process mining and the domain.
23. 23
Implications
For developers of process mining
tools
Process mining tools can be enhanced to use
visualization and patterns to:
● facilitate analysis of dependencies of a
process with other processes,
● improve visualizations for communication
purposes,
● and incorporate support for financial
implications of the processes.
24. 24
Implications
For researchers
Directions for further research to facilitate
process analysts' work with improvement
opportunities:
● process mining templates to help identify
common improvement opportunities,
● algorithms for data-driven discovery of
improvement opportunities.
25. Conclusion
To identify
improvement
opportunities
Take structured
approach and consider
adjusting an existing
method to own projects.
Decompose the big
problem into separate
problems and investigate
them separately.
Apply business-driven
rationale when using
process mining for
analysis.
Consult other tools for
deeper insight into data,
where required.
To prioritize
improvement
opportunities
Analyze the improvement
opportunity wrt its
location and proportion of
cases/variants affected.
Consider improvement
opportunity's
dependency on outside
entities.
Estimate potential
savings and costs.
To communicate
improvement
opportunities
Consider structuring the
communication using
storytelling.
Adjust the story wrt to the
audience's knowledge of
process mining and the
domain.
Future work: exploring how visualization can be used to
facilitate the identification of improvement opportunities.
26. Thank you!
Kateryna Kubrak
kateryna.kubrak@ut.ee
PhD Student
Institute of Computer Science
University of Tartu
26
Icons by Freepik on flaticon.com
To identify
improvement
opportunities
Take structured
approach and consider
adjusting an existing
method to own projects.
Decompose the big
problem into separate
problems and investigate
them separately.
Apply business-driven
rationale when using
process mining for
analysis.
Consult other tools for
deeper insight into data,
where required.
To prioritize
improvement
opportunities
Analyze the improvement
opportunity wrt its
location and proportion of
cases/variants affected.
Consider improvement
opportunity's
dependency on outside
entities.
Estimate potential
savings and costs.
To communicate
improvement
opportunities
Consider structuring the
communication using
storytelling.
Adjust the story wrt to the
audience's knowledge of
process mining and the
domain.
Link to paper: