While generative AI grabs headlines, most organizations are yet to achieve continuous process improvement from predictive and prescriptive analytics.
Why? It’s largely about data, people, and a methodical approach to deploy AI to connect data and people. The good news is that if your organization has built a process mining capability, you are well placed to climb the ladder to achieve AI-driven process optimization. But to get there, you need a disciplined step-by-step approach along two tracks: a tactical management track and an operational management track.
First, it’s about predicting what will happen if you leave your process as-is, and what will happen if you implement a change in your process. At a tactical level, a predictive capability allows you to prioritize improvement opportunities. At an operational level, it allows you to predict issues, such as deadline violations. The challenges here are how to manage the inherent uncertainty of data-driven AI systems, and how to change your people and culture to manage processes proactively, rather than reactively. One thing is to deploy predictive dashboards, another entirely different thing is to get people to use them effectively to improve the processes.
Next, it’s about becoming preemptive: continuously optimizing your processes by leveraging streams of data-driven recommendations to trigger changes and actions. At the tactical level, this prescriptive capability allows you to implement the right changes to maximize competing KPIs. At the operational level, it means triggering interventions in your processes to “wow” customers and to meet SLAs in a cost-effective manner. The challenge here is how to help process owners, workers, and other stakeholders to understand the causes of performance issues and how the recommendations generated by the AI-driven optimization system will tackle those causes?
And finally, as an icing on the cake, generative AI allows you to produce improvement scenarios to adapt to external changes. Importantly, the transformative potential of generative AI in the context of process improvement does not come from its ability to provide question-and-answer interfaces to query data. It comes from its ability to support continuous process adaptation by generating and validating hypotheses based on a holistic view of your organization.
In this talk, we will discuss how organizations are driving sustainable business value by strategically layering predictive, prescriptive, and generative AI onto a process mining foundation, one brick at a time.
Industry keynote talk by Marlon Dumas at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, 25 October 2023
This talk introduces the concept of Augmented Business Process Management System: An ABPMS is a process-aware information system that relies on trustworthy AI technology to
reason and act upon data, within a set of restrictions, with the aim to continuously adapt and
improve a set of business processes with respect to one or more key performance indicators.
The talk describes the transition from existing process mining technology to AI-Augmented BPM as a pyramid, where predictive, prescriptive, conversational and reasoning capabilities are stacked up incrementally to reach the level of Augmented BPM.
Talk delivered at the AAAI'2023 Workshop on AI for Business Process Management.
Process Mining and AI for Continuous Process ImprovementMarlon Dumas
Talk delivered at BPM Day Rio Grande do Sul on 11 November 2021.
Abstract.
Process mining is a technology that marries methods from business process management and from data science, to support operational excellence and digital transformation. Process mining tools can transform data extracted from enterprise systems, into visualizations and reports that allow managers to improve organizational performance along different dimensions, such as efficiency, quality, and compliance. In this talk, we will give an overview of the capabilities of process mining tools, and we will illustrate the benefits of process mining via several case studies in the fields of insurance, manufacturing, and IT service management.
Process Mining 2.0: From Insights to ActionsMarlon Dumas
Keynote talk at the workshop on Artificial Intelligence for Enterprise Process Transformation in conjunction with the PAKDD'2021 conference. The talk focuses on the move from process mining as a descriptive analytics approach, to process mining as a predictive and prescriptive analytics technology for automated process improvement.
Slides supporting the book "Process Mining: Discovery, Conformance, and Enhancement of Business Processes" by Wil van der Aalst. See also http://springer.com/978-3-642-19344-6 (ISBN 978-3-642-19344-6) and the website http://www.processmining.org/book/start providing sample logs.
Slides of a lecture delivered at the First Process Mining Summer School in Aachen, Germany, July 2022.
This lecture introduces techniques in the area of "task mining" with an emphasis on Robotic Process Mining. Robotic Process Mining (RPM) is a family of techniques to discover repetitive routines that can be automated using Robotic Process Automation (RPA) technology, by analyzing interactions between
one or more workers and one or more software applications, during the performance of one or more tasks in a business process. In general, RPM techniques take as input logs of User Interactions (UI logs). These UI logs are recorded while workers interact with one or more applications, typically desktop applications. Based on these logs, RPM techniques produce specifications of one or more routines that can be automated using RPA or related tools.
Business Process Modelling research presentation introducing process mining and how it helps organizations in the monitoring phase to measure its effectiveness and efficiency and to further develop its To-Be models.
Introduction to Business Process Monitoring and Process MiningMarlon Dumas
Two-day course delivered at the Chinese Business Process Management (BPM) Summer School in Jinan, China, 23-24 August 2018. The course introduces a range of techniques, tools, and algorithms for process monitoring and mining.
Gartner EA: The Rise of Data-driven ArchitecturesLeanIX GmbH
LeanIX CEO André Christ's presentation from the 2019 Gartner Enterprise Architecture & Technology Innovation Summit in Orlando: Changing demands on Enterprise Architects require different approaches to tooling. The need to provide fast, smart answers to challenging business questions means switching from diagram-driven to data-driven architecture. The switch takes architecture from being used by the few, to a point where your whole organization is benefiting from and using the architecture you create every day!
This talk introduces the concept of Augmented Business Process Management System: An ABPMS is a process-aware information system that relies on trustworthy AI technology to
reason and act upon data, within a set of restrictions, with the aim to continuously adapt and
improve a set of business processes with respect to one or more key performance indicators.
The talk describes the transition from existing process mining technology to AI-Augmented BPM as a pyramid, where predictive, prescriptive, conversational and reasoning capabilities are stacked up incrementally to reach the level of Augmented BPM.
Talk delivered at the AAAI'2023 Workshop on AI for Business Process Management.
Process Mining and AI for Continuous Process ImprovementMarlon Dumas
Talk delivered at BPM Day Rio Grande do Sul on 11 November 2021.
Abstract.
Process mining is a technology that marries methods from business process management and from data science, to support operational excellence and digital transformation. Process mining tools can transform data extracted from enterprise systems, into visualizations and reports that allow managers to improve organizational performance along different dimensions, such as efficiency, quality, and compliance. In this talk, we will give an overview of the capabilities of process mining tools, and we will illustrate the benefits of process mining via several case studies in the fields of insurance, manufacturing, and IT service management.
Process Mining 2.0: From Insights to ActionsMarlon Dumas
Keynote talk at the workshop on Artificial Intelligence for Enterprise Process Transformation in conjunction with the PAKDD'2021 conference. The talk focuses on the move from process mining as a descriptive analytics approach, to process mining as a predictive and prescriptive analytics technology for automated process improvement.
Slides supporting the book "Process Mining: Discovery, Conformance, and Enhancement of Business Processes" by Wil van der Aalst. See also http://springer.com/978-3-642-19344-6 (ISBN 978-3-642-19344-6) and the website http://www.processmining.org/book/start providing sample logs.
Slides of a lecture delivered at the First Process Mining Summer School in Aachen, Germany, July 2022.
This lecture introduces techniques in the area of "task mining" with an emphasis on Robotic Process Mining. Robotic Process Mining (RPM) is a family of techniques to discover repetitive routines that can be automated using Robotic Process Automation (RPA) technology, by analyzing interactions between
one or more workers and one or more software applications, during the performance of one or more tasks in a business process. In general, RPM techniques take as input logs of User Interactions (UI logs). These UI logs are recorded while workers interact with one or more applications, typically desktop applications. Based on these logs, RPM techniques produce specifications of one or more routines that can be automated using RPA or related tools.
Business Process Modelling research presentation introducing process mining and how it helps organizations in the monitoring phase to measure its effectiveness and efficiency and to further develop its To-Be models.
Introduction to Business Process Monitoring and Process MiningMarlon Dumas
Two-day course delivered at the Chinese Business Process Management (BPM) Summer School in Jinan, China, 23-24 August 2018. The course introduces a range of techniques, tools, and algorithms for process monitoring and mining.
Gartner EA: The Rise of Data-driven ArchitecturesLeanIX GmbH
LeanIX CEO André Christ's presentation from the 2019 Gartner Enterprise Architecture & Technology Innovation Summit in Orlando: Changing demands on Enterprise Architects require different approaches to tooling. The need to provide fast, smart answers to challenging business questions means switching from diagram-driven to data-driven architecture. The switch takes architecture from being used by the few, to a point where your whole organization is benefiting from and using the architecture you create every day!
This describes a conceptual model approach to designing an enterprise data fabric. This is the set of hardware and software infrastructure, tools and facilities to implement, administer, manage and operate data operations across the entire span of the data within the enterprise across all data activities including data acquisition, transformation, storage, distribution, integration, replication, availability, security, protection, disaster recovery, presentation, analytics, preservation, retention, backup, retrieval, archival, recall, deletion, monitoring, capacity planning across all data storage platforms enabling use by applications to meet the data needs of the enterprise.
The conceptual data fabric model represents a rich picture of the enterprise’s data context. It embodies an idealised and target data view.
Designing a data fabric enables the enterprise respond to and take advantage of key related data trends:
• Internal and External Digital Expectations
• Cloud Offerings and Services
• Data Regulations
• Analytics Capabilities
It enables the IT function demonstrate positive data leadership. It shows the IT function is able and willing to respond to business data needs. It allows the enterprise to meet data challenges
• More and more data of many different types
• Increasingly distributed platform landscape
• Compliance and regulation
• Newer data technologies
• Shadow IT where the IT function cannot deliver IT change and new data facilities quickly
It is concerned with the design an open and flexible data fabric that improves the responsiveness of the IT function and reduces shadow IT.
DAS Slides: Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key inter-relationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how Data Architecture is a key component of an overall enterprise architecture for enhanced business value and success.
Use Case: Airbus and Process Mining TechnologyCelonis
Within the framework of its digital transformation, Airbus is using Celonis software to gain a better understanding of its processes to maximize improvement. In this session we will discuss how our ERP Solution Center is organized to answer all the organization’s business needs for process mining. We will also provide you with an overview of business area uses for Celonis at Airbus and provide a concrete use case of process improvement at the operation level.
Presenters:
Mr. Gildas Lavergne, Head of ERP Solution Center Technologies, Airbus
Ms. Xiaowei Jiang, Product Owner of Process Mining Solutions, Airbus
How to Take Advantage of the Unique Celonis EcosystemCelonis
What sets Celonis apart from standard process mining technology providers? Aside from our innovative intelligent process mining and cloud solution, we have an incredible resource in the form of talented professionals--employees, partners, and customers--who support and guide our efforts to facilitate successful business transformation. In this session, learn more about this unique ecosystem and how it can add value to your process mining journey.
Presenter:
Sebastian Walter, VP Professional Services & Customer Success, Celonis
Developing & Deploying Effective Data Governance FrameworkKannan Subbiah
This is the slide deck presented at the Customer Privacy and Data Protection India Summit 2019 held in Mumbai, India. The specific topics touched upon are the guiding principles, Aligning with Data Architecture, Data Quality & Compliance.
Presentation (jointly with Claudio Di Ciccio) on "Declarative Process Mining", as part of the 1st Summer School in Process Mining (http://www.process-mining-summer-school.org). The Presentation summarizes 15 years of research in declarative process mining, covering declarative process modeling, reasoning on declarative process specifications, discovery of process constraints from event logs, conformance checking and monitoring of process constraints at runtime. This is done without ad-hoc algorithms, but relying on well-established techniques at the intersection of formal methods, artificial intelligence, and data science.
This Slide Deck was presented at the annual international conference of itSMF Slovensko on May, 6th. in Bratislava. It gives an introduction into Process Mining as a new useful approach to discover real life processes in IT Service Management end everywhere else where processes are driven by tools providing log file information.
Many thanks to Anne Rozinat http://fluxicon.com for the graphs and information she provided to itSMF Austria. Many thanks to Celonis for providing a demo application.
Please recognize the further links and recommendations at the end of the presentation.
Creating business blueprints in Solution Manager has traditionally been such a time consuming task that most organizations have given up on the idea. But what if you could automatically create and update blueprints? During this webinar we discussed how the latest technology makes it possible.
Build Real-Time Applications with Databricks StreamingDatabricks
In this presentation, we will study a recent use case we implemented recently. In this use case we are working with a large, metropolitan fire department. Our company has already created a complete analytics architecture for the department based upon Azure Data Factory, Databricks, Delta Lake, Azure SQL and Azure SQL Server Analytics Services (SSAS). While this architecture works very well for the department, they would like to add a real-time channel to their reporting infrastructure.
This channel should serve up the following information: •The most up-to-date locations and status of equipment (fire trucks, ambulances, ladders etc.)
• The current locations and status of firefighters, EMT personnel and other relevant fire department employees
• The current list of active incidents within the city The above information should be visualized through an automatically updating dashboard. The central component of the dashboard will be map which automatically updates with the locations and incidents. This view should be as real-time as possible and will be used by the fire chiefs to assist with real-time decision-making on resource and equipment deployments.
In this presentation, we will leverage Databricks, Spark Structured Streaming, Delta Lake and the Azure platform to create this real-time delivery channel.
Slides supporting the book "Process Mining: Discovery, Conformance, and Enhancement of Business Processes" by Wil van der Aalst. See also http://springer.com/978-3-642-19344-6 (ISBN 978-3-642-19344-6) and the website http://www.processmining.org/book/start providing sample logs.
The Analytics CoE: Positioning your Business Analytics Program for SuccessCartegraph
This Loras College Business Analytics Symposium breakout session presentation by Kiran Garimella, Ph.D., president and founder of XBITALIGN, explored the analytics center of excellence (CoE).
A business analytics program is more than the application of data science and Big Data technology to data. Success should be measured not only by the valuable insights the program delivers, but also by how well it is sustained and how much the ‘analytics mindset’ becomes part of the company’s DNA. The journey is not only from data to information, but also from information to knowledge, and from knowledge to intelligence. The foundation for making this happen is a well-structured Analytics Center of Excellence (CoE).
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
Got data?… now what? An introduction to modern data platformsJamesAnderson599331
What are Data Analytics Platforms? What decision points are necessary in creating a modern, unified analytics data platform? What benefits are there to building your analytics data platform on Google Cloud Platform? Susan Pierce walks us through it all.
This deck provides a high-level framework to implement business process redesign within a business transformation initiative. It shows how to establish the team, define the approach, and identify some of the deliverables within this track of work.
We present our solution for building an AI Architecture that provides engineering teams the ability to leverage data to drive insight and help our customers solve their problems. We started with siloed data, entities that were described differently by each product, different formats, complicated security and access schemes, data spread over numerous locations and systems.
Product-thinking is making a big impact in the data world with the rise of Data Products, Data Product Managers, data mesh, and treating “Data as a Product.” But Honest, No-BS: What is a Data Product? And what key questions should we ask ourselves while developing them? Tim Gasper (VP of Product, data.world), will walk through the Data Product ABCs as a way to make treating data as a product way simpler: Accountability, Boundaries, Contracts and Expectations, Downstream Consumers, and Explicit Knowledge.
DataTalkClub Conference, Feb 12 2021
Creating a machine learning model is not an easy task.
Creating a useful machine learning model that gets into production and generates actual business value - is an even harder one.
There are many ways for an ML project or product to fail even when the data is there and the model technically performs well. From the wrong problem statement to lack of trust from stakeholders, in this talk I will discuss what issues to look out for, and how to avoid them.
Governance, Risk and Compliance and you | CollabDays Bletchley Park 2022Nikki Chapple
5 October 2022: CollabDays Bletchley Park 2022 - October edition | In-person event United Kingdom
Governance, Risk and Compliance and you – Microsoft Purview and beyond | Simon Hudson & Nikki Chapple
Governance, Risk and Compliance; it’s not nice to have, It’s The Law. Every organisation needs to pay attention to GRC, but not everyone has the tools, expertise or strategy. Microsoft Purview is a surprisingly capable tool in your organisation’s GRC tool bag when combined with a broad & competent approach. This session will provide: – an overview of GRC obligations and approaches – what’s in Purview – pragmatic approaches to elevating your Compliance Score – wider technical and business thinking for de-risking your operations and organisation – thoughts on using the Maturity Model for Microsoft 365 GRC Competency to set your objectives.
This describes a conceptual model approach to designing an enterprise data fabric. This is the set of hardware and software infrastructure, tools and facilities to implement, administer, manage and operate data operations across the entire span of the data within the enterprise across all data activities including data acquisition, transformation, storage, distribution, integration, replication, availability, security, protection, disaster recovery, presentation, analytics, preservation, retention, backup, retrieval, archival, recall, deletion, monitoring, capacity planning across all data storage platforms enabling use by applications to meet the data needs of the enterprise.
The conceptual data fabric model represents a rich picture of the enterprise’s data context. It embodies an idealised and target data view.
Designing a data fabric enables the enterprise respond to and take advantage of key related data trends:
• Internal and External Digital Expectations
• Cloud Offerings and Services
• Data Regulations
• Analytics Capabilities
It enables the IT function demonstrate positive data leadership. It shows the IT function is able and willing to respond to business data needs. It allows the enterprise to meet data challenges
• More and more data of many different types
• Increasingly distributed platform landscape
• Compliance and regulation
• Newer data technologies
• Shadow IT where the IT function cannot deliver IT change and new data facilities quickly
It is concerned with the design an open and flexible data fabric that improves the responsiveness of the IT function and reduces shadow IT.
DAS Slides: Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key inter-relationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how Data Architecture is a key component of an overall enterprise architecture for enhanced business value and success.
Use Case: Airbus and Process Mining TechnologyCelonis
Within the framework of its digital transformation, Airbus is using Celonis software to gain a better understanding of its processes to maximize improvement. In this session we will discuss how our ERP Solution Center is organized to answer all the organization’s business needs for process mining. We will also provide you with an overview of business area uses for Celonis at Airbus and provide a concrete use case of process improvement at the operation level.
Presenters:
Mr. Gildas Lavergne, Head of ERP Solution Center Technologies, Airbus
Ms. Xiaowei Jiang, Product Owner of Process Mining Solutions, Airbus
How to Take Advantage of the Unique Celonis EcosystemCelonis
What sets Celonis apart from standard process mining technology providers? Aside from our innovative intelligent process mining and cloud solution, we have an incredible resource in the form of talented professionals--employees, partners, and customers--who support and guide our efforts to facilitate successful business transformation. In this session, learn more about this unique ecosystem and how it can add value to your process mining journey.
Presenter:
Sebastian Walter, VP Professional Services & Customer Success, Celonis
Developing & Deploying Effective Data Governance FrameworkKannan Subbiah
This is the slide deck presented at the Customer Privacy and Data Protection India Summit 2019 held in Mumbai, India. The specific topics touched upon are the guiding principles, Aligning with Data Architecture, Data Quality & Compliance.
Presentation (jointly with Claudio Di Ciccio) on "Declarative Process Mining", as part of the 1st Summer School in Process Mining (http://www.process-mining-summer-school.org). The Presentation summarizes 15 years of research in declarative process mining, covering declarative process modeling, reasoning on declarative process specifications, discovery of process constraints from event logs, conformance checking and monitoring of process constraints at runtime. This is done without ad-hoc algorithms, but relying on well-established techniques at the intersection of formal methods, artificial intelligence, and data science.
This Slide Deck was presented at the annual international conference of itSMF Slovensko on May, 6th. in Bratislava. It gives an introduction into Process Mining as a new useful approach to discover real life processes in IT Service Management end everywhere else where processes are driven by tools providing log file information.
Many thanks to Anne Rozinat http://fluxicon.com for the graphs and information she provided to itSMF Austria. Many thanks to Celonis for providing a demo application.
Please recognize the further links and recommendations at the end of the presentation.
Creating business blueprints in Solution Manager has traditionally been such a time consuming task that most organizations have given up on the idea. But what if you could automatically create and update blueprints? During this webinar we discussed how the latest technology makes it possible.
Build Real-Time Applications with Databricks StreamingDatabricks
In this presentation, we will study a recent use case we implemented recently. In this use case we are working with a large, metropolitan fire department. Our company has already created a complete analytics architecture for the department based upon Azure Data Factory, Databricks, Delta Lake, Azure SQL and Azure SQL Server Analytics Services (SSAS). While this architecture works very well for the department, they would like to add a real-time channel to their reporting infrastructure.
This channel should serve up the following information: •The most up-to-date locations and status of equipment (fire trucks, ambulances, ladders etc.)
• The current locations and status of firefighters, EMT personnel and other relevant fire department employees
• The current list of active incidents within the city The above information should be visualized through an automatically updating dashboard. The central component of the dashboard will be map which automatically updates with the locations and incidents. This view should be as real-time as possible and will be used by the fire chiefs to assist with real-time decision-making on resource and equipment deployments.
In this presentation, we will leverage Databricks, Spark Structured Streaming, Delta Lake and the Azure platform to create this real-time delivery channel.
Slides supporting the book "Process Mining: Discovery, Conformance, and Enhancement of Business Processes" by Wil van der Aalst. See also http://springer.com/978-3-642-19344-6 (ISBN 978-3-642-19344-6) and the website http://www.processmining.org/book/start providing sample logs.
The Analytics CoE: Positioning your Business Analytics Program for SuccessCartegraph
This Loras College Business Analytics Symposium breakout session presentation by Kiran Garimella, Ph.D., president and founder of XBITALIGN, explored the analytics center of excellence (CoE).
A business analytics program is more than the application of data science and Big Data technology to data. Success should be measured not only by the valuable insights the program delivers, but also by how well it is sustained and how much the ‘analytics mindset’ becomes part of the company’s DNA. The journey is not only from data to information, but also from information to knowledge, and from knowledge to intelligence. The foundation for making this happen is a well-structured Analytics Center of Excellence (CoE).
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
Got data?… now what? An introduction to modern data platformsJamesAnderson599331
What are Data Analytics Platforms? What decision points are necessary in creating a modern, unified analytics data platform? What benefits are there to building your analytics data platform on Google Cloud Platform? Susan Pierce walks us through it all.
This deck provides a high-level framework to implement business process redesign within a business transformation initiative. It shows how to establish the team, define the approach, and identify some of the deliverables within this track of work.
We present our solution for building an AI Architecture that provides engineering teams the ability to leverage data to drive insight and help our customers solve their problems. We started with siloed data, entities that were described differently by each product, different formats, complicated security and access schemes, data spread over numerous locations and systems.
Product-thinking is making a big impact in the data world with the rise of Data Products, Data Product Managers, data mesh, and treating “Data as a Product.” But Honest, No-BS: What is a Data Product? And what key questions should we ask ourselves while developing them? Tim Gasper (VP of Product, data.world), will walk through the Data Product ABCs as a way to make treating data as a product way simpler: Accountability, Boundaries, Contracts and Expectations, Downstream Consumers, and Explicit Knowledge.
DataTalkClub Conference, Feb 12 2021
Creating a machine learning model is not an easy task.
Creating a useful machine learning model that gets into production and generates actual business value - is an even harder one.
There are many ways for an ML project or product to fail even when the data is there and the model technically performs well. From the wrong problem statement to lack of trust from stakeholders, in this talk I will discuss what issues to look out for, and how to avoid them.
Governance, Risk and Compliance and you | CollabDays Bletchley Park 2022Nikki Chapple
5 October 2022: CollabDays Bletchley Park 2022 - October edition | In-person event United Kingdom
Governance, Risk and Compliance and you – Microsoft Purview and beyond | Simon Hudson & Nikki Chapple
Governance, Risk and Compliance; it’s not nice to have, It’s The Law. Every organisation needs to pay attention to GRC, but not everyone has the tools, expertise or strategy. Microsoft Purview is a surprisingly capable tool in your organisation’s GRC tool bag when combined with a broad & competent approach. This session will provide: – an overview of GRC obligations and approaches – what’s in Purview – pragmatic approaches to elevating your Compliance Score – wider technical and business thinking for de-risking your operations and organisation – thoughts on using the Maturity Model for Microsoft 365 GRC Competency to set your objectives.
Has your organization ever considered replacing a tester that did not write, for example, 15 test cases per day? Is the testing team blamed if defect leakage is greater than 5% into production? What drives decisions like these? The common thread in these examples is “Test Metrics”
Test Metrics... Everyone has an opinion about them. Some believe they are the most valuable way to communicate the results of testing. Some think that they are useless, misleading, and damaging to the communication of test results. Some believe that without measurement you are not managing the effort. And some believe that bad metrics are worse than no metrics at all.
Where does your organization fit in the metrics and measurement debates? Is your team aligned? Do you agree with the team? Do you use a reporting process for test results? Are you forced to report on metrics you don't believe are valuable? Do you have dozens of metrics that you are reporting periodically that no one looks at, and when they do look at them, there is room for misinterpretation?
In this session, Mike Lyles and Jay Philips will challenge the audience to discuss the topic of metrics and measurement, review multiple viewpoints on the topic, and address many of the questions that organizations have today around metrics and measurement.
Takeaways:
- Top metrics that are misused or misunderstood in most every organization.
- Metrics that you should you get rid of ASAP!
- Best and Worst metrics - based on opinions of the speakers & audience.
- Metrics that everyone should use – and how they compare to your organization’s metrics.
- Tools and processes that can help your organization better measure your testing.
** Presentation given at STPCon Spring 2014
According to recent research report by Wall Street Journal, AI project failure rates near 50%, more than 53% terminates at proof of concept level and does not make it to production. Gartner report says that nearly 80% of the analytics projects are not delivering any business value. That means for every 10 projects, only 2 projects are useful to the organization. Let us pause here a moment, rather than looking at what makes AI projects to fail, let’s look at the challenges involved in AI projects and find a solution to overcome these challenges.
AI projects are different from traditional software projects. Typical software projects, as shown in Figure 1, consist of well-defined software requirements, high level design, coding, unit testing, system testing, and deployment along with beta testing or field testing. Now, organizations are adopting Agile process instead of traditional V or waterfall model, but still steps mentioned are valid.
However, AI and Machine Learning projects’ methodology is different from the above. Our experience working on many AI/ML projects has given us insights on some of the challenges of executing AI projects. Also, we are in regular touch with senior executives and thought leaders from different industries who understand the success formula. The following discussion is based on our practical experience and knowledge gained in the field.
Successful execution of AI projects depends on the following factors:
1. Clearly aligned Business Expectations
2. Clarity on Terminologies
3. Meeting Data Requirements
4. Tools and Technology
5. Right Resources
6. Understanding Output Results
7. Project Planning and the Process
Explainability for Natural Language ProcessingYunyao Li
Final deck for our popular tutorial on "Explainability for Natural Language Processing" at KDD'2021. See links below for downloadable version (with higher resolution) and recording of the live tutorial.
Title: Explainability for Natural Language Processing
Presenter: Marina Danilevsky, Shipi Dhanorkar, Yunyao Li and Lucian Popa and Kun Qian and Anbang Xu
Website: http://xainlp.github.io/
Recording: https://www.youtube.com/watch?v=PvKOSYGclPk&t=2s
Downloadable version with higher resolution: https://drive.google.com/file/d/1_gt_cS9nP9rcZOn4dcmxc2CErxrHW9CU/view?usp=sharing
@article{kdd2021xaitutorial,
title={Explainability for Natural Language Processing},
author= {Marina Danilevsky, Shipi Dhanorkar and Yunyao Li and Lucian Popa and Kun Qian and Anbang Xu},
journal={KDD},
year={2021}
}
Abstract:
This lecture-style tutorial, which mixes in an interactive literature browsing component, is intended for the many researchers and practitioners working with text data and on applications of natural language processing (NLP) in data science and knowledge discovery. The focus of the tutorial is on the issues of transparency and interpretability as they relate to building models for text and their applications to knowledge discovery. As black-box models have gained popularity for a broad range of tasks in recent years, both the research and industry communities have begun developing new techniques to render them more transparent and interpretable.Reporting from an interdisciplinary team of social science, human-computer interaction (HCI), and NLP/knowledge management researchers, our tutorial has two components: an introduction to explainable AI (XAI) in the NLP domain and a review of the state-of-the-art research; and findings from a qualitative interview study of individuals working on real-world NLP projects as they are applied to various knowledge extraction and discovery at a large, multinational technology and consulting corporation. The first component will introduce core concepts related to explainability inNLP. Then, we will discuss explainability for NLP tasks and reporton a systematic literature review of the state-of-the-art literaturein AI, NLP and HCI conferences. The second component reports on our qualitative interview study, which identifies practical challenges and concerns that arise in real-world development projects that require the modeling and understanding of text data.
Building Continuous Auditing Capabilities utilizing CAATs and Data Analytics technologies. Overview , CA, DA, ACL, Audit Guidelines, Technology, Audit Innovation,
Explainability for Natural Language ProcessingYunyao Li
NOTE: Please check out the final version here with small but important updates and links to downloadable version and recording: https://www.slideshare.net/YunyaoLi/explainability-for-natural-language-processing-249992241
Updated version on our popular tutorial on "Explainability for Natural Language Processing" as a tutorial at KDD'2021.
Title: Explainability for Natural Language Processing
@article{kdd2021xaitutorial,
title={Explainability for Natural Language Processing},
author= {Marina Danilevsky, Dhanorkar, Shipi and Li, Yunyao and Lucian Popa and Kun Qian and Anbang Xu},
journal={KDD},
year={2021}
}
Presenter: Marina Danilevsky, Dhanorkar, Shipi and Li, Yunyao and Lucian Popa and Kun Qian and Anbang Xu
Website: http://xainlp.github.io/
Abstract:
This lecture-style tutorial, which mixes in an interactive literature browsing component, is intended for the many researchers and practitioners working with text data and on applications of natural language processing (NLP) in data science and knowledge discovery. The focus of the tutorial is on the issues of transparency and interpretability as they relate to building models for text and their applications to knowledge discovery. As black-box models have gained popularity for a broad range of tasks in recent years, both the research and industry communities have begun developing new techniques to render them more transparent and interpretable.Reporting from an interdisciplinary team of social science, human-computer interaction (HCI), and NLP/knowledge management researchers, our tutorial has two components: an introduction to explainable AI (XAI) in the NLP domain and a review of the state-of-the-art research; and findings from a qualitative interview study of individuals working on real-world NLP projects as they are applied to various knowledge extraction and discovery at a large, multinational technology and consulting corporation. The first component will introduce core concepts related to explainability inNLP. Then, we will discuss explainability for NLP tasks and reporton a systematic literature review of the state-of-the-art literaturein AI, NLP and HCI conferences. The second component reports on our qualitative interview study, which identifies practical challenges and concerns that arise in real-world development projects that require the modeling and understanding of text data.
Despite increased adoption across industries, many people still have trouble defining and distinguishing between Agile, DevOps and product management. What’s the difference between these practices? Are they competing or complementary?
In this on-demand Agile Leadership Series webinar, we’ll explore what the Agile mindset is and how to develop it, taking a deep dive into the technical practices needed to build in quality at every step of the development process. Learn how, together, these approaches can improve quality and dramatically decrease time to market. We’ll also discuss how product management can help to ensure teams are building the right features for the right users.
What we’ll cover:
Defining Agile, DevOps and product management
The combined value of these approaches (and what happens when one is left out)
How to identify and prevent feature factories, technical debt and feature debt
Strategies for bringing these approaches to your organization
ML in GRC: Supporting Human Decision Making for Regulatory Adherence with Mac...BigML, Inc
This is a real-life Machine Learning use case about integrated risk.
Speakers: Thomas Rengersen, Product Owner of the Governance Risk and Compliance Tool for Rabobank, and Thomas Alderse Baas, Co-Founder and Director of The Bowmen Group.
*ML in GRC 2021: Virtual Conference.
Ditch the Surplus Software and Hardware Spend that's Weighing you DownIvanti
Advance your ITAM Program with these Top 6 Best Practices
Are you still struggling to keep tabs on your software and hardware with spreadsheets? Break free in 2019! Spreadsheets are cumbersome and difficult to maintain. Let Ivanti ITAM help you to go beyond spreadsheets and basic inventory and asset tracking.
Join our ITAM experts to explore the top 6 Things to think about when starting or advancing your ITAM program to better help balance costs and risks in your organization.
New Model Testing: A New Test Process and ToolTEST Huddle
In this webinar, Paul described his experiences of building and using a bot for paired testing and also propose a new test process suitable for both high integrity and agile environments. His bot – codenamed System Surveyor – builds a model of the system as you explore and captures test ideas, risks and questions and generates structured test documentation as a by-product.
Now more than ever, organizations must capture what’s happening in the business and transform their data into faster, smarter decisions. In this deck, you’ll learn how Deloitte uses Workday Prism Analytics to harness financial, workforce, and operational data by unlocking key analytics at the most critical times.
View related videos:
Welcome to the New World of Analytics.
https://www.youtube.com/watch?v=DLOekjChar0
Build Belonging and Diversity | Insights https://www.youtube.com/watch?v=slhpTY5z68c
Agile and CMMI: Yes, They Can Work TogetherTechWell
There is a common misconception that agile and CMMI cannot work together. CMMI is viewed as a documentation heavy, slow, process-driven model—the polar opposite of agile principles. The cost of documentation for an appraisal is viewed as another drawback. Join Ed Weller to see why a large organization chose to use the practices in the CMMI to complement agile, and a formal appraisal to improve and evaluate their performance. When mixing approaches that seem contradictory, the first step is to understand the benefits, drawbacks, and cost of each approach and then identify complementary additions. This includes myth busting the misperceptions about both agile and CMMI. The second step, using a formal CMMI appraisal to evaluate organizational performance, requires an understanding of the CMMI model that goes beyond a “checklist approach” requiring extensive documentation. Using lean principles, the appraisal team minimized “appraisal documentation” by using the day-to-day team output. Ed shows that agile and CMMI can be complementary due to executive leadership, lean implementation, and organization training, as demonstrated by a formal appraisal and business results.
How GenAI will (not) change your business?Marlon Dumas
Not all new technology waves are the same. Some waves are vertical (3D printing, digital twins, blockchain) while others are horizontal (the PC in the 80s, the Web in the 90s). GenAI is a horizontal wave. The question is not if GenAI will impact my business, but what will be the scope of this impact. In this talk, we will go through a journey of collisions: GenAI colliding with customer service, clerical work, information search, content production, IT development, product design, and other knowledge work. A common thread to understand the impact of GenAI is to distinguish between descriptive use cases (search, summarize, expand, transcribe & translate) versus creative use.
Discovery and Simulation of Business Processes with Probabilistic Resource Av...Marlon Dumas
In the field of business process simulation, the availability of resources is captured by assigning a calendar to each resource, e.g., Monday-Friday 9:00-18:00. Resources are assumed to be always available to perform activities during their calendar. This assumption often does not hold due to interruptions, breaks, or because resources time-share across multiple processes. A simulation model that captures availability via crisp time slots (a resource is either on or off during a slot) does not capture these behaviors, leading to inaccuracies in the simulation output. This paper presents a simulation approach wherein resource availability is modeled probabilistically. In this approach, each availability time slot is associated with a probability, allowing us to capture, for example, that a resource is available on Fridays between 14:00-15:00 with 90% probability and between 17:00-18:00 with 50% probability. The paper proposes an algorithm to discover probabilistic availability calendars from event logs. An empirical evaluation shows that simulation models with probabilistic calendars discovered from event logs, replicate the temporal distribution of activity instances and cycle times of a process more closely than simulation models with crisp calendars.
This presentation was delivered at the 5th International Conference on Process Mining (ICPM'2023), Rome, Italy, October 2023.
The paper is available at: https://easychair.org/publications/preprint/Rz9g
Can I Trust My Simulation Model? Measuring the Quality of Business Process Si...Marlon Dumas
Business Process Simulation (BPS) is an approach to analyze the performance of business processes under different scenarios. For example, BPS allows us to estimate what would be the cycle time of a process if one or more resources became unavailable. The starting point of BPS is a process model annotated with simulation parameters (a BPS model). BPS models may be manually designed, based on information collected from stakeholders and empirical observations, or automatically discovered from execution data. Regardless of its origin, a key question when using a BPS model is how to assess its quality. In this paper, we propose a collection of measures to evaluate the quality of a BPS model w.r.t. its ability to replicate the observed behavior of the process. We advocate an approach whereby different measures tackle different process perspectives. We evaluate the ability of the proposed measures to discern the impact of modifications to a BPS model, and their ability to uncover the relative strengths and weaknesses of two approaches for automated discovery of BPS models. The evaluation shows that the measures not only capture how close a BPS model is to the observed behavior, but they also help us to identify sources of discrepancies.
Presentation delivered by David Chapela-Campa at the BPM'2023 conference, Utrecht, September 2023.
Business Process Optimization: Status and PerspectivesMarlon Dumas
For decades, business process optimization has been largely about art and craft (and sometimes wizardry). Apart from narrowly scoped approaches to optimize resource allocation (often assuming that workers behave like robots), a lot of business process optimization relies on high-level guidelines, with A/B testing for idea validation, which is hard to scale to complex processes. As a result, managers end up settling for a "good enough" process. Can we do more? In this talk, we review recent work on the use of high-fidelity simulation models discovered from execution data. The talk also explores the possibilities (and perils) that LLMs bring to the field of business process optimization.
This talk was delivered at the Workshop on Data-Driven Business Process Optimization at the BPM'2023 conference.
Learning When to Treat Business Processes: Prescriptive Process Monitoring wi...Marlon Dumas
Paper presentation at the 35th International Conference on Advanced Information Systems Engineering (CAiSE'2023).
Abstract.
Increasing the success rate of a process, i.e. the percentage of cases that end in a positive outcome, is a recurrent process improvement goal. At runtime, there are often certain actions (a.k.a. treatments) that workers may execute to lift the probability that a case ends in a positive outcome. For example, in a loan origination process, a possible treatment is to issue multiple loan offers to increase the probability that the customer takes a loan. Each treatment has a cost. Thus, when defining policies for prescribing treatments to cases, managers need to consider the net gain of the treatments. Also, the effect of a treatment varies over time: treating a case earlier may be more effective than later in a case. This paper presents a prescriptive monitoring method that automates this decision-making task. The method combines causal inference and reinforcement learning to learn treatment policies that maximize the net gain. The method leverages a conformal prediction technique to speed up the convergence of the reinforcement learning mechanism by separating cases that are likely to end up in a positive or negative outcome, from uncertain cases. An evaluation on two real-life datasets shows that the proposed method outperforms a state-of-the-art baseline.
Why am I Waiting Data-Driven Analysis of Waiting Times in Business ProcessesMarlon Dumas
Presentation of a research paper at the 35th International Conference on Advanced Information Systems Engineering (CAiSE) in Zaragoza Spain. The paper presents a classification of causes of waiting times in business processes and a method to automatically detect and quantify the presence of each of these causes in a business process recorded in an event log.
Process Mining and Data-Driven Process SimulationMarlon Dumas
Guest lecture delivered at the - Institut Teknologi Sepuluh on 8 December 2022.
This lecture gives an overview of process mining and simulation techniques, and how the two can be used together in process improvement projects.
Modeling Extraneous Activity Delays in Business Process SimulationMarlon Dumas
This paper presents a technique to enhance the fidelity of business process simulation models by detecting unexplained (extraneous) delays from business process execution data, and modeling these delays in the simulation model, via timer events.
The presentation was delivered at the 4th International Conference on Process Mining (ICPM'2022).
Paper available at: https://arxiv.org/abs/2206.14051
Business Process Simulation with Differentiated Resources: Does it Make a Dif...Marlon Dumas
Existing methods for discovering business process simulation models from execution data (event logs) assume that all resources in a pool have the same performance and share the same availability calendars. This paper proposes a method for discovering simulation models, wherein each resource is treated as an individual entity, with its own performance and availability calendar. An evaluation shows that simulation models with differentiated resources more closely replicate the distributions of cycle times and the work rhythm in a process than models with undifferentiated resources. The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-16103-2_24
Prescriptive Process Monitoring Under Uncertainty and Resource ConstraintsMarlon Dumas
This paper presents an approach to trigger runtime interventions at runtime, in order to improve the success rate of a process, when the number of resources who can perform these interventions is limited.
The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-16171-1_13
The presentation delivered at the 20th International Conference on Business Process Management (BPM'2022), in Muenster, Germany, September 2022.
Accurate and Reliable What-If Analysis of Business Processes: Is it Achievable?Marlon Dumas
In this talk, I discuss the problem of how to discover simulation models that can be used to, accurately and reliably, predict the impact of a change on a business process, e.g. what-if we automate an activity? what-if 10% of our workers become unavailable? I focus on recent approaches that exploit the availability of data in enterprise systems to address this question.
Learning Accurate Business Process Simulation Models from Event Logs via Auto...Marlon Dumas
Paper presentation at the International Conference on Advanced Information Systems Engineering (CAiSE).
This paper presents an approach to automatically discover business process simulation models from event logs by combining process mining and deep learning techniques.
Paper available at: https://link.springer.com/chapter/10.1007/978-3-031-07472-1_4
Process Mining: A Guide for PractitionersMarlon Dumas
Paper presentation delivered at the Research Conference on Challenges in Information Science (RCIS 2022). The paper studies the following questions:
1) What are the most common use cases for process mining methods?
2) What business questions do process mining methods address?
Paper available at:
https://link.springer.com/chapter/10.1007/978-3-031-05760-1_16
Process Mining for Process Improvement.pptxMarlon Dumas
Presentation of a research paper at the 16th International Conference on Research Challenges in Information Science (RCIS). The paper presents the results of an empirical study on how practitioners use process mining to identify business process improvement opportunities. The paper is available at: https://link.springer.com/chapter/10.1007/978-3-031-05760-1_13
Data-Driven Analysis of Batch Processing Inefficiencies in Business ProcessesMarlon Dumas
Slides of a research paper presentation at the 16th International Conference on Research Challenges in Information Science (RCIS).
The research paper presents an approach to analyze event logs of business processes in order to identify batched activities and to analyze the waiting times caused by these activities.
Paper available at: https://link.springer.com/chapter/10.1007/978-3-031-05760-1_14
Optimización de procesos basada en datosMarlon Dumas
Ponencia en BPM Day Lima 2021.
En esta charla, hablaremos de métodos y aplicaciones emergentes en el ámbito de la optimización de procesos basada en datos. Hablaremos de avances en el área de la minería de procesos, de métodos de construcción de gemelos digitales de procesos y de métodos de monitoreo predictivo. Mostraremos por medio de ejemplos y casos de estudio, cómo estos métodos permiten guiar las iniciativas de transformación digital y de mejora continua de procesos, En particular, ilustraremos el uso de estos métodos para: (1) analizar el rendimiento de los procesos de negocio de manera a identificar fricciones y oportunidades de automatización; (2) predecir el impacto de cambios, y en particular, predecir el impacto de una iniciativa de automatización; (3) realizar predicciones sobre el rendimiento del proceso y ajustar la ejecución del proceso de manera a prevenir incumplimientos del SLA, quejas de clientes, y otros eventos indeseables.
Prescriptive Process Monitoring for Cost-Aware Cycle Time ReductionMarlon Dumas
Paper presentation at the 3rd International Conference on Process Mining (ICPM), 4 November 2021.
The paper is available at: https://arxiv.org/abs/2105.07111
Mine Your Simulation Model: Automated Discovery of Business Process Simulatio...Marlon Dumas
Keynote talk by Marlon Dumas at the SIMULTECH 2021 conference. The talk gives an overview of ongoing research on automated construction of simulation models / digital twins from business process execution logs, including approaches that combine discrete event simulation with deep learning methods.
On the Road to AI-Infused Process ExecutionMarlon Dumas
Talk delivered at the ISSIP Discovery Summit: AI & Automation on 12 May 2021.
Abstract:
Traditionally, process automation and process monitoring have been living in two worlds. As AI technology reaches maturity, we are seeing a convergence between automation and monitoring. Data collected during the execution of business processes is fed into AI tools, which in turn drive the automation of these processes.
For example, predictive process monitoring techniques exploit events generated during the execution of a process (e.g. in an ERP or a BPMS) allow us to make predictions about the future states of a process. These predictions are then used to trigger interventions, to enhance the performance of the process. In the RPA world, UI-level events gathered on the background while workers perform their daily work are used to discover automatable routines that are then used to instrument RPA bots.
These and other applications of AI techniques are going to reshape the way we think about the BPM lifecycle. Monitoring and automation will be replaced by AI-Infused Process Execution.
Process Mining in Action: Self-service data science for business teamsMarlon Dumas
Talk delivered at University of Tartu's Data Science Seminar, 17 February 2021. The talk explains the role of process mining as a self-service data analytics technology for business teams.
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
9. Organizational Pitfalls
Lack of buy-in from operations
Lack of trust in predictions
Prescriptions not linked with actions
Lack of processes to validate, monitor, and maintain
prescriptive models
Technical Pitfalls
Insufficient data availability & quality
Lack of uncertainty modeling
Drifts and out-of-distribution predictions
Correlation ≠ Causation
10. No amount of organizational
and technical readiness will
save you from the sin of using
simulation models without pilot
testing or predictive models
without A/B testing
0
11. Results achieved: Even though the prediction of
the most likely customers to report incorrectly
was reasonably accurate, the intervention did
not have a preventive effect. No root cause was
identified to explain why the intervention did
not have the desired effect.
12.
13. +/- -
- +
Kairos: A Tool for Prescriptive Monitoring of Business Processes (demo)
https://kairos.cloud.ut.ee/
14. - Automate tasks X and Y
- Add +5 resources to task A
- Remove -2 resources from task B
…
15.
16. Everywhere!
GenAI brings context recognition: What types of processes, activities, KPIs are we talking about?
LLMs enable conversational process optimization across all layers of the pyramid
Descriptive Process Mining
•Where are the sources of waiting time?
•Where are the rework loops?
•Where are we over-processing?
•What are the sources of variance?
•Which cases require the most touches?
•Where are we violating our KPIs?
•Are we abiding to our business rules and
policies?
•…
Predictive Process Optimization
•By how much would we reduce
order-to-delivery times is we:
•Shorten inter-batch cycles?
•Move resources to packaging?
•Automate verification steps?
•By how much would we reduce costs
if we:
•Consolidate touchpoints?
•Reorder verification steps to
reduce over-processing?
•Reduce rework rates?
•…
Assisted Process Optimization
•How can we slash KPI violation rates?
•What are the practices of the teams
with highest performance?
•What should we change to reduce
the number of touches?
•Which checks should we add to
reduce compliance violations?
•How can we cut the cycle time at
constant cost?
•How to allocate resources to optimize
time at constant capacity?
•…
17. Faster or Cheaper Horses
• Structured data prep (ETL scripting)
• Data querying (if your tool requires SQL coding)
• Data synthesis / enhancement (simulation?)
Better Than (Hallucinating) Horses
• Preprocess unstructured data
• Explain (or mis-explain) findings, patterns
• Figure out which options/questions to consider
18.
19. Organizational Pitfalls
Lack of buy-in from operations
Lack of trust in suggestions / prescriptions
Prescription flooding / over-prescription
Prescriptions not linked with actions
Lack of processes to validate, monitor, and maintain
prescriptive models
Technical Pitfalls
Insufficient data availability & quality
Neglecting inter-process dependencies
Lack of uncertainty modeling
Drifts and out-of-distribution predictions
Unreliable prescriptions
Lack of feedback loop
20. No amount of organizational and
technical readiness will save you
from the sin of using AI-driven
improvement recommendations
without validation and pilot testing
or prescriptive models without A/B
testing
0
21.
22.
23.
24. No amount of organizational and
technical readiness will save you from
the sin of deploying proactive or
adaptive process optimization without
building a capability to derive sustained
value from process mining, predictive,
and prescriptive optimization
4
25.
26. • Lay the foundations, start climbing, keep climbing, don't hold off
Getting data for process mining is often a challenge. But there are both short-
term benefits (bottom of the pyramid) and long-term ones (top)
• Don't skip the layers
The lower layers of the pyramid provide a foundation to draw business value
from the upper layers.
• Align strategically and build governance incrementally
Apply these capabilities first and foremost to processes that matter.
Adopt these capabilities incrementally, one process at a time.
Build success stories internally, ensure each layer of the pyramid yields value.
Building an analytics engine to support different automation and enterprise platforms
Why are we having this SLA violation? (example of conversational bot used for diagnostics)
What if demand increase by 20%? (example of conversational bot used for prediction)
What intervention policies (same)
For each question, the bot will automatically do an AB test of the improvement intervention/compare cases with and without the issue to show the impact and identify root cases
Lay the foundations, start climbing, keep climbing, don't hold off. Many managers postpone the adoption of process mining by stating “we don’t have the data”, or “our data is not good enough”. Yes, getting the data for process mining is often a challenge. But the benefits have been demonstrated repeatedly, in thousands of successful deployments. And getting the data to do process mining opens many doors. The data that is used today for process mining can be used tomorrow for predictive process monitoring or to build digital process twins. Once the obstacle of data collection and curation has been overcome, the possibilities are endless. Note that task mining provides an additional channel for collecting data— when the enterprise system does not allow us to do so.
Don't skip the layers. The lower layers of the Augmented BPM pyramid provide a foundation to derive business value from the upper layers. Organizations that wish to maximize the benefits of adopting the upper-layer capabilities need to master the lower layers.
Align strategically and build governance incrementally. Any process mining, predictive monitoring, or prescriptive process improvement initiative needs to be grounded on the strategic priorities of the organization. The capabilities in the augmented BPM pyramid should first and foremost be applied to business processes that matter to the organization. It is also important to adopt these technologies incrementally, one process at a time. Over time, a governance structure is needed to ensure that the technologies in the pyramid create value predictably and repeatably. But before getting there, it is important to have a few success stories internally, to gain executive support, and to keep this support by showing that every capability in the augmented BPM pyramid produces tangible value.