When Workflow Management Systems first began to proliferate (1990s) there was little attention paid to the data generated by the running processes. Most thought this as an audit trail, not a source of information for process improvement.
We now understand that the historical record contains valuable information essential to a well orchestrated continuous process improvement program.
Correctly designed analytics is the starting point for providing business process intelligence. The analytics drives both real-time monitoring and predictive optimization of the executing B usiness P rocess M anagement S ystem.
Overview Business Operations Control Event Detection &Correlation Predictive Simulation Data Mining Optimization Event Bus ERP BPM ECM Legacy EAI Custom Historical Analytics Real Time Dashboards Alerts & Actions
Actions & Alerts Process Metrics Action Schedule Rules Engine Email and Cellphone notification Process Event Triggers Goals Thresholds Risk Mitigation KPI Evaluation Web Service Call or Execute Script Actions
A stream of events produced by a variety of business process engines (ERP, Supply Chain Management, BPMS enactment) is fed to an Analytics engine which transforms the event data into usable information.
A Business Activity Monitoring module updates in real time a set of KPI indicators and using a Rules Engine applied to these indicators, generates Alerts and Actions which inform managers of critical situations and alter the behavior of the running processes.
A simulation tool, using the historical data, provides What-If analysis in support of continuous process improvement. Integrated with a Work Force Management system it enables optimization of staff schedules.
But designing the what-if scenarios can be a challenging and labor-intensive task for a specialist.
Automatic Optimization uses Analytics and Simulation to generate and evaluate proposals for achieving a set of goals.
Analysis of Process structure in conjunction with historical data about processing delays and resource availability permits the intelligent exploration of improvement strategies.
Coupled with WorkForce Management technology, this approach helps optimize staff schedules.
Optimization BottleNeck Analysis Determine task most understaffed Cross-train most idle, feasible person Alternatively hire new one Predict (simulate altered scenario) Wait Time Reduction by Load Balancing Analyze current situation Predict (simulate) Alter scenario Propose measure for improvement
Optimization, using goals formulated as KPI’s, can analyze historical information and propose what changes are likely to help attain these goals. It can systematically evaluate the proposed changes, using the simulation tool as a component.
This can be performed in a totally automated manner, with termination upon satisfying the goal or recognizing that no proposed change results in further improvement.
Staff optimization, focusing on end-to-end cycle time and processing cost as the KPI’s, is one example of the application of this technology.
We focus on the data generated by typical computer-based business processes, using Process Intelligence as the lens through which to view the data.
This process view is critical in developing a mining structure and mining models that expose correlations between Key Performance Indicators and other factors such as work item attributes, resource schedules, arrival patterns and other external business factors.
A marketing campaign is expected to increase the number of low end loan applications next month.
Simulation-based forecasting could be used to optimize work force management, but the simulation model must have accurate information about how long each step in the process takes and using average duration values based on history will not do.
How can data mining provide better estimates for durations based on line-of-business attributes of the applications?
Making Predictions using Simulation and Data Mining
Simulation and Data Mining can both be used to make predictions. Are they competing or complementary technologies?
We have already discussed the role of Data Mining in the preparation of information required for accurate simulations.
Apart from this, there are major differences.
The simulation model must be a sufficiently accurate representation of the collection of processes being executed. It can make predictions for situations not previously encountered so long as the underlying processes have not changed.
The Data Mining predictions are based on a statistical analysis of what has already happened. A trained mining model assumes the historical patterns are still valid.
There are major differences in performance.
Simulation is computationally intensive. It takes significant time to obtain predictions.
In Data Mining, the training is computationally intensive. Once a model is trained predictions are extremely fast. Periodic retraining may be required to keep the model accurate.
BPMS generate event streams that provide the Analytics Data needed for Business Activity Monitoring in real time and Continuous Process Improvement .
A customizable Optimizer , employing Data Mining and Simulation tool kits, derives from the Analytics Data a stream of recommendations for improving the business operations, including:
Redeployment of resources
Optimization of business rules
The Data Mining component supports an alternative approach to prediction under changing business circumstances and generates critical information for use by the Simulator. It also provides Process Discovery capabilities useful in Process Re-Design.