Your SlideShare is downloading. ×

Irm Uk Process Intelligence Ppt2003 Final

794

Published on

Process Intelligence, understanding the actual temporal flow events in business processes. Why BI tools are not able to do this - they lack flow and time as analytical primitives, Given in London, …

Process Intelligence, understanding the actual temporal flow events in business processes. Why BI tools are not able to do this - they lack flow and time as analytical primitives, Given in London, November 3, 2009 by Neil Raden.

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
794
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
16
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Page
  • Healthcare providers get bills collected from Medicare. Always models a flow/process
  • There seems to be a difference in arrival time when looking at where attendees stay. Looking only at the measures taken at the final stage (arrival in the conference room), attendees from Caesars certainly have the lowest arrival time. Notice that three are in the front row, and three in the third row. But is it a meaningful difference? How do we know?
  • A Process Intelligence inquiry begins with a search against log files stored in an event data warehouse. The search and query of BI have been separated. The search against a columnar, time-indexed store is fast. Most process inquiries occur along, precise, narrow time windows. Among other things, the log files preserve the distributions of flow behavior. From the log files, the process trace (or path) is also reconstructed.
  • When looking at the distributions of the arrival times for the 54 attendees one sees a different picture. The small number of attendees staying at Caesars combined with their large dispersion renders their mean arrival time meaningless. It appears that it was a fluke—three attendees decided to meet in the meeting room that morning at 7:30. The arrival time of the Venetian attendees is, however, significant. You can see from the distribution that they arrive consistently earlier than all the other delegates. This is a process signal. Can it be useful in helping us figure out how to get people into the meeting room earlier?
  • Although the log files do not contain information about the path directly, the path can be derived from an real-time, on-the-fly examination of the logs. In analyzing complex causal flow systems, path is often important. With Process Intelligence, the process trace or flow path for each process instance is known, and becomes a key analytical primitive. (Note, a BI approach might use the log files to create summary data which in turn is stored in fact and dimension tables.) The key thing about path is that it is not stored directly. That’s a key aspect of PI – constructing path on the fly from examination of logs.
  • It turns out that there is a consistent bottleneck at the convention hall. Attendees have to queue up and the service is slow. Guests at the Venetian though are presented with a well-staffed coffee station at the lobby of the hotel; they even provide good coffee in attractive take-out containers. In general, guests at the Venetian are getting their coffee right before they get on the bus and skipping the coffee stand at the convention hall and going straight into the meeting room.
  • The process could be improved by (1) working with hotels to establish early morning coffee bars in their lobbies with fast, convenient take-out, or (2) greatly increasing the throughput of the convention coffee service. It required analyzing both the process path as well as the distributions of cycle time data to come to this fact-based conclusion on improving process performance.
  • Today, the major hurdles to clear for effective Dwing are methodology and best practices that have lagged behind. Practitioners draw up paper designs of the architecture, based on framework principles, and arrive at a structure that makes sense from a data architecture perspective, but the complexity interferes with the smooth operation for which it was designed. The problem with these kinds of frameworks is that things tend to work fairly well at first, but as conditions change, the approach proves to have the fluidity of poured concrete - only fluid until set. These convoluted architectures get patched with layers of new structure, creating an overall design that is too brittle and inflexible to be useful in a BI environment. Each new artifact adds latency to each query and each new patch adds latency to the delivery of needed features. Ultimately, the expense, delay and dysfunction overwhelm the process. The environments become too unwieldy to respond to changing business requirements or even the original ones they were meant to support . A whole new approach is needed that makes the building and rebuilding of data stores for BI a fast and partially automatic process.
  • Page Besides rules we need analytics to deliver on precision
  • Range goes from 5% to 20% lift More sophisticated as go from left to right. The application of analytics will move an organization’s BI efforts from simply informing decisions to taking action and tracking the effectiveness of those actions, thereby closing the loop. What do you need to effectively deploy sophisticated analytics in your organization? First of all, you need access to data, and lots of it. Descriptive and predictive modeling, also known as data mining, does not operate on small sets of aggregated data, such as that in most cleaned up data marts. These tools need access to lots of detailed data, so that means a database that can slice and dice through mountains of data, fast. Because people from different domains, at different levels of skill, need to participate, you need software tools that can accommodate this diversity and. More importantly, can support collaborative and iterative use patterns, such as the ability to animate analyses and/or create guides, to explain the models to those who need to approve them but lack the ability to completely understand them.
  • Describe your event flows But not just to your analysts, also to your systems. Delivering these models as rules is often key. Use of rules to implement models
  • The risk or opportunity in the context of a single customer or transaction Not an overall pattern, even if it is predictive Executable Timeliness, accuracy, operationalization, horizon Implementation is hard, need to do so rapidly
  • Managing tradeoffs between multiple models like risk of returning a product and the likelihood of purchasing it Trying to find the best rules, given all your models How to apply this to systems, transactions not modeled?
  • Page Descriptive analytics can be used to categorize customers into different categories – to find the relationships between customers - which can be useful in setting strategies and targeting treatment. But this analysis must be delivered not just to your analysts, also to your systems. Analysis is generally done offline, but the results can be used in automated decisions – such as offering a given product to a specific customer – often by developing rules that embody the analytics . For instance a decision tree can be created where each branch, each end node, identifies the segment for a particular member.
  • Page Predictive analytics often rank-order individuals. For example, rank-order members by their likelihood of completing treatment – the higher the score, the more “completers” for every “non-completer”. The risk or opportunity is assessed in the context of a single customer or transaction and these models are not an overall pattern, even if they are predictive. Models are called by a business rules engine to “score” an individual or transaction, often in real time, though the analysis is done offline. These models are often represented by a scorecard where each characteristic of a member adds to the score and where the total score can then be returned.
  • Page Any given customer has a current profitability trajectory or prediction. Today I could take one of several actions – different retention offers in our example – or do nothing. Each action, and doing nothing, results in a different profitability trajectory shown with the Action A, Action B and Action C paths. When we make the decision we don’t know what these paths will look like – they are in the future – but we want to pick the most profitable action for each customer when we make the decision Any given customer has a current profitability trajectory or prediction. Today I could take one of several actions – different retention offers in our example – or do nothing. Each action, and doing nothing, results in a different profitability trajectory shown with the Action A, Action B and Action C paths. When we make the decision we don’t know what these paths will look like – they are in the future – but we want to pick the most profitable action for each customer when we make the decision At first sight this may seem impossible – how can we know the future outcomes of our actions? Well we can’t. Fortunately, for those of us building information systems, there is a continuous stream of customers reaching any given decision point. If we have a set or cohort of customers reaching the “renewal offer decision point” today then we will also have some more next week, and the next and so on. This allows us, if we approach the problem correctly, to use the results of our decisions for the first cohort to improve how we treat the next cohort and to continuously improve our results as a consequence. This is adaptive control.
  • Page This kind of approach also makes it easier to continuously improve. For any given decision there is some optimal decision “out there”. If we only try one approach then we explore a limited space. If on the other hand we use multiple challengers then we expand the space we explore, allowing us to move more rapidly towards our optimal approach. Even if (when) that optimal approach changes we can use challengers to “follow” it.
  • Page
  • Transcript

    • 1. The Agility Imperative Rethinking BI for Processes Neil Raden CEO Hired Brains Inc. November, 2009
    • 2. A special thanks to John Patton, CEO, Sight Software, who introduced me to the idea of “Process Intelligence” and with whom I collaborated for this presentation. [email_address]
    • 3. Business Users Drive Analytics
      • Promise of BPM (Business Process Management ) :
      • Business users drive the creation, operation and improvement of strategic processes because of enabling technology of (BPM, BPMN, BPEL & SOA)
      • Also create, use, change and improve process analytics
      • DRM (Domain Reference Model) approach: SAP reference models: >4000 entity types, 1000 business processes using EPCs; Oracle similar
      • Only usable by DRM experts. Even configurable reference modeling languages suffer from the same knowledge transfer limitations – stakeholders don’t learn.
    • 4. DRM of the Sell Process `
    • 5. Business Users Drive Analytics
      • Today’s BI* for Processes truth: with EPC’s (Event Process Chains) and DRMs, people can muddle through with pre-defined KPIs and pre-configured BI data models.
      • Tomorrow’s BI for Processes truth: Today’s BI is ill-suited for a world of agile, interconnected (strategic) processes.
      • It’s like a square peg in a round hole
      • * For brevity, we include current data
      • warehousing methods in the term BI
    • 6. The State of BI
    • 7. What Are the Analytical Needs?
      • The An alytic Environment is Different
      • Complex flow data among many agile internal and external complex processes.
      • There will be precise, complete end-to-end data on every process instance available in process log files.
      • Stakeholders require many different views of the processes .
      • This includes not only flow patterns of process instances, but the resources involved and the process instance data associated with them as well.
    • 8. What Are the Analytical Needs?
      • If the needs were only tactical, simple dashboard (BAM) instrumentation would suffice.
      • But strategic processes require a variety of stakeholders to know with certainty:
        • What happened?
        • What will happen?
        • What could happen?
      • Processes are complex causal systems with continuously measured flow. (One reason why simplified BI models are poor candidates for root cause analysis and other prescriptive/ predictive analyses)
      This implies prediction
    • 9. What Are the Analytical Needs?
      • The Analytical Methods for Complex Causal Systems
      • The key strategic questions all involve analyzing process uncertainty .
      • These methods require that process data:
        • Be presented as distributions
        • Preserve flow (path) information
        • Allow robust time perspectives
      Not suited for current BI tools A simple example
    • 10. Process Intelligence IRM Hotels Convention Venue How can we get people into the meeting room earlier (so we can start on time)?
    • 11. Luxor Venetian Flamingo Caesars Bellagio 8:00 7:52 8:04 7:50 8:02 Mean Arrival Time IRM
    • 12. Process Intelligence Data Persistence Structure: Process Log Files LOG FILES LOG FILES IRM LOG FILES LOG FILES LOG FILES
    • 13. Process Intelligence Analytical Primitive: Distributions Caesars Venetian Bellagio Luxor Flamingo FrequencyCount Arrival Time
    • 14. Process Intelligence Analytical Primitive: Process Trace LOG FILES IRM LOG FILES LOG FILES LOG FILES LOG FILES
    • 15. EDM Convention Venue Hotels Process Intelligence Analytical Primitive: Process Trace
    • 16. Luxor Venetian Flamingo Caesars Bellagio 8:00 7:52 8:04 7:50 8:02 Mean Arrival Time EDM
    • 17. What Approach Will Work?
      • Answer: Process Intelligence
      • Subtext: Process Intelligence is a distinct product with a distinct market.
      • Tech Truth: Process event log files are different.
      • Traffic in distribution not aggregated data
      • Today’s state-of-the-art solutions for business process intelligence depend on a traditional query-based approach against a relatively static data model and a tough-to-configure data warehouse
    • 18. Shortcomings of Current Data Warehousing Practices
      • Then you have to jackhammer them up
      Data warehouse data model designs only fluid when being poured
    • 19.
      • The problems with this tortured OLAP cube-based approach to process analytics are:
      • Distributions cannot be used as analytical primitives
      • The process flow model is destroyed in creating the data model
      • Time is poorly represented
      • It is not agile
      • Process flow predictions are crude and inaccurate
      PDW Pre- processing PDW Schema Process Log Files Process View Process View Process View Process Log Files Process View Process Log Files Process View Process View Process Log Files Process View Mapping & correlation of events and data Process Log Files BPI
    • 20.
      • Process Intelligence , a distinct technology, provides business and other users the tools necessary to use the proper methods of strategic process analysis:
      • Automatic instrumentation of BPMN-modeled processes
      • Analytical primitives that are distributions
      • Agile process views
      • Robust time perspectives
      • Imbedded analytical methods (of complex causal systems)
      • Accurate flow predictions with full uncertainty
      • Simplicity
      What Approach Will Work?
    • 21.
      • Process Intelligence Is A Better Solution
      • Process Intelligence is the right solution for enterprises’ need for agile strategic analysis of process flow through heterogeneous process islands.
      Mapping & correlation of events and data Process Log Files Process Intelligence What Approach Will Work?
    • 22. What Are the Analytical Needs?
      • Summary
      • BI was not built for agile, strategic complex causal systems
      • Strategic processes require analytic agility not available in current BI
      • Analyzing complex causal systems in an agile environment is beyond the capability of current BI
    • 23. Three Basic Kinds of Analytics They can be used together and often are. Type Used to: Example 1 Descriptive Analytics Classify or categorize individuals or other entities Cluster model 2 Predictive Analytics Predict future behavior of individual Score 3 Decision Analytics Develop superior ruleset or strategy Strategy optimization
    • 24. More Sophisticated Analytics Improve Results Decision Optimization Predictive Modeling Descriptive Analytics How do I use data to learn about my processes ? Where are there areas for improvement? How are those processes likely to behave in the future? How do they react to the myriad ways instantiated? How do I leverage that knowledge to extract maximum value from my operations? Knowledge - Description Action - Prescription X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X
    • 25. Descriptive Models Identify Relations Use: Find the relationships between events Example: Sort events into groups with different characteristics and outputs. Operation: Analysis is generally done offline, but the results can be used in automated decisions – such as switching a supplier in real time Descriptive models can be used to categorize events into different categories – which can be useful in setting strategies and targeting treatment.
    • 26. Predictive Models Calculate Risk Or Opportunity Use: Identify the odds that a route will require a specified cycle time Example: Will the supplier deliver on time? Will the process modification deliver the desired result? Operation: Models are called by a business rules engine to “score” an individual or transaction, often in real time Predictive models often rank-order events. For example, cycle time scores rank-order suppliers by their risk – the higher the score, the more “good” supplier for every “bad” one.
    • 27. Decision Models Design Better Strategies Use: Design a ruleset that will deliver the right decisions to reach goals Example: Identify how much money to spend on each marketing channel to maximize sales in a given timeframe and budget Operation: Decision models are used offline to develop rules, which can then be deployed to operate in real time A decision model maps the relationships between the data available, the decision itself, the outcomes of the decision and the business objective. It is ideal for balancing multiple objectives and constraints.
    • 28. Data Mining - Improve Rules * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Low-moderate income, young High Income High income, low-moderate education Moderate-high education low-moderate income High Moderate education, low income, middle-aged Low education, low income Education Age High
    • 29. Predictive Analytics – Add Insight 10 20 30 40 Member completes treatment Member fails to complete treatment
    • 30. Impact May Take Time to Play Out
    • 31. Continuous Improvement with Adaptive Control Unknown Optimal Approach Single Approach Decision Space Considered Unknown Optimal Approach Champion Challenger 1 Challenger 2 Decision Space Considered
    • 32. Contact Information Neil Raden Hired Brains, Inc. 1415 Kenwood Road Santa Barbara, CA 93109 www.hiredbrains.com [email_address] White papers: www.hiredbrains.com/Whitepapers.pdf LinkedIn: http://www.linkedin.com/in/neilraden Blog: http://www.intelligententerprise.com/blog/nraden.html (Office)   +1 805 962 7391  GMT - 08:00 PST (Mobile) +1 805 284 2322

    ×