IBM Predictive analytics IoT Presentation

2,580 views

Published on

Published in: Software, Technology
0 Comments
9 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,580
On SlideShare
0
From Embeds
0
Number of Embeds
19
Actions
Shares
0
Downloads
220
Comments
0
Likes
9
Embeds 0
No embeds

No notes for slide

IBM Predictive analytics IoT Presentation

  1. 1. Predictive Analytics for IoT Michael Adendorff Architect, STSM IBM IBM Predictive Maintenance and Quality michael.adendorff@ca.ibm.com
  2. 2. Evidence , Clues > Failure Prediction
  3. 3. Predictive Analytics Valuable Insight
  4. 4. Maintenance Insight
  5. 5. Maintenance Insight Failure Risk: Under Maintained Equipment
  6. 6. Maintenance Insight Wasted $$$$$: Over maintained equipment
  7. 7. Work Order Urgent Inspection Required: High probability of failure High risk of failure before next scheduled maintenance
  8. 8. Maintenance Schedule Update Request Bring forward scheduled maintenance to Jul 5 Bring forward scheduled maintenance to Aug 7 Delay scheduled maintenance to Dec 15
  9. 9. Parts Requirements Forecast : Main Bearing June: July: Aug: 10 3 22 9 20 12 7 21 14 12 15 17
  10. 10. Business Results : Predictive Maintenance Downtime Unplanned Planned
  11. 11. Predictive Analytics Valuable Insight How does it work?
  12. 12. Simplistic Illustration Historic Data Failure Records Vibration Levels Correlation FailureCount Vibration Level More failures have been witnessed when vibration levels are high
  13. 13. Univariate Model FailureCount Vibration Level Vibration Level P Failure Confidence < 0.1 0.1 % 2% 0.-0.5 1% 3% 0.5 – 2 3% 5% 2 – 5 15% 10% +5 98% 80% p(fail) Simple univariate models are generally not very accurate. This one looks better than it is. High vibration strongly correlated with failure as it is a lagging indicator. Need leading indicators to predict.
  14. 14. Multivariate model p(fail) More accurate than the univariate model, but raw input data never reveals the whole story. Correlates failures with combinations between multiple input variables Historic Data
  15. 15. Advanced Data Prep + Ensemble Models More accurate than the univariate model, but raw input data never reveals the whole story. Historic Data p(fail) E(fail date)
  16. 16. Advanced Data Prep + Ensemble Models More accurate than the univariate model, but raw input data never reveals the whole story. Historic Data Cumulative Cycles = f(speed, operating hours) p(fail) E(fail date)
  17. 17. Advanced Data Prep + Ensemble Models More accurate than the univariate model, but raw input data never reveals the whole story. Historic Data Cumulative Fatigue Load = f(Cycles, Speed) p(fail) E(fail date)
  18. 18. Advanced Data Prep + Ensemble Models More accurate than the univariate model, but raw input data never reveals the whole story. Historic Data Wear Damage Forecast p(fail) E(fail date)
  19. 19. Advanced Data Prep + Ensemble Models p(fail) More accurate than the univariate model, but raw input data never reveals the whole story. Historic Data Wear Damage Forecast E(fail date) Wear Modeling
  20. 20. Advanced Data Prep + Ensemble Models More accurate than the univariate model, but raw input data never reveals the whole story. Historic Data Fatigue Damage Forecast p(fail) E(fail date)
  21. 21. Advanced Data Prep + Ensemble Models p(fail) More accurate than the univariate model, but raw input data never reveals the whole story. Historic Data Wear Damage Forecast E(fail date) Fatigue Modeling
  22. 22. Advanced Data Prep + Ensemble Models Building models like this requires brute force number crunching as well as skills and knowledge. Payoff comes from more accurate predictions – but – it doesn’t end here. Historic Data Time series forecast + Combination Model p(fail) E(fail date)
  23. 23. Advanced Data Prep + Ensemble Models Historic Data Expected failure date is more actionable than current probability of failure Building models like this requires brute force number crunching as well as skills and knowledge. Payoff comes from more accurate predictions – but – it doesn’t end here. p(fail) E(fail date)
  24. 24. Advanced Data Prep + Ensemble Models Historic Data Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut lacinia semper gravida. Morbi vel orci in leo malesuada malesuada in ac enim. Nam pulvinar nec enim in venenatis. In nibh turpis, sodales at fermentum in Sensors don’t record every causal factor. Text analytics is used to fill in some of the blanks. p(fail) E(fail date)
  25. 25. Predictive Analytics Valuable Insight Building models is only half the fun. Next step – OPERATIONALIZE
  26. 26. Feed Data APIs for: • Describing target data structures • Describing calculations and aggregations • Running analytics • Exposing analytic results REST Historian DB WebService MQTT Other
  27. 27. Data flows into DB in realtime Event Master Data Profile KPI
  28. 28. Predictive Analytics done in realtime Event Master Data Profile KPI p(fail) E(fail date)
  29. 29. Predictive Analytics done in realtime Event Master Data Profile KPI p(fail) E(fail date) Predictive Outputs fed back as new events
  30. 30. Deciding on Recommended Actions Event Profile Action KPI
  31. 31. Taking Action REST DB WebService FTP Other
  32. 32. Valuable Insight Build Models 1) Assemble historic data 2) Attempt to correlate historical data with a known target 3) Improve results by putting more thought about preparing inputs and algorithm selection Operationalize 1) Feed raw data 2) Describe calculation and aggregation 3) Perform analytics 4) Carry out decision logic 5) Feed results 6) Retrain models regularly
  33. 33. Questions? Michael Adendorff Architect, STSM IBM IBM Predictive Maintenance and Quality michael.adendorff@ca.ibm.com

×