• Save
CMMI High Maturity Best Practices HMBP 2010: Process Performance Models:Not Necessarily Complex by Himanshu Pandey and Nishu Lohia
Upcoming SlideShare
Loading in...5
×
 

CMMI High Maturity Best Practices HMBP 2010: Process Performance Models:Not Necessarily Complex by Himanshu Pandey and Nishu Lohia

on

  • 1,318 views

Process Performance Models:Not Necessarily Complex -Himanshu Pandey and Nishu Lohia(Aricent Technologies) presented at ...

Process Performance Models:Not Necessarily Complex -Himanshu Pandey and Nishu Lohia(Aricent Technologies) presented at
1st International Colloquium on CMMI High Maturity Best Practices held on May 21, 2010, organized by QAI

Statistics

Views

Total Views
1,318
Views on SlideShare
1,316
Embed Views
2

Actions

Likes
0
Downloads
2
Comments
0

1 Embed 2

http://www.slideshare.net 2

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

CMMI High Maturity Best Practices HMBP 2010: Process Performance Models:Not Necessarily Complex by Himanshu Pandey and Nishu Lohia CMMI High Maturity Best Practices HMBP 2010: Process Performance Models:Not Necessarily Complex by Himanshu Pandey and Nishu Lohia Presentation Transcript

  • Process Performance Models:Not Necessarily Complex -Himanshu Pandey Nishu Lohia (Aricent Technologies)
  • Process Performance Models - Not Necessary Complex HMBP 2010 21-May-2010
  • Who We Are Aricent is the world’s leading independent communications software company. – Dedicated focus on communications software – Unmatched depth and breadth of services and products – Culture of innovation, excellence and results More than 400 customers across the world Equipment Manufacturers Device Manufacturers Service Providers 7-Jun-10 3
  • What We Offer Wireless Data Signaling Packet networks/ Communications VoIP GSM, GPRS/EDGE, 3G, ISDN, SS7 and SIGTRAN IMS, SIP, H.248, MGCP, WCDMA, CDMA2000, Routers, VPN and QoS, VoWiFi, Interworking WiMAX, UMA, Femtocell ATM, IP, MPLS, GigE, Platform 8,000+ employees with expertise in all Communication Engineering major communications categories Applications Messaging, Location Based ATCA, Network Processors Services, Workforce Automation, Voice Applications Mobile Handsets Billing and OSS User Interface, Multimedia Multi-vendor Billing, Applications, Physical OSS Integration, Service Layer, Middleware, Multiple Activation, OSS/BSS OS and Platforms, DSP Business Process Re-eng DSP Broadband and Network Transmission Video and Voice Processing, Wireless access Management Audio/Video Codecs, xDSL, Satellite, Cable, TL1, SNMP, CORBA and SONET/SDH, RPR, DWDM Network Processor 802.11a/b/g/i and WiMAX CLI Application, Microcode Design 7-Jun-10 4
  • Agenda  Problem Statement  Aricent’s way to Resolution  Process Performance Modelling – Overview  Models- Overview – Rayleigh’s Defect Prediction Model – Test End date prediction Model – NHPP + Gompertz Model  Combinations of Models  Conclusion  Q&A 7-Jun-10 5
  • Problem Statement • CMMI Level 5 can not be achieved without Statistical process performance models in place. • Models available in the market are: • Too Costly • Complex to understand and implement • Limited availability • Non-customizable • Law of inertia is applicable to software industry as well. • High resistance for change due to strict timelines. • Implementation of models is an over head. 7-Jun-10 6
  • Aricent’s Way Of Resolution • Identify generic life cycle steps where PPM could be applied across maximum projects • Identify in-house developed statistical tools which are being used most frequently across the organization. • Identify the tools those can be used in collaborative mode effectively to: • Perform What if Analysis • Implement PDCA cycle 7-Jun-10 7
  • Process Performance Modeling • The CMMI definition of PPM -> description of the relationships among process attributes and its work products that are developed from historical process-performance data and calibrated using collected process and product measures from the project which are used to predict results to be achieved by following a process • Makes quantitative predictions about a particular process • May estimate resource consumption, effectiveness, efficiency depending upon organization goals 7-Jun-10 8
  • In-house Statistical Tools  Rayleigh’s Defect Prediction Model  Test End Date Prediction Model  NHPP + Gompertz Model 7-Jun-10 9
  • In-house developed Models Overview • The 3 models developed in-house are: Rayleigh’s Defect Prediction Model Test End Date Prediction Model NHPP + Gompertz Model 7-Jun-10 10
  • Rayleigh’s Defect Prediction Model • Input to the Tool: • Total number of detected/injected defects in past phases • Duration of the past and upcoming phases • Output from the tool: • Phase wise Estimated vs Actual Defect Distribution • Estimation of Defects passed to customer Phase wise & Cumulative Defect Distribution 90 160 80 Phase wise Defects 140 70 • Benefits: 120 Cum_Defects 60 100 50 80 40 60 • Forecast on Defect Passed to Customer 30 20 40 10 20 • Comparison of estimated vs actual defects in phases 0 SRS Design CUT RT PQT 0 Actual Defects (Phase wise) Estimated Defects(Phase wise) Estimated Defects (Cumulative) Actual Defects (Cumulative) 7-Jun-10 11
  • In-house developed Models Overview • The 3 models developed in-house are: Rayleigh’s Defect Prediction Model Test End Date Prediction Model NHPP + Gompertz Model 7-Jun-10 12
  • Test End Date Prediction Model – Overview • Performs simulation to forecast: • Testing End date • Using EWMA control charts which helps in • Observing/monitoring the variations in current test execution rate • Generating results to predict number of days to finish testing Test Case Executed Per Day 40.00 35.00 30.00 25.00 20.00 15.00 10.00 5.00 0.00 1 2 3 4 5 6 Exp. Mov. Average UCL LCL 7-Jun-10 13
  • Test End Date Prediction Model cont.. • Input to the Tool: • Total number of cases planned for execution and actual test case execution data • Revised test execution data included failed test cases • Output from the Tool: • Predicts the number of days pending to finish the remaining planned number of test cases • Benefits • Strategizing remaining number of days in order to finish testing on time Distribution of Possible Days to End Testing Date at 90% Replicate1 350 100% Replicate2 X Axis - Days to finish testing Y Axis - Frequency Replicate3 90% Replicate4 300 Replicate5 80% Replicate6 250 70% Replicate7 Replicate8 200 60% Replicate9 Number of days Replicate10 50% Replicate11 150 Replicate12 40% Replicate13 Replicate14 100 30% Replicate15 Replicate16 20% Replicate17 50 10% Replicate18 Replicate19 0 0% Replicate20 14.7 to 14.75 14.8 to 14.85 14.9 to 14.95 15 to 15.04 15.09 to 15.14 15.19 to 15.24 15.29 to 15.34 15.39 to 15.44 15.49 to 15.54 15.59 to 15.64 15.69 to 15.74 15.79 to 15.83 15.88 to 15.93 15.98 to 16.03 16.08 to 16.13 16.18 to 16.23 16.28 to 16.33 16.38 to 16.43 16.48 to 16.53 16.58 to 16.62 16.67 to 16.72 16.77 to 16.82 16.87 to 16.92 16.97 to 17.02 17.07 to 17.12 17.17 to 17.22 17.27 to 17.32 17.37 to 17.41 17.46 to 17.51 17.56 to 17.61 % Distribution of Replicate1 90%_Line 7-Jun-10 14
  • In-house developed Models Overview • The 3 models developed in-house are: Rayleigh’s Defect Prediction Model Test End Date Prediction Model NHPP + Gompertz Model 7-Jun-10 15
  • NHPP + Gompertz Model – Overview CURRENT STATUS OF • Performs forecast on SOFTWARE RELIABILITY DATE FOR WHICH S/W RELIABILITY • Reliability of the software under testing 23-Mar-10 STATUS IS GIVEN SOFTWARE RELIABILITY ON ABOVE 79.72% MENTIONED DATE • based on MTBF UPPER LIMIT OF 95% CONFIDENCE INTERVAL Almost 100% LOWER LIMIT OF 95% CONFIDENCE 50.94% • based on Gompertz reliability growth model INTERVAL • Expected number of failures in remaining number of testing days • Based on NHPP equation 7-Jun-10 16
  • NHPP + Gompertz Model Cont.. • Input to the Tool: • Actual test case execution data; each case run failed or passed. • Number of days of testing and average testers per day • Output from the Tool: • Reliability Growth Pattern • Current reliability of software • Failure Forecast • Benefits: • Strategizing remaining number of testing days in order to improve reliability and minimize failures passing to customer 7-Jun-10 17
  • Process Performance Models  NHPP- Gompertz + Test End Date Prediction  Rayleigh’s Model + Test End Date Prediction  NHPP-Gompertz Model + Test End Date Prediction Model+ Rayleigh’s Model 7-Jun-10 18
  • NHPP- Gompertz + Test End Date Prediction • Test End Date Prediction Tool takes test execution data as input and provides the number of days pending to finish the testing • NHPP Model, along with execution data, require to know how many more days testing will continue in order to predict the failures • Number of failed test cases predicted by NHPP then in turn act as revised input for Test End Date Prediction to re-calibrate the testing end date • Perform multiple calibrations in order to arrive at best suitable situation to strategize where highest software reliability can be achieved within lesser time and lesser failures 7-Jun-10 19
  • Rayleigh’s Model + Test End Date Prediction • Test End Date Prediction Tool takes test execution data as input and provides the number of days pending to finish the testing • Rayleigh’s Model, require to know for how many days testing will continue in order to predict the defects • Derive number of test case failures from number of defects predicted by Rayleigh’s model using baseline defects/failure rate and use this as input in Test End Date prediction for revised number of days. • Perform multiple calibrations in order to know estimated defect leakage to customer and strategize for the prevention. 7-Jun-10 20
  • NHPP-Gompertz + Test End Date Prediction + Rayleigh’s Model • Test End Date Prediction Tool usage along with NHPP and Rayleigh models separately has been discussed • Now, comparing the results of two, brings down to an interesting analysis in order to verify the defects/TC failure rate with baseline figures. • Hence, knowing your current project’s performance in comparison to past project’s performance justifies our subjective confidence on how far/close we are w.r.t. to achieving our targets. • Re-calibration and verification analysis based PCDA format, helps in strengthening our prevention strategies and hence achieving great results. 7-Jun-10 21
  • NHPP-Gompertz + Test End Date Prediction + Rayleigh’s Model • Special scenario: • When Rayleigh’s and NHPP’s forecasts are not in synch: • Study the phase wise defect distribution curve • Study Actual value curve vs Estimated value curve • Identify and Analyze the phases outside our control • Study the co-efficient of determination • Correctness of available data • Availability of support data used for forecasts • Decision • Choose the model for which most of the above parameters are satisfied 7-Jun-10 22
  • Conclusion • Yes! We applied these in-house developed PPMs and achieved CMMI level 5 (v1.2) successfully. • Applied on wide range of Projects. • The models applied ,undoubtedly: • Has almost no Cost • Are easy to understand and implement • Are Easily Customizable • Are not Overhead to Implement • Implies no resistance to use • Has no constraints of availability • Satisfies CMMI practices completely 7-Jun-10 23
  • Thank You! 7-Jun-10 24
  • Click here for: High Maturity best practices HMBP 2010 Presentations organized by QAI Click here