A practitioner’s approach to measuring the success of a centralized, independent test center

  • 2,428 views
Uploaded on

If your organization is thinking of undertaking the extensive process of deploying a centralized, independent testing center, attend this presentation and learn how to get sponsorship and commitment …

If your organization is thinking of undertaking the extensive process of deploying a centralized, independent testing center, attend this presentation and learn how to get sponsorship and commitment from senior management, manage change and communication, and build a measurement framework to evaluate your success. Presenters Jim Foloky and Srinivas Yeluripaty will tell you how you can go beyond the obvious objective of reducing costs and address the crucial problem of measuring cost, quality, and time to market so that they don’t impact the effectiveness of your testing center and jeopardize its success. They’ll explain how—by adopting a systematic measurement framework and integrating it with the overall process framework—MetLife’s IT department accurately measured the success of its test center, and they’ll share best practices and results achieved over the last 18 months. You’ll come away knowing how your test center can set new benchmarks and standards for centralized testing.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
2,428
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
162
Comments
0
Likes
1

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. A practitioner's approach to measuring the success of a centralized, independent test center Srinivas Yeluripaty Senior Project Manager, Infosys Technologies Ltd. Jim Foloky Director, US Business Testing Services, MetLife 1 ©2010 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice
  • 2. Knowledge Attendees Will Take Away Importance of a Test Center of Excellence (TCoE) Deploying it successfully – Key Considerations Quantifying the success/value of Test Center of Excellence Strategies to link Business/IT and QA goals Effective utilization of Test Management Tool to deploy your key metrics 2
  • 3. Flow Testing challenges, How TCoE can address them Test Center of Excellence (TCoE) deployment strategy TCoE success measurement framework Metrics Program Deployment Strategy – Integrated with TCoE processes deployment strategy How HP Quality Center helps in deploying metrics program Results 3
  • 4. Testing Challenges – How TCoE Address Them Proj B Development Pre Centralization Post Centralization Proj C Development Proj A Development Proj Z Centralized Governance Uniform Process Metrics Development Proj B Proj C Proj A Proj Z Development Development Development Development Team Team Team Team Team Team Test Test Test Test Test Test LOB- 1 QA LOB-2 QA LOB-3 QA Automation, SOA testing Knowledge/Test process advisory Tools/ Infrastructure Key Characteristics of Centralized QA State • Centralized structure for QA Team serving all projects, with uniform processes • QA governance team defining and driving the testing strategy • Achieve QA operational efficiency through metrics based management • Well defined QA organizational structure with common pool of resources and structured career path • Training and Knowledge Management for capability uplift of QA team • Regression Testing and Automation competency centers for Quality and Productivity improvements 4
  • 5. Test Center of Excellence Deployment Strategy Improve Productivity Improvement Initiatives Standardize Q1 – Q4 2009 Maturity Process Standardization Initiatives Create Transition & Built Centralized QA Q3,Q4 2008 Q1 2009 Q1,Q2 2008 • Establish TCoE Vision, Strategy, Governance structure (align teams based on LOB/Product Areas) • Form testing team (recruitment), transition applications to centralized QA, implement change management Create • Measure transition success, ensure business as usual (no releases are impacted) • Assess current capability of the team, processes • Define & implement standardized and repeatable processes that can establish predictable quality • Train & uplift people capability Standardize • Implement metrics based measurement • Baseline current capabilities using metrics • Benchmark against Best in Class • Implement continuous improvement program Improve • Share best practices, lessons learned ,etc. 5
  • 6. Measurement Framework • How can I measure the „Value of Testing‟? What is TCoE • Is reduction in cost by XX% sufficient to prove the success of TCoE? Success Criteria • When do we say the TCoE deployment is successful? • Align organization‟s objectives, goals to TCoE objectives and goals Define Measurement framework key drivers (goal oriented) • Build strategies and create measurement framework to measure progress Strategy Drivers Operational Drivers • Efficiently measure the day to day operations of QA through operational metrics • The metrics framework should be simple, easy to understand Key Features of the metrics framework • It must be uniform across the QA groups • The QA processes must be well integrated with Simple Uniform Integral Transparent metrics • The data, reporting should be transparent • Assess the reporting needs of various stakeholders, business, IT, management, QA How to Implement? etc. Assess current state • Pilot metrics deployment by implementing few Assess Pilot Adopt Improve metrics in selected few projects • Adopt the framework across the QA function • Benchmark current capability & improve 6
  • 7. Measurement Framework Objectives Goals Strategies Metrics Reduce QA cost by 25% • Improve productivity, • % of QA Cost “Cheaper” estimation accuracy, • Blended Rate Reduce QA Cost reuse, automation, etc. • Effort Adherence, etc. Reduce estimation variance to +/- 8% Near-zero defect delivery • Measure DRE and • DRE – Defect Removal “Better” implement defect RCA Efficiency Improve Quality • Improve KM process • DRR – Defect Rejection Reduce defect rejection Rate rate to 10% “Faster” Improve automation • Implement ROI driven • Schedule adherence coverage in 45% apps automation strategy, • % automation Reduce Cycle leverage Global Delivery • % reduction test cycle Time Faster delivery through Model 14 hr test cycle time Build 20% SME • Implement core-flex • % core “Focused” knowledge model • On boarding time Flexible QA • Partner with vendor Specialists Ramp up team within 2 weeks of lead time 7
  • 8. Metrics Program Deployment Strategy – Integrated with TCoE processes deployment strategy Year 2008 Year 2009 Q3 2008 Q4 2008 Q1 2009 Q2 2009 Q3 2009 Q4 2009 Quality Center Standardization Re-use Initiative Build Regression Test beds Estimation Model Uniform Configuration Mgmt QA Thought Risk Management Model Defined Defect Leadership RCA Defect prevention Process Prediction Model (TCoE Process Roles & Responsibilities & Governance) Communication Mgmt Onsite-offshore optimization Define Operating Model Performance Metrics Program Ph1 Metrics Program Ph2 Management Metrics Portal Prototype Reporting structure Quarterly/Monthly AD updates (Metrics) implemented Comprehensive Training Competency Product KM documents Competency Mgmt Programs Management (& KM) • Metrics program aligned with Test Center of Excellence deployment • Quality Center deployment and standardization is key to the success of Metrics Program implementation • Metrics program is divided into two phases, • Phase1 -> Strategy metrics, key operational metrics • Phase2 -> Benchmarking performance, implementation of remaining operational metrics 8
  • 9. Metrics Categorized Based on the Key Drivers Abstraction Independent QA Vision Metric Name Aim of the Metric Resource Distribution Geographical distribution of QA team Blended Rate Unit cost - average Reduced Cost Strategy Metrics Application Spread Measures % of application coverage by QA % of QA Cost Top Measure cost effectiveness 5 Improved Quality QA Effectiveness Measures improvement in QA effectiveness (pre-post centralization) Effort Deviation Measures accuracy of estimation process Reduced Cost Productivity (Test Planning and Test Execution) Top Measures efficiency of Testing Team 5 % of Test Case Reusability Measure the test case reusability factor % of Entry and Exit Criteria achieved (*) Top Measures % Test Criteria Met. How effective the 5 process is performing against the standards Operational Top DRE 5 Measures Testing Effectiveness of Testing process Metrics - Quality & Defect Rejection Rate Measure Testing Group‟s understanding of Requirements (Domain) and technology Productivity Improved Quality Metrics Test Case Effectiveness Measures Test Coverage (Q&P Metrics) Requirement Stability Index (*) Measures accuracy of requirement definition Code Drops Efficiency (*) Measure of efficiency of Delivered code drops Smoke Testing Success Rate (*) Measures quality of Code Drops Schedule Adherence Measure of ability to deliver on time Increased Agility (Time to Market) % of automation (Regression Testing) Top Measure the overall automation coverage 5 (*) Metrics identified to track quality gateways 9
  • 10. Integration of Metrics with Project Life Cycle Stages Requirements Design Build Test UAT & Deploy Entry & Exit Criteria Achieved Schedule Adherence Effort Deviation Productivity Requirements Stability Index Smoke Testing Success Rate Code Drops Efficiency % of Test Case Reusability Test Case Effectiveness Defect Rejection Rate Defect Removal Efficiency All Life Cycle Stages Build & Test Phases Test Phase Test, UAT and Production 10
  • 11. Defining a Uniform and Transparent Metrics Data Source Is Key Metrics Data Sources QA Process Data Industry Financial Data Effort Data (Infosys Defect Data (HP (Effort, Schedule, benchmarks or (Business Mgmt DART, FTE effort Quality Center) Entry Exit Criteria Best in Class Office - AD) tracker) etc.,) benchmarks Metrics Generation Communication model S.No Forum Frequency Participants/Audience Agenda and benefits 1 QA Communication to Quarterly AD Directors, VP and Managers • Communicate the QA updates, latest initiatives, metrics Sr. Mgmt of AD from QA and take feedback from AD Chair: QA Director • Helps in reaching out to entire AD management in one common forum 2 QA Manager‟s Monthly AD Directors, Managers/Teams • Monthly QA updates, Metrics meeting with AD • Discuss initiatives, DRE, Defect Prevention Strategies etc. Chair: QA managers • Financial updates 3 Monthly Metrics Monthly Offshore, onshore management • Review metrics and perform a trend analysis • Measure key improvements Chair: Metrics Team 11
  • 12. HP Quality Center (QC) standardization helping metrics deployment succeed • The strategy to adopt HP Quality center as Test Management tool and standardizing it across testing projects proven to accelerate deployment of metrics program • HP QC helped in, – Building Consistency: Implementing uniform defect management – Creating Transparency: Creating transparency in the data collection, reporting – Aiding Simplification: Simplify the metrics data collection using automated queries – Integrating Metrics with Process deployment • HP QC is effectively used in generating below metrics, – Defect removal efficiency – Defect rejection rate – QA effectiveness – Test case effectiveness – Test case reusability % 12
  • 13. Reuse Tracking with Quality Center Business/Functional/Technical Requirements Test Scenarios “Reusable” Test Cases version Prepare Identify Release specific test cases Reusable test cases Quality Center Reusable Suite Upload Reusable test cases Release version Test Plan release folder in QC Release test cases (Test Lab) Result: Improved reuse from 0% to 10% 13
  • 14. Results 2009 Productivity Improvement - Productivity Improvement - Overall Test Cases Projects Vs. Effort 100000 20% improvement 80000 Doing 13441 No. of Test Cases 40% 34% 30% 26% 60000 20% more 38549 40000 66536 10% things 20000 0% -10% % Growth in Projects % Growth in Application % Reduction in Effort 0 Coverage 2008 2009 -20% -14% Additional TCs delivered 2009 Expected No. of TCs based on 2008 productivity Onsite-Offshore Consultants (End of Year) % of QC Cost over IT Cost 20% 40% 14% 30% 12% 11% 21% 20% With less 10% 10% 10% Cost 0% 0% 2008 2009 2007 2008 2009 Onsite-Offshore Consultants (End of Year) % of QC Cost DRE (Yearly Average) Defect Rejection Rate 25% 22% 96.8% 97.80% 21% 100.0% 95.0% 90.60% 20% 80.0% 75.5% 80.8% With 15% 60.0% improved 9% 10% 40.0% Quality 5% 20.0% 0% 0.0% 2007 2008 2009 2007 2008 2009 Defect Rejection Rate Considering code defects IT DRE Pre-Prod 14
  • 15. Q&A “The contents of this document are proprietary and confidential to Infosys Technologies Ltd. and may not be disclosed in whole or in part at any time, to any third party without the prior written consent of Infosys Technologies Ltd.” “© 2010 Infosys Technologies Ltd. All rights reserved. Copyright in the whole and any part of this document belongs to Infosys Technologies Ltd. This work may not be used, sold, transferred, adapted, abridged, copied or reproduced in whole or in part, in any manner or form, or in any media, without the prior written consent of Infosys Technologies Ltd.”
  • 16. About the authors A QA Director with Metlife, Jim has more than 19yrs of experience in IT within the Insurance industry. Has held positions as a Business Analyst, Application Development Manager and handled multiple initiatives including Development, Conversion, Migration etc. He took on responsibilities for a Centralized & Independent Test Center around 20 months back and helped successfully implement a Centralized Testing Team and Measurement Framework that has proven to be beneficial to the IT and Business. Lead the initiative by establishing vision, goals, direction, defining metrics, continuous tracking of performance and achieving results. Jim Foloky E-Mail: jfoloky@metlife.com A senior project manager with Infosys Technologies ltd., and testing centralization strategist, Srinivas Yeluripaty has more than 11 years of experience in verification & validation services. Srini is PMP certified and Two times Black Belt Certified in Six Sigma Implementation for testing productivity and business results improvement. In the recent 5 years he has been involved in, Test Center of Excellence strategy development and implementation, Consulting for building Independent and Centralized QC function - Setting goals, direction and execution for large US corporations in Banking, Financial and Insurance Testing Space. Srinivas Yeluripaty E-Mail: Srinivas_Yeluripaty@infosys.com 16
  • 17. Thank You “The contents of this document are proprietary and confidential to Infosys Technologies Ltd. and may not be disclosed in whole or in part at any time, to any third party without the prior written consent of Infosys Technologies Ltd.” “© 2010 Infosys Technologies Ltd. All rights reserved. Copyright in the whole and any part of this document belongs to Infosys Technologies Ltd. This work may not be used, sold, transferred, adapted, abridged, copied or reproduced in whole or in part, in any manner or form, or in any media, without the prior written consent of Infosys Technologies Ltd.”
  • 18. To learn more on this topic, and to connect with your peers after the conference, visit the HP Software Solutions Community: www.hp.com/go/swcommunity 18 ©2010 Hewlett-Packard Development Company, L.P.
  • 19. 19