Agile Testing: Metrics that Matter
 

Agile Testing: Metrics that Matter

on

  • 2,841 views

In an Agile project, what testing metrics should you care about? How do you best collect these metrics so they are meaningful? ...

In an Agile project, what testing metrics should you care about? How do you best collect these metrics so they are meaningful?

Agile projects impose unique challenges compared to traditional methodologies because of the key tenets of the Agile manifesto. From rapid execution cycles to preference for working software over documentation, it may seem that effectively measuring testing of Agile projects is prohibitively difficult.

During this webinar, we will present an overview of what metrics need to be captured specifically in an Agile project that will provide the business with a valuable snapshot of the quality of a release.

Statistics

Views

Total Views
2,841
Views on SlideShare
2,796
Embed Views
45

Actions

Likes
2
Downloads
72
Comments
0

4 Embeds 45

http://www.allianceglobalservices.com 42
http://alliance.sandbox.whoisstudio.com 1
http://www.linkedin.com 1
https://www.linkedin.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Agile Testing: Metrics that Matter Agile Testing: Metrics that Matter Presentation Transcript

  • Metrics that MatterMeasuring Quality in Agile ProjectSreekanth Singaraju, VPSharon Lee, Director of Marketing
  • Agenda• Agile Overview• Testing in Agile• Importance of Metrics• Key Metrics – Test and Execution Coverage – Defect identification efficiency – Technical Debt – Test confidence level• Tools 2
  • Why Organizations Use Agile• Develop iteratively with business needs• Deliver working software rapidly• Flexibility to change with business needs• Incorporate end-user feedback• Highly productive technical teams Deliver what end-users really want in an iterative fashion 3
  • Role of Testing in Agile What product or service does testing provide? “Tested, Debugged Software” – Wrong Testing provides Information about  the status of software under development  to make informed decisions.• Testing may be done by people who aren’t called “QA” or “Tester”; e.g. Business Analysts.• Some teams may use developers for system or acceptance testing.• Dedicated testers bring two benefits – Focus on customer usage over technical implementation – Focus on uncovering flaws over confirming completeness 4
  • Testing in Agile ProjectsAgile Testing Quadrantsprovide a concise way tocommunicate the differentlayers of testing needed andhow they can be executed 5
  • Testing in Agile Projects• Testing is time-boxed like development• Functionality to be tested is not limited like Dev• Difficult to achieve “fully-tested” status• Defects found do not reflect Test Results• Testing spread across Dev and Test teams Unique Challenges on Agile Projects 6
  • Metrics Objectives 7
  • Metrics ObjectivesFor Business Stakeholders provide:• All the information they need to make decisions, and no more.• Information at the level of detail they can use.• Information at the scope they care about (team, project, program, line of business, enterprise, industry)• Information pertaining to the time frame they care about (day, iteration, release, project, strategic milestone) 8
  • Metrics ObjectivesCommunicate the following aspects:• The level of testing that was accomplished in a sprint, release etc.• The efficiency with which defects have been identified in different testing stages• The cost of trade-offs being made in each sprint• The overall confidence level in the testing efforts 9
  • Coverage MetricsPrincipleCommunicate to all stakeholders the coverage provided by formal testingDimensions• Use Validation points to evaluate application’s compliance with requirements• Weight them by business criticalityDefinitionDesigned Coverage = Designed Validation Points / All Validation Points(Estimated)Executed Coverage = Executed Validation Points / All Validation Points(Estimated)Best Practices• Keep coverage metrics simple• Introduce granularity to address business needs• Trend to show progress (or declines – great case for additional resources) 10
  • Defect Identification EfficiencyPrincipleIdentify how defects are identified and created. Evaluate compliance of Agileprinciples.Dimensions• Number of defects identified in different stages of software lifecycle• Number of defects created in different stages of software lifecycle• Use key Software stages to draw valuable inferencesDefinitionProportion of defects identified by each stage of a lifecycle – Ex – Testing –56%, UAT – 18%, Production - 4%, other - 22%Best Practices• Start with defects identified first – then introduce defects created• Use “Other” to catch exceptions and introduce granularity incrementally• Factor in defect severity if that information is readily available 11
  • Technical DebtPrincipleProvide a financial impact of technical choices on the overall product resultingfrom technology, process and implementation choices made duringdevelopmentDimensions• Quality of software code• Unresolved defects that are business or quality critical• Impact of technical or architectural choicesDefinitionCost estimates for redressing most critical (Business perspective) defects ortechnical choicesBest Practices• Explosive- Handle and communicate with care• Be conservative – use a very strong criteria to evaluate “criticality”• Use several available tools to evaluate software code quality 12
  • Confidence LevelPrincipleDemonstrate the confidence of the testing team in the testing efforts executedon a release with the resources provided for testing.Informational• Test Execution Coverage• Expertise of resources to test the application• Quantitative results of defects including Defect identification efficiencyDefinitionBased on the evaluation of the 3 dimensions above rank each of them on ascale of 0(Poor) – 1(High). Overall confidence =(3.Coverage.Resources.Results)/(Coverage+Resources+Results)Best Practices• Be prepared with supporting facts• Define clearly laid out criteria for ranking 13
  • Key Success Drivers• Alignment with ultimate stakeholders’ goals• Won’t be perfect so don’t wait for them to be• Keep them simple• Collaborate with other functions• Trending more valuable than snapshots Key best practices to  succeed at creating  metrics. 14
  • ToolsQuality Management Systems• HP Quality Center• Silk Central• RallyAutomation Tools• QTP• Selenium• SilkCode Quality Analysis• CAST AIP• Coverity• Radar 15
  • QUESTIONS & DISCUSSIONwww.allianceglobalservices.com 16
  • Thank YouFor more information email us at:info@allianceglobalservices.com 17