SlideShare a Scribd company logo
©2015 InfoStretch Corporation. All rights reserved.
Liana Gevorgyan | May 6, 2015
Measuring
Quality
QA Metrics and Trends in Practice
US - UK - India
©2015 InfoStretch Corporation. All rights reserved.
SECTION 1: TECHNOLOGY IN LIFE
Bugs Are Costly
1999
Mars Climate Orbiter Crash
Instead of using the provided
metric system for navigation,
the contractor carried out
measurements using empirical
units and the space craft
crashed into Mars.
COST
$135 Million
1996
ARIANE Failure
Ariane 5 rocket exploded 36.7
seconds after take off. The
engine of this satellite was
much faster than that of the
previous models, but it had a
software bug that went
unnoticed.
COST
>$370 Million
2003
EDS Fails Child Support
EDS created an IT system
for a Child Support
Agency in the UK that had
many software
incompatibility errors.
COST
$1.1 Billion
2013
NASDAQ Trading Shutdown
August 22, 2013 NASDAQ
Stock Market Shut down
trading for three hours
because of a computer
error.
COST
$2 Billion
1985-1987
Therac-25 Medical Accelerator
A software failure caused
wrong dosages of x-rays.
These dosages were
hundreds or thousands of
times greater than
normal, resulting in death
or serious injury.
COST
5 Human Lives
Technology In Our Daily Life
Average usage of electronic systems in developed countries:
 One PC or desktop in each home.
 80% of people are using mobile phones
 40% of people are driving cars with various electronic systems
 People are traveling via train, plane on an average once a year
 Dozens of other embedded systems in our homes
 Dozens of software programs in our work place, service systems
Quality of all mentioned systems are equal to the Quality of life!
SECTION 2: DEFINING THE “WHAT”
Known QA Metrics & Trends
Defining “What”
10
 Metrics and Trends
 Measure to Understand
 Understand to Control
 Control to Improve
Several Known QA Metrics and Trends
11
 Manual & automation time ratio during
regression cycle
 Scripts maintenance time during delivery
iteration
 Daily test cases manual execution
 Automation effectiveness for issues
identification
 Issues found per area during regression
 Areas impacted after new features integration
 Issues identification behavior based on major
refactoring.
 Software process timetable metrics
 Delivery process productivity metric
 Software system availability metrics
 Test cases coverage
 Automation coverage
 Defined issues based on gap analysis
 Ambiguities per requirement
 Identified issues by criticality
 Identified issues by area separation
 Issues resolution turnaround time
 Backlog growth speed
 Release patching tendency and costs
 Customer escalations by
Blocker/Critical/Major issues per release
 QA engineer performance
 Continuous integration efficiency
Metrics Classification
12
PRODUCT METRICS
PROCESS METRICS
QA METRICS
Metrics Examples by Classification
13
 Delivery process productivity metrics
 Continuous integration efficiency
 Release patching tendency and costs
 Backlog growth speed
 QA engineer performance
 Software process timetable metrics
 Software system stability metrics
 Identified issues by criticality
 Identified issues by area separation
 Customer escalations by Blocker/Critical/Major
issues per release
 Ambiguities per requirement
 Backlog growth speed
PRODUCT METRICSPROCESS METRICS
Sample Metrics Visual
14
55%20%
15%
7% 3%
Automated UI and BE
Automated UI
In Progress
Pending Automation
Not Feasible
AUTOMATION COVERAGE BUGS BY SEVERITY
3 % 5 %
12 %
34 %
45 %
Blocker Critical High Medium Low
Visual Depiction Of Sample Trends
15
Blocker
High
Low
0
10
20
30
40
Blocker
Critical
High
Medium
Low
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
%
ISSUE ESCALATIONS BY CRITICALITY - MONTHLY TREND
1 2 3 4 5
6 WEEKS
REJECTED BUGS % PER WEEK
Expectations
Smooth releases
Predefined risks with mitigation plans
Nice feedback and appreciation
Top notch and innovative products
16
Real Life
 Delivery not always ideal
 We are familiar what is patching the release
 Lack of process tracking data for analysis
 Experimental delivery models not exactly the Best practice
models
17
SECTION 3: DELIVERY
PROCESSES & METRICS
Waterfall & Agile
Waterfall Process
REQUIREMENTS
VALIDATION
ARCHITECTURE
VERIFICATION
MODULE DESIGN
VERIFICATION
IMPLEMENTATION
SYSTEM TEST
OPERATIONS AND
MAINTENANCE
REVALIDATION
Agile Process
20
TESTING/VALIDATION/VERIFICATION
Product Backlog
 Client prioritized
product features
Sprint Backlog
 Features assigned
to Sprint
 Estimated by team
 Team Commitment
Working Code Ready For
DEPLOYMENT
Time-Boxed
Test/Develop
PRODUCT BACKLOG BACKLOG TASKS
Scrum Meetings
Every 24 hours
Agile Process Metrics
21
SCRUM TEAM SPRINT METRICS
Scrum Team’s Understanding
of Sprint Scope and Goal
Scrum Team’s Adherence to
Scrum Rules & Engineering
Practices
Scrum Team’s
Communication
Retrospective Process
Improvement
Team Enthusiasm
Quality Delivered to
Customer
Team Velocity
Technical Debt
Management
Actual Stories
Completed vs. Planned
Processes Are Not Always Best Practices
22
 Unique way of Agile
 Transition from Waterfall to Agile
 Transition from Agile to Kanban
Metrics Set Definition for Your Project
23
 Process
 Technology
 Iterations
 Project/Team Size
 Goal
SECTION 4: WEIGHT BASED ANALYSIS
FOR QA METRICS & MEASUREMENTS
Mapping With Graph Theory
Metric and Trends for your Project
25
You are watching Metrics/Trends set, are they the right ones?
Trends are in an acceptable range, but the products quality is
not improving?
Trying to improve one metric and another is going down?
How do you analyze and fix it?
Mapping QA Metrics Info Graph Theory
26
Process Metrics > A= Metric 1, B=Metric 2…
Actions/Data set that takes effect in Metrics > A1, A2…
Metric dependencies of specific action
Product Metrics > C= Metric 3, D=Metric 4…
Preconditions & Definitions For Metrics &
Actions mapped model
27
 Node’s initial weight is predefined and has value from 1-10
 Edge’s weight is predefined and has value from 1-10
 Connections between Nodes is defined based on dependencies of Metrics
from each other and from Actions
 All Actions have fixed 1 weight
Initial Metrics Model & Dependencies
28
Assume
Current Metric set is:
2 Process Metrics -> M1, M2
2 Product
Metrics -> M3, M4
Where :
M1 has dependency on M3
M1 has dependency on M4
M2 has dependency on M3
There are 3 Actions or Data sets that have effect on
some of the Metrics. Those are A1, A2, A3
Where :
M1 has dependency on A1 and A2
M4 has dependency on A3
Initial Priority
Initial Priority based on
Best Practices
W(M1) = 5
W(M2) = 4
W(M3) = 3
W(M4) = 2
Metrics Visualization via Graph
29
M2
M3
M1
4
3
M4
5
2
A2
A1
A3
Process Metrics > A= Metric 1, B=Metric 2…
Actions/Data set that takes effect in Metrics > A1, A2…
Metric dependencies of specific action
Product Metrics > C= Metric 3, D=Metric 4…
Weight Assignment On Undirected Graph
30
M2
M3
M1
4
3
M4
5
2
A2
A1
A3
2
3
5
1
1
6
1
1 1
Process Metrics > A= Metric 1, B=Metric 2…
Actions/Data set that takes effect in Metrics > A1, A2…
Metric dependencies of specific action
Product Metrics > C= Metric 3, D=Metric 4…
Calculation Formula for Metrics New Priority
31
 Priority of the node is calculated the following way:
where
 - initial priority of the node
 - node weight assigned by user
 - cumulative weight of each node's edges
𝑾(𝑨)
New Priority Calculations For One Metric
32
M2
M3
2
4
3
A2
A1
1
11
1
Initial Priority
M2 = W(M2)=4
New Priority
M2 = W(M2) * (W(M2-A1) + W(M2-A2) + W(M2-M3))
M2 = 4 * (1+1+2) = 4*4 = 16
New Priority Calculations For Graph
33
M1 M2 M3 M4 A1 A2 A3
W 5 4 3 2 1 1 1
M1 5 3 5 40
M2 4 2 1 1 16
M3 3 3 2 6 33
M4 2 5 10
CA LCULATIO NS
Metrics
New
Priority
M1
M3
M2
M4
Metrics
Initial
Priority
M1
M2
M3
M4
Metrics Priorities: Current Vs. Calculated
34
Initial Priority Based on
Best Practices
M1
M2
M3
M4
Project Dependent
Calculated Priority
M1
M3
M2
M4
INITIAL PRIORITY NEW PRIORITY
SECTION 5: METRICS WEIGHT
BASED ANALYSIS IN PRACTICE
Defining “How”
Metrics Definition For Test Project
36
 Process – Agile with Area ownership
 Technology – SAAS Based Enterprise Web & Mobile App
 Iteration – 2 weeks
 Project Size – 5 Scrum Teams
 Goal – Customer Satisfaction, No Blocker, Critical Issues Escalation
by Customer
Key Metrics and Dependencies
37
Metrics
M1 - Customer Escalations per defect severity – Product Metric
M2 – Opened Valid Defects per Area – Product Metric
M3 – Rejected Defects – Process Metric
M4 - Test cases Coverage – Process Metric
M5 - Automation Coverage – Process Metric
M6 - Defect fixes per Criticality – Product Metric
Actions and Data Sets
A1 – Customer types per investment and escalations per severity
A2 – Most Buggy areas
Metrics Initial Priority
Weight assignment and dependency analysis
38
Metric Name Predefined
Node
Weight
Metrics By
Initial
Priority
M1 - Customer Escalations per
defect severity
8 M1
M2 – Opened Valid Defects per
Area
5 M6
M3 – Rejected Defects 4 M2
M4 - Test cases Coverage 3 M3
M5 - Automation Coverage 2 M4
M6 - Defect fixes per Criticality
per Team
6 M5
Node
Weight
M1 M2 M3 M4 M5 M6 A1 A2
M1 = 8 2 6 4 5 2
M2 = 5 2 1 3
M3 = 4 1 3
M4 = 3 6 3 3 2
M5 = 2 2
M6 = 6 4 2
Graph Creation
39
3
M1
M3
M2
6
8
4
M4
5
A2
A1 2
5
1
1
3
M6
M5 26
4
3
1
2
2
2
Calculations and Metrics Prioritization
40
Node Weight M1 M2 M3 M4 M5 M6 A1 A2 Calculated
Priority
M1 = 8 2 6 4 5 2 152
M2 = 5 2 1 3 30
M3 = 4 1 3 16
M4 = 3 6 3 3 2 42
M5 = 2 2 4
M6 = 6 4 2 36
Initial
Priority
M1
M6
M2
M3
M4
M5
Calculated
Priority
M1
M3
M6
M2
M4
M5
Key Metric Changes & Improvement Plans
41
Metrics by Calculated Priority
M1 - Customer Escalations
per defect severity
M3 – Rejected Defects
M6 - Defect fixes per
Criticality per Team
M2 – Opened Valid Defects
per Area
M4 - Test cases Coverage
M5 - Automation Coverage
 Group Defect by Severity and per Customer investment to understand
real picture. 1000 Minor issues can cost more than 1 High severity issue.
 Proceed Trainings to low defect rejection, so developers will not spend
more time on analysis of invalid issues
 Make sure Defect fixes are going in Parallel with new feature
development for each sprint
 Continuously update Test case after each new issue, to make sure you
have good coverage
 Automate as much as possible to cut the costs and increase the coverage
Monitoring of Trend Based Priority Metrics
Based on Process Changes
42
60
70 68
75
20
25
29
34
70
80 82
78
0
10
20
30
40
50
60
70
80
90
Jan Feb March April
M1 M3 M6
Let the Challenge Begin… & Have FUN
43
Thank You
Global Footprint
About Us
A leading provider of next-gen mobile application
lifecycle services ranging from design and
development to testing and sustenance.
Locations
Corporate HQ: Silicon Valley
Offices: Conshohocken (PA), Ahmedabad (India),
Pune (India), London (UK)
InfoStretch Corporation
References
 Narsingh Deo, Graph Theory with Applications to Engineering and Computer Science, Prentice Hall 1974.
 A.A. Shariff K, M.A. Hussain, and S. Kumar, Leveraging un- structured data into intelligent information – analysis and
evaluation, Int. Conf. Information and Network Technology, IPCSIT, vol. 4, IACSIT press, Singapore, pp. 153-157, 2011.
 http://en.wikipedia.org/wiki/List_of_software_bugs
 http://www.starshipmodeler.com/real/vh_ari52.htm
 http://news.nationalgeographic.com/news/2011/11/pictures/111123-mars-nasa-rover-curiosity-russia-phobos-lost-curse-
space-pictures/
 http://www.bloomberg.com/news/articles/2013-08-22/nasdaq-shuts-trading-for-three-hours-in-latest-computer-error
©2015 InfoStretch Corporation. All rights reserved. 46
Q & A
Liana Gevorgyan
Sr. QA Manager
InfoStretch Corporation Inc.
liana.gevorgyan@infostretch.com
www.linkedin.com/in/lianag/en
Info@infostretch.com
www.infostretch.com

More Related Content

What's hot

Software quality metrics
Software quality metricsSoftware quality metrics
Software quality metrics
Sandeep Supal
 
Measurement systems analysis v1.1
Measurement systems analysis v1.1Measurement systems analysis v1.1
Measurement systems analysis v1.1Alexander Polyakov
 
Testing metrics
Testing metricsTesting metrics
Testing metricsprats12345
 
Attribute MSA
Attribute MSAAttribute MSA
Attribute MSA
dishashah4993
 
Attribute MSA presentation
Attribute MSA presentationAttribute MSA presentation
Attribute MSA presentation
PRASHANT KSHIRSAGAR
 
A lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projectsA lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projects
Sonata Software
 
Test Metrics
Test MetricsTest Metrics
Test Metrics
Devukjs
 
Maximo Oil and Gas 7.6.1 HSE: Permit to Work Overview
Maximo Oil and Gas 7.6.1 HSE: Permit to Work OverviewMaximo Oil and Gas 7.6.1 HSE: Permit to Work Overview
Maximo Oil and Gas 7.6.1 HSE: Permit to Work Overview
Helen Fisher
 
Arc flash August 2012 IE Aust JEEP
Arc flash  August 2012   IE Aust JEEPArc flash  August 2012   IE Aust JEEP
Arc flash August 2012 IE Aust JEEPEngineers Australia
 
Unit I Software Testing and Quality Assurance
Unit I Software Testing and Quality AssuranceUnit I Software Testing and Quality Assurance
Unit I Software Testing and Quality Assurance
VinothkumaR Ramu
 
Software Test Metrics and Measurements
Software Test Metrics and MeasurementsSoftware Test Metrics and Measurements
Software Test Metrics and Measurements
Davis Thomas
 
Autonomous control of a thermal distortion tester
Autonomous control of a thermal distortion testerAutonomous control of a thermal distortion tester
Autonomous control of a thermal distortion tester
Michael Sallmen
 
Measurement System Analysis
Measurement System AnalysisMeasurement System Analysis
Measurement System Analysis
Ronald Shewchuk
 

What's hot (18)

Software quality metrics
Software quality metricsSoftware quality metrics
Software quality metrics
 
Measurement systems analysis v1.1
Measurement systems analysis v1.1Measurement systems analysis v1.1
Measurement systems analysis v1.1
 
Testing metrics
Testing metricsTesting metrics
Testing metrics
 
Metrics used in testing
Metrics used in testingMetrics used in testing
Metrics used in testing
 
Attribute MSA
Attribute MSAAttribute MSA
Attribute MSA
 
Attribute MSA presentation
Attribute MSA presentationAttribute MSA presentation
Attribute MSA presentation
 
A lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projectsA lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projects
 
Test Metrics
Test MetricsTest Metrics
Test Metrics
 
Msa 5 day
Msa 5 dayMsa 5 day
Msa 5 day
 
Msa presentation
Msa presentationMsa presentation
Msa presentation
 
Maximo Oil and Gas 7.6.1 HSE: Permit to Work Overview
Maximo Oil and Gas 7.6.1 HSE: Permit to Work OverviewMaximo Oil and Gas 7.6.1 HSE: Permit to Work Overview
Maximo Oil and Gas 7.6.1 HSE: Permit to Work Overview
 
Lecture08
Lecture08Lecture08
Lecture08
 
Arc flash August 2012 IE Aust JEEP
Arc flash  August 2012   IE Aust JEEPArc flash  August 2012   IE Aust JEEP
Arc flash August 2012 IE Aust JEEP
 
Unit I Software Testing and Quality Assurance
Unit I Software Testing and Quality AssuranceUnit I Software Testing and Quality Assurance
Unit I Software Testing and Quality Assurance
 
Spc
SpcSpc
Spc
 
Software Test Metrics and Measurements
Software Test Metrics and MeasurementsSoftware Test Metrics and Measurements
Software Test Metrics and Measurements
 
Autonomous control of a thermal distortion tester
Autonomous control of a thermal distortion testerAutonomous control of a thermal distortion tester
Autonomous control of a thermal distortion tester
 
Measurement System Analysis
Measurement System AnalysisMeasurement System Analysis
Measurement System Analysis
 

Viewers also liked

Agile Metrics...That Matter
Agile Metrics...That MatterAgile Metrics...That Matter
Agile Metrics...That Matter
Erik Weber
 
Software team lead performance appraisal
Software team lead performance appraisalSoftware team lead performance appraisal
Software team lead performance appraisalvictorluxman
 
Web developer designer performance appraisal
Web developer designer performance appraisalWeb developer designer performance appraisal
Web developer designer performance appraisalgarymobile15
 
Computer software engineer performance appraisal
Computer software engineer performance appraisalComputer software engineer performance appraisal
Computer software engineer performance appraisal
jamespoter576
 
Agile Scrum Methodology
Agile Scrum MethodologyAgile Scrum Methodology
Agile Scrum Methodology
Rajeev Misra
 

Viewers also liked (6)

Agile Metrics...That Matter
Agile Metrics...That MatterAgile Metrics...That Matter
Agile Metrics...That Matter
 
Software team lead performance appraisal
Software team lead performance appraisalSoftware team lead performance appraisal
Software team lead performance appraisal
 
Web developer designer performance appraisal
Web developer designer performance appraisalWeb developer designer performance appraisal
Web developer designer performance appraisal
 
Agile metrics and quality
Agile metrics and qualityAgile metrics and quality
Agile metrics and quality
 
Computer software engineer performance appraisal
Computer software engineer performance appraisalComputer software engineer performance appraisal
Computer software engineer performance appraisal
 
Agile Scrum Methodology
Agile Scrum MethodologyAgile Scrum Methodology
Agile Scrum Methodology
 

Similar to Measuring Quality_Testing&Trends_Final _May 5

Software metrics by Dr. B. J. Mohite
Software metrics by Dr. B. J. MohiteSoftware metrics by Dr. B. J. Mohite
Software metrics by Dr. B. J. Mohite
Zeal Education Society, Pune
 
Otto Vinter - Analysing Your Defect Data for Improvement Potential
Otto Vinter - Analysing Your Defect Data for Improvement PotentialOtto Vinter - Analysing Your Defect Data for Improvement Potential
Otto Vinter - Analysing Your Defect Data for Improvement Potential
TEST Huddle
 
Aplication of on line data analytics to a continuous process polybetene unit
Aplication of on line data analytics to a continuous process polybetene unitAplication of on line data analytics to a continuous process polybetene unit
Aplication of on line data analytics to a continuous process polybetene unit
Emerson Exchange
 
AI Class Topic 2: Step-by-step Process for AI development
AI Class Topic 2: Step-by-step Process for AI developmentAI Class Topic 2: Step-by-step Process for AI development
AI Class Topic 2: Step-by-step Process for AI development
Value Amplify Consulting
 
Managing Vendor Issues on AMI Design
Managing Vendor Issues on AMI DesignManaging Vendor Issues on AMI Design
Managing Vendor Issues on AMI Design
TESCO - The Eastern Specialty Company
 
Managing Vendor Issues on AMI Design
Managing Vendor Issues on AMI DesignManaging Vendor Issues on AMI Design
Managing Vendor Issues on AMI Design
TESCO - The Eastern Specialty Company
 
Sema 2016- Managing Vendor Issues on AMI Design
Sema 2016- Managing Vendor Issues on AMI DesignSema 2016- Managing Vendor Issues on AMI Design
Sema 2016- Managing Vendor Issues on AMI Design
TESCO - The Eastern Specialty Company
 
Automation Essentials for the Age of Agile
Automation Essentials for the Age of AgileAutomation Essentials for the Age of Agile
Automation Essentials for the Age of Agile
Applause
 
Driving Innovation with Kanban at Jaguar Land Rover
Driving Innovation with Kanban at Jaguar Land RoverDriving Innovation with Kanban at Jaguar Land Rover
Driving Innovation with Kanban at Jaguar Land Rover
LeanKit
 
From sensor readings to prediction: on the process of developing practical so...
From sensor readings to prediction: on the process of developing practical so...From sensor readings to prediction: on the process of developing practical so...
From sensor readings to prediction: on the process of developing practical so...
Manuel Martín
 
Testing Metrics and why Managers like them
Testing Metrics and why Managers like themTesting Metrics and why Managers like them
Testing Metrics and why Managers like them
PractiTest
 
Doing Analytics Right - Designing and Automating Analytics
Doing Analytics Right - Designing and Automating AnalyticsDoing Analytics Right - Designing and Automating Analytics
Doing Analytics Right - Designing and Automating Analytics
Tasktop
 
Comparative Analysis of Machine Learning Algorithms for their Effectiveness i...
Comparative Analysis of Machine Learning Algorithms for their Effectiveness i...Comparative Analysis of Machine Learning Algorithms for their Effectiveness i...
Comparative Analysis of Machine Learning Algorithms for their Effectiveness i...
IRJET Journal
 
Mtc strategy-briefing-houston-pd m-05212018-3
Mtc strategy-briefing-houston-pd m-05212018-3Mtc strategy-briefing-houston-pd m-05212018-3
Mtc strategy-briefing-houston-pd m-05212018-3
Dania Kodeih
 
Metrics based Management
Metrics based ManagementMetrics based Management
Metrics based Management
SPIN Chennai
 
Testing Metrics: Project, Product, Process
Testing Metrics: Project, Product, ProcessTesting Metrics: Project, Product, Process
Testing Metrics: Project, Product, Process
TechWell
 
IJIRS_Improvement of Quality Sigma Level of Copper Terminal at Vertical Machi...
IJIRS_Improvement of Quality Sigma Level of Copper Terminal at Vertical Machi...IJIRS_Improvement of Quality Sigma Level of Copper Terminal at Vertical Machi...
IJIRS_Improvement of Quality Sigma Level of Copper Terminal at Vertical Machi...
Government engineering College- Banswara,Rajasthan
 
The Business Value of Data Modeling
The Business Value of Data ModelingThe Business Value of Data Modeling
The Business Value of Data Modeling
Embarcadero Technologies
 
Day 1 1620 - 1705 - maple - pranabendu bhattacharyya
Day 1   1620 - 1705 - maple - pranabendu bhattacharyyaDay 1   1620 - 1705 - maple - pranabendu bhattacharyya
Day 1 1620 - 1705 - maple - pranabendu bhattacharyyaPMI2011
 

Similar to Measuring Quality_Testing&Trends_Final _May 5 (20)

Software metrics by Dr. B. J. Mohite
Software metrics by Dr. B. J. MohiteSoftware metrics by Dr. B. J. Mohite
Software metrics by Dr. B. J. Mohite
 
Otto Vinter - Analysing Your Defect Data for Improvement Potential
Otto Vinter - Analysing Your Defect Data for Improvement PotentialOtto Vinter - Analysing Your Defect Data for Improvement Potential
Otto Vinter - Analysing Your Defect Data for Improvement Potential
 
PAPER_CODE__IE12
PAPER_CODE__IE12PAPER_CODE__IE12
PAPER_CODE__IE12
 
Aplication of on line data analytics to a continuous process polybetene unit
Aplication of on line data analytics to a continuous process polybetene unitAplication of on line data analytics to a continuous process polybetene unit
Aplication of on line data analytics to a continuous process polybetene unit
 
AI Class Topic 2: Step-by-step Process for AI development
AI Class Topic 2: Step-by-step Process for AI developmentAI Class Topic 2: Step-by-step Process for AI development
AI Class Topic 2: Step-by-step Process for AI development
 
Managing Vendor Issues on AMI Design
Managing Vendor Issues on AMI DesignManaging Vendor Issues on AMI Design
Managing Vendor Issues on AMI Design
 
Managing Vendor Issues on AMI Design
Managing Vendor Issues on AMI DesignManaging Vendor Issues on AMI Design
Managing Vendor Issues on AMI Design
 
Sema 2016- Managing Vendor Issues on AMI Design
Sema 2016- Managing Vendor Issues on AMI DesignSema 2016- Managing Vendor Issues on AMI Design
Sema 2016- Managing Vendor Issues on AMI Design
 
Automation Essentials for the Age of Agile
Automation Essentials for the Age of AgileAutomation Essentials for the Age of Agile
Automation Essentials for the Age of Agile
 
Driving Innovation with Kanban at Jaguar Land Rover
Driving Innovation with Kanban at Jaguar Land RoverDriving Innovation with Kanban at Jaguar Land Rover
Driving Innovation with Kanban at Jaguar Land Rover
 
From sensor readings to prediction: on the process of developing practical so...
From sensor readings to prediction: on the process of developing practical so...From sensor readings to prediction: on the process of developing practical so...
From sensor readings to prediction: on the process of developing practical so...
 
Testing Metrics and why Managers like them
Testing Metrics and why Managers like themTesting Metrics and why Managers like them
Testing Metrics and why Managers like them
 
Doing Analytics Right - Designing and Automating Analytics
Doing Analytics Right - Designing and Automating AnalyticsDoing Analytics Right - Designing and Automating Analytics
Doing Analytics Right - Designing and Automating Analytics
 
Comparative Analysis of Machine Learning Algorithms for their Effectiveness i...
Comparative Analysis of Machine Learning Algorithms for their Effectiveness i...Comparative Analysis of Machine Learning Algorithms for their Effectiveness i...
Comparative Analysis of Machine Learning Algorithms for their Effectiveness i...
 
Mtc strategy-briefing-houston-pd m-05212018-3
Mtc strategy-briefing-houston-pd m-05212018-3Mtc strategy-briefing-houston-pd m-05212018-3
Mtc strategy-briefing-houston-pd m-05212018-3
 
Metrics based Management
Metrics based ManagementMetrics based Management
Metrics based Management
 
Testing Metrics: Project, Product, Process
Testing Metrics: Project, Product, ProcessTesting Metrics: Project, Product, Process
Testing Metrics: Project, Product, Process
 
IJIRS_Improvement of Quality Sigma Level of Copper Terminal at Vertical Machi...
IJIRS_Improvement of Quality Sigma Level of Copper Terminal at Vertical Machi...IJIRS_Improvement of Quality Sigma Level of Copper Terminal at Vertical Machi...
IJIRS_Improvement of Quality Sigma Level of Copper Terminal at Vertical Machi...
 
The Business Value of Data Modeling
The Business Value of Data ModelingThe Business Value of Data Modeling
The Business Value of Data Modeling
 
Day 1 1620 - 1705 - maple - pranabendu bhattacharyya
Day 1   1620 - 1705 - maple - pranabendu bhattacharyyaDay 1   1620 - 1705 - maple - pranabendu bhattacharyya
Day 1 1620 - 1705 - maple - pranabendu bhattacharyya
 

Measuring Quality_Testing&Trends_Final _May 5

  • 1. ©2015 InfoStretch Corporation. All rights reserved. Liana Gevorgyan | May 6, 2015 Measuring Quality QA Metrics and Trends in Practice US - UK - India
  • 2. ©2015 InfoStretch Corporation. All rights reserved. SECTION 1: TECHNOLOGY IN LIFE Bugs Are Costly
  • 3. 1999 Mars Climate Orbiter Crash Instead of using the provided metric system for navigation, the contractor carried out measurements using empirical units and the space craft crashed into Mars. COST $135 Million
  • 4. 1996 ARIANE Failure Ariane 5 rocket exploded 36.7 seconds after take off. The engine of this satellite was much faster than that of the previous models, but it had a software bug that went unnoticed. COST >$370 Million
  • 5. 2003 EDS Fails Child Support EDS created an IT system for a Child Support Agency in the UK that had many software incompatibility errors. COST $1.1 Billion
  • 6. 2013 NASDAQ Trading Shutdown August 22, 2013 NASDAQ Stock Market Shut down trading for three hours because of a computer error. COST $2 Billion
  • 7. 1985-1987 Therac-25 Medical Accelerator A software failure caused wrong dosages of x-rays. These dosages were hundreds or thousands of times greater than normal, resulting in death or serious injury. COST 5 Human Lives
  • 8. Technology In Our Daily Life Average usage of electronic systems in developed countries:  One PC or desktop in each home.  80% of people are using mobile phones  40% of people are driving cars with various electronic systems  People are traveling via train, plane on an average once a year  Dozens of other embedded systems in our homes  Dozens of software programs in our work place, service systems Quality of all mentioned systems are equal to the Quality of life!
  • 9. SECTION 2: DEFINING THE “WHAT” Known QA Metrics & Trends
  • 10. Defining “What” 10  Metrics and Trends  Measure to Understand  Understand to Control  Control to Improve
  • 11. Several Known QA Metrics and Trends 11  Manual & automation time ratio during regression cycle  Scripts maintenance time during delivery iteration  Daily test cases manual execution  Automation effectiveness for issues identification  Issues found per area during regression  Areas impacted after new features integration  Issues identification behavior based on major refactoring.  Software process timetable metrics  Delivery process productivity metric  Software system availability metrics  Test cases coverage  Automation coverage  Defined issues based on gap analysis  Ambiguities per requirement  Identified issues by criticality  Identified issues by area separation  Issues resolution turnaround time  Backlog growth speed  Release patching tendency and costs  Customer escalations by Blocker/Critical/Major issues per release  QA engineer performance  Continuous integration efficiency
  • 13. Metrics Examples by Classification 13  Delivery process productivity metrics  Continuous integration efficiency  Release patching tendency and costs  Backlog growth speed  QA engineer performance  Software process timetable metrics  Software system stability metrics  Identified issues by criticality  Identified issues by area separation  Customer escalations by Blocker/Critical/Major issues per release  Ambiguities per requirement  Backlog growth speed PRODUCT METRICSPROCESS METRICS
  • 14. Sample Metrics Visual 14 55%20% 15% 7% 3% Automated UI and BE Automated UI In Progress Pending Automation Not Feasible AUTOMATION COVERAGE BUGS BY SEVERITY 3 % 5 % 12 % 34 % 45 % Blocker Critical High Medium Low
  • 15. Visual Depiction Of Sample Trends 15 Blocker High Low 0 10 20 30 40 Blocker Critical High Medium Low 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 % ISSUE ESCALATIONS BY CRITICALITY - MONTHLY TREND 1 2 3 4 5 6 WEEKS REJECTED BUGS % PER WEEK
  • 16. Expectations Smooth releases Predefined risks with mitigation plans Nice feedback and appreciation Top notch and innovative products 16
  • 17. Real Life  Delivery not always ideal  We are familiar what is patching the release  Lack of process tracking data for analysis  Experimental delivery models not exactly the Best practice models 17
  • 18. SECTION 3: DELIVERY PROCESSES & METRICS Waterfall & Agile
  • 20. Agile Process 20 TESTING/VALIDATION/VERIFICATION Product Backlog  Client prioritized product features Sprint Backlog  Features assigned to Sprint  Estimated by team  Team Commitment Working Code Ready For DEPLOYMENT Time-Boxed Test/Develop PRODUCT BACKLOG BACKLOG TASKS Scrum Meetings Every 24 hours
  • 21. Agile Process Metrics 21 SCRUM TEAM SPRINT METRICS Scrum Team’s Understanding of Sprint Scope and Goal Scrum Team’s Adherence to Scrum Rules & Engineering Practices Scrum Team’s Communication Retrospective Process Improvement Team Enthusiasm Quality Delivered to Customer Team Velocity Technical Debt Management Actual Stories Completed vs. Planned
  • 22. Processes Are Not Always Best Practices 22  Unique way of Agile  Transition from Waterfall to Agile  Transition from Agile to Kanban
  • 23. Metrics Set Definition for Your Project 23  Process  Technology  Iterations  Project/Team Size  Goal
  • 24. SECTION 4: WEIGHT BASED ANALYSIS FOR QA METRICS & MEASUREMENTS Mapping With Graph Theory
  • 25. Metric and Trends for your Project 25 You are watching Metrics/Trends set, are they the right ones? Trends are in an acceptable range, but the products quality is not improving? Trying to improve one metric and another is going down? How do you analyze and fix it?
  • 26. Mapping QA Metrics Info Graph Theory 26 Process Metrics > A= Metric 1, B=Metric 2… Actions/Data set that takes effect in Metrics > A1, A2… Metric dependencies of specific action Product Metrics > C= Metric 3, D=Metric 4…
  • 27. Preconditions & Definitions For Metrics & Actions mapped model 27  Node’s initial weight is predefined and has value from 1-10  Edge’s weight is predefined and has value from 1-10  Connections between Nodes is defined based on dependencies of Metrics from each other and from Actions  All Actions have fixed 1 weight
  • 28. Initial Metrics Model & Dependencies 28 Assume Current Metric set is: 2 Process Metrics -> M1, M2 2 Product Metrics -> M3, M4 Where : M1 has dependency on M3 M1 has dependency on M4 M2 has dependency on M3 There are 3 Actions or Data sets that have effect on some of the Metrics. Those are A1, A2, A3 Where : M1 has dependency on A1 and A2 M4 has dependency on A3 Initial Priority Initial Priority based on Best Practices W(M1) = 5 W(M2) = 4 W(M3) = 3 W(M4) = 2
  • 29. Metrics Visualization via Graph 29 M2 M3 M1 4 3 M4 5 2 A2 A1 A3 Process Metrics > A= Metric 1, B=Metric 2… Actions/Data set that takes effect in Metrics > A1, A2… Metric dependencies of specific action Product Metrics > C= Metric 3, D=Metric 4…
  • 30. Weight Assignment On Undirected Graph 30 M2 M3 M1 4 3 M4 5 2 A2 A1 A3 2 3 5 1 1 6 1 1 1 Process Metrics > A= Metric 1, B=Metric 2… Actions/Data set that takes effect in Metrics > A1, A2… Metric dependencies of specific action Product Metrics > C= Metric 3, D=Metric 4…
  • 31. Calculation Formula for Metrics New Priority 31  Priority of the node is calculated the following way: where  - initial priority of the node  - node weight assigned by user  - cumulative weight of each node's edges 𝑾(𝑨)
  • 32. New Priority Calculations For One Metric 32 M2 M3 2 4 3 A2 A1 1 11 1 Initial Priority M2 = W(M2)=4 New Priority M2 = W(M2) * (W(M2-A1) + W(M2-A2) + W(M2-M3)) M2 = 4 * (1+1+2) = 4*4 = 16
  • 33. New Priority Calculations For Graph 33 M1 M2 M3 M4 A1 A2 A3 W 5 4 3 2 1 1 1 M1 5 3 5 40 M2 4 2 1 1 16 M3 3 3 2 6 33 M4 2 5 10 CA LCULATIO NS Metrics New Priority M1 M3 M2 M4 Metrics Initial Priority M1 M2 M3 M4
  • 34. Metrics Priorities: Current Vs. Calculated 34 Initial Priority Based on Best Practices M1 M2 M3 M4 Project Dependent Calculated Priority M1 M3 M2 M4 INITIAL PRIORITY NEW PRIORITY
  • 35. SECTION 5: METRICS WEIGHT BASED ANALYSIS IN PRACTICE Defining “How”
  • 36. Metrics Definition For Test Project 36  Process – Agile with Area ownership  Technology – SAAS Based Enterprise Web & Mobile App  Iteration – 2 weeks  Project Size – 5 Scrum Teams  Goal – Customer Satisfaction, No Blocker, Critical Issues Escalation by Customer
  • 37. Key Metrics and Dependencies 37 Metrics M1 - Customer Escalations per defect severity – Product Metric M2 – Opened Valid Defects per Area – Product Metric M3 – Rejected Defects – Process Metric M4 - Test cases Coverage – Process Metric M5 - Automation Coverage – Process Metric M6 - Defect fixes per Criticality – Product Metric Actions and Data Sets A1 – Customer types per investment and escalations per severity A2 – Most Buggy areas
  • 38. Metrics Initial Priority Weight assignment and dependency analysis 38 Metric Name Predefined Node Weight Metrics By Initial Priority M1 - Customer Escalations per defect severity 8 M1 M2 – Opened Valid Defects per Area 5 M6 M3 – Rejected Defects 4 M2 M4 - Test cases Coverage 3 M3 M5 - Automation Coverage 2 M4 M6 - Defect fixes per Criticality per Team 6 M5 Node Weight M1 M2 M3 M4 M5 M6 A1 A2 M1 = 8 2 6 4 5 2 M2 = 5 2 1 3 M3 = 4 1 3 M4 = 3 6 3 3 2 M5 = 2 2 M6 = 6 4 2
  • 40. Calculations and Metrics Prioritization 40 Node Weight M1 M2 M3 M4 M5 M6 A1 A2 Calculated Priority M1 = 8 2 6 4 5 2 152 M2 = 5 2 1 3 30 M3 = 4 1 3 16 M4 = 3 6 3 3 2 42 M5 = 2 2 4 M6 = 6 4 2 36 Initial Priority M1 M6 M2 M3 M4 M5 Calculated Priority M1 M3 M6 M2 M4 M5
  • 41. Key Metric Changes & Improvement Plans 41 Metrics by Calculated Priority M1 - Customer Escalations per defect severity M3 – Rejected Defects M6 - Defect fixes per Criticality per Team M2 – Opened Valid Defects per Area M4 - Test cases Coverage M5 - Automation Coverage  Group Defect by Severity and per Customer investment to understand real picture. 1000 Minor issues can cost more than 1 High severity issue.  Proceed Trainings to low defect rejection, so developers will not spend more time on analysis of invalid issues  Make sure Defect fixes are going in Parallel with new feature development for each sprint  Continuously update Test case after each new issue, to make sure you have good coverage  Automate as much as possible to cut the costs and increase the coverage
  • 42. Monitoring of Trend Based Priority Metrics Based on Process Changes 42 60 70 68 75 20 25 29 34 70 80 82 78 0 10 20 30 40 50 60 70 80 90 Jan Feb March April M1 M3 M6
  • 43. Let the Challenge Begin… & Have FUN 43
  • 44. Thank You Global Footprint About Us A leading provider of next-gen mobile application lifecycle services ranging from design and development to testing and sustenance. Locations Corporate HQ: Silicon Valley Offices: Conshohocken (PA), Ahmedabad (India), Pune (India), London (UK) InfoStretch Corporation
  • 45. References  Narsingh Deo, Graph Theory with Applications to Engineering and Computer Science, Prentice Hall 1974.  A.A. Shariff K, M.A. Hussain, and S. Kumar, Leveraging un- structured data into intelligent information – analysis and evaluation, Int. Conf. Information and Network Technology, IPCSIT, vol. 4, IACSIT press, Singapore, pp. 153-157, 2011.  http://en.wikipedia.org/wiki/List_of_software_bugs  http://www.starshipmodeler.com/real/vh_ari52.htm  http://news.nationalgeographic.com/news/2011/11/pictures/111123-mars-nasa-rover-curiosity-russia-phobos-lost-curse- space-pictures/  http://www.bloomberg.com/news/articles/2013-08-22/nasdaq-shuts-trading-for-three-hours-in-latest-computer-error
  • 46. ©2015 InfoStretch Corporation. All rights reserved. 46 Q & A Liana Gevorgyan Sr. QA Manager InfoStretch Corporation Inc. liana.gevorgyan@infostretch.com www.linkedin.com/in/lianag/en Info@infostretch.com www.infostretch.com

Editor's Notes

  1. Original Title: Key Metrics and Dependencies For Our Project
  2. http://top-10-list.org/2010/05/03/ten-costliest-software-bugs/