Project quality (and test process) metrics

1,728 views
1,519 views

Published on

Presentation from lecture on Poznań University of Technology for IT students.

Published in: Technology
0 Comments
5 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,728
On SlideShare
0
From Embeds
0
Number of Embeds
11
Actions
Shares
0
Downloads
44
Comments
0
Likes
5
Embeds 0
No embeds

No notes for slide
  • Ask about six sigma. Marta asked me. Which company developed six sigma.
  • Who use metrics so far?
  • Analyze
  • People are smart. They can start working to improve metrics
  • Trends – short story how we set KPI. Do not set it at the begining. Obeserve trends, analysis it. Talk about it, ask which release/ sprint went fine, we are happy, which not and make a decision based on it. Do not set it orbitrary. Of course at the begining you can set base KPI but you have to be carefull. KPI can change as project phase/ requirement / .. Changes.
  • QuantitativeQualitativeAgile – stawia na qualitative. Demo ask for client feedback after it, talk with your client, clolse cooperation with client, delivery what client need, rather than what client wrote in requirement/specificationStory baout JLP
  • WE were like a Gilbert. Some timie ago there wer quite big project for very important client. Agile We were quite happy about project quality, everything was smooth, progressive UAT (yes, clent test features after each sprint and accept). But once there was a message from client. That our code quality is very low. When we start asking why, we didn’t get clear answer. But after more detailed question we realize that issue is somewhere else. Migration/content team has an issue with using delivered and it cased delay, client development team weren’t able o take responsibility for maintenence, ... It cause that client management team had feeling that we deliver low quality code. We have verify our metrics and they were fine. Almost on the same level throught during the project. External company did code/architecture review and UAT tests. All went well and we showed that problem were somwher else on client side. But we learnt that we have to carefull on more on qualitative metrics / client feelings and only demo and short feedback after whcih almost claimt that’s fine is not enough
  • Measure basic things, that shows you were is something wrong. Than looking for details. So do not use metrics for detailed information / do not expect it. Problems can be different, so you metrics to found that you have a problem and then try to identify what caused it.
  • JIRA and other tools change values for issues. So if you debug after few days you can see totally different values. SO, most of defect can be fixed, ...It great to froze data for debuging
  • WE have a lot of question about metrics – are they ok? Descriptions are fine, but if you need to go throught several report and read details it takes too much time. Visualisation allow you to understand it better
  • So we as QA monitor values that are agreed with TL for example or just good practice. WE can monitor, ask, require to add task to new sprint, ... or just raise that good practice are not keep in this project
  • Questions to students – why does we test?
  • Testing doesn’t improve, it measure, when we fix defect, we improve quality, but there is a chance to introduce new ones
  • How testing process looks in Cognifide – as it’s the most important metrics for us are connected with our specific process. And remember metrics are against process not
  • Metric
  • Screen z metryka – szybkosc qa zalezy od jakosci jaka dostajemhy. Jest to tak naprade suma wszelkich testow jak i retestow. Czyli jesli cos dostaniemy zlego, odrzucimy to kilka razy, to nasza efektywnosc jest roznie mierzonaManager – przetestowaliscie tylko jedno storyQA przetestowalismy 4 story (za jazdtm razem to samo), ale za kazdym razem dostawalismy je z bledem
  • Number passed test cases says nothing. We always add one more metric from bug tracker. I have never been in project where there were full mapping between test cases and bugs. Always thre were some bugs reported agains application which are not cover by any test case. There is one more rule in QA world – that number of test cases means nothing. So I would rather be very carefull with metrics against test cases – unfrotunatelly most of companies/test leads believ in test case world – I am not.
  • Screen z metryka – szybkosc qa zalezy od jakosci jaka dostajemhy. Jest to tak naprade suma wszelkich testow jak i retestow. Czyli jesli cos dostaniemy zlego, odrzucimy to kilka razy, to nasza efektywnosc jest roznie mierzonaManager – przetestowaliscie tylko jedno storyQA przetestowalismy 4 story (za jazdtm razem to samo), ale za kazdym razem dostawalismy je z bledem
  • Screen z metryka – szybkosc qa zalezy od jakosci jaka dostajemhy. Jest to tak naprade suma wszelkich testow jak i retestow. Czyli jesli cos dostaniemy zlego, odrzucimy to kilka razy, to nasza efektywnosc jest roznie mierzonaManager – przetestowaliscie tylko jedno storyQA przetestowalismy 4 story (za jazdtm razem to samo), ale za kazdym razem dostawalismy je z bledem
  • Screen z metryka – szybkosc qa zalezy od jakosci jaka dostajemhy. Jest to tak naprade suma wszelkich testow jak i retestow. Czyli jesli cos dostaniemy zlego, odrzucimy to kilka razy, to nasza efektywnosc jest roznie mierzonaManager – przetestowaliscie tylko jedno storyQA przetestowalismy 4 story (za jazdtm razem to samo), ale za kazdym razem dostawalismy je z bledem
  • What is it?
  • Screen z metryka – szybkosc qa zalezy od jakosci jaka dostajemhy. Jest to tak naprade suma wszelkich testow jak i retestow. Czyli jesli cos dostaniemy zlego, odrzucimy to kilka razy, to nasza efektywnosc jest roznie mierzonaManager – przetestowaliscie tylko jedno storyQA przetestowalismy 4 story (za jazdtm razem to samo), ale za kazdym razem dostawalismy je z bledem
  • Screen z metryka – szybkosc qa zalezy od jakosci jaka dostajemhy. Jest to tak naprade suma wszelkich testow jak i retestow. Czyli jesli cos dostaniemy zlego, odrzucimy to kilka razy, to nasza efektywnosc jest roznie mierzonaManager – przetestowaliscie tylko jedno storyQA przetestowalismy 4 story (za jazdtm razem to samo), ale za kazdym razem dostawalismy je z bledem
  • Please look at automation results. We have quite good coverage in this project – about 20%, but automated tests found not too many. Do you know why?Pesticide effect – you still need to work an update your automation. But it doesn’t mean that automation do not work. It minimalize our test effort. Automated tests can be run on dev env too, and if found issue they are not reported – so please be carefull. The same during acceptance tests, we do not raise all defect. AS mostly dev-tester are paired and if issue is found and can be fixed immidiatelly we do not raise as it takes more time . Why we do not care – if effect not delay development – we can leave with them.
  • Screen z metryka – szybkosc qa zalezy od jakosci jaka dostajemhy. Jest to tak naprade suma wszelkich testow jak i retestow. Czyli jesli cos dostaniemy zlego, odrzucimy to kilka razy, to nasza efektywnosc jest roznie mierzonaManager – przetestowaliscie tylko jedno storyQA przetestowalismy 4 story (za jazdtm razem to samo), ale za kazdym razem dostawalismy je z bledem
  • Quite important metric. Do we test write thing. Do we raise right defect? It’s not only about tester time, but client who prioritize it, developers who are trying to fix/reproduce, .... Do we undersand what is the most important for client and businessThe same for external compoany – UAT. WE shows that they raising a lot of issues but not correct ones. We lost time only for debugging not adding any value to project
  • Quite important metric. Do we test write thing. Do we raise right defect? It’s not only about tester time, but client who prioritize it, developers who are trying to fix/reproduce, .... Do we undersand what is the most important for client and business
  • Queues are very important metrics in agile env – especially in Kanabn methodology.
  • Quality merics are universal. Quality doesn’t ask about methodology but measure what you deliver. The same wioth testing approach. There are very small difference sometimes in measurement like requirement covereage but still tit’s the same metric.
  • Screen z metryka – szybkosc qa zalezy od jakosci jaka dostajemhy. Jest to tak naprade suma wszelkich testow jak i retestow. Czyli jesli cos dostaniemy zlego, odrzucimy to kilka razy, to nasza efektywnosc jest roznie mierzonaManager – przetestowaliscie tylko jedno storyQA przetestowalismy 4 story (za jazdtm razem to samo), ale za kazdym razem dostawalismy je z bledem
  • Project quality (and test process) metrics

    1. 1. Project quality and test process metrics Zbigniew Moćkun, Tomasz Rękawek © 2011 Cognifide Limited. In commercial confidence only.
    2. 2. Who are we? © 2011 Cognifide Limited. In commercial confidence only.
    3. 3. Agenda • Introduction to metrics • Code quality metrics − Metrics − Sonar as example • Project quality and testing process metrics − Metrics − Example quality report © 2011 Cognifide Limited. In commercial confidence only.
    4. 4. Metrics What is it about? © 2011 Cognifide Limited. In commercial confidence only.
    5. 5. Why do we use metrics? • • • • • • Project overview Control a process Control risks Force the use of good practices Project audits Projects / methodologies / ... comparison Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore © 2011 Cognifide Limited. In commercial confidence only.
    6. 6. I realized that metrics... Encourage me to ask questions © 2011 Cognifide Limited. In commercial confidence only.
    7. 7. Be careful with metrics!! Processes Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore People © 2011 Cognifide Limited. In commercial confidence only.
    8. 8. Type of measurements • Trend • KPI (Key Performance Indicator) − Lower threshold − Higher threshold − Value • How to set correct KPI? Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore © 2011 Cognifide Limited. In commercial confidence only.
    9. 9. Not only about numbers Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore © 2011 Cognifide Limited. In commercial confidence only.
    10. 10. Qualitative vs. Quantitative - case study Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore © 2011 Cognifide Limited. In commercial confidence only.
    11. 11. Level of details • • • • You cann’t measure everything Choose the most important metrics Monitor and ask questions Looking for details when you found a problem Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore © 2011 Cognifide Limited. In commercial confidence only.
    12. 12. Metrics gathering • • • • Analysis of live data (on request) Analisis based on daily values (can be run as cron job) Data vary in time manner Froze data for analysis © 2011 Cognifide Limited. In commercial confidence only.
    13. 13. Visualize metrics • • • • Use charts if possible Use colors as status (green, amber, red) Add description Remember about recommendation (your feeling) − Subjective assessment is important too Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore © 2011 Cognifide Limited. In commercial confidence only.
    14. 14. Project Quality metrics Quality is everywhere © 2011 Cognifide Limited. In commercial confidence only.
    15. 15. Code quality metrics Code quaility metrics are covered by Tomek Rękawek presentation which can be found here: http://www.slideshare.net/TomaszRkawek/code-metrics © 2011 Cognifide Limited. In commercial confidence only.
    16. 16. QA vs. code quality metrics • Part of assurance • Define KPI with Technical Lead • Understand what you measure and why • Discuss trends, exception with TL • Motivate developers to keep high coding standards © 2011 Cognifide Limited. In commercial confidence only.
    17. 17. Project quality metrics © 2011 Cognifide Limited. In commercial confidence only.
    18. 18. Why do we test? Software testing is an investigation conducted to provide stakeholders with information about the quality of the product or service under test © 2011 Cognifide Limited. In commercial confidence only.
    19. 19. Testing = measuring Software testing is an investigation conducted to provide stakeholders with information about the quality of the product or service under test Source: wikipedia © 2011 Cognifide Limited. In commercial confidence only.
    20. 20. Dev – QA cooperation © 2011 Cognifide Limited. In commercial confidence only.
    21. 21. Metric by actions © 2011 Cognifide Limited. In commercial confidence only.
    22. 22. One measurement, two metrics • Acceptance rate / Internal Quality − Acceptance actions/all qa actions − All QA actions = accept or reject − Measure quality of software send to QA • QA velocity − Sum off all actions − Divide by number of QA Engineers assigned to project − QA velocity depend on software quality that comes to qa − Why testing takes so long? © 2011 Cognifide Limited. In commercial confidence only.
    23. 23. Test Case Execution metrics • Planned test cases vs. written test case (Preparation, UAT) • Number / percentage of executed test cases (Acceptance, regression, UAT) • Number of passed/failed test cases (Acceptance, regression, UAT) • Test Case metrics mean nothing © 2011 Cognifide Limited. In commercial confidence only.
    24. 24. Requirement coverage • Requirement coverage − How many requirements have test cases? − How many requirements were tested? − How many requirements passed? − Can we measure coverage if we use exploratory approach? • Requirement traceability − Defect – test case - requirement © 2011 Cognifide Limited. In commercial confidence only.
    25. 25. Bugs metrics – S curve • S Curve • Open against Resolved • Wages − Blocker: 10 − Critical: 8 − Major: 6 • No Major or higher rule − 5 Trivials = 1 Minor − 5 Minors = 1 Major , ... © 2011 Cognifide Limited. In commercial confidence only.
    26. 26. External quality © 2011 Cognifide Limited. In commercial confidence only.
    27. 27. The importance of metrics • Example A − − − − − − − Story comes to QA QA found 1 major issue and reject (1 day) Dev fix it (1 day) QA test it again, 1 major issue found and reject(1 day) Dev fix it (1 day) QA accept (1 day) Sum: 5 days, 2, rejects, 2 issues only • Example B − − − − − Story comes to QA QA found 1 critical, 1 major and 3 minor issues and reject story (1 day) Dev fix it (2 day) QA test it again and accept (1 day) Sum: 4 days, 1 reject, 5 issues • Acceptance rate against raised issues metric © 2011 Cognifide Limited. In commercial confidence only.
    28. 28. Bug Taxonomy © 2011 Cognifide Limited. In commercial confidence only.
    29. 29. Bug Taxonomy • Phases − Project: Discovery, Development, UAT, Live • Test Case lifecycle − Requirements/Documentation, Test Scenario, Test Execution • Risk areas • Application modules/components • Functional/non-functional • Product specific areas − CMS: author, publish, frontend, backend © 2011 Cognifide Limited. In commercial confidence only.
    30. 30. Defect Density - components © 2011 Cognifide Limited. In commercial confidence only.
    31. 31. Defect Density - Acceptance vs. Regression (automation vs. manual) © 2011 Cognifide Limited. In commercial confidence only.
    32. 32. Defect Density Application specific © 2011 Cognifide Limited. In commercial confidence only.
    33. 33. Defect Density – won’t fix Do we test in right way? © 2011 Cognifide Limited. In commercial confidence only.
    34. 34. Automation - coverage • Defects found (manual vs. automated) • Coverage − Manual vs. automated test cases − High priority − Application specific  Author vs. publish © 2011 Cognifide Limited. In commercial confidence only.
    35. 35. Other metrics • Performance monitoring trend as example • Client side grades (webpagetest.org as example) • QA queue • Agile metrics − Lead Time (Open to Resolved time) − Idle Time (time spend in queues) © 2011 Cognifide Limited. In commercial confidence only.
    36. 36. Qualitative metrics • Client feeling − Surveys − Demo − Just try to talk • Tester / team subjective opinion − Recommendation − Feeling − Usability © 2011 Cognifide Limited. In commercial confidence only.
    37. 37. Agile vs. Waterfall • Quality metrics are independent against: − Methodology − Approach • Two main test approach can be used in both − Exploratory vs. scripting • Agile put more emphasis on qualitative metrics © 2011 Cognifide Limited. In commercial confidence only.
    38. 38. Example report © 2011 Cognifide Limited. In commercial confidence only.
    39. 39. References • http://www.kaner.com/pdfs/BugTaxonomies.pdf • ISTQB Advance Test Manager Syllabus − http://www.istqb.org/downloads/viewcategory/46.html • Michael Bolton blog post − http://www.developsense.com/blog/2009/01/meaningful-metrics/ • Douglas Hoffman − http://www.softwarequalitymethods.com/Papers/DarkMets%20Pape r.pdf © 2011 Cognifide Limited. In commercial confidence only.
    40. 40. Poznań Testing and Quality Group • Local group • Next meeting: 28th of November − Bogdan Bereza-Jarociński: Quality in Agile – quality at all − Jakub Bryl: How to organize security testing © 2011 Cognifide Limited. In commercial confidence only.
    41. 41. Q&A © 2011 Cognifide Limited. In commercial confidence only.

    ×