Tester performance evaluation

7,404 views

Published on

Published in: Technology
1 Comment
2 Likes
Statistics
Notes
No Downloads
Views
Total views
7,404
On SlideShare
0
From Embeds
0
Number of Embeds
57
Actions
Shares
0
Downloads
125
Comments
1
Likes
2
Embeds 0
No embeds

No notes for slide
  • 到底什么是系统测试,和解决方案测试。 我们可以给出一个定义,但那样子就太理论化了,所以我更愿意给出一个例子来。 看一看别的做的比较好的公司是怎么做的。
  • Tester performance evaluation

    1. 1. Tester Performance Evaluation Liang Gao (lgao@sigma-rt.com)
    2. 2. Attributes of Good Testers • Stubborn but reasonable • Fearless in front of developers & management • Sense of smell (dig the right place, profiling) • Common sense (when to trust developers) • Learn to knowledge broad & shallow (inter- feature dependencies) • Methodological follow the process/procedure
    3. 3. Definition of Performance Management • Defined the performance metrics to let tester deliver the quality work. • Should not be something that they can deliver without doing the real useful work. • Bug counts? • Hours of work? • Certification? • Peer Ratings? • Customers ratings?
    4. 4. Key Tasks of A Tester • Writing bug reports? • Designing, running, modifying test cases? • Developing test strategies / plans / ideas? • Editing technical documentation? • Interact with developers? • Writing support materials for the help desk or field support? • Facilitate inspections / reviews? • Requirements analysis—meet, interview, interpret needs of stakeholders? • Release management? Archiving? Configuratio management? • Develop automation scripts?
    5. 5. Different Role in the Test Team • Management – Dev Test Group manager – System and solution test manager – Regression test manager – Tools group manager • Engineer – New feature testing engineer – Regression engineer – System / Solution testing engineer – Testing Tools developer
    6. 6. Performance is Relative. Top Performance Group Bottom Performance Group Top Sub Group Bottom Sub Group Middle Group Top Sub Group Bottom Sub Group Middle Group
    7. 7. Quantity Metrics of Dev Test Engineer • Number of the initiative (Under Minimum supervision) • and also delivered Number of the defects + fixed defects • Number of the test cases designed. • Number of the test cases manually executed. • Number of the automation scripts developed in lines. . • Number of the review meetings attended/invited as reviewer, number of the feedbacks gave • Defects average response time • Number of test cases missed during design phase • Number of the defects missed during cross testing
    8. 8. Quantity Metrics of Regression Test Engineer • Number of the initiative and also delivered • Number of the regression defects + fixed defects • Number of the test cases executed • Number of the testbed integrated. . • Number of the review meetings attended/invited as reviewer, number of the feedbacks gave • Defects average response time
    9. 9. Quantity Metrics of Tools Engineer • Number of the initiative and also delivered • Number of the defects fixed • Number of the tools developed • Lines of the code developed • Number of the users using the tool . • Number of the review meetings attended/invited as reviewer, number of the feedbacks gave • Defects average response time
    10. 10. Quantity Metrics of System Testing Engineer • Number of the initiative and also delivered • Number of the defects reported • Number of the test case executed • Number of the testbed built . • Number of the review meetings attended/invited as reviewer, number of the feedbacks gave • Defects average response time • Number of test cases missed during design phase • Number of the defects missed during cross testing
    11. 11. Quality Metrics of Defects • Severity and priority of the defects • Customer related defects • Check the Defect Quality Checklist Bug Quality Checklist
    12. 12. Quality Metrics of Test Cases • Complexity and priority of the test cases • Check the Test Case Quality Checklist Test Case Quality Check List
    13. 13. Quality Metrics of Scripts • Check the Scripts Quality Checklist Script Quality Checklist
    14. 14. Quality Review Process • Random pick 2 bugs, or scripts or test cases tester developed in this quarter. • Manager carry out a face-to-face review with the tester against the checklist and give a score
    15. 15. Quality Review Process • Missed test case ratio – During a review session, how many new test cases are proposed by the reviewers, and the ratio with the existing test cases. • Missed defects during cross testing – Same test case, if others can test out bugs during a cross testing, then it is a negative impact on the previous tester • Customer found bugs root cause analysis

    ×