• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Как найти побольше багов? (Особенно, если времени нет)
 

Как найти побольше багов? (Особенно, если времени нет)

on

  • 838 views

SQA Days 11. День 1. Секция А

SQA Days 11. День 1. Секция А
Ярон Цубери
Глава ISTQB, CEO, Smartest Technologies Ltd.
Гиват-Зеев, Израиль

Statistics

Views

Total Views
838
Views on SlideShare
798
Embed Views
40

Actions

Likes
0
Downloads
39
Comments
0

2 Embeds 40

http://sqadays.com 22
http://www.sqadays.com 18

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Как найти побольше багов? (Особенно, если времени нет) Как найти побольше багов? (Особенно, если времени нет) Presentation Transcript

    • Smartest Technologies Ltd. Providing the SMART solutions How to find more bugs? (Especially if you have no time) 11th SQA Days & ELT Ukraine April 2012Yaron TsuberyWebsite: www.dSmartest.comEmail: YaronTs@dSmartest.ComMobile: +972 52 854925525 April 2012 Smartest Technologies (c) 2010 1
    • IntroductionPresentation objectivesThe challenges that we haveTake control over the processSummary
    • Introduction Yaron Tsubery Homeland: Israel More than 19 years experience in Software Development and Testing, Director QA & Testing at Comverse, managed large testing groups and projects deployed to customers located in Israel, USA, Europe, and to the Fareast countries and... Currently, Managing Director of Smartest. President of ITCB (Israeli Testing Certification Board) and a formal member in ISTQB, President of ISTQB (International Software Testing Qualifications Board).25 April 2012 Smartest Technologies (c) 2010 3
    • IntroductionPresentation objectivesThe challenges that we haveTake control over the processSummary
    • ObjectivesWe will walkthrough some theoretical material and see the match between theory and reality,Present the challenges and their results,Show a way to improve your efficiency and effectiveness, through a mixed Approaches,The presentation is aimed at stimulating your mind and opening new views to the subject,25 April 2012 Smartest Technologies (c) 2010 5
    • Set ExpectationsThis presentation focuses on the testing process related to the Execution phase,Let’s have a dynamic and interactive session25 April 2012 Smartest Technologies (c) 2010 6
    • IntroductionPresentation objectivesThe challenges that we haveTake control over the processSummary
    • What Were TheChallenges That We Had?25 April 2012 Smartest Technologies (c) 2010 8
    • Challenges Very large and complex systems Systems that required to function 24/7 Frequent requests for changes Market change and required fast delivery Continue to keep the high quality level required for our systems Competitors…25 April 2012 Smartest Technologies (c) 2010 9
    • Background25 April 2012 Smartest Technologies (c) 2010 10
    • Life-cycles Models (SDLCs) Time box N Big Bang Progress Orientation preparation Steering committee HL plan N+1 Fine planning N+1 Plan updates reviews Updated Plan Requirement Test Development Design definition Water fall design Implement code analyze MicroUnit test proces Test execution prelude s Integration test Spiral System test Reorganization Daily builds – smoke tests Optional release Integration testes V-Model release start Result delivery Time-box and Agile development model25 April 2012 Smartest Technologies (c) 2010 11
    • Were Usually We Put The Efforts In The Testing Process? How much time are you investing in…? Planning Design Preparations Execution (manual Reporting25 April 2012 Smartest Technologies (c) 2010 12
    • Where Do We Invest Most of Our Time at The Execution Stage?25 April 2012 Smartest Technologies (c) 2010 13
    • The Evolution Of Regression Using Risk- Partial based regression Testing tests set approach Full testing including execution ATP cycles Full testing including a execution repeated cycles - regression including all regression tests set tests25 April 2012 Smartest Technologies (c) 2010 14
    • What Are We Looking For?25 April 2012 Smartest Technologies (c) 2010 15
    • 25 April 2012 Smartest Technologies (c) 2010 16
    • Stages That We Can Improve Defect Detection Test Execution Code Requirements25 April 2012 Smartest Technologies (c) 2010 17
    • Nothing New So Far… (-;25 April 2012 Smartest Technologies (c) 2010 18
    • IntroductionPresentation objectivesThe challenges that we haveTake control over the processSummary
    • So, What Did We Do?How Our Strategy &Plan Looked Like?25 April 2012 Smartest Technologies (c) 2010 20
    • StrategyFocused plan at the Execution CyclesUse risk-based approachRapidly track and analyze the execution resultsManage through defect detection analysis.25 April 2012 Smartest Technologies (c) 2010 21
    • Planning Main dimensions to consider at execution:  Concentrate on each build we’re getting from the development  Design tests using quality risk categories analysis  Divide each build to 3 parts – following risk-based approach, using priority (High, Medium and Low)  What to cover in each part?  Which coverage level should we have? How can we benefit from previous builds and cycles? How can we benefit from the current build? …and even from the current part we’re testing!25 April 2012 Smartest Technologies (c) 2010 22
    • Level Of TestingRisk analysis and priorities of the features – taken from the initial plan we builtEvaluation of level according to sanity tests resultsDependencies between features (new & current)Coverage we had in previous build:  Did we cover that part?  How much and what did we cover?  What were the results of previous tests?  How many defects did we had? Their severity and priority!25 April 2012 Smartest Technologies (c) 2010 23
    • Level Of Testing – cont’Don’t forget the Platform and Infrastructure areas:  New hardware  New Kernel  O/S changes  Changes or New 3rd party parts  Configurations  Etc’And additional parts that fits to your systemI’m sure that you have now concerns about the duration which these activities may take…25 April 2012 Smartest Technologies (c) 2010 24
    • Coverage AreasDefine what to cover per each set of priority (High, Medium and Low)SanityNew functionalityInfrastructure & system related testsRegressions:  Related to new functionality  Bug fixes / retests  Related to bug fixesATP related tests25 April 2012 Smartest Technologies (c) 2010 25
    • Adding The Focused Exploratory Part25 April 2012 Smartest Technologies (c) 2010 26
    • Focused Exploratory TestingAfter we have the plan per each set of priority test, considering all elements of coverage and level of coverage,Add to each set of priority test – at the end – a Focused Exploratory Testing partLet’s have a look at the Gantt25 April 2012 Smartest Technologies (c) 2010 27
    • Gantt Sample25 April 2012 Smartest Technologies (c) 2010 28
    • Well… How To Leverage The Efficiency of Exploratory Testing?25 April 2012 Smartest Technologies (c) 2010 29
    • Leveraging The Exp’ TestingPerform an analysis of the defects density  Severity  PriorityDon’t delay the analysis to the end of the testing cycle, perform that while you execute the test priority set!After analyzing the defects density and the infected areas, chose the most infected areas and focus your exploratory testing there!Now you’re running focused exploratory testing25 April 2012 Smartest Technologies (c) 2010 30
    • Let’s Simplify The Defect Detection Analysis?25 April 2012 Smartest Technologies (c) 2010 31
    • Defects Density Per AreaAssumptions:  We have 2 new functionality areas: A & B  We have 2 areas for regressions: L & MWe saw the following defect density behavior Severity Severity Severity Total Total Area Test Group (Critical) (Major) (Minor) (C+Mj) (Mj+Mn) Total A A.1 3 0 3 3 A A.2 5 0 5 5 A A.3 1 8 1 9 9 A A.4 7 0 7 7 A A.5 1 8 1 9 9 B B.1 1 1 1 2 2 B B.2 1 2 3 3 5 6 B B.3 1 2 1 3 3C = Critical L L.1 1 2 3 2 3 L L.2 2 3 4 5 7 9Mj = Major L L.3 1 2 3 2 3Mn = Minor L L.4 1 1 4 2 5 6 M M.1 1 3 1 4 4 M M.2 1 1 1 125 April 2012 Smartest Technologies (c) 2010 32
    • Divide To Priority SetsOrder the priority using the defects density tableCreate Priority Sets:  High  Medium Severity Severity Severity Total Total  Low Area L Test Group L.2 (Critical) (Major) (Minor) (C+Mj) (Mj+Mn) Total B B.2 L L.1 L L.3 L L.4 A A.3 A A.5 M M.1 B B.3 B B.1 M M.2 A A.4 A A.2 A A.125 April 2012 Smartest Technologies (c) 2010 33
    • More Complex Analysis TableAnalysis per tested module or featurePerform deep analysis per the various tests executed Risk Defects Defects Coverage Level in Infrastructure Final Module/Feature Level (Sanity) (Dependencies) Previous Build Impact Avarage Priority Module A Medium Module B High Functionality A High Functionality B Low25 April 2012 Smartest Technologies (c) 2010 34
    • More Complex Analysis TableMeasure the infected area depending on defects detection ratioPrioritize the areas to put the focus when performing Exploratory Testing Risk Defects Defects Coverage Level in Infrastructure Final Module/Feature Level (Sanity) (Dependencies) Previous Build Impact Avarage Priority Module A Medium Module B High Functionality A High Functionality B Low25 April 2012 Smartest Technologies (c) 2010 35
    • Defect Detection RatioNormal S-Curve S-Curve S-Curve open S-Curve, defects ratio25 April 2012 Smartest Technologies (c) 2010 36
    • Improved Defect Detection Ratio S-Curve using Focused Exploratory Testing The S-Curve with focused Exploratory S-Curve Ratio of regular S-Curve25 April 2012 Smartest Technologies (c) 2010 37
    • Where Do We See The Impact?25 April 2012 Smartest Technologies (c) 2010 38
    • The ImpactApproach method: Defects driven approachProject: delivery timelineQuality: product’s quality raisedEfficiency: less time invested at defect detectionPositioning: higher quality of the Testing Team/s – less escaped defects25 April 2012 Smartest Technologies (c) 2010 39
    • When To Expand The Focused Exploratory Tests?25 April 2012 Smartest Technologies (c) 2010 40
    • When To Expand? At the time that we have many features to test and the probability to have defects is high At the time that we have less time for detailed description of the test cases At the time that we have many change requests that we didn’t cover with test cases Recommendations:  Document the skeleton of the Exploratory Tests you executed  Describe in more details those who encountered bugs (- :25 April 2012 Smartest Technologies (c) 2010 41
    • When To Reduce Or Not Perform The FocusedExploratory Tests?25 April 2012 Smartest Technologies (c) 2010 42
    • When To Reduce Or Not Perform?At the time that we have only minor areas to cover,At the time that we see defect convergence – less bugs open,At the time that we see that there are less areas that are getting “infected”,At the time that the ROI is less effective.25 April 2012 Smartest Technologies (c) 2010 43
    • IntroductionPresentation objectivesThe challenges that we haveTake control over the processSummary
    • Before The Summary… What are the Next Steps?25 April 2012 Smartest Technologies (c) 2010 45
    • Next StepsTry to Implement that in an Agile environment,Start to use Heuristic methods!Where do we find most of our bugs? (ODC analysis) How to reduce over-testing?Etc’ (-;25 April 2012 Smartest Technologies (c) 2010 46
    • SummaryAnalyze your testing project and decide if to activate this approachFeel free to Mix some Execution ApproachesFactors for successful implementation:  Initial risk analysis  Prioritize test areas – High, Medium and Low  Daily defect detection analysis – per area  Focus on areas with more potential to have defects  Communicate plan and results with your peers and managersGood Luck!25 April 2012 Smartest Technologies (c) 2010 47
    • 25 April 2012 SMART Technologies (c) 2010 48
    • Yaron Tsubery: yaronts@dsmartest.com Office: +972 2 650925525 April 2012 Smartest Technologies (c) 2010 49
    • System: Physical Architecture25 April 2012 Smartest Technologies (c) 2010 50
    • System Layout25 April 2012 Smartest Technologies (c) 2010 51
    • Low-Level Design Each risk to be mitigated via testing will have one or more test cases associated with it. (I’ve not shown all traces, to avoid clutter.)25 April 2012 Smartest Technologies (c) 2010 52 Reference: Rex Black (STE)
    • Quality risks are potential system problems which could reduce user satisfaction Risk priority number: Aggregate measure of problem risk Tracing information back Business (operational) risk: Impact of the problem to requirements, desig Technical risk: Likelihood of the problem n, or other risk bases Tech. Bus. Risk Extent ofQuality Risk Risk Risk Pri. # Testing TracingRisk Category 1Risk 1Risk 2Risk n A hierarchy of 1 = Very high risk categories 2 = High The product of can help 1-5 = Extensive 3 = Medium technical and organize the 6-10 = Broad 4 = Low business list and jog 11-15 = Cursory 5 = Very low risk, from 1- your memory. 16-20 = Opportunity 25. 25 April 2012 Smartest Technologies (c) 2010 21-25 = Report bugs 53 Reference: Rex Black (STE)