Your SlideShare is downloading. ×
What is Risk? - lightning talk for software testers (2011)
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

What is Risk? - lightning talk for software testers (2011)

250
views

Published on

British Computer Society Specialist Interest Group in Software Testing (SIGiST) Jun 2011, London.

British Computer Society Specialist Interest Group in Software Testing (SIGiST) Jun 2011, London.

Published in: Technology, Business

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
250
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 Thompson information Systems 1Photo credit: Axel Rouvin,Consulting Ltd Commons. Flickr, Creative
  • 2. SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 What is Risk? Lightning Talk Neil Thompson Thompson information Systems Consulting Ltd Some of this material courtesy of, or co-developed with, ©Thompson informationv1.0 Paul Gerrard and (on another occasion) Testing Solutions Group Systems Consulting Ltd 2
  • 3. Risk is... Bad Things which may (or SIGiSTmay not) happen Specialist Interest Group in Software Testing 21 Jun 2011 BAD THINGS WHICH COULD HAPPEN, AND LIKELIHOOD OF EACH N E W S CONSEQUENCE OF EACH BAD THING WHICH COULD HAPPEN ©Thompson• If the bad thing happens, then becomes “Issue” information Systems Consulting Ltd 3
  • 4. The simple way to “quantify” risk SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 LIKELIHOOD risk EXPOSURE = (“probability”) likelihood of bad thing 3 6 9 x occurring consequence 2 4 6 1 2 3 CONSEQUENCE (impact) if bad thing does occur • This is how most people quantify risk (though true quantification is notoriously difficult) • “Probability” is (properly) a number between 0 & 1` • Adding gives same rank as multiplying, but less differentiation 4
  • 5. Does risk have any other dimensions? SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 • In addition to likelihood and consequence... • Undetectability: – difficulty of seeing a bad thing if it does happen – eg insidious database corruption • Urgency: – advisability of looking for / preventing some bad things before other bad things – eg lack of requirements stability • Both the above make a risk worse • Any others? 5
  • 6. Different types of software risk SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 Eg: • supplier may Project deliver late • key staff may leave risk may may cause cause Eg: • configuration management Process may install wrong version of product risk may cause may Eg: cause • specifications may contain Product defects • software may contain faults risk ©Thompson information Systems Consulting Ltd 6
  • 7. The “iron triangle” is really a tunable SIGiSTtetrahedron? Specialist Interest Group in Software Testing 21 Jun 2011 Quality Quality Scope Cost Quality Scope Time Time Costbest pair to Scope Costfine-tune Risk Time ©Thompson information Systems Consulting Ltd 7
  • 8. Risk on the Value Flow ScoreCard SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 SIX VIEWPOINTS of what stakeholders want Supplier Process Product Customer Financial Improvement & InfrastructureObjectives WHY we do thingsThreats to Project Process Product Project Project Processsuccess risk risk risk risk risk riskMeasures WHAT (will constituteTargets success) HOW toInitiatives do things well ©Thompson information Systems Consulting Ltd 8
  • 9. Product Risk dimensions for testing SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 MAGNITUDE = likelihood 125 100 x 75 consequenceLIKELIHOOD 50 x(probability) testabilityof bad thing 5 10 15 20 25 occurring 4 8 12 16 20 3 6 9 12 15 TESTABILITY (how feasible / convenient it is to 2 4 6 8 10 test against this risk) 1 2 3 4 5 CONSEQUENCE (impact) if bad thing does occur • This new three-way view is useful when © Thompson prioritising risks for testing information Systems Consulting Ltd 9
  • 10. Brief digression: we really mean SIGiST“uncertainty”! Specialist Interest Group in Software Testing 21 Jun 2011 Decision theory, different situations under which decisions are made… Certainty Risk UncertaintyAlternatives A B C A B C A B C unknown!Consequences a b c a1 b1 c1 a1 b1 c1 a2 b2 c2 a2 b2 c2 b3 c3 b3 c3Probability b4 b4of each p(a1), p(a2), p(b1), p(b2), p(b3), p(b4), ?, ?, ?, ?, ?, ?,consequence p(c1), p(c2), p(c3) ?, ?, ? known unknown• In software risk, we can only estimate the probabilities! ©Thompson information 10• And... we don’t really know all the alternatives! Systems Consulting Ltd
  • 11. Each step in software lifecycle is SIGiST threatened by risk Specialist Interest Group in Software Testing 21 Jun 2011 validation DEVELOPMENT TEST testing MODEL MODEL simplification Acceptance Test AT REAL Requirements Analysis & Design Execution WORLD verification testing DEV MODEL TEST MODEL (expected) (ver’d / val’d) refinement Functional System Test ST with risk of Specification Analysis & Design Execution distortionREALWORLD Technical Integration Test IT(desired) Design Analysis & Design ExecutionafterSOFTWARE TESTING:A CRAFTSMAN’SAPPROACH SOFTWARE Module Component Test CTPaul Jorgensen (observed) Spec Analysis & Design ExecutionSo:• remember overlapping models programming SOFTWARE ©Thompson information• need both verification & validation with risk of bugs Systems Consulting Ltd 11
  • 12. Product risks have a cause-effect SIGiSTchain Specialist Interest Group in Software Testing 21 Jun 2011 DEVELOPMENT TEST MODEL MODEL simplificationREAL RequirementsWORLD Consequence: impact of risk becoming issue refinement Functional with risk of Specification Failure: Knock-on an incorrect result distortion Effects Error: amount by which Technical result is incorrect Design Fault:Mistake: an incorrect step, Anomaly:a human action that Module process or dataproduces an Defect: Spec definition in a an unexpected resultincorrect result computer program during testing incorrect results(eg in spec- in specifications (ie executablewriting, software)program-coding) programming with risk of bugs SOFTWARE ©Thompson informationLikelihood of making mistakes, of defects causing faults, of faults causing failures, etc Systems Consulting Ltd 12
  • 13. Product Risk factors SIGiST Specialist Interest Group in Software Testing 21 Jun 2011• Consequence is usually seen in terms of potential impact on the business: – direct financial (loss of profit, regulatory fines etc) – indirect financial (eg reputation damage) – frequency of use of the malfunctioning part/aspect of the system• Likelihood is more associated with technical factors: – complexity of the part/aspect of the system – newness, degree to which changed – historical bugginess – etc etc – and frequency of use, again! © Thompson information Systems Consulting Ltd 13
  • 14. More about quantification difficulties SIGiST Specialist Interest Group in Software Testing 21 Jun 2011• In addition to the difficulty assessing all the things which could possibly go wrong, and their likelihoods...• Consequences are also difficult to calculate• And...• Humans often have emotional / irrational biases in matters of risk ©Thompson information Systems Consulting Ltd 14
  • 15. So, what does all this mean for SIGiSTtesting? Specialist Interest Group in Software Testing 21 Jun 2011 DEVELOPMENT TEST MODEL MODEL 2. Brainstorm / workshop:REAL Requirements • things which couldWORLD 1b. Prioritise: go wrong, whether • test items or not in spec Functional Specification • features to be • their likelihood & tested consequences1a. Risk-assess: • test basis• the importance Technical elements etc Design TEST CONDITIONS of each • GRADE coverage come from both part/aspect Module 1&2 of the system Spec Consequence:• the likelihood impact of risk becoming issue of risks here SOFTWARE ©Thompson informationLikelihood of making mistakes, of defects causing faults, of faults causing failures, etc Systems Consulting Ltd 15
  • 16. What do I mean by GRADE test SIGiSTcoverage? Specialist Interest Group in Software Testing 21 Jun 2011Test Coverage Source: Testing Solutions Group&Effort Even distribution X X Random / spurious priorities Risk-graded Riskiness (but avoid using this as an excuse to omit some things completely!) Also NB, risk information carries through the test process, to prioritise: • defects & anomalies • retests • regression tests ©Thompson information Systems Consulting Ltd 16
  • 17. References & acknowledgements SIGiST• James Bach: Heuristic Risk-Based Testing, Troubleshooting RBT, Specialist Interest Group in Software Testing 21 Jun 2011 etc (www.satisfice.com)• Paul Gerrard: various presentations & papers (www.gerrardconsulting.com), leading to...• ...Paul Gerrard & Neil Thompson: Risk-Based E-Business Testing (Artech House 2002)• Neil Thompson: Risk Mitigation Trees – Review test handovers with stakeholders (EuroSTAR 2004)• Chris Comey & Testing Solutions Group: Risk Based Assurance & Acceptance (www.testing-solutions.com)Associated topicsDecision-making & risk:• Terje Aven: Foundations of Risk Analysis – a Knowledge and Decision-oriented Perspective (Wiley 2003)Wider risk management:• Tom DeMarco & Timothy Lister: Waltzing with Bears – Managing Risk on Software Projects (Dorset House 2003)Psychology & philosophy of risk:• Dan Gardner: Risk – the Science and Politics of Fear (Virgin books 2008)• (Edited by) Tim Lewens: Risk – Philosophical Perspectives (Routledge 2007) ©ThompsonModels in testing: information Systems• Paul Jorgensen: Software Testing – a Craftsman’s Approach (CRC Press 1995) Consulting Ltd 17