Your SlideShare is downloading. ×
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Classic Mistakes in Software Testing
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Classic Mistakes in Software Testing


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide
  • Reducing altitude, he spotted a man on the ground and descended to shouting range.. "Excuse me," he shouted. "Can you help me? I promised my friend I would meet him a half hour ago, but I don't know where I am." The man below responded: "Yes. You are in a hot air balloon, hovering approximately 30 feet above this field. You are between 40 and 42 degrees North Latitude, and between 58 and 60 degrees West Longitude." "You must be an tester," responded the balloonist. "I am," the man replied. "How did you know?" "Well," said the balloonist, "everything you have told me is technically correct, but I have no idea what to make of your information, and the fact is I am still lost." Whereupon the man on the ground responded, "You must be a manager." "That I am" replied the balloonist, "but how did you know?" "Well," said the man, "you don't know where you are, or where you're going. You have made a promise which you have no idea how to keep, and you expect me to solve your problem. The fact is, you are in the exact same position you were before we met, but now it is somehow my fault.“ Now - Why is that joke funny? Because it happens at work, every day. And all that time spent complaining about each other is time not spent building solutions – it’s productivity lost. This talk is about those productivity timesinks – and what we can do about them.
  • I dislike the word “Taxonomy” and “Classes of mistakes” has an object-oriented feel. 
  • Just trying to be humble and mention the people this talk is derived from _before_ someone can preemptively accuse me of stealing from them. Every smarty I’ve ever talked to about this talk (except maybe you) has said “you better be careful. Marick did the seminal work …” I DID get Brian Marick’s explicit permission to re-visit the vault. The problem is that he hasn’t the right book, and I can’t seem to get a good cut-and-paste of his picture. I’ve settled for his other book. 
  • “When it comes to getting better, there are a couple of different approaches. Probably the best example of the first approach is exemplified in this book – In Search of Excellence. In this book, the authors looked at the best-run and most-admired companies in the united states and looked for Themes and Threads that ran through every company. It was a best practices approach. The theory was that if you copied those practices, your company would be successful, too. … Thirty years later, lots and lots of people have read the book and tried to implement the practices, and the Jury is still out. People have had mixed results – some success, some failure. The problem is that the companies in the book and your company are different, with different markets, different customers, different staff, different physical locations, different products, and lately, a different era. What worked for them might not work for you. So we want to move on to a second approach to ‘getting better …’
  • Microsoft won the software wars not because it was ‘better’ or ‘excellent’, but instead because it made less critical mistakes than it’s opponents. At least, Merrill Chapman believes as much, and he wrote a book on it - “In search of stupidity”, indicating a second approach – avoiding worst practices. Now, I’m a member of the context-driven school of software testing, so just as I avoid saying that certain best practices are always best, I want to be very careful when I say that a practice is always worst. There may be situations where some of the things I discuss are appropriate and helpful – I just haven’t seen them. I will say that in my experience, which includes shrink-wrapped, consulting, and internal software for business customers, I have noticed certain practices that are intended to solve a small problem and invariably create a larger one. So I believe that the practices that we discuss today are bad enough often enough to be listed here. After I finish the classic mistakes, we can move on to finding the root causes of these problems.
  • If you want a statistical survey that will blow your mind, think about this: If you want ONE single thing to dramatically reduce the mortality rate at hospitals, one simple best practice that is PROVEN to save lives … … wait for it … … It’s bedside manner. (And you heard that from a guy who works at an HMO. It is, quite literally, some of our business.) Want to dramatically improve productivity? Re-humanize the work. Encourage face-to-face communication. Too busy to do everything? INCREASE the amount of time you spend with your direct reports. If a 10% investment of your time yields a 10% increase in productivity in each direct report … Peopleware, by DeMarco/Lister, is a great place to start.
  • Make the people at the end of the process responsible for the work of everyone else before them. Yeah, right, that’s going to work. Problem #1: They can’t do a thing about bad requirements, bad designs, or bad code. Problem #2: Other people will tend to cut corners, as quality is someone else’s job In the end, the only group that is capable of balancing the razor’s edge of pain for low quality vs. pain for shipping late is general management. That’s why they are paid the big bucks. It is the testing group’s job to provide information to decision makers. (There _ARE_ other approaches to quality, eg prevention, but this is a talk about testing, an inspection-oriented approach)
  • Invariably, either there is a “real” ship date, and QA will be pressured into a release on that date … … Or they could hold the project hostage and miss a market window, force the project to be over-budget, etc. Ultimately, the person best qualified to balance the razor’s edge of shipping early (pain) and missing the market window (pain) is general management Exceptions occur, of course, for mission and life critical software … which most of the people at BetterSoftware Ain’t.
  • Reporting test status requires that you talk about risk, coverage, effort, and how that relates to the present build and the potential for future builds. Yet, I keep hearing that testing is merely "going good" "yes we tested that" "the testing should be 'done' by Tuesday" all without any context specified. If the test progress is challenged, most testers seem not to know what to say or do.
  • Time spent arguing on bug-or-not-ness is time NOT spent testing & developing!
  • Many people don’t even recognize testing as a technical task. They think it’s solely a non-technical thing that low-level people do. There’s a whole science called white box testing that only developers can do. So when we think of testing as something someone else does … whitebox testing never happens. Many shops turn over code that has never even been executed to test. This wastes the testers time on obvious bugs that could have been caught much earlier. You want a classic mistake? Have developers think that testing is someone else’s problem.
  • “ out of sight, out of mind” To get people to fight, emphasize how they are different (eg different roles), not how they are the same (eg same project)
  • What if someone already knows the automation tool? Would that eliminate the first bullet’s relevance? Per the second bullet: what if you are using MBT so that you aren’t recording scripts? I often recommend that people do some small automation (typically, monkeys) no matter how late they are in the cycle, since creating the monkeys takes little time and you can run the monkeys on multiple machines while you continue your own testing. And the monkeys find serious issues that manual testers often miss. So, is the slide trying to convey that testers should understand that automation is an investment and they may not have time to recoup the cost? I don’t know what “truth & consequences” refers to. And I thought it was “truth OR consequences”?
  • Explain the classic problem of minefield regression testing. Suggest model-driven testing as an alternative.
  • How many times have I seen an add for someone with “WinRunner Skills” that ran week after week after week in the paper. Why? Well, they wanted someone who could “hit the ground running” … … ok. But in the time it took to find him, couldn’t they just train someone else? And if he knows the tool, what’s to say he knows how to test?
  • Example: A 737 veered off a runway, once, because a plug came loose in a computer that controlled the reverse thruster on one side of the plane. The computer wasn't tested for getting a "NULL" input. I bet you anything that the thing was tested to the requirements, and the requirements didn't talk about what happens when plugs come unplugged.
  • If you don’t eat your own dogfood, are you really testing?
  • Talented, intuitive exploratory testing can yield positive results, but that’s not an excuse to give your people no training and say “have at it.” Training staff on “just” Equivalence Classes, Bounds Testing, State Transitions, and Decision Tables can have extremely large benefits.
  • We could use different people to do the test design and the test execution” This allows us to hire monkeys for a pittance If you really can separate these two, just automate execution James Bach calls exploratory testing “simultaneous learning, test design, and test execution” That’s cool. The extremo-opposite to exploratory testing is having the design and execution done at different times by different people . It screws up the feedback loop and circumvents learning. Joe is executing test case #5: Save a big file. He notices that when he tries to save, the screen flickers. Hmm. However, his test passed. If he felt ownership of the test cases, he might expore what caused that flicker, find a bug, and create a few new test cases. However, if he just has two days to get through a ‘test cycle’ …
  • The lost keys problem / Low ROI documentation activities
  • I’ve mixed classic management mistakes with classic test-doing mistakes, and that’s on purpose. Some of these will be outside your sphere of authority, or even influence. Work on the things you can actually change.
  • The Five Why’s will go here
  • A good way to know you’re encountering communication problem is - When you start complaining to co-workers that person XYZ is an idiot. He’s not an idiot; he’s either speaking a different language or facing pressure causing him to ignore you.
  • Talent and skills trump defined, institutionalized, stable, repeatable processes every time. How many more Burt Rutan’s do we need to prove the point? Historically, companies that take a process-oriented approach that have competed in the free market have lost – producing buggy software that’s too expensive – but on-budget and on-schedule.
  • If management is heck-bent on some wild idea that you don’t like, but remember … they need you to do it. It won’t happen without you. That gives you quite a bit of personal power. If a staffer trying to build a resume is heck-bent on some new technology, focus on business impact and risk. How does this new technology impact the business? If it doesn’t, don’t fund it. Does using the new, unproven technology create business risk? If yes, is that risk balanced by positive business impact, or potential delivered value? Don’t let it become a me vs. you thing. If it degrades to that, call it, and move on.
  • Fred Taylor was the founder of scientific management. His basic idea was to systematize the work: to have a small band of elites monitor every task with a stopwatch and find ways to optimize it. This works great for unthinking work; it made Ford a billionare and McDonald’s and Wal*Mart wildly successful My fundamental assertion is that software testing is complex, critical-thinking, intellectual work. Fred Taylor’s 19 th Century ideas are out of place and downright dangerous in a testing organization.
  • Root Cause: The five why’s Pareto: Make a list. Number them. Sort. If it helps, this is the ‘Toyota Production System’ or ‘Lean Software Development’ 
  • I think this stands on it’s own. Anyone have a grahic idea?
  • About the last two bullet points: AS a QA/Testing person, you can always send an email that you are worried about the state of the software if it were shipped today. You recognize the pain of not shipping could hurt this quarter’s/year’s sales revenue, but you believe the cost of shipping in damaged reputation & damage to the clients would exceed the cost of being late. Now, as we discussed before, you aren’t qualified to make that decision, but as a professional, you can provide meaningful information to decision makers, to insist of a well-considered decision instead of a knee-jerk “we gotta make 4 th quarter numbers”. The final bullet point is a call to make people aware of the damages of short-term optimization. One way to do that is to get people to read articles about the companies that survive – like ID software – that “ship when it’s done”, vs. companies that fail. Hint: Ya gotta get them to think it’s their idea.
  • Transcript

    • 1. Classic Testing Mistakes: Revisited Matthew Heusser [email_address] Presented at the Better Software Conference San Francisco, CA - Sept. 21st , 2005 Contributing peer reviewers: James Bach Paul Carvalho Michael Kelly Harry Robinson
    • 2. Organization
      • Classic Mistakes: A different approach
      • The mistakes enumerated
        • Test Management Mistakes
        • Test Automation Mistakes
        • Development Mistakes
        • Test Strategy Mistakes
      • Root Causes
      • What to do tomorrow
    • 3. On the shoulders of pioneers
    • 4.  
    • 5.  
    • 6. Classic Mistake #1: De-humanize the test process Test Management Mistakes AKA Management by Spreadsheet, Management by Email, Management by MS Project …
    • 7. Classic Mistake #2: Testers Responsible for Quality “ It’s strange that QA let that bug slip through” Test Management Mistakes
    • 8. Classic Mistake #3: IV&V Determines ship date Do they really? Test Management Mistakes
    • 9. Classic Mistake #4: Task-based status reporting
      • Examples:
        • Testing is “on schedule”
        • Testing “should be done by Tuesday”
      • Consequences
        • Loss of credibility
        • Bad information for decision makers
      Test Management Mistakes
    • 10. Classic Mistake #5: Evaluating testers by bugs found
      • … and developers by number of bugs injected
      • Consequences:
        • Friction
        • Focus on easy-to-find yet trivial bugs (usability)
        • Information hiding
      Test Management Mistakes
    • 11. Classic Mistakes #6 Inappropriate Models for Test Improvement
      • NO
    • 12. Classic Mistake #7: Lack of test training for developers
      • Testing is a skill .
      • It won’t appear like magic.
      Development Testing Mistakes
    • 13. Classic Mistake #8: Separate devs and testers
      • To create friction, emphasize division
      • Anything that increases the length of the feedback loop is bad .
      • To improve get rid of waste and tighten the feedback loop.
      Development Testing Mistakes
    • 14. Mistake #9: When late, add Test Automation
      • Someone has to learn the tool
      • Someone has to record the scripts
      Test Automation Mistakes
    • 15. Mistake #10: Mine Field Test Automation Test Automation Mistakes
    • 16. Mistake #11: Hiring for test tool skills
      • Technology skills can be taught
      • Talent can’t
      • The “Hit the Ground Running” Argument
      Test Automation Mistakes
    • 17. Classic Mistake #12: Insufficient diversity in test strategy
      • Examples:
        • Only requirements based testing
        • Only coverage testing
      • Consequence:
        • Missing entire classifications of defects
      Test Strategy Mistakes
    • 18. Classic Mistake #13: Over-reliance on scripted testing All the testing we did, meticulously pulling down every menu and seeing if it worked right, didn't uncover the showstoppers that made it impossible to do what the product was intended to allow. Trying to use the product, as a customer would, found these showstoppers in a minute. - Joel Spolsky, Test Strategy Mistakes
    • 19. Mistake #14: Untrained exploratory Testing
      • “Just think creatively”
      • “Try to break it”
      • Exploratory Testing is a discipline
      Test Strategy Mistakes
    • 20. Classic Mistake #15: Test ‘Engineers’ and ‘Executors’ Test Strategy Mistakes
    • 21. Classic Mistake #16: Vacuous Documentation
      • Examples:
        • The issue resolution document
        • Physical signoff/check marks
        • Elaborate test case templates
      • Consequence:
        • Time spent documenting is time not spent testing
      Test Strategy Mistakes
    • 22. Mistake #17: Trying to fix things beyond your reach The Meta-Mistake
    • 23. Don’t treat the symptoms Find & Fix the root cause!
    • 24. Root Cause #1 Lack of Systems Thinking in Testing
      • The law of unintended consequences
      Root Causes
    • 25. Root Cause #2: Translation Problems Example: - “You need to completely test this module” Root Causes
    • 26. Root Cause #3: Process Myopia
      • Example:
        • The [in]famous Issue Resolution Document
        • “ We don’t do things that way here”
        • Elevating process over skills
      • Solutions:
        • The ear of the king / History Lessons
        • If your boss doesn’t care – ignore it
      Root Causes
    • 27. Root Cause #4: Technology Myopia
      • Example
        • “ Use XML on the next project”
        • “ I just bought 5 copies of WinRunner …”
      • Solution:
        • If you’re technical, they need you to do it
        • If you’re a manager, focus on business impact and risk
      Root Causes
    • 28. Root Cause #5: Fred Taylor in the organization
      • Examples:
        • Factory Mentality
        • High Specialization
        • Mixing of skill sets is verboten
      • Solutions:
        • Peopleware , or anything by Weinberg
        • First Break all the rules – Jim Collins
        • Lead. Insulate your team.
      Root Causes
    • 29. A ‘new’ methodology
      • Root Cause Analysis
      • Pareto Analysis
      • Drive out waste/tighten the feedback loop
      • Then worry about better practices
      • (Image from Rapid Development , (c) 1996 by Steve McConnell. Used with permission from the author)
      What to do tomorrow
    • 30. Why
      • New practices take permission
      • It is hard (but possible) to get more done by adding work
      • It is easy to get more done by subtracting work
      • So start by looking to remove worst practices
    • 31. What to do tomorrow
      • Discuss
      • Q&A
    • 32. Where to go for more
      • “ Classic Mistakes in Software Testing”, Brian Marick, STQE, 1997
      • Rapid Development , Steve McConnell
      • An Introduction to general systems thinking , Gerold Weinberg, 1975
      • Lessons Learned in Software Testing , Kaner, Bach, Pettichord
    • 33. Bonus Section
    • 34. Root Cause #6: Pressure for short-term results
      • Example:
        • “ Ship to make 4 th quarter numbers”
      • Putting off problems instead of addressing them
      • Solution:
        • Save your team
        • Professionalism means something
        • The Quake Example
      Root Causes