Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide


  1. 1. CS 5150 Software Engineering Lecture 23 Testing Large Systems
  2. 2. Administration Final presentations Sign up for your presentations now. Weekly progress reports Remember to send your progress reports to your TA
  3. 3. Testing Begins on Day 1 <ul><li>A professional project team begins with: </li></ul><ul><li>(a) An idea (usually mused over for a while between friends) </li></ul><ul><li>A program/product manager </li></ul><ul><li>A development manager </li></ul><ul><li>(d) A test manager </li></ul>Today, we’re going to focus on how the Test Manager thinks
  4. 4. Good Test Managers <ul><li>Before the project starts, they build the first draft of the Test Exit Criteria which answers: “How do we know when we’re done?” </li></ul><ul><li>The Test Manager: </li></ul><ul><ul><li>Drives the testing framework and approach </li></ul></ul><ul><ul><li>Is a great communicator </li></ul></ul><ul><ul><li>Understands the value of defect-fix-build-retest cycles that work like clockwork. THINK BOTTLENECKS. </li></ul></ul>
  5. 5. Example of a large scale system test: a browser At one point, the second largest automated software testing infrastructure in commercial software development was used to test Internet Explorer. The largest tested Windows. Why so many computers?
  6. 6. A Test Pass <ul><li>A Test Pass : all of the actions involved in assessing whether a software system meets its requirements. </li></ul><ul><li>This definition implies: </li></ul><ul><li>All of the requirements are expressed as “tests” (usually with instrumented indicators) </li></ul><ul><li>The team documents all of the steps required to run the tests, so that the test pass can be replicated . </li></ul><ul><li>The team runs the test pass multiple times. Else why write it all down? </li></ul>
  7. 7. Automated Testing Methods <ul><li>Two different types of </li></ul><ul><li>Scripted testing </li></ul><ul><ul><li>Programs that manipulate internal or external interfaces </li></ul></ul><ul><li>“ Screen Scrapers” </li></ul><ul><ul><li>Webload (http://www.webload.org) can automate tests by capturing mouse movements, keyboard events, etc. </li></ul></ul><ul><li>Both methods can be “fed” into scripted load testing solutions like WebLoad. </li></ul>
  8. 8. The Test Harness <ul><li>Test Harness : the program that controls the automated testing scripts. </li></ul><ul><ul><li>Usually first built for “smoke testing” </li></ul></ul><ul><ul><li>Expands to replicate the maximum required load (simultaneous transactions, MTBF, more scenarios) </li></ul></ul><ul><li>Smoke Testing : the minimal set of tests one runs to make certain that daily check-ins to the source control system didn’t break critical functionality in the daily build. </li></ul><ul><ul><li>I integrate the smoke test into the “make” command </li></ul></ul>
  9. 9. Scenarios Make Great Test Cases <ul><li>Success and failure scenarios and use cases are the foundation of great testing systems. </li></ul><ul><li>Over the course of product development, you usually uncover additional scenarios by watching people break the system. </li></ul><ul><li>How does this happen … </li></ul>
  10. 10. Eating your own Dog Food <ul><li>“ Eating your own dogfood” refers to a situation when a team decides that it will use the software it is developing to conduct its daily business and operations. </li></ul><ul><li>If people use the system every day as part of their work, they find problems with it. These problems frequently become new scenarios for the test infrastructure. </li></ul><ul><li>With operating systems and browsers, this is easy. With IT systems (like human resource systems), this is less possible. But get creative. </li></ul>
  11. 11. Additional Sources of Test Scenarios <ul><li>Ergonomics </li></ul><ul><li>Usability labs </li></ul><ul><li>In-home testing </li></ul><ul><li>Customer Support </li></ul><ul><li>Consulting </li></ul><ul><li>Sales Engineers </li></ul><ul><li>“ Mail a PC” to family members program </li></ul><ul><li>Drive to Fry’s and buy one of everything </li></ul><ul><li>Executive dogfood </li></ul><ul><li>Executive Mission Statements </li></ul><ul><li>Other </li></ul>
  12. 12. User Testing <ul><li>Most teams have figured out the basics of user testing. </li></ul><ul><li>Real users generate new testing scenarios for the test harness </li></ul><ul><li>Condensing user feedback into scenarios enhances its usefulness. </li></ul><ul><li>An example … </li></ul>
  13. 13. Requirements FAIL <ul><li>User: I don’t like the interface. </li></ul><ul><li>UE: What don’t you like about it? </li></ul><ul><li>User: I don’t understand what to do. </li></ul><ul><li>UE: What do you think you’re trying to do? </li></ul><ul><li>User: “ Access the system .” </li></ul><ul><li>UE: Have you used the system before? </li></ul><ul><li>User: No </li></ul><ul><li>….. </li></ul><ul><li>Requirement Failure: no “access the system for the first time” scenario </li></ul>
  14. 14. Dealing with Mixed User Interface Feedback <ul><li>Market segmentation (try dividing the feedback by demographic groups to look for patterns) </li></ul><ul><li>Example 1: Men over age 65; Income < Poverty Line </li></ul><ul><li>Example 2: Ages 20 – 24; college educated; </li></ul><ul><ul><li>“Interface is not cool” </li></ul></ul><ul><li>Ask questions like: </li></ul><ul><ul><li>Why is this group buying the product? </li></ul></ul><ul><ul><li>Which demos like the product and why? </li></ul></ul><ul><ul><li>Is their issue related to the core “buying story”? </li></ul></ul>
  15. 15. Beta Testing <ul><li>Beta Testing : early release of a product for the purpose of accumulating feedback that can be integrated into the product. </li></ul><ul><li>Validates requirements </li></ul><ul><li>Uncovers buying issues </li></ul><ul><li>Generates testing scenarios </li></ul><ul><li>Users have a lot of diversity in both their uses and their environments. Exploit the diversity to get better data about the coverage of your test pass. </li></ul>
  16. 16. Permutations and Combinations <ul><li>How many versions of Microsoft Windows are there? </li></ul><ul><ul><li>7? 35? 1100? </li></ul></ul><ul><li>A version is a build number plus an install set: </li></ul><ul><ul><li>Windows XP SP 2 with 7 of the 49 security patches </li></ul></ul><ul><ul><li>Win64 vs. Win32 </li></ul></ul><ul><ul><li>Office installed vs. not installed </li></ul></ul><ul><ul><li>IDE v SCSI v SATA drivers </li></ul></ul><ul><ul><li>Video adapter driver </li></ul></ul><ul><ul><li>Etc. </li></ul></ul>
  17. 17. Durability <ul><li>How long should you make the tests run? </li></ul><ul><ul><li>Once? </li></ul></ul><ul><ul><li>Multiple times? </li></ul></ul><ul><ul><li>A number calculated based on analysis? </li></ul></ul><ul><li>Determining MTBF is important because it is a measure of resilience (frailty). These tests frequently insert random events in the system as a part of the test harness. </li></ul>
  18. 18. Acting on the Results <ul><li>Once you run the test pass, what do you do? </li></ul><ul><ul><li>One bug for each failure </li></ul></ul><ul><li>Bug counts go up because </li></ul><ul><ul><li>Test coverage increased </li></ul></ul><ul><ul><li>Code quality went down </li></ul></ul><ul><ul><li>Major regression </li></ul></ul><ul><ul><li>Code base change that requires change in the test pass </li></ul></ul><ul><li>Bug counts go down because: </li></ul><ul><ul><li>Test team is not testing or losing creativity </li></ul></ul>
  19. 19. End Game <ul><li>The End Game : the process of balancing feedback from sales, marketing, executives, customers, engineering, testing, etc. </li></ul><ul><li>Usually involves: </li></ul><ul><ul><li>A stabilization of bug find/fix ratios </li></ul></ul><ul><ul><li>A code freeze </li></ul></ul><ul><ul><li>A full test pass and subsequent analysis </li></ul></ul><ul><ul><li>Discussions among the stakeholders that begin with: Who would it effect if corporate customers couldn’t … </li></ul></ul>
  20. 20. How to Decide When to Release <ul><li>Imagine this scenario. What would you do? </li></ul><ul><li>The browser has a bug which causes rendering to be off by 3 pixels for certain combinations of tags within <DIV>s </li></ul><ul><li>The code works correctly on Windows Vista but not Windows 7. </li></ul><ul><li>Fixing the bug would require adding 2,000 lines of code to the rendering engine </li></ul><ul><li>Failure to fix will cause W3C standards advocates to complain. </li></ul>
  21. 21. How to Decide When to Release <ul><li>Imagine this scenario. What would you do? </li></ul><ul><li>3 months prior to the scheduled release, you are in a code freeze. </li></ul><ul><li>Privacy advocates push Congress to pass legislation that would disable all cookies in the browser. The legislation passes. </li></ul><ul><li>The legislation makes it illegal to sell consumer products or publish web sites that use cookies. </li></ul>
  22. 22. How to Decide When to Release <ul><li>Imagine this scenario. What would you do? </li></ul><ul><li>The Final Four is in two weeks. You run ESPN’s web site development and the bracket downloader fails on the version of Windows 7 systems that was released yesterday. </li></ul>
  23. 23. Integrating these Lessons into Your Projects <ul><li>If you work with embedded devices, test the software on the embedded device as frequently as possible. Professional developers would build a test harness that automatically controlled the devices. </li></ul><ul><li>If you do web site development, your tests can be automated. This may be overkill for your project, but you can at least write down all of the steps to automate and run them over and over again. </li></ul><ul><li>Web site database smoke tests can be built using wget </li></ul>
  24. 24. Different Testing Methodologies <ul><li>In large projects there are many stakeholders. </li></ul><ul><li>Usually, they believe in different schools or methodologies. </li></ul><ul><li>You need to balance this from the start. </li></ul><ul><li>An example of a methodology: Agile </li></ul><ul><li>http://pettichord.com/ </li></ul><ul><li>http://www.io.com/~wazmo/papers/four_schools.pdf </li></ul><ul><li>Five different schools (from pdf above): </li></ul><ul><ul><li>Analytic, Standard, Quality, Context-Driven, Agile </li></ul></ul>
  25. 25. Hiring a Test Manager <ul><li>Test Managers are different </li></ul><ul><li>Some are good at hiring testers </li></ul><ul><li>Some are good at thinking through business process </li></ul><ul><li>Some have deep technical background in important areas </li></ul><ul><li>Some have relationships with important players </li></ul><ul><li>Test Managers need to be able to articulate arguments in terms of the requirements. They also need to be able to argue for the importance of their employees during personnel reviews. </li></ul>
  26. 26. Morale
  27. 27. Thanks and Questions <ul><li>Thanks to the following people for assistance: </li></ul><ul><ul><li>Joel Maher (a great test engineer that I’ve worked with) </li></ul></ul><ul><ul><li>Bill Walrond (a great PM who provided feedback) </li></ul></ul><ul><li>Questions? </li></ul>