Software Testing Fundamentals part I

2,210 views

Published on

Software Testing Fundamentals part I

  1. 1. T-76.613 Software Testing and Quality Assurance Lecture 1, 13.9.2005 Software Testing Fundamentals part I Juha Itkonen SoberIT HELSINKI UNIVERSITY OF TECHNOLOGY
  2. 2. Contents Why do we need testing? What is software testing? Quality and quality assurance Juha Itkonen, 2005 2 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  3. 3. Why do we need testing? Juha Itkonen, 2005 3 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  4. 4. Why do we need testing? We need testing to reduce and mitigate risk All software has faults We prefer to invest in testing in order to find the most important faults before the software is released to live operation Failures that occur during live operation are usually more expensive to deal with than failures than occur during testing Higher the risks, more we need to test No risk – no test Juha Itkonen, 2005 4 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  5. 5. Someone will always test your software Customer or User Competitor A government inspector, an industrial analyst, an insurance inspector An editor of a wide-spread magazine Channel reseller Systems integrator …and find the Potential partner nastiest Potential subcontractor bugs… Potential employee Software tester Juha Itkonen, 2005 5 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  6. 6. Software testing case studies (1/3) Disney’s Lion King CD-ROM, 1994-1995 26, December customer support phones began to ring... Multimedia CD was tested only on a few systems – did not work on most common home pc:s A lot of bad publicity, newspaper stories and TV news Intel Pentium floating-point division bug, 1994 (4195835 / 3145727) * 3145727 - 4195835 = 0 Problem was detected in Intel’s own tests before the chip was released but the management decided that the problem was not severe or likely enough to fix or even publish it In the end $400 million costs of replacing faulty chips Juha Itkonen, 2005 6 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  7. 7. Software testing case studies (2/3) NASA Mars Polar Lander, 1999 Multiple testing teams – one team tested contact switch that turn the landing thrusters off after landing, and another team tested that the leg opening mechanisms worked. When the legs snapped open the vibration caused the contact switch to turn off the thrusters 1800 feet above the Mars surface -> the Polar Lander smashed to the ground. Patriot missile defense system, 1991 System failed to defend against several Iraqi Scud missiles, one of which killed 28 U.S. soldiers. Analysis found a software bug – a small timing error accumulated to the point that after 14 hours of operation the tracking system was no longer accurate. Juha Itkonen, 2005 7 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  8. 8. Software testing case studies (3/3) The Y2K Bug, early 1970s A programmer was programming a payroll system for his company The computer had very little memory for storage and the programmer decided to use only 2 digits to store years, because this would save him a lot of memory in the system that heavily relies on date processing He considered briefly the problems that will occur when year hits 2000, but decided that this system will surely be replaced by then It’s estimated that several hundred billion dollars were spent to replace or update this kind of systems to fix potential year 2000 failures Juha Itkonen, 2005 8 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  9. 9. Why do defects occur in software? Software is written by human beings Who know something, but not everything Who have skills, but aren’t perfect Who don’t usually use rigorous methods Who do make mistakes (errors) Under increasing pressure to deliver to strict deadlines No time to check, assumptions may be wrong Systems may be incomplete Software is complex, abstract and invisible Hard to understand Hard to see if it is complete or working correctly No one person can fully understand large systems Numerous external interfaces and dependencies Juha Itkonen, 2005 9 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  10. 10. Sources of defects Education Developers does not understand well enough what he or she is doing Lack of proper education leads to errors in specification, design, coding, and testing Communication Developers do not know enough Information does not reach all stakeholders Information is lost Oversight Omitting to do necessary things Transcription Developer knows what to do but simply makes a mistake Process Process is not applicable for the actual situation Process places restrictions that cause errors Juha Itkonen, 2005 10 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  11. 11. What do software faults cost? Huge sums Ariane 5 ($7 billion) Mariner space probe to Venus ($250 million) American Airlines seat booking system ($50 million) Intel pentium floating point division bug ($400 million) Y2K (estimated several hundred billion dollars worldwide) Very little or nothing at all minor inconvenience no visible or physical detrimental impact Software is not ”linear” small input may have very large effect ”The national annual costs of an inadequate infrastructure for software testing is estimated to range from $22.2 to $59.5 billion.” NIST Final Report: The Economic Impacts of Inadequate Infrastructure for Software Testing. 2002 Juha Itkonen, 2005 11 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  12. 12. Safety-critical systems Software faults can cause death or injury Radiation treatment kills patients (Therac-25) Aircraft crashes Industrial process control software Hardware controllers … Juha Itkonen, 2005 12 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  13. 13. Example from Finland ”Helsingin terveysviraston tietojärjestelmä pysähtyi taas Odotusajat venyivät, lääkärit pelkäävät potilasturvallisuuden vaarantuvan. Helsingin terveysviraston potilastietojärjestelmä saatiin toimimaan keskiviikkoaamuna, mutta jo iltapäivällä se pysähtyi taas. Uuden Pegasos-tietojärjestelmän toimintavaikeudet alkoivat viime torstaina ja tämän viikon alussa se jähmettyi kokonaan. Tietojärjestelmän ja tietokannan toimittajan asiantuntijat korjasivat vikaa koko tiistai-illan, ja keskiviikkoaamuna alkoi järjestelmän vähittäinen käyttöönotto. ” Helsingin Sanomat 6.2.2003 Juha Itkonen, 2005 13 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  14. 14. What is Software Testing? Juha Itkonen, 2005 14 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  15. 15. What is software testing? Finding defects Trying to break the system Finding and reporting defects Demonstrating correct functionality Demonstrating incorrect functionality Demonstrating robustness, reliability, security, maintainability, … Measuring performance, reliability, … Evaluating and measuring quality Proving the software correct Executing pre-defined test cases Automatic error detection … Juha Itkonen, 2005 15 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  16. 16. The growth of software testing - 1956 Debugging oriented period Testing was not distinguished from debugging 1957 - 1978 Demonstration oriented period Goals to demonstrate that software satisfies its specification 1979 - 1982 Destruction oriented period Goal to detect implementation faults 1983 - 1987 Evaluation oriented period Goal to detect requirements, design, and implementation faults 1988 - Prevention oriented period Goal to prevent requirements, design, and implementation faults (Gelperin and Hetzel. 1988. The Growth of Software Testing. Communications of the ACM) Juha Itkonen, 2005 16 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  17. 17. Definition of Software Testing (Glenford Myers 1979) Testing is the execution of programs with the intent of finding defects. Juha Itkonen, 2005 17 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  18. 18. Definition of Software Testing (Ilene Burnstein 2002) Testing is a the process of exercising a software component using a selected set of test cases, with the intent of revealing defects and evaluating quality. (Testing can be described as a process used for revealing defects in software, and for establishing that the software has attained a specified degree of quality with respect to selected attributes.) Juha Itkonen, 2005 18 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  19. 19. Definition of Software Testing (Pol & Veenendal 2002) Testing is a process of planning, preparation, execution and analysing, aimed at establishing the characteristics of an information system and demonstrating the difference between actual and required status. Juha Itkonen, 2005 19 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  20. 20. Definition of Software Testing (Cem Kaner, 2004) Software testing is a technical investigation of a product, i.e. an empirical search for quality-related information of value to a project’s stakeholders Juha Itkonen, 2005 20 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  21. 21. Context-driven school of testing 1. The value of any practice depends on its context. 2. There are good practices in context, but there are no best practices. 3. People, working together, are the most important part of any project's context. 4. Projects unfold over time in ways that are often not predictable. 5. The product is a solution. If the problem isn't solved, the product doesn't work. 6. Good software testing is a challenging intellectual process. 7. Only through judgment and skill, exercised cooperatively throughout the entire project, are we able to do the right things at the right times to effectively test our products. http://www.context-driven-testing.com/ Juha Itkonen, 2005 21 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  22. 22. Testing is an integral part of development Often, testing is seen as some separate, last phase of software development process That can be outsourced to separate testing team That only deeds to be done just before the release – if there is any time Testing can not be separated from rest of the software development Testing is much more than the final acceptance testing phase Testing has to be involved from the beginning Testers can, and should, contribute in each phase of the software development life-cycle Testing – is not a phase Juha Itkonen, 2005 22 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  23. 23. Reputation of testing and testers Traditionally testers and their job have not been respected Bad programmers and poor designers are put into testing team These people do bad job in testing -> their job is not respected Testing is very creative work that requires as skilled and experienced professionals as any job, in order to be done well Actually, good testing requires probably more skill and experience than programming – the skills, however, are different Juha Itkonen, 2005 23 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  24. 24. We need professional testers Developers can’t find their own defects They have no motivation for testing They have no skills for testing They have no time for testing Testers have the tester’s mindset Goal is to find defects – break the software Get the defects fixed together with the developers Testers are independent verifiers, they can test the software from the customer’s or user’s viewpoint Testers are professional – they have the skills, tools, and experience Juha Itkonen, 2005 24 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  25. 25. Some skills of a good tester Tester’s (destructive) attitude and mindset Excellent communication skills Ability to manage many details Know the different testing techniques and strategies and know when and how to apply them Be familiar with the domain the software is targeted Comprehensive knowledge of the software engineering discipline Knowledge and experience of how software is specified, designed, and developed Knowledge of fault and failure types Juha Itkonen, 2005 25 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  26. 26. Are testers destructive? Finding defects is a destructive task Requires destructive attitude and mindset Testers’ goal is to help the development team develop high quality software The ultimate goal is to construct software with required level of quality Testers job is not to be destructive Destructive attitude towards software during testing tasks – constructive, when reporting, towards people and the development project Juha Itkonen, 2005 26 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  27. 27. Software Quality and Quality Assurance Juha Itkonen, 2005 27 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  28. 28. Describe a high quality software system Defect free Feature rich High sales figures Award winning product Fast to learn Fast to use Nice graphics Low system requirements Easy to install and upgrade Reliable customer support Robust Cool Adaptable Compatible Cheap Expensive Simple Big and complicated Small and compact Large and scalable Extensible Standardized Open-source Reliable vendor Fit-for-purpose Multipurpose Fast upgrade cycle Never crashes Open access Secure Maintainable Fast time-to-market Appeals majority HELSINKI UNIVERSITY OF TECHNOLOGY Perfect for target segment Juha Itkonen, 2005 SoberIT/HUT 28
  29. 29. Definition of software quality Quality: (1) The degree to which a system, component or process meets specified requirements. (2) The degree to which a system, component, or process meets customer or user needs or expectations. (IEEE Standard Glossary of Software Engineering Terminology [IEEE610.12]) Quality is value to some person(s) (Gerald M. Weinberg, 1992, “Quality Software Management”) Juha Itkonen, 2005 29 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  30. 30. Users’ viewpoint differs from yours All stakeholders have different viewpoints to quality Customer End user Programmer Project manager Tester Note: bug free product can be unacceptable to the user The goal of a software system is to solve customers problem Requirements should be validated Usability needs to be considered Documentation should be tested Customer feedback process needs to be working Juha Itkonen, 2005 30 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  31. 31. Process quality and product quality Quality in process -> quality in product Numerous models, but has not been proven to be always true Project: instantiated process Process Project Quality according to ISO 9126 Process quality contributes to improving product quality, which in turn contributes to improving quality in use Juha Itkonen, 2005 31 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  32. 32. The McCall quality model Maintainability Portability Flexibility Product Revision Product Transition Reusability Testability Interoperability Product Operations Correctness Reliability Efficiency Integrity Usability Juha Itkonen, 2005 32 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  33. 33. ISO 9126 Quality characteristics The capability to provide functions that meet stated or implied needs. The capability to be The capability transferred from one Functionality to maintain a environment to specified level another. of performance. Portability Reliability ISO/IEC 9126 Maintainability Usability The capability to The capability to be understood, be modified. Efficiency learned, used and attractive to user. The capability to provide appropriate performance, relative to amount of resources used, under stated conditions. Juha Itkonen, 2005 33 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  34. 34. ISO 9126 Quality attributes Functionality Efficiency Suitability Time behavior Accurateness Resource behavior Interoperability Maintainability Security Analyzability Reliability Changeability Maturity Stability Fault tolerance Testability Recoverability Portability Usability Adaptability Understandability Installability Learnability Conformance Operability Replaceability Attractiveness Juha Itkonen, 2005 34 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  35. 35. Focus on different quality attributes in testing Traditionally emphasis on functionality Grown emphasis on Usability Efficiency Scalability Security Everything has to be accessible from public web Internal quality attributes important for products with long life cycle Maintainability Portability Testability … Juha Itkonen, 2005 35 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  36. 36. Good Enough Quality Defined by James Bach IEEE Computer, 1997, vol. 30, no. 8, pp. 96-98. To claim that any given thing is good enough is to agree with all of the following propositions: It has sufficient benefits It has no critical problems The benefits sufficiently outweigh the problems In the present situation, and all things considered, further improvement would be more harmful than helpful Juha Itkonen, 2005 36 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  37. 37. Quality Assurance (1) A planned and systematic pattern of all actions necessary to provide adequate confidence that an item or product conforms to established technical requirements. (2) A set of activities designed to evaluate the process by which products are developed or manufactured. Application of sound technical QA vs. Testing methods and tools Different attitude Formal technical reviews and inspections QA is focused to build in quality and prevent Software testing defects to ever happen Enforcement of standards constructive Documentation Testing is focused to find Control of change defects and show the level Extensive measurement of quality Record keeping and reporting destructive of the process Juha Itkonen, 2005 37 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  38. 38. Quality Assurance 1. A planned and systematic pattern of all actions necessary to provide adequate confidence that an item or product conforms to established technical requirements. 2. A set of activities designed to evaluate the process by which products are developed or manufactured. Application of sound technical methods and tools Formal technical reviews and inspections Software testing Enforcement of standards Documentation Control of change Extensive measurement Record keeping and reporting of the process Juha Itkonen, 2005 38 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT
  39. 39. QA vs. Testing Different attitude QA is focused to build in quality and prevent defects to ever happen Constructive Testing is focused to find defects and show the level of quality Destructive Testing can be seen as a part of QA QA includes Measuring and tracking Process improvement Assessing quality and reacting to variances Juha Itkonen, 2005 39 HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT/HUT

×