Testing Software In An Integrated Environment
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Testing Software In An Integrated Environment

on

  • 1,614 views

 

Statistics

Views

Total Views
1,614
Views on SlideShare
1,613
Embed Views
1

Actions

Likes
0
Downloads
74
Comments
1

1 Embed 1

http://www.slideshare.net 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Testing Software In An Integrated Environment Presentation Transcript

  • 1. Testing Software In An Integrated Environment
  • 2. References
    • Gerald D. Everett and Raymond McLeod Jr., Software Testing: Testing Across the Entire Software Development Life Cycle , John Wiley Science, 2007, ISBN 978-0-471-79371-7
    • Marnie L. Hutcheson, Software Testing Fundamentals: Methods and Metrics , John Wiley, 2003, ISBN 0-471-43-2-X
    • Rex Black, Critical Testing Processes: Plan, Prepare, Perform, Perfect , Addison Wesley, 2004, ISBN 0-201-74868-1
  • 3. Topics
    • What is testing and why is it important?
    • Software development life cycle and the role of testing
    • Structured testing
    • Testing strategies
    • Test planning
    • Types of testing
      • Static
      • Functional
      • Structural
      • Performance
    • Testing environment
    • Automated testing tools
    • Analyzing and interpreting test results
  • 4. What is testing and why is it important?
    • Testing is about reducing risks and minimizing failures
    • The cost of software failures 1
      • $59.5 billion annually
      • $22.2 billion is avoidable
    1 Everett and McLeod, page 1
  • 5. Characteristics of Testing
    • Technical thinking
      • Ability to model technology
      • Understand technology model’s inherent cause/effect relationships
    • Creative thinking
      • Ability to generate ideas and see possibilities
    • Critical thinking
      • Evaluate ideas
      • Make inferences
    • Practical thinking
      • Ability to put ideas into practice
  • 6. Primary Objectives of Testing
    • Identify magnitude and sources of development risk reducible by testing
      • What is the risk/
      • Where does the risk occur?
      • How great is the risk?
    • Perform testing to reduce identified risks
      • What to test?
      • What not to test?
    • Know when testing is complete
      • Results and measurements
    • Manage testing as a standard project within the development project
  • 7. Testing as a Project
    • Use the WBS
    • Predecessors and successors
    • Critical path considerations
    • Task analysis
      • Requirements to begin
      • Inputs
      • Outputs
      • Measurements and results to show completion
  • 8. Including Testing in a Project Plan
    • In structuring a development project, should testing be:
      • A stand-alone WBS decomposed to include all testing aspects?
      • A sub-element of every WBS?
      • Something else?
    • Why?
  • 9. Alternate Approaches to Including Testing in the Overall Project Plan
    • Single WBS
    • WBS Sub-element
    • 1.
    • 1.1
    • 1.1.1
    • 8. System Testing
    . . . 8. System Testing 1. 1.1 1.1.1 Testing 1.2 1.2.1 Testing . . . Testing included within each WBS grouping
  • 10. Primary Objectives of Testing
    • Identify magnitude and sources of reducible development risk
      • No testing required unless business case is tested
      • How much can be risk be reduced by testing?
    • Reduce identified risks
      • Risks cannot be completely eliminated
    • Know when testing is complete
      • Validate all development requirements
    • Manage testing as a standard project within the development project
      • Application and commitment of resources
      • Task Analysis approach
    Quality must be build in because quality cannot be tested in!
  • 11. Value of Testing Versus its Cost
    • Testing requires same ROI decision as any other project
    • Kinds of loss
      • Revenue
      • Profit
      • Resources
      • Customers
      • Litigation
  • 12. Correcting New Software
    • Basili and Boehm’s Rule of Exponentiality
    Cost of Correcting Defects Design Compile Pre-production Post & Code or Bind integration release Development Phase Source: Everett & McLeod, page 14
  • 13. Principles of Good Software Testing
    • Business risk can be reduced by finding defects.
    • Positive and negative testing both contribute to reducing risk
    • Static and execution testing both contribute to reducing risk
    • Automated test tools can contribute to reducing risk
    • Make the highest risks the initial testing priority
    • Make the most frequent business activities the second testing priority
    • Statistical analyses of defect arrival patterns and other defect characteristics are highly effective in forecasting testing complexity
    • Test the system the way it will be used
    • Assume that defects occur as the result of process rather than personality
    • Testing for defects is an investment as well as a cost
  • 14. Sources of Defects in Developed Software Percentage of Defects Design Compile Pre-production Post & Code or Bind integration release Development Phase 85 Source: Everett & McLeod, page 21
  • 15. Software Development Life Cycle and the Role of Testing Planning Stage Analysis Stage Design Stage Implementation Stage Cutover Waterfall Development Life Cycle Source: Everett and McLeod, Page 31
  • 16. Software Development Life Cycle and the Role of Testing Preliminary Investigation Design Analysis Complete System Components Construction Cutover Reject Prototype Accept Prototype Prototype Development Life Cycle Source: Everett and McLeod, Page 32
  • 17. Software Development Life Cycle and the Role of Testing Information Systems Department User Community Source: Everett and McLeod, Page 33 RAD Life Cycle Requirements Planning User Design Construction Cutover
  • 18. Software Development Life Cycle and the Role of Testing Preliminary Investigation Stage Analysis Stage Design Stage Preliminary Construction Stage Final Construction Stage Installation Stage Phase 1 Analysis Stage Design Stage Preliminary Construction Stage Phase 3 Analysis Stage Design Stage Preliminary Construction Stage Phase 2 Prototyping Development Methodology Source: Everett and McLeod, Page 34
  • 19. Prototype Development Methodology – Testing Types
    • Monitoring of operation and performance within boundaries
    Post-implementation evaluation None Installation
    • Static testing of users guide, operators guide, installation guide, and training material
    • Performance tests
    • Load tests
    • User acceptance testing
    • Installation testing
    Final construction
    • Static testing of all codes
    • Functional tests
    • Performance tests
    Preliminary construction
    • Static testing of all design documents
    Design
    • Static testing of requirements
    Analysis None Preliminary Investigation Type of Testing Stage
  • 20. The General Systems Model Input Physical Resources Output Physical Resources Transformation Processes Information Processor Management Standards Physical System Environment Data Information Decisions Source: Everett and McLeod, Page 38 Environment
  • 21. Structured Testing
    • Specifications – written statement of expected software behavior
    • Premeditation – written test plans, test environments, test data, test scripts, and testing schedules
    • Repeatability – multiple executions producing identical results
    • Accountability – written statement of responsibility
    • Economy – addressing the cost/benefit issues with regard to testing
  • 22. Testing Strategies
    • Static testing
      • Pencil and paper
      • Documents
    • White box
      • Source code
      • Logic paths
    • Black box
      • Executable code
      • Behavior
    • Performance
      • Responsiveness
    • Single release versus multiple releases
      • Static
      • White box
      • Black box
      • Performance
      • Regression
  • 23. Test Planning
    • Defined and managed approach to demonstrating the workability of an application or system
    • Consists of
      • Overarching strategy (test plan)
      • Specific evaluations (test cases)
    • Must align with development methodology
  • 24. The Test Plan
    • Purpose – document the philosophy and strategy for assessing the suitability of an application
    • Elements
      • Application to be tested
      • Testing objectives an rationale
      • Test plan scope and limitations
      • Sources of expertise
        • Business
        • Technical
      • Sources of test data
      • Test environment definition and management
      • Testing strategy
      • Testing details
  • 25. Elements of the Test Plan
    • Objectives and rationale
      • Business needs
      • Business risk
      • Measurable
      • Achievable
    • Test plan scope and limitations
      • What will be tested
      • What will not be tested
    • Test environment and management
      • Separate and independent
      • Production environment mirror
      • Controlled/managed by test team
  • 26. Testing Details
    • Applicable development phase
    • Criteria to complete
    • Identification of test cases
      • Identification
      • Summary description
    • Test case schedules
      • Writing
      • Execution
      • Analysis and reporting
  • 27. Test Cases
    • Describes how an application or system will be assessed
      • To satisfy a business purpose or objective
      • To participate in a business process
      • To demonstrate workability
  • 28. Test Case Components
    • Unique test case identifier
    • Unique test case title
    • Test case description
    • Appropriate development phase cross reference
    • Testing goals and achievement measures
    • Test data to be used (recommended)
    • Test tools to be used (recommended)
    • Test start up procedure – required hardware, software, and data needed to begin
    • Test close down procedure – procedure for ending the test nominally
    • Test procedure for rerun – procedure for restarting or repeating this test
    • Test execution steps
      • Step number
      • Step action
      • Expected results
      • Actual results
    • Testing attempt description
    • Software defects identified
    • Successful (yes/no)
  • 29. Aligning Test Cases with Development Life Cycle Preliminary Investigation & Analysis Design Final Construction Installation Preliminary Construction Post Implementation Draft test plan and write test cases for post implementation Update test plan and write test cases for first construction Update test plan and write test cases for preliminary construction Execute test cases for preliminary construction Execute test cases for final construction Execute test cases for post implementation
  • 30. Types of Testing
    • Static Testing
    • Functional Testing
    • Structural Testing
    • Performance Testing
  • 31. Static Testing
    • Goal – overall defect reduction by reducing defects in driving documentation so as to achieve correct operation of the system
    • Candidate documents to be tested
      • Software development management
      • Software development
      • Tester
      • Administrator
      • End user
  • 32. Static Testing - Documents
    • Software development manager
      • Software requirements
      • Software project plans
    • Software developers
      • Use cases
      • Software designs
      • Software specifications
      • Data flow diagrams
      • Database and file designs
      • Operating environment specifications
      • Interfaces
      • Network specifications
      • Security specifications
      • HCI/GUI specifications
      • Reports
      • Code
    • Testers
      • Test plans
      • Test cases
      • Test environment specifications
      • Test data sources and preparation
      • Test tool installation and operation
    • Administrator
      • Installation guides
      • Operation guides
      • Administration guides
    • End users
      • User guides
      • HELP screens
      • Training manuals
  • 33. Static Testing Techniques
    • Desk checking – editing and proofreading
      • Grammar
      • Spelling
      • Punctuation
    • Inspections – independent editing and proofreading
    • Walk-through – formal assessment
      • Content
      • Inconsistencies
  • 34. Functional Testing
    • Objective – validate the software behavior against the business functionality documented in the software requirements and specifications
    • Approaches
      • User navigation testing
      • Transaction screen testing
      • Transaction flow testing
      • Report screen testing
      • Report flow testing
      • Database CRUD * testing
    * C reate R etrieve U pdate D elete
  • 35. Regression Testing
    • Objective – demonstrate repeatability of results across software changes
    • Test process consistency
      • Test cases
      • Test data
  • 36. White Box Testing Techniques
    • Statement coverage – determining the percentage of source code lines that were executed
    • Branch coverage – determining percentage of source code branch logic has been executed
      • Simple – true/false
      • Compound – true/false plus Boolean operators (AND, OR, NOT)
    • Path coverage – determining percentage of source code paths completely traversed
    • Loop coverage – determining the percentage of source code loops in a program completely executed
      • DO WHILE
      • DO UNTIL
    • Intuition and experience
      • Dates
      • Zero length elements
      • Buffer overflow
  • 37. Zero Length Elements
    • Arrays
    • Blank inputs
    • Divide by zero
    • Loops
    • Pointers
    • Record lengths
    • Empty files
    • Sorts
  • 38. Black Box Testing Techniques
    • Equivalence class techniques – identifying input data groups that tend to cause application under test to behave the same way for all values in the group
    • Boundary value techniques – assessing the beginning and ending values of an equivalence class
    • Expected results coverage techniques – assessing the output values associated with a set of input values
    • Intuition and experience
      • Error handling
      • Data type conversions
      • Dates
  • 39. Structural Testing
    • Interface testing – addressing the data transferred among components under test
    • Security testing – addressing the integrity and safety of the application or system under test
    • Installation testing – focusing on the operation of the system in the production environment
    • Smoke test – verify configuration verification
    • Administration testing – verify system management in a production environment
    • Backup/recovery testing – demonstrate recovery and restart following application or system failure
  • 40. Performance Testing
    • Goal – demonstrate that the application will fully satisfy the service level requirements
      • Responsiveness
      • Throughput
      • Capacity utilization
    • Focus
      • Components
      • Subsystems
      • System overall
  • 41. Performance Testing Expectations
    • Requirements define user/customer expectations
      • Necessary functional capabilities
    • Simulations describe analytical expectations
      • Mathematical models to confirm viability of user requirements
      • Configuration constrained
    • Testing proves validity of expectations
  • 42. Performance Testing Levels of Test
    • Isolation –
      • No competing workload
      • Provides absolute performance values
      • Single instance
    • Under load –
      • Competing workload
      • Staged
        • Expected competing workload
        • Peak competing workload
        • Volume
  • 43. Performance Testing Metrics
    • Vendor-provided measureands
      • Processor
      • Input/Output
      • Communications
    • Supplementary tools
      • Third party vendors
    • Manual measures
      • Stop watch
  • 44. Testing Environment
    • Goal – cause the application or system under test to exhibit true production behavior while being observed and measured in a controlled environment
    • Elements
      • Operating system
      • Security
      • File systems
      • Connectivity
      • Application/applications
    • Types
      • Simple
      • Complex
  • 45. Testing Environment - Simple
    • Version 1
    • Version 2
    • Version n
    Next version Untested and corrected software Next version Tested software only Development Environment Staging (test) Environment Production Environment
  • 46. Testing Environment - Complex Accounting Subsystem Marketing Subsystem Manufacturing Subsystem Warehousing Subsystem Accounting untested & corrected Marketing untested & corrected Manufacturing untested & corrected Warehousing untested & corrected Enterprise- wide application Untested & corrected Enterprise- wide application Tested software only Development Environments Staging (test) Environments Production Environment Final Staging (test) Environment
  • 47. Automated Testing Tools
    • Test plan models
      • Available in text books
      • Available on the Internet
    • Test case models
      • Available in text books
      • Available on the Internet
    • Test data generation
      • Manual
      • automated
    • Sampling considerations
  • 48. Analyzing and Interpreting Test Results Investigate test data for correctness; if correct, investigate code. Make corrections as needed. Repeat test Test fails Values 1- 3 report red; values 4 – 8 report yellow; values 9 – 12 report green Values 1 – 4 report red; values 5 – 8 report yellow; values 9 – 12 report green Using approved test case, verify power source reporting for cited values SATHS 143 Action Verdict Actual Results Expected Results Test Description Test ID or WBS