Class 9
Upcoming SlideShare
Loading in...5
×
 

Class 9

on

  • 1,142 views

 

Statistics

Views

Total Views
1,142
Views on SlideShare
1,142
Embed Views
0

Actions

Likes
0
Downloads
17
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Class 9 Class 9 Presentation Transcript

    • CES 592 Telecommunications System Product Verification Sonoma State University
      • Class Lecture 9:
      • Software Testing Processes
      Fall 2004
    • Instructors
      • France Antelme
      • Ario Bigattini
      • Jaseem Masood
      • Steven Woody
      • Coordinator: Prof Ravi Kumar
      Website http://www.sonoma.edu/engineering/courses/CES592.shtml
    • Software Testing Process
      • Test Plans and Test Cases
      • Regression versus New Feature testing
      • Test Entrance / Exit Criteria
      • Version Control
      • Defect Tracking
    • Software Testing Process
      • Test Plans and Test Cases
      • A test plan is a document that describes the planned test activities. For a large software project, this may be divided into separate feature test plans.
      • A test case is a list of steps which test a specific feature. Should not be more than one page long. Must contain a pass/fail criteria.
      • A test case matrix is commonly used in a test plan to identify which combinations/permutations of conditions will be tested.
    • Software Testing Process:Test Plans
      • From IEEE Standard for Software Test Documentation IEEE 829-1998
      • Test plan identifier – Name / Number
      • Introduction – Brief description of product & test strategy
      • Test items – Description of item to be tested
      • Features to be tested - List
      • Features not to be tested – Very important to list, prevents assumptions
      • Approach – Describe your test strategy
      • Item pass/fail criteria – Must have pass/fail criteria for tests
      • Suspension criteria and resumption requirements – (Entry / Exit criteria)
      • Test deliverables – Results: Performance charts, bug lists, bug charts
      • Testing tasks
      • Environmental needs – Lab setup
      • Responsibilities – Define & agree
      • Staffing and training needs – Continuous training, eye on future
      • Schedule – Time to perform test cases vs. time for software to stabilize
      • Risks and contingencies – Identify real & meaningful risks
      • Approvals – Management, Customer Account team
    • Software Testing Process: Test Plans
      • Project Milestones:
        • Testplan approved
        • Testbed ready
        • Scripts complete
        • Testing Start
        • Testing Finish
        • Script transferred to regression
    • Software Testing Process: Test Plans
    • Software Testing Process: Test Plans
      • Schedule estimation is difficult!
      • Have a target schedule (make your best estimate based on prior experience)
      • When 75% of that schedule has elapsed, nail down a specific date for test completion, taking into account the progress thus far.
      • Keep track of your project history, so that you can make more accurate schedule estimates for the next project
    • Software Testing Process:Test Cases
      • Test Case identifier – Name / Number
      • Test case owner – W ho wrote it? Who is responsible for updating it?
      • Item to be tested – Describe
      • Input/Output Specification
      • Item pass/fail criteria – Must have pass/fail criteria for test case
      • Automation – Can it be automated? Is it automated? Point to file.
      • Testing tasks – one-page procedure
      • Environmental needs – Lab setup
      • Special procedural requirements
      • Inter-case dependencies
      • Test case priority – changes constantly, depends on many factors
      • Track which software version this test case is valid for
      • Schedule – Time to perform the test case – keep track of it
      • Bugs found by this test case – update every time a new bug is found
    • Software Testing Process: Test Cases Start writing test cases as the product requirements are being defined. Show your test cases to the developer who is doing the coding. Get him/her thinking about how the feature will be tested. Help prevent the bugs from ever getting into the code. Test cases should give enough information to perform the test, but not so much detail that everyone will perform the test in the same way. You want to allow some variation and randomness in the test case.
    • Software Testing Process: Test Cases
      • Positive Test Cases: Does the feature work as required?
      • Negative Test Cases: Can the feature be broken?
          • Network: Disconnected, No Ports available…
          • Disk Storage: File not found, File in use, Disk Full, Invalid Path, CRC error
          • Memory: Not enough free memory, fragments too small…
      • Do the Positive testcases first, to verify the feature is working. Then try to break it with the Negative test cases.
    • Software Testing Process
      • Test Plans and Test Cases
      • Regression versus New Feature testing
      • Test Entrance / Exit Criteria
      • Version Control
      • Defect Tracking
    • Software Testing Process Regression versus New Feature testing
      • New Feature Testing: Testing the new features that have been added since the previous test cycle
      • Regression Testing: Re-testing “old” functionality to make sure that no features were inadvertently broken
    • Software Testing Process Regression versus New Feature testing
      • New Feature Testing: Usually a small percentage of the overall number of tests to be performed
      • Regression Testing: Number of tests grows with each new feature added; can easily consume all testing resources
    • Software Testing Process Regression test priorities
      • Check that the bug was actually fixed
      • Check for related bugs
      • Check that the fix didn’t break something else Testing Computer Software, Kaner, Falk, Nguyen
    • Software Testing Process
      • Test Plans and Test Cases
      • Regression versus New Feature testing
      • Test Entrance / Exit Criteria
      • Version Control
      • Defect Tracking
    • Software Testing Process
      • Software Testing Process
      • Entrance / Exit Criteria
    • Software Testing Process
      • Software Testing Entrance Criteria
      • Feature list defined & completed
      • Code Review completed, documented
      • Static Analysis completed
      • Draft user documents ready
      • Unit testing plan completed
      • Does the software meet the entrance criteria?
      • If so, then software test will be complete in 12 weeks.
      • If not, then software test will not complete on time.
      • Measure the number & severity of bugs found after test start:. If bugs exceeds a limit, then entrance criteria was not met, 12 week schedule will not be met.
    • Software Testing Process
      • Exit Criteria
      • Is the software safe for people, property and data? (“First, do no harm”)
      • All test team members agree that it is fit for use?
      • Feature testing: At least one full test cycle with >75% pass rate
      • Regression Testing completed, >90% pass rate
      • Code Coverage analysis completed, >70% code coverage
      • No showstopper or catastrophic bugs open
      • Final review of all open bugs
      • Deliverables
      • Automated regression scripts
      • Updated test plan, test cases, test scripts (constantly re-evaluate)
      • Defect Counts, trend summaries
      • Release notes reviewed
      • User Docs reviewed
      • Customer support training
    • Software Testing Process
      • A word about Entrance Criteria…
      • Don’t “burn your lead time” by waiting for the perfect, 100% complete software build to be delivered. Start testing as soon as you can.
      Coding Start Coding Finished Testing Start Testing Finished Official Testing Start – Must meet entrance criteria
    • Software Testing Process
      • Test Plans and Test Cases
      • Regression versus New Feature testing
      • Test Entrance / Exit Criteria
      • Version Control
      • Defect Tracking
    • Software Testing Process: Version Control
      • Version control for software under test
      • One central source for software to be tested
      • Version control for test plans, test cases, test scripts
      • Keep track of what was tested on what version of software & hardware
    • Software Testing Process: Version Control
      • CVS - Concurrent Versions System
      • RCS - Revision Control System
      • Subversion
      • Clearcase - Rational Software
      • PVCS - Merant
    • Software Testing Process
      • Test Plans and Test Cases
      • Regression versus New Feature testing
      • Test Entrance / Exit Criteria
      • Version Control
      • Defect Tracking
    • Software Testing Process: Defect Tracking
      • Agreed-upon definitions for bug severity, before the project starts
      • Quality is everyone’s job – if you see a problem, it is your responsibility to enter a bug report
      • Weekly bug review meeting - “bug scrub”
    • Software Testing Process: Defect Tracking
      • Bugzilla
      • GNATS
      • Debian Bug Tracking System
      • Clear DDTS – Rational Software
      • SilkRadar – Segue Software http://testingfaqs.org/t-track.html
    • Software Testing Process: Defect Reports
      • Title
      • Severity category
      • Inputs
      • Expected results
      • Actual results
      • Anomalies Observed
      • Bug Info: Error logs, debug traces
      • Date and time
      • Procedure step
      • Environment
      • Attempts to repeat / Steps to repeat
      • Testers
      • Observers
      • Last version observed to pass
    • Software Testing Process: Defect Reports
      • The Title is the most important part of the Defect Report – make it clear and concise!
      • Find the simplest trigger for the bug
      • Find the most serious consequences of the bug
      • Try to determine if the bug is the tip of an icecube or the tip of an iceburg
      • Always give the developer a chance to debug it in real time – it may take you hours to reproduce it later
    • Software Testing Process: Exit Criteria A word about exit criteria: Watch the Defect trend to know if you are ready to release:
    • References
      • Testing Computer Software, by C. Kaner, J. Falk, and H. Nguyen
      • Managing the Testing Process, Rex Black
      • Software Test Automation, Fewster & Graham
      • Code Complete, Steve McConnell
      • “ Software Testing and Quality Assurance”, Ross Collard
      • IEEE Standard for Software Test Documentation, Std 829 - 1998
      • Black-Box Testing: Techniques for Functional Testing of Software and Systems, Boris Beizer, Wiley, 1995
      • Classic Testing Mistakes, Brian Marick
      • http:// www.testing.com/writings/classic/mistakes.pdf
      • Software QA / Test Resource Center
      • http ://www.softwareqatest.com/index.html