Your SlideShare is downloading. ×
CS527: Advanced Topics in Software Engineering
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

CS527: Advanced Topics in Software Engineering

1,145
views

Published on


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,145
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
14
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. CS527: Advanced Topics in Software Engineering (Software Testing and Analysis) Darko Marinov August 23, 2007
  • 2. Course Overview
    • Graduate seminar on program analysis (for bug finding), emphasis: systematic testing
    • Focus on a (research) project
      • Papers: reading in advance, writing reports, presenting, and discussing
      • Project: initial proposal, progress report, final presentation, paper
    • One or two problem sets
  • 3. Administrative Info
    • Meetings: TR 2-3:15pm, 1302 SC
    • Credit: 4 graduate hours
    • Auditors welcome for discussions
      • Can come to any lecture, mostly self-contained
    • Prerequisites: some software engineering, programming languages, and logic
  • 4. Evaluation
    • Grading
      • Project [50%]
      • Presentation [20%]
      • Participation (reports and discussion) [20%]
      • Problem set(s) [10%]
    • Distribution
      • Grades will be A- centered
      • No guarantee for A (or even a passing grade)!
    • Project is the most important
  • 5. Project
    • Proposal (due in about a month)
    • Progress report (around midterm time)
    • Presentation (around last week of classes)
    • Paper (by the grade submission deadline)
    • Six students who took CS598DM published five papers based on their projects
      • I’m happy to help, no co-authorship required
  • 6. Course Communication
    • Web site http://agora.cs.uiuc.edu/display/cs527
    • Mailing list < course-code>@mail.cs.uiuc.edu
  • 7. Personnel
    • TA: Brett Daniel
      • Office: 3217 SC, hours: by appointment
      • Email: b<last-name>3@cs.uiuc.edu
    • Instructor: Darko Marinov
      • Office: SC 3116, hours: by appointment
      • Phone number: 217-265-6117
      • Email: <last-name>@cs.uiuc.edu
  • 8. Signup Sheet
    • Name
    • Netid (please write it legibly)
    • Program/Year
    • Group
    • Looking for advisor/project/thesis topic?
    • Interests
    • Also sign up on Wiki
  • 9. This Lecture: Overview
    • Why look for bugs?
    • What are bugs?
    • Where they come from?
    • How to detect them?
  • 10. Some Example “Bugs”
    • NASA Mars space missions
      • priority inversion (2004)
      • different metric systems (1999)
    • BMW airbag problems (1999)
      • recall of >15000 cars
    • Ariane 5 crash (1996)
      • uncaught exception of numerical overflow
    • Your own favorite example?
  • 11. Economic Impact
    • “The Economic Impact of Inadequate Infrastructure for Software Testing” NIST Report, May 2002
    • $59.5B annual cost of inadequate software testing infrastructure
    • $22.2B annual potential cost reduction from feasible infrastructure improvements
  • 12. Estimates
    • Extrapolated from two studies (5% of total)
      • Manufacturing: transportation equipment
      • Services: financial institutions
    • Number of simplifying assumptions
    • “…should be considered approximations”
    • What is important for you?
      • Correctness, performance, functionality
  • 13. Terminology
    • Anomaly
    • Bug
    • Crash
    • Defect
    • Error
    • Failure, fault
    • G…
  • 14. Dynamic vs. Static
    • Incorrect (observed) behavior
      • Failure, fault
    • Incorrect (unobserved) state
      • Error, latent error
    • Incorrect lines of code
      • Fault, error
  • 15. “Bugs” in IEEE 610.12-1990
    • Fault
      • Incorrect lines of code
    • Error
      • Faults cause incorrect (unobserved) state
    • Failure
      • Errors cause incorrect (observed) behavior
    • Not used consistently in literature!
  • 16. Correctness
    • Common (partial) properties
      • Segfaults, uncaught exceptions
      • Resource leaks
      • Data races, deadlocks
      • Statistics based
    • Specific properties
      • Requirements
      • Specification
  • 17. Traditional Waterfall Model Requirements Analysis Design Checking Implementation Unit Testing Integration System Testing Maintenance Verification
  • 18. Phases (1)
    • Requirements
      • Specify what the software should do
      • Analysis: eliminate/reduce ambiguities, inconsistencies, and incompleteness
    • Design
      • Specify how the software should work
      • Split software into modules, write specifications
      • Checking: check conformance to requirements
  • 19. Phases (2)
    • Implementation
      • Specify how the modules work
      • Unit testing: test each module in isolation
    • Integration
      • Specify how the modules interact
      • System testing: test module interactions
    • Maintenance
      • Evolve software as requirements change
      • Verification: test changes, regression testing
  • 20. Testing Effort
    • Reported to be >50% of development cost [e.g., Beizer 1990]
    • Microsoft: 75% time spent testing
      • 50% testers who spend all time testing
      • 50% developers who spend half time testing
  • 21. When to Test
    • The later a bug is found, the higher the cost
      • Orders of magnitude increase in later phases
      • Also the smaller chance of a proper fix
    • Old saying: test often, test early
    • New methodology: test-driven development (write tests before code)
  • 22. Software is Complex
    • Malleable
    • Intangible
    • Abstract
    • Solves complex problems
    • Interacts with other software and hardware
    • Not continuous
  • 23. Software Still Buggy
    • Folklore: 1-10 (residual) bugs per 1000 nbnc lines of code (after testing)
    • Consensus: total correctness impossible to achieve for complex software
      • Risk-driven finding/elimination of bugs
      • Focus on specific correctness properties
  • 24. Approaches for Detecting Bugs
    • Software testing
    • Model checking
    • (Static) program analysis
  • 25. Software Testing
    • Dynamic approach
    • Run code for some inputs, check outputs
    • Checks correctness for some executions
    • Main questions
      • Test-input generation
      • Test-suite adequacy
      • Test oracles
  • 26. Other Testing Questions
    • Selection
    • Minimization
    • Prioritization
    • Augmentation
    • Evaluation
    • Fault Characterization
  • 27. Model Checking
    • Typically hybrid dynamic/static approach
    • Checks correctness for all executions
    • Some techniques
      • Explicit-state model checking
      • Symbolic model checking
      • Abstraction-based model checking
  • 28. Static Analysis
    • Static approach
    • Checks correctness for all executions
    • Some techniques
      • Abstract interpretation
      • Dataflow analysis
      • Verification-condition generation
  • 29. Comparison
    • Level of automation
      • Push-button vs. manual
    • Type of bugs found
      • Hard vs. easy to reproduce
      • High vs. low probability
      • Common vs. specific properties
    • Type of bugs (not) found
  • 30. Soundness and Completeness
    • Do we detect all bugs?
      • Impossible for dynamic analysis
    • Are reported bugs real bugs?
      • Easy for dynamic analysis
    • Most practical techniques and tools are both unsound and incomplete!
      • False positives
      • False negatives
  • 31. Analysis for Performance
    • Static compiler analysis, profiling
    • Must be sound
      • Correctness of transformation: equivalence
    • Improves execution time
    • Programmer time is more important
    • Programmer productivity
      • Not only finding bugs
  • 32. Combining Dynamic and Static
    • Dynamic and static analyses equal in limit
      • Dynamic: try exhaustively all possible inputs
      • Static: model precisely every possible state
    • Synergistic opportunities
      • Static analysis can optimize dynamic analysis
      • Dynamic analysis can focus static analysis
      • More discussions than results
  • 33. Current Status
    • Testing remains the most widely used approach for finding bugs
    • A lot of recent progress (within last decade) on model checking and static analysis
      • Model checking: from hardware to software
      • Static analysis: from sound to practical
    • Vibrant research in the area
    • Gap between research and practice
  • 34. “Schools” of Software Testing
    • Brett Pettichord described four schools
      • Analytic (a branch of CS/Mathematics)
      • Factory (a managed process)
      • Quality (a branch of quality assurance)
      • Context-Driven (a branch of development)
    • Brian Marick may give a guest lecture
    • We will focus on artifacts, not process
  • 35. Testing Topics to Cover
    • Test coverage and adequacy criteria
    • Test-input generation
    • Test oracles
    • Model-based testing
    • Testing GUIs
    • Testing software with structural inputs
    • Testing in your domain?
  • 36. Topics Related to Finding Bugs
    • How to eliminate bugs?
      • Debugging
    • How to prevent bugs?
      • Programming language design
      • Software development processes
    • How to show absence of bugs?
      • Theorem proving
      • Model checking, program analysis
  • 37. Next Lecture
    • Tuesday, August 28, at 2pm, 1302 SC
    • Assignment (on Wiki)
      • Read a paper on writing papers “Writing Good Software Engineering Research Papers” by Mary Shaw (ICSE 2003)
        • If you already read that one, read another
      • Read a paper on testing: “Parallel test generation and execution with Korat” by S. Misailovic, A. Milicevic, N. Petrovic, S. Khurshid, and D. Marinov (FSE 2007)