CS527: Advanced Topics in Software Engineering

1,621 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,621
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
30
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

CS527: Advanced Topics in Software Engineering

  1. 1. CS527: Advanced Topics in Software Engineering (Software Testing and Analysis) Darko Marinov August 23, 2007
  2. 2. Course Overview <ul><li>Graduate seminar on program analysis (for bug finding), emphasis: systematic testing </li></ul><ul><li>Focus on a (research) project </li></ul><ul><ul><li>Papers: reading in advance, writing reports, presenting, and discussing </li></ul></ul><ul><ul><li>Project: initial proposal, progress report, final presentation, paper </li></ul></ul><ul><li>One or two problem sets </li></ul>
  3. 3. Administrative Info <ul><li>Meetings: TR 2-3:15pm, 1302 SC </li></ul><ul><li>Credit: 4 graduate hours </li></ul><ul><li>Auditors welcome for discussions </li></ul><ul><ul><li>Can come to any lecture, mostly self-contained </li></ul></ul><ul><li>Prerequisites: some software engineering, programming languages, and logic </li></ul>
  4. 4. Evaluation <ul><li>Grading </li></ul><ul><ul><li>Project [50%] </li></ul></ul><ul><ul><li>Presentation [20%] </li></ul></ul><ul><ul><li>Participation (reports and discussion) [20%] </li></ul></ul><ul><ul><li>Problem set(s) [10%] </li></ul></ul><ul><li>Distribution </li></ul><ul><ul><li>Grades will be A- centered </li></ul></ul><ul><ul><li>No guarantee for A (or even a passing grade)! </li></ul></ul><ul><li>Project is the most important </li></ul>
  5. 5. Project <ul><li>Proposal (due in about a month) </li></ul><ul><li>Progress report (around midterm time) </li></ul><ul><li>Presentation (around last week of classes) </li></ul><ul><li>Paper (by the grade submission deadline) </li></ul><ul><li>Six students who took CS598DM published five papers based on their projects </li></ul><ul><ul><li>I’m happy to help, no co-authorship required </li></ul></ul>
  6. 6. Course Communication <ul><li>Web site http://agora.cs.uiuc.edu/display/cs527 </li></ul><ul><li>Mailing list < course-code>@mail.cs.uiuc.edu </li></ul>
  7. 7. Personnel <ul><li>TA: Brett Daniel </li></ul><ul><ul><li>Office: 3217 SC, hours: by appointment </li></ul></ul><ul><ul><li>Email: b<last-name>3@cs.uiuc.edu </li></ul></ul><ul><li>Instructor: Darko Marinov </li></ul><ul><ul><li>Office: SC 3116, hours: by appointment </li></ul></ul><ul><ul><li>Phone number: 217-265-6117 </li></ul></ul><ul><ul><li>Email: <last-name>@cs.uiuc.edu </li></ul></ul>
  8. 8. Signup Sheet <ul><li>Name </li></ul><ul><li>Netid (please write it legibly) </li></ul><ul><li>Program/Year </li></ul><ul><li>Group </li></ul><ul><li>Looking for advisor/project/thesis topic? </li></ul><ul><li>Interests </li></ul><ul><li>Also sign up on Wiki </li></ul>
  9. 9. This Lecture: Overview <ul><li>Why look for bugs? </li></ul><ul><li>What are bugs? </li></ul><ul><li>Where they come from? </li></ul><ul><li>How to detect them? </li></ul>
  10. 10. Some Example “Bugs” <ul><li>NASA Mars space missions </li></ul><ul><ul><li>priority inversion (2004) </li></ul></ul><ul><ul><li>different metric systems (1999) </li></ul></ul><ul><li>BMW airbag problems (1999) </li></ul><ul><ul><li>recall of >15000 cars </li></ul></ul><ul><li>Ariane 5 crash (1996) </li></ul><ul><ul><li>uncaught exception of numerical overflow </li></ul></ul><ul><li>Your own favorite example? </li></ul>
  11. 11. Economic Impact <ul><li>“The Economic Impact of Inadequate Infrastructure for Software Testing” NIST Report, May 2002 </li></ul><ul><li>$59.5B annual cost of inadequate software testing infrastructure </li></ul><ul><li>$22.2B annual potential cost reduction from feasible infrastructure improvements </li></ul>
  12. 12. Estimates <ul><li>Extrapolated from two studies (5% of total) </li></ul><ul><ul><li>Manufacturing: transportation equipment </li></ul></ul><ul><ul><li>Services: financial institutions </li></ul></ul><ul><li>Number of simplifying assumptions </li></ul><ul><li>“…should be considered approximations” </li></ul><ul><li>What is important for you? </li></ul><ul><ul><li>Correctness, performance, functionality </li></ul></ul>
  13. 13. Terminology <ul><li>Anomaly </li></ul><ul><li>Bug </li></ul><ul><li>Crash </li></ul><ul><li>Defect </li></ul><ul><li>Error </li></ul><ul><li>Failure, fault </li></ul><ul><li>G… </li></ul><ul><li>… </li></ul>
  14. 14. Dynamic vs. Static <ul><li>Incorrect (observed) behavior </li></ul><ul><ul><li>Failure, fault </li></ul></ul><ul><li>Incorrect (unobserved) state </li></ul><ul><ul><li>Error, latent error </li></ul></ul><ul><li>Incorrect lines of code </li></ul><ul><ul><li>Fault, error </li></ul></ul>
  15. 15. “Bugs” in IEEE 610.12-1990 <ul><li>Fault </li></ul><ul><ul><li>Incorrect lines of code </li></ul></ul><ul><li>Error </li></ul><ul><ul><li>Faults cause incorrect (unobserved) state </li></ul></ul><ul><li>Failure </li></ul><ul><ul><li>Errors cause incorrect (observed) behavior </li></ul></ul><ul><li>Not used consistently in literature! </li></ul>
  16. 16. Correctness <ul><li>Common (partial) properties </li></ul><ul><ul><li>Segfaults, uncaught exceptions </li></ul></ul><ul><ul><li>Resource leaks </li></ul></ul><ul><ul><li>Data races, deadlocks </li></ul></ul><ul><ul><li>Statistics based </li></ul></ul><ul><li>Specific properties </li></ul><ul><ul><li>Requirements </li></ul></ul><ul><ul><li>Specification </li></ul></ul>
  17. 17. Traditional Waterfall Model Requirements Analysis Design Checking Implementation Unit Testing Integration System Testing Maintenance Verification
  18. 18. Phases (1) <ul><li>Requirements </li></ul><ul><ul><li>Specify what the software should do </li></ul></ul><ul><ul><li>Analysis: eliminate/reduce ambiguities, inconsistencies, and incompleteness </li></ul></ul><ul><li>Design </li></ul><ul><ul><li>Specify how the software should work </li></ul></ul><ul><ul><li>Split software into modules, write specifications </li></ul></ul><ul><ul><li>Checking: check conformance to requirements </li></ul></ul>
  19. 19. Phases (2) <ul><li>Implementation </li></ul><ul><ul><li>Specify how the modules work </li></ul></ul><ul><ul><li>Unit testing: test each module in isolation </li></ul></ul><ul><li>Integration </li></ul><ul><ul><li>Specify how the modules interact </li></ul></ul><ul><ul><li>System testing: test module interactions </li></ul></ul><ul><li>Maintenance </li></ul><ul><ul><li>Evolve software as requirements change </li></ul></ul><ul><ul><li>Verification: test changes, regression testing </li></ul></ul>
  20. 20. Testing Effort <ul><li>Reported to be >50% of development cost [e.g., Beizer 1990] </li></ul><ul><li>Microsoft: 75% time spent testing </li></ul><ul><ul><li>50% testers who spend all time testing </li></ul></ul><ul><ul><li>50% developers who spend half time testing </li></ul></ul>
  21. 21. When to Test <ul><li>The later a bug is found, the higher the cost </li></ul><ul><ul><li>Orders of magnitude increase in later phases </li></ul></ul><ul><ul><li>Also the smaller chance of a proper fix </li></ul></ul><ul><li>Old saying: test often, test early </li></ul><ul><li>New methodology: test-driven development (write tests before code) </li></ul>
  22. 22. Software is Complex <ul><li>Malleable </li></ul><ul><li>Intangible </li></ul><ul><li>Abstract </li></ul><ul><li>Solves complex problems </li></ul><ul><li>Interacts with other software and hardware </li></ul><ul><li>Not continuous </li></ul>
  23. 23. Software Still Buggy <ul><li>Folklore: 1-10 (residual) bugs per 1000 nbnc lines of code (after testing) </li></ul><ul><li>Consensus: total correctness impossible to achieve for complex software </li></ul><ul><ul><li>Risk-driven finding/elimination of bugs </li></ul></ul><ul><ul><li>Focus on specific correctness properties </li></ul></ul>
  24. 24. Approaches for Detecting Bugs <ul><li>Software testing </li></ul><ul><li>Model checking </li></ul><ul><li>(Static) program analysis </li></ul>
  25. 25. Software Testing <ul><li>Dynamic approach </li></ul><ul><li>Run code for some inputs, check outputs </li></ul><ul><li>Checks correctness for some executions </li></ul><ul><li>Main questions </li></ul><ul><ul><li>Test-input generation </li></ul></ul><ul><ul><li>Test-suite adequacy </li></ul></ul><ul><ul><li>Test oracles </li></ul></ul>
  26. 26. Other Testing Questions <ul><li>Selection </li></ul><ul><li>Minimization </li></ul><ul><li>Prioritization </li></ul><ul><li>Augmentation </li></ul><ul><li>Evaluation </li></ul><ul><li>Fault Characterization </li></ul><ul><li>… </li></ul>
  27. 27. Model Checking <ul><li>Typically hybrid dynamic/static approach </li></ul><ul><li>Checks correctness for all executions </li></ul><ul><li>Some techniques </li></ul><ul><ul><li>Explicit-state model checking </li></ul></ul><ul><ul><li>Symbolic model checking </li></ul></ul><ul><ul><li>Abstraction-based model checking </li></ul></ul>
  28. 28. Static Analysis <ul><li>Static approach </li></ul><ul><li>Checks correctness for all executions </li></ul><ul><li>Some techniques </li></ul><ul><ul><li>Abstract interpretation </li></ul></ul><ul><ul><li>Dataflow analysis </li></ul></ul><ul><ul><li>Verification-condition generation </li></ul></ul>
  29. 29. Comparison <ul><li>Level of automation </li></ul><ul><ul><li>Push-button vs. manual </li></ul></ul><ul><li>Type of bugs found </li></ul><ul><ul><li>Hard vs. easy to reproduce </li></ul></ul><ul><ul><li>High vs. low probability </li></ul></ul><ul><ul><li>Common vs. specific properties </li></ul></ul><ul><li>Type of bugs (not) found </li></ul>
  30. 30. Soundness and Completeness <ul><li>Do we detect all bugs? </li></ul><ul><ul><li>Impossible for dynamic analysis </li></ul></ul><ul><li>Are reported bugs real bugs? </li></ul><ul><ul><li>Easy for dynamic analysis </li></ul></ul><ul><li>Most practical techniques and tools are both unsound and incomplete! </li></ul><ul><ul><li>False positives </li></ul></ul><ul><ul><li>False negatives </li></ul></ul>
  31. 31. Analysis for Performance <ul><li>Static compiler analysis, profiling </li></ul><ul><li>Must be sound </li></ul><ul><ul><li>Correctness of transformation: equivalence </li></ul></ul><ul><li>Improves execution time </li></ul><ul><li>Programmer time is more important </li></ul><ul><li>Programmer productivity </li></ul><ul><ul><li>Not only finding bugs </li></ul></ul>
  32. 32. Combining Dynamic and Static <ul><li>Dynamic and static analyses equal in limit </li></ul><ul><ul><li>Dynamic: try exhaustively all possible inputs </li></ul></ul><ul><ul><li>Static: model precisely every possible state </li></ul></ul><ul><li>Synergistic opportunities </li></ul><ul><ul><li>Static analysis can optimize dynamic analysis </li></ul></ul><ul><ul><li>Dynamic analysis can focus static analysis </li></ul></ul><ul><ul><li>More discussions than results </li></ul></ul>
  33. 33. Current Status <ul><li>Testing remains the most widely used approach for finding bugs </li></ul><ul><li>A lot of recent progress (within last decade) on model checking and static analysis </li></ul><ul><ul><li>Model checking: from hardware to software </li></ul></ul><ul><ul><li>Static analysis: from sound to practical </li></ul></ul><ul><li>Vibrant research in the area </li></ul><ul><li>Gap between research and practice </li></ul>
  34. 34. “Schools” of Software Testing <ul><li>Brett Pettichord described four schools </li></ul><ul><ul><li>Analytic (a branch of CS/Mathematics) </li></ul></ul><ul><ul><li>Factory (a managed process) </li></ul></ul><ul><ul><li>Quality (a branch of quality assurance) </li></ul></ul><ul><ul><li>Context-Driven (a branch of development) </li></ul></ul><ul><li>Brian Marick may give a guest lecture </li></ul><ul><li>We will focus on artifacts, not process </li></ul>
  35. 35. Testing Topics to Cover <ul><li>Test coverage and adequacy criteria </li></ul><ul><li>Test-input generation </li></ul><ul><li>Test oracles </li></ul><ul><li>Model-based testing </li></ul><ul><li>Testing GUIs </li></ul><ul><li>Testing software with structural inputs </li></ul><ul><li>Testing in your domain? </li></ul>
  36. 36. Topics Related to Finding Bugs <ul><li>How to eliminate bugs? </li></ul><ul><ul><li>Debugging </li></ul></ul><ul><li>How to prevent bugs? </li></ul><ul><ul><li>Programming language design </li></ul></ul><ul><ul><li>Software development processes </li></ul></ul><ul><li>How to show absence of bugs? </li></ul><ul><ul><li>Theorem proving </li></ul></ul><ul><ul><li>Model checking, program analysis </li></ul></ul>
  37. 37. Next Lecture <ul><li>Tuesday, August 28, at 2pm, 1302 SC </li></ul><ul><li>Assignment (on Wiki) </li></ul><ul><ul><li>Read a paper on writing papers “Writing Good Software Engineering Research Papers” by Mary Shaw (ICSE 2003) </li></ul></ul><ul><ul><ul><li>If you already read that one, read another </li></ul></ul></ul><ul><ul><li>Read a paper on testing: “Parallel test generation and execution with Korat” by S. Misailovic, A. Milicevic, N. Petrovic, S. Khurshid, and D. Marinov (FSE 2007) </li></ul></ul>

×