Project Data Incorporating Qualitative Factors for  Improved Software Defect Prediction Norman Fenton Martin Neil, William...
Overview <ul><li>Background </li></ul><ul><li>The data </li></ul><ul><li>Results </li></ul><ul><li>Caveats </li></ul>
Background <ul><li>Predicting reliability </li></ul><ul><li>Statistical  models </li></ul><ul><li>Causal models </li></ul>
Causal model (Bayesian network) Probability of finding defect Testing process effectiveness Testing process quality Testin...
Background <ul><li>AID </li></ul><ul><li>MODIST </li></ul>
Schematic view of model Existing  code base Defect insertion and recovery Testing and rework Design and development Specif...
How Model was developed <ul><li>Qualitative and quantitative factors </li></ul><ul><li>Expert judgement </li></ul><ul><li>...
Example question: “Relevant Experience of Spec & Doc Staff” <ul><li>Very High : Over 3 years experience in requirements ma...
How projects were selected <ul><li>Reliable Data </li></ul><ul><li>Satisfactory end  </li></ul><ul><li>Key people availabl...
Defects vs size
Actual versus predicted defects
Caveats <ul><li>Biased priors </li></ul><ul><li>Structural aspects biased </li></ul><ul><li>Data accuracy  </li></ul><ul><...
Conclusions <ul><li>No ‘data fitting’ </li></ul><ul><li>Dataset provided a validation </li></ul><ul><li>Good predictions w...
Upcoming SlideShare
Loading in …5
×

Project Data Incorporating Qualitative Factors for Improved Software Defect Prediction

1,723 views

Published on

Norman Fenton, Martin Neil, William Marsh, Peter Hearty, Lukasz Radinski, Paul Krause

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,723
On SlideShare
0
From Embeds
0
Number of Embeds
47
Actions
Shares
0
Downloads
54
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Joke about Savoy. Who I am – show my book. I’ll have something more to say about this book shortly. One of the best books on software metrics was written by Bob Grady of Hewlett Packard in 1987. Bob was responsible for what I believe was recognised as the first true company-wide metrics programme. And his book described the techniques and experiences associated with that at HP. A few years later I was at a meeting where Bob told an interesting story about that metrics programme. He said that one of the main objectives of the programme was to achieve process improvement by learning from metrics what process activities worked and what ones didn’t. To do this they looked at those projects that in metrics terms were considered most successful. These were the projects with especially low rates of customer-reported defects. The idea was to learn what processes characterised such successful projects. It turned out that what they learned from this was very different to what they had expected. I am not going to tell you what it was they learnt until the end of my presentation; by then you may have worked it out for yourselves.
  • Project Data Incorporating Qualitative Factors for Improved Software Defect Prediction

    1. 1. Project Data Incorporating Qualitative Factors for Improved Software Defect Prediction Norman Fenton Martin Neil, William Marsh, Peter Hearty and Łukasz Radliński, Paul Krause PROMISE 20 May 2007
    2. 2. Overview <ul><li>Background </li></ul><ul><li>The data </li></ul><ul><li>Results </li></ul><ul><li>Caveats </li></ul>
    3. 3. Background <ul><li>Predicting reliability </li></ul><ul><li>Statistical models </li></ul><ul><li>Causal models </li></ul>
    4. 4. Causal model (Bayesian network) Probability of finding defect Testing process effectiveness Testing process quality Testing effort Testing staff experience Quality of documented test cases Testing process well-defined
    5. 5. Background <ul><li>AID </li></ul><ul><li>MODIST </li></ul>
    6. 6. Schematic view of model Existing code base Defect insertion and recovery Testing and rework Design and development Specification and documentation Common influences Scale of new required functionality
    7. 7. How Model was developed <ul><li>Qualitative and quantitative factors </li></ul><ul><li>Expert judgement </li></ul><ul><li>Published data </li></ul><ul><li>Company data </li></ul><ul><li>NOT THE DATA HERE!!! </li></ul>
    8. 8. Example question: “Relevant Experience of Spec & Doc Staff” <ul><li>Very High : Over 3 years experience in requirements management, and extensive domain knowledge. </li></ul><ul><li>High : Over 3 years experience in requirements management, but limited domain knowledge. </li></ul><ul><li>Medium : 1-3 years experience in requirements management. </li></ul><ul><li>Low : 1-3 three years experience, but no experience in requirements management. </li></ul><ul><li>Very Low : Less than one year’s experience, and no previous domain experience. </li></ul>
    9. 9. How projects were selected <ul><li>Reliable Data </li></ul><ul><li>Satisfactory end </li></ul><ul><li>Key people available </li></ul><ul><li>Breadth </li></ul><ul><li>Depth </li></ul>
    10. 10. Defects vs size
    11. 11. Actual versus predicted defects
    12. 12. Caveats <ul><li>Biased priors </li></ul><ul><li>Structural aspects biased </li></ul><ul><li>Data accuracy </li></ul><ul><li>Projects overly ‘uniform’ </li></ul>
    13. 13. Conclusions <ul><li>No ‘data fitting’ </li></ul><ul><li>Dataset provided a validation </li></ul><ul><li>Good predictions with few of the inputs </li></ul><ul><li>Causal model provides genuine support for risk management </li></ul>

    ×