A case study on effectively identifying technical debt

657 views
550 views

Published on

We identified and organized a number of statements about technical debt (TD Folklore list) expressed by practitioners in online websites, blogs and published papers. We chose 14 statements and we evaluated them through two surveys (37 practitioners answered the questionnaires), ranking them by agreement and consensus. The tatements most agreed with show that TD is an important factor in software project management and not simply another term for “bad code”. This study will help the research community in identifying folklore that can be translated into research questions to be investigated, thus targeting attempts to provide a scientific basis for TD management.

Published in: Technology, Economy & Finance
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
657
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
14
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Keypoints:Different developers know about very different kinds of debt (only one debt item was identified by two developers)In almost all types of TD subjects found a mix of intentionally and unintentionally earned TDDefect debt most prominent (in this project)One new type of TD “Usability Debt” was identifiedwas introduced by one of the subjects to describe the lack of a common user interface templateOther findings (not visible in the slide):principal, interest, and interest probability was fairly randomly distributed across TD typesAnecdotes:Example for Documentation Debt: “The Module of Allocation doesn’t have a requirements specification document.”Example for Testing Debt not related to source code: “The lack of test plans can bring problems after system deployment.”Example for Design Debt (the item that was identified by two subjects): Two databases were used (legacy reasons) and some data is stored redundantly in both databases Example for Defect Debt: “This functionality is almost all incomplete.”
  • Keypoints:Using a mix of tool approaches identify all files that were linked with defect debt and a part of the files linked with design debt The three selected tools (obviously) do not find TD that is not related to source codeAnecdotes (same as slide before):Example for Documentation Debt: “The Module of Allocation doesn’t have a requirements specification document.”Example for Testing Debt not related to source code: “The lack of test plans can bring problems after system deployment.”Example for Design Debt (the item that was identified by two subjects): Two databases were used (legacy reasons) and some data is stored redundantly in both databases Example for Defect Debt: “This functionality is almost all incomplete.”
  • A case study on effectively identifying technical debt

    1. 1. A Case Study onEffectively IdentifyingTechnical DebtNico ZazworkaRodrigo O. SpínolaAntonio VetróForrest ShullCarolyn SeamanEASE 2013, Porto de Galinhas
    2. 2. Abstract A study focused on the identification of Technical Debt Compared human elicitation of Technical Debt with tool-assisted identification Results revealed that there is little overlap in the TechnicalDebt reported by different members of a development team Results also showed that tools can identify some of the sametypes of Technical Debt that human developers considerimportant, but not all. Contribution to the Technical Debt Landscape.
    3. 3. What is Technical Debt? Imperfections in a software system… That were caused by lack of time… And that run the risk of causing higher future maintenance cost Examples: Classes that need refactoring (design debt) Known defects that were never fixed (defect debt) Requirements that were only partially implemented (defect debt) Inadequacies in the test suite (testing debt) Some debt is obvious and explicit, other types are hidden andneed to be identified, or detected. Current research focuses on techniques for identifying andmanaging Technical Debt This study focuses on identification
    4. 4. Research Goal and Questions Goal: understand the human elicitation of Technical Debt andcompare it to automated Technical Debt identification Research Questions: Do the Technical Debt identification tools find issues that aresimilar or different from those reported by developers? How much overlap is there between the Technical Debt itemsreported by different developers? How hard is the Technical Debt item template to fill in?
    5. 5. Study Procedure Group of 5 developers working on a small (25KLOC) database-driven web application for the sea transportation domain We asked them:If you were given a week to work on this application, and were told notto add any new features or fix any bugs, but only to address TechnicalDebt, what would you spend your time on? Resulted in 21 Technical Debt items, documented using atemplate At the same time, we ran an automated static analysis tool, acode smell detector, and a metrics calculator on the currentcode base Compared the results
    6. 6. Technical Debt TemplateID TD identification numberResponsible Person or role who should fix this TD itemType design, documentation, defect, testing, or other type ofdebtLocation List of files/classes/methods or documents/pagesinvolvedDescription Describes the anomaly and possible impacts on futuremaintenanceEstimatedprincipalHow much work is required to pay off this TD item on athree point scale: High/Medium/LowEstimatedinterestamountHow much extra work will need to be performed in thefuture if this TD item is not paid off now on a threepoint scale: High/Medium/LowEstimatedinterestprobabilityHow likely is it that this item, if not paid off, will causeextra work to be necessary in the future on a threepoint scale: High/Medium/LowIntentional? Yes/No/Don’t Know
    7. 7. Results I
    8. 8. Results II
    9. 9. Discussion Tools would have facilitated the discovery of all the importantdefect debt and about half of the design debt, as reported bydevelopers. Tools would not have found any of the other types of debtdeemed important by developers Developers reported spending a reasonable amount of timedocumenting the Technical Debt items, but found principal andinterest hard to assess Bottom line: No silver bullet – need a variety of tools andhumans To identify Technical Debt To interpret Technical Debt To decide what’s important
    10. 10. Conclusion A study investigating and comparing different approaches toidentifying Technical Debt Tools – ASA, code smells, and metrics Human elicitation – developers reporting the most importantinstances Tools can’t replace humans, but may find things that humans miss Aggregation, not consensus, is an appropriate approach tocombining the Technical Debt items reported by differentdevelopers Next step: a focus group with the developers to get their feedbackon the results
    11. 11. Thank you!Questions?

    ×