Introduction
Background
Technical debt
-
Architecture debt: refers to problems encountered in the software architecture, for example, violation of modularity, which can affect architectural requirements such as performance and robustness. Normally, this type of debt cannot be paid off with simple interventions in the code, requiring more extensive development activities [6, 17].
-
Build debt: refers to issues that make the build task harder, and unnecessarily time consuming, such as when the build process needs to run ill-defined dependencies and the process becomes unnecessarily slow. It also incurs when the build process involves code that does not add customer value [18].
-
Code debt: refers to problems found in the source code that can negatively affect the legibility. Usually, this debt can be identified by examining the source code for issues related to bad coding practices [19].
-
Defect debt: refers to known defects, usually identified by testing activities or by the user and reported on bug tracking systems, that the Configuration Control Board (CCB) agrees should be fixed but, due to competing priorities and limited resources, have to be addressed at a later time [20].
-
Documentation debt: refers to problems found in the project documentation such as missing, inadequate, or incomplete documentation of any type [21].
-
People debt: refers to people issues that, if present in the software organization, can delay or hinder some development activities. An example of this kind of debt is expertise concentrated in too few people, as an effect of delayed training or hiring [22].
-
Requirement debt: refers to trade-offs made with respect to what requirements the development team needs to implement or how to implement them. Examples of this type of debt are partially implemented requirements or implementations that do not fully satisfy a non-functional requirement [6].
-
Test debt: Refers to issues found in testing activities that can affect the quality of those activities. Examples of this type of debt are planned tests that were not run, or known deficiencies in the test suite (e.g., low code coverage) [21].
Technical debt indicators
Software visualization
VisminerTD
RepositoryMiner
-
Mining local (e.g., projects stored on GIT) and remote software repositories (e.g., issues and milestones stored on github.com)
-
Calculating 19 software metrics: AMW (Average Method Weight), ATFD (Access To Foreign Data), CYCLO (McCabe’s Cyclomatic Number), FDP (Foreign Data Provider), LAA (Locality Attribute Accesses), LOC (Lines of Code), LVAR (Number of Local Variables), MAXNESTING (Maximum Nesting Level), MLOC (Method Lines of Code), NOA (Number of Attributes), NOAM (Number of Accessor Methods), NOAV (Number of Accessed Variables), NOM (Number of Methods), NOPA (Number Of Public Attributes), NProtM (Number of Protected Members), PAR (Number of Parameters), TCC (Tight Class Cohesion), WMC (Weighted Method Count), and WOC (Weight Of a Class)
-
Detecting seven types of code smells: brain class, brain method, conditional complexity, data class, feature envy, god class, and long method
-
Detecting duplicated code using the Copy/Paste Detector (CPD) functionality of the PMD tool [48]
-
Detecting possible defects in Java projects through static code analysis using the FindBugs tool [28]
-
Detecting style problems in Java code using the CheckStyle tool [29]
-
Detecting comments in Java code that contain some indication of the existence of TD items using the eXcomment tool [33]
-
Detecting nine TD types: architecture, build, code, design, defect, documentation, requirements, people, and test
VisminerTD
VisminerTD views
TDAnalyzer View
TD form
TD Timeline
Metrics Graph
Code Smells
FindBugs
CheckStyle
CPD
eXcomment
TDEvolution View
TDManagement View
Feasibility study I
Study objective
Project context
Design
Results
Discussion
Study limitations
Feasibility study II
Study objective
Project context
Procedure and instrumentation
# | Task |
---|---|
1 |
Task 1 (TDAnalyzer)
|
1. In version r4.12, identify the two debt items that contain “God Class” and “Duplicated Code” indicators, occurring in the same item. | |
2. For each item: | |
(a) Inform the metric values used to detect “God Class”. | |
(b) Find the total amount of duplication. | |
(c) On the TDForm tab, change the status of the debts found from “Not Analyzed” to “Doing” and click “Save”. | |
2 |
Task 2 (TDAnalyzer)
|
1. Indicate the number of different types of TD present in the “Assert” item in the master version. | |
2. Using the Timeline tab, from “Assert” item, determine whether there has been any change in the number of indicators from version 4.12 to master. If so, name the indicators. | |
3. Go to TDForm tab, also from “Assert” item, change the status of the debts found from “Not Analyzed” to “Done”, then click the “Save” button. | |
3 |
Task 3 (TDAnalyzer)
|
1. In the master version, select the debt type, “Defect Debt/Comment Analysis”, and click on “Update Query”. | |
2. Enter the number of items found, then click on the button “Confirm All from Filter”. | |
4 |
Task 4 (TDManagement)
|
1. From the TD items identified in the TDAnalyzer, select the master version and simulate the payment of the debt for ParametrizedTestTest (CODE_DEBT) and BaseTestRunner (DESING_DEBT), until both are in the DONE panel; | |
2. Find the total amount of items in all 3 panels (TO DO, DOING, and DONE). | |
5 |
Task 5 (TDEvolution)
|
1. Use the slider to limit viewing the versions r4.10 and r4.11. | |
2. Find the difference of the number of items having a Code Debt from version r4.10 to version r4.11. | |
3. Investigate whether there has been any abrupt change in the amount of TD items between software versions, and in what versions this has happened. If so, why do you think this happened? |
Results
Characterization of participants
Knowledge area | Level of experience | ||||
---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | |
Project management | 7 | 13 | 3 | 1 | 4 |
Monitoring and correction of software defects | 8 | 8 | 2 | 3 | 7 |
Software maintenance | 8 | 9 | 1 | 3 | 7 |
Software architecture | 8 | 13 | 1 | 3 | 3 |
Software design | 6 | 11 | 4 | 3 | 4 |
Software documentation | 11 | 9 | 3 | 4 | 1 |
Requirement specification | 6 | 14 | 4 | 2 | 2 |
Implementation | 1 | 7 | 8 | 4 | 8 |
Software testing | 4 | 11 | 5 | 4 | 4 |
VisminerTD evaluation
Strongly agree | Agree | Neutral | Disagree | Strongly disagree | ||
---|---|---|---|---|---|---|
# | Usefulness | |||||
U1 | Using the proposed tool in my job, I would be able to identify and monitor of TD items more quickly. (Quick) | 21 | 7 | |||
U2 | Using the proposed tool, I would improve my performance in identifying and monitoring TD items. (Job Performance) | 22 | 5 | 1 | ||
U3 | Using the proposed tool, I would increase my productivity. (Increase Productivity) | 15 | 11 | 2 | 1 | |
U4 | Using the proposed tool, I would improve my effectiveness in identifying and monitoring TD items. (Effectiveness) | 21 | 6 | 1 | ||
U5 | Using the proposed tool would make identification and monitoring TD items easier. (Makes job easier) | 22 | 5 | 1 | ||
U6 | I find the proposed tool useful to management of TD items. (Useful) | 20 | 7 | 1 | ||
# | Ease of Use | |||||
E1 | Learning to operate the proposed tool would be easy for me. (Easy to learn) | 10 | 10 | 4 | 4 | |
E2 | My interaction with the proposed tool would be clear and understandable. (Clear and understandable) | 11 | 12 | 5 | ||
E3 | I would find it easy to use the proposed tool to do what I want it to do. (Controllable) | 14 | 9 | 5 | ||
E4 | It would be easy to become skillful in using the proposed tool. (Skillful) | 12 | 8 | 6 | 2 | |
E5 | It would be easy to remember how to perform TD identification and monitoring using the proposed tool. (Remember) | 18 | 8 | 2 | ||
E6 | I find the support easy to use. (Easy to use) | 13 | 11 | 3 | 1 | |
# | Self-predicted future use | |||||
S1 | Assuming the proposed tool is available on my job, I predict that I will use it on a regular basis in the future. | 16 | 10 | 2 | ||
S2 | I prefer using the proposed tool for conducting identification and monitoring of TD items than not using it. | 12 | 6 | 8 | 1 | 1 |
Discussion
Threats to validity
Comparison to related works
Characteristics | VisminerTD | SonarQube | DebtFlag | CAST AIP |
---|---|---|---|---|
License | Free | Free | Not Found | Commercial |
Programming languages | Java | Java and 20 others | Java | Java and 50 others |
New parsers | Yes | Yes | No | No |
Repository mining | Yes | No | No | No |
Object-oriented metrics | Yes | Yes | No | Yes |
Style problems | Yes | Yes | No | Yes |
ASA issues | Yes | Yes | No | Yes |
Code Smells | Yes | Yes | No | No |
Custom TD thresholds | Yes | Yes | Yes | Yes |
Multiple versions analysis | Yes | No | No | No |
Source metrics analysis | Yes | Yes | No | No |
Code comments analysis | Yes | No | No | No |
Availability | Yes | Yes | No | Yes |
Standalone | Yes | Yes | No | Yes |