skip to main content
10.1145/2162049.2162069acmconferencesArticle/Chapter ViewAbstractPublication PagesmodularityConference Proceedingsconference-collections
research-article

Are automatically-detected code anomalies relevant to architectural modularity?: an exploratory analysis of evolving systems

Authors Info & Claims
Published:25 March 2012Publication History

ABSTRACT

As software systems are maintained, their architecture modularity often degrades through architectural erosion and drift. More directly, however, the modularity of software implementations degrades through the introduction of code anomalies, informally known as code smells. A number of strategies have been developed for supporting the automatic identification of implementation anomalies when only the source code is available. However, it is still unknown how reliable these strategies are when revealing code anomalies related to erosion and drift processes. In this paper, we present an exploratory analysis that investigates to what extent the automatically-detected code anomalies are related to problems that occur with an evolving system's architecture. We analyzed code anomaly occurrences in 38 versions of 5 applications using existing detection strategies. The outcome of our evaluation suggests that many of the code anomalies detected by the employed strategies were not related to architectural problems. Even worse, over 50% of the anomalies not observed by the employed techniques (false negatives) were found to be correlated with architectural problems.

References

  1. Aldrich, J. ArchJava: Connecting Software Architecture to Implementation. In Proc of the 24th ICSE, pp. 187--197, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Alikacem, E.H and Sahraoui, H. Generic metric extraction framework. In Proc. of the 16th IWSM/MetriKon, 2006, pp. 383--390.Google ScholarGoogle Scholar
  3. Bieman, J.M. and Kang, B.K. Cohesion and Reuse in an Object Oriented System. In Proc of the ISSR, pp 259--262, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Clements, P et al. Documenting Software Architectures: Views and Beyond. Addison-Wesley, 2nd Edition, 2010 Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Code smells study: http://www.inf.puc-rio.br/~ibertran/aosd12.Google ScholarGoogle Scholar
  6. D'Ambros, M. et al. the Impact of Design Flaws on Software Defects. In Proc. of the 10th QSIC, pp. 23 - 31, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Dhambri et al. Visual Detection of Design Anomalies. In Proc. of the 12th CSMR, pp. 279--283, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Eichberg, M. et al. Defining and Continuous Checking of Structural Program Dependencies. In Proc. of the 30th ICSE, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Emden, E. and Moonen, L. Java quality assurance by detecting code smells. In Proceedings of the 9th ICRE, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  10. FEAT tool, http://www.cs.mcgill.ca/~swevo/feat/Google ScholarGoogle Scholar
  11. Ferrari, F. et al. An exploratory study of error-proneness in evolving Aspect-Oriented Programs. In: Proc. of the 25th OOPSLA, USA, 2009.Google ScholarGoogle Scholar
  12. Figueiredo, E. et al. Evolving software product lines with aspects: An empirical study on design stability. In Proc of the 30th ICSE, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Fowler, M. Refactoring: Improving the Design of Existing Code. Addison-Wesley, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Garcia, J. et al. Identifying architectural bad smells. In Proc of the. 13th CSMR, pp 255--258, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Greenwood, P. et al. On the impact of aspectual decompositions on design stability: An empirical study. In Proc. of the 21st ECOOP, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Hochstein, L. and Lindvall, M. Combating architectural degeneration: A survey. Info. & Soft. Technology July, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Hosmer, D. and Lemeshow, S. Applied Logistic Regression (2nd Edition). Wiley, 2000.Google ScholarGoogle ScholarCross RefCross Ref
  18. Khomh, K. et al. An exploratory study of the impact of code smells on software change-proneness. In Proc of the 16th WCRE, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Kiczales, G.,et al. Aspect-oriented programming. In Proc. of the 11th ECOOP. LNCS, vol. 1241. Springer, Heidelberg. pp. 220--242, 1997.Google ScholarGoogle ScholarCross RefCross Ref
  20. Kitchenham, B. et al. Evaluating guidelines for empirical software engineering studies. ISESE pp 38--47, 2006 Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Lanza, M. and Marinescu, R. Object-Oriented Metrics in Practice. Springer, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Lippert, M. and Roock, S. Refactoring in Large Software Projects: Performing Complex Restructurings Successfully. Wiley. 2006.Google ScholarGoogle Scholar
  23. Macia, I. et al. A. An Exploratory Study of Code Smells in Evolving Aspect-Oriented Systems. In Proc of the 10th AOSD, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Malek, S. et al. Reconceptualizing a family of heterogeneous embedded systems via explicit architectural support. In Proc. of the 29th ICSE. 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Mantyla, M.V. and Lassenius, C. Subjective evaluation of software evolvability using code smells: An empirical study. Empirical Software Enggineering, vol. 11, no. 3, pp. 395--431, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Mara, L. et al. Hist-Inspect: A Tool for History-Sensitive Detection of Code Smells. In Proc. of the 10th AOSD, 2011 Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Marinescu, R. Detection strategies: Metrics-based rules for detecting design flaws. In Proc. of the 20th ICSM, pp 350--359, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Marinescu,R.; Ganea, G. and Veredi, I. inCode: Continuous Quality Assessment and Improvement. In Proc of the 14th CSMR, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Martin, R. Agile Principles, Patterns, and Practices. Prentice Hall, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. McCabe, T.J. A Software Complexity Measure. IEEE Transactions on Software Engineering, 2 (4), pp 308--320, 1976. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Meyer, B. Object-Oriented Software Construction. Prentice Hall Professional Technical 2nd edition, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Moha, N. et al. DECOR: A Method for the Specification and Detection of Code and Design Smells. IEEE TSE, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Munro, MJ. Product metrics for automatic identification of bad smell design problems in java source-code. In Proc of 11th METRICS, 2005 Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. MuLATo tool, http://sourceforge.net/projects/mulato/ (3/08/2009)Google ScholarGoogle Scholar
  35. Murphy, G.C., et al.. Software Reflexion Models: Bridging the Gap between Design and Implementation. IEEE TSE, pp 364--380, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Murphy-Hill, E. Scalable, expressive, and context-sensitive code smell display. In Proc of the 23rd OPSLA, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Olbrich, S.M. et al. Are all code smells harmful? A study of God Classes and Brain Classes in the evolution of three open source systems. In Proc of the 26th ICSM pp 1--10, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Olbrich, S.M. et al. The evolution and impact of code smells: A case study of two open source systems. In Proc of the 3rd ESEM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Perry, D.E. and Wolf, A.L. Foundations for the study of software architecture, ACM Software. Eng. Notes 17 (4) pp 40--52, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Ratiu, D. et al. Using History Information to Improve Design Flaws Detection. In Proc of the 8th CSMR, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Ratzinger, J. et al. Improving evolvability through refactoring. In Proc of the 5th IEEE MSR, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Sant'anna, C. et al. On the modularity of software architectures: A Concern-Driven measurement framework. In Proc. of ECSA, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Sonar: http://docs.codehaus.org/display/SONAR/Google ScholarGoogle Scholar
  44. Srivisut, K. and Muenchaisri, P. Bad-smell Metrics for Aspect-Oriented Software. In Proc of the 6th ICIS, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  45. Together: http://www.borland.com/us/products/together/Google ScholarGoogle Scholar
  46. Tsantalis, N. and Chatzigeorgiou, A. Identification of move method refactoring opportunities. IEEE TSE, 35(3), pp 347--367, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Understand: http://www.scitools.com/Google ScholarGoogle Scholar
  48. Wake, W.C. Refactoring Workbook. Boston, MA, USA: Addison-Wesley Longman Publishing Co., Inc., 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Wettel, R. and Lanza, M. Visually localizing design problems with disharmony maps. In Proc. of the 4th Softvis pp. 155--164, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Are automatically-detected code anomalies relevant to architectural modularity?: an exploratory analysis of evolving systems

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        AOSD '12: Proceedings of the 11th annual international conference on Aspect-oriented Software Development
        March 2012
        286 pages
        ISBN:9781450310925
        DOI:10.1145/2162049

        Copyright © 2012 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 25 March 2012

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        AOSD '12 Paper Acceptance Rate20of79submissions,25%Overall Acceptance Rate41of139submissions,29%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader