skip to main content
article
Free Access

Contemporary software development environments

Published:01 May 1982Publication History
Skip Abstract Section

Abstract

There are a wide variety of software development tools and methods currently available or which could be built using current research and technology. These tools and methods can be organized into four software development environments, ranging in complexity from a simple environment containing few automated tools or expensive methods to a complete one including many automated tools and built around a software engineering database. The environments were designed by considering the life-cycle products generated during two classes of software development projects. Relative cost figures for the environments are offered and related issues, such as standardization, effectiveness, and impact, then addressed.

References

  1. 1 Alford, M.W. A requirements engineering methodology for real-time processing requirements. IEEE Trans. Software Eng. SE- 3, 1 (1977), 60-68. General description of the TRW SREM (System Requirements and Evaluation Methodology) requirements analysis system for real-time systems.]]Google ScholarGoogle Scholar
  2. 2 Autoflow II Users Guide. Applied Data Res., Princeton, N.J. User manual for automated flowchart construction tool for programs.]]Google ScholarGoogle Scholar
  3. 3 Balzer R., Goldman, N., and Wile, D. On the transformational implementation approach to programming. Proc~ 2nd Internat. Conf. Software Eng., IEEE, Long Beach, Calif., 1976, pp. 337-344. Advantages of program transformations and their use in changmga simple inefficient program into a more complicated but efficient one.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. 4 Balzer, R., et al. Domain-independent automatic programming. Proc. IFIP Congress 74, Vol. 2, American Elsevier, New York, 1974, pp. 326-330. Approach to interactive automatic programming in which a semantic model is used to interpret user responses during program construction.]]Google ScholarGoogle Scholar
  5. 5 Boehm, B, McClean, R.K., and Ufrig, D.B. Some experience with automated aids to the design of large-scale reliable software. IEEE Trans. Software Eng. SE-1, 1 (1975), 125-133. Module approach to design and use of design analysis tools for verification of module interfaces and other module properties.]]Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. 6 Brown, J. Getting better software cheaper and quicker. In Practical Strategies for Developing Large Software Systems, E. Horowitz, Ed., Addison-Wesley, Reading, Mass., 1975. Use of different kinds of software analysis tools, especially verific,tion and validation tools.]]Google ScholarGoogle Scholar
  7. 7 Budd, T.A., Lipton, R.J., and Sayward, F.G. The design of a prototype mutation system for program testing. Proc. Nat. Comp. Conf., AFIPS, Arlington, Va., 1978, pp. 623- 628. Major features of Fortran-oriented program mutation system and the basic concepts underlying mutation testing.]]Google ScholarGoogle Scholar
  8. 8 Caine, S.H., and Gordon, K.E. PDL--A tool for software design. Proc. Nat. Comp. Conf., AFIPS, Arlington, Va., 1975, pp. 271- 276. Programming language control constructs and embedded prose are used to construct PDL (Program Design Language) program designs.]]Google ScholarGoogle Scholar
  9. 9 COBOL-TDG-II. Inform. Management Inc., San Francisco, Calif. Tool for generating Cobol program test files.]]Google ScholarGoogle Scholar
  10. 10 Compare Facility, Network Operating System, CYBER Series. Control Data Corp. Automated tool for comparing data streams for equality.]]Google ScholarGoogle Scholar
  11. 11 DeMarco, T. Structured Analysis and System Specification. Yourdon, N.Y., 1978. Use of data flow diagrams, data dictionaries, structured English, and database schema in requirements modelling.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. 12 Ehrman, J.R. The new Tower of Babel. Datamation 26, 3 (1980), 156-163. Discusses language proliferation in applications and command languages, with particular attention to the problematic lack of homogeneity in different levels of languages which must be used by programmers.]]Google ScholarGoogle Scholar
  13. 13 Estrin, G., and Campos, I.A. Concurrent software system design supported by SARA at the age of one. Proc. 3rd lnternat. Conf. Software Eng., IEEE, Long Beach, Calif., 1978, pp. 230-242. Basic design primitives (modules, control nodes, data sets, data arcs, etc.) used to build modular designs of systems. Tools for storage and design analysis.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. 14 Green, C. The design of a program synthesis system. Proc. 2nd Internat. Conf. Software Eng., IEEE, Long Beach, Calif., 1976, pp. 4-18. PSI program synthesis system in which a user can describe a problem using traces and natural language. PSI can construct simple symbolic computation programs in LISP.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. 15 Hamilton, M., and Zelden, S. Higher order software-- a methodology for defining software. IEEE Trans. Software Eng. SE-2, 1 (1976), 9-32. A formal functional specification language for describing modular designs and module interfaces. Oriented toward the formal treatment of real-time designs.]]Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. 16 Howden, W.E. Functional testing and design abstractions. J. Syst. and Software 1 (1980), 307-313. Use of design information (Jackson diagrams and structure diagrams) in the construction of program test sets.]]Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. 17 Howden, W.E. Applicability of software validation techniques to scientific programs. A CM Trans. Programming Languages and Syst. 2, 3 (1980), 307-320. Analysis of the effectiveness of a collection of static and dynamic analysis methods for finding the errors in a late release of a commercially available package of scientific subroutines.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. 18 Howdenl W.E. Symbolic testing and the DISSECT symbolic evaluation system. IEEE Trans. Software Eng. SE-3, 4 (1979), 266- 278. User-oriented symbolic evaluation system for Fortran programs.]]Google ScholarGoogle Scholar
  19. 19 lrvine, C.A., and Brackett, J.W. A system for software engineering. In Infotech State-of-the-Art Report on Structured Analysis and Design, Infotech, Maidenhead, England, 1978. An approach to software engineering systems that uses a simple entity relationship type of database for managing development products.]]Google ScholarGoogle Scholar
  20. 20 Jackson. M. Principles of Program Design. Academic Press, London, 1975. Describes abstraction diagrams for program design and guidelines for transforming abstractions into (Cobol) programs.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. 21 Knudsen, D.B., Berofsky, A., and Satz, L.R. A modification request control system. Proc. 2nd lnternat. Conf. Software Eng., IEEE, Long Beach, Calif., 1976, pp. 187-192. Tool in the UNIX-based Programmer's Workbench. Tracks and reports project change requests and resulting activity. Supports a modification request database.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. 22 Librarian User Reference Manual. Applied Data Res., Princeton, N.J. Generalpurpose tool for document storage and retrieval.]]Google ScholarGoogle Scholar
  23. 23 Miller, E., and Howden, W.E. Software Testing and Validation Techniques. 1EEE, Cat. No. EHO-138-8, Los Alamitos, Calif., 1978. Collection of reprints describing the theory of testing, testing and validation methods, and the results of effectiveness studies.]]Google ScholarGoogle Scholar
  24. 24 Miller, E.F., and Melton, R.A. Automated generation of test case data sets. Proc. lnternat. Conf. Reliable Software, IEEE, Long Beach, Calif., 1975, pp. 51-58. Comprehensive system for test monitoring and test data generation. Facilities for helping a programmer find data that will cause a section of code to be executed.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. 25 Nassi, 1., and Schneiderman, B. Flowchart techniques for structured programming. A CM Sigplan Notices 8 (1973). Novel low-level design technique for graphical description of the structure and statements in a class of well-structured programs.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. 26 Osterweil, L.J., and Fosdick, L.D. DAVE--A validation error detection and documentation system for FORTRAN programs. Software Practice and Experience 6, 4 (1976), 473,86. Fortran static analysis system for detecting references to uninitiated variables and variable assigments that are never referenced.]]Google ScholarGoogle ScholarCross RefCross Ref
  27. 27 Panzl, D.J. Automatic software test drivers. Computer 11, 4 (1978), 44-50. System for executing programs over stored sets of data and for checking the validity of output results against stored descriptions of expected output.]]Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. 28 Patrick, R.L. A checklist for system design. Datamation 26, 1 (1980), 147-154. Guidelines for design construction and desirable properties of design tools and methods.]]Google ScholarGoogle Scholar
  29. 29 Program Problem Evaluator (PPE) Users Guide. #U-D503, Boole & Babbage, Palo Alto, Calif. Performance monitoring tool for determining the most heavily used sections of code during program execution.]]Google ScholarGoogle Scholar
  30. 30 Ramamoorthy, C.V., and Ho, S.-B.F. Testing large software with automated evaluation systems. IEEE Trans. Software Eng. SE-1, 1 (1979), 46-58. Collection of analysis and validation tools, including a tool for monitoring statement use during program execution.]]Google ScholarGoogle Scholar
  31. 31 Rochkind, M.J. The source code control system. 1EEE Trans. Software Eng. SE- 1, 4 (1975), 364-369. Facilities for storing, updating, and retrieving versions of modules. Tracking and control of source code modifications.]]Google ScholarGoogle Scholar
  32. 32 Ross, D.T., and Schoman, K.E. Structured analysis for requirements definition. IEEE Trans. Software Eng. SE-3, 1 (1977), 6-15. Introduction to SADT requirements representation diagrams and their use in requirements analysis.]]Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. 33 SADT, the Softech Approach to System Development. Softech (The Software Technology Co.), Waltham, Mass., Jan. 1976. Description of a graphical requirements language and a requirements analysis methodology.]]Google ScholarGoogle Scholar
  34. 34 Schultz, D.J. A case study in system integration using the build approach. Proc. ACM Nat. Conf., New York, 1979, pp. 143- 151. An approach to system building in which system functions are integrated and demonstrated to the user in a series of successive system increments or builds.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. 35 Snowden, R.A., and Henderson, P. The TOPD system for computer-aided system development. In lnfotech State-of-the-Art Report on Structured A nalysis and Design, lnfotech, Maidenhead, England, 1978. An automated design system based on the use of state transition diagrams and models.]]Google ScholarGoogle Scholar
  36. 36 Stay, J.F. HIPO and integrated program design. 1BM Syst. J. 15, 2 (1976), 143- 154. Hierarchical plus input process output functional design diagrams for modelling programming projects as levels of systems, programs, and modules.]]Google ScholarGoogle Scholar
  37. 37 Stevens, S.A. and Tripp, L.L. Requirements expression and verification aid. Proc. 3rd lnternat. Conf. Software Eng., IEEE, Long Beach, Calif., 1978 pp. 101-108. Systematic Activity Modelling Method approach to systems modelling. Uses activity-data flow diagrams, input/output conditions charts, and tree-structured component context descriptions.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. 38 Stucki, L.G. New directions in automated tools for improving software quality. In Current Trends in Programming Methodology, Vol. 2, R.T. Yeh, Ed., Prentice-Hall, Englewood Cliffs, N.J., 1978. Review of the branch testing coverage measures and basic features of a tool which can be employed for both branch monitoring during program execution as well as using dynamic assertions to monitor general data relationships.]]Google ScholarGoogle Scholar
  39. 39 Teichroew, D., and Hershey, E.A. PSL/ PSA: A computer-aided technique for structured documentation and analysis of information processing systems. 1EEE Trans. Software Eng. SE-3, 1 (1977), 41-18. Introduction to the PSL (Problem State Language) and PSA (Problem Statement Analyzer) system. A formal, machine-readable requirements language and the tools for analyzing the requirements for well-formedness. PSL models are stored in a specialized database.]]Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. 40 Yourdon, E. Structured Walkthroughs. Yourdon, New York, 1977. Systematic guidelines for setting up and managing structured walkthroughs of requirements, designs, and programs.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. 41 Yourdon, E., and Constantine L. Structured Design. Prentice-Hall, Englewood Cliffs, N.J., 1979. Text on structured design. Guidelines for evaluating hierarchical modular designs.]]Google ScholarGoogle Scholar

Index Terms

  1. Contemporary software development environments

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image Communications of the ACM
      Communications of the ACM  Volume 25, Issue 5
      May 1982
      32 pages
      ISSN:0001-0782
      EISSN:1557-7317
      DOI:10.1145/358506
      Issue’s Table of Contents

      Copyright © 1982 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 1 May 1982

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • article

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader