Skip to main content

Über dieses Buch

Cost analysis and estimating is a vital part of the running of all organizations, both commercial and government. This volume comprises the proceedings of the 1992 conference of the Society for Cost Estimating and Analysis. Individual chapters are written by experts in their respective fields. Consequently, the volume as a whole provides an invaluable and up-to-date survey of the field.



Learning and Production Rate


Learning Curve and Rate Adjustment Models: An Investigation of Bias

Learning curve models have gained widespread acceptance as a technique for analyzing and forecasting the cost of items produced from a repetitive process. Considerable research has investigated augmenting the traditional learning curve model with the addition of a production rate variable, creating a rate adjustment model. This study compares the forecasting bias of the learning curve and rate adjustment models. A simulation methodology is used to vary conditions along seven dimensions. The magnitude and direction of errors in estimating future cost are analyzed and compared under the various simulated conditions, using ANOVA. Overall results indicate that the rate adjustment model is generally unbiased. If the cost item being forecast contains any element that is not subject to learning then the traditional learning curve model is consistently biased toward underestimation of future cost. Conditions when the bias is strongest are identified.1
O. Douglas Moses

The Production Rate Impact Model: IRATE

The IRATE model is an analytic procedure for measuring the direct impact of production rate upon a cost improvement curve (CIC) slope. The IRATE process involves a steepening of the assembly’s CIC slope, which is experienced in parallel with or after the installation of all non-recurring items, (e.g., new setups, tool investments, facilities, process changes), specifically added for the higher production rate program. The IRATE effect abstracts the progressive characteristic of recurring events, bringing together the horizontal and vertical “learning” synergisms of production lines.
Schuyler C. Lawrence

A Bang-Bang Approximation to the Solution of a Learning Augmented Planning Model

In this paper we study production programs where a relatively small number of complex units are produced to contractual order, and unit costs fall over time in some systematic way. We call this situation made-to-order production. Aircraft production is a good example. We derive the time path of planned resource use and demonstrate how to obtain approximate solutions for these optimal trajectories. The use of the approximation avoids a “messy” nonlinear optimization problem while providing resource estimates.
James R. Dorroh, Thomas R. Gulledge, Norman Keith Womer

Software Economics


The Software Development Effort Estimation Exponent: An Effort Estimation Model Accounting for Learning and Integration

The study of estimation of the effort required for software development has progressed to the point where most currently used models employ an equation of the form y = ax°, which relates the effort to the lines of code, or size of a program, exponentially through a constant, e. That constant, e, the effort estimation exponent, has been referred to as an entropy constant by Randall Jensen1 and is the subject of many analyses, both theoretical and empirical, to estimate its value. In fact, many models indicate that its value is greater than one, meaning that the effort estimation curve bends upward, or that larger programs take proportionally more effort to develop. Yet, some estimates of the exponent’s value are less than one, and some SEL experience indicates a value of 0.92 and a curve that bends downward. This implies that larger programs take proportionally less effort to develop. Furthermore, if an exponent greater than one is employed, then the effort estimated to develop a large program will be greater than the sum of the effort required to develop each of its submodules individually.
Everett Ayers

Software Cost Estimating Models: A Calibration, Validation, and Comparison

This study was a calibration, validation and comparison of four software effort estimation models. The four models evaluated were REVIC, SASET, SEER, and COSTMODL. A historical database was obtained from Space Systems Division, in Los Angeles, and used as the input data. Two software environments were selected, one used to calibrate and validate the models, and the other to show the performance of the models outside their environment of calibration.
REVIC and COSTMODL are COCOMO derivatives and were calibrated using Dr. Boehm’s procedure. SASET and SEER were found to be uncalibratable for this effort. Accuracy of all the models was significantly low; none of the models performed as expected. REVIC and COSTMODL actually performed better against the comparison data than the data from the calibration. SASET and SEER were very inconsistent across both environments.
Gerald L. Ourada, Daniel V. Ferens

Cost Estimating for Automated Information Systems

The establishment of the Major Automated Information System Review Council (MAISRC) has had, and will continue to have significant impacts on the cost estimating community. The three major sections of this paper examine the MAISRC review process in general with emphasis on the role of the cost estimator.
William Richardson

The Complete COCOMO Model: Basic, Intermediate, Detailed, and Incremental Versions for the Original, Enhanced, Ada, and Ada Process Models of COCOMO

Since its publication in 1981, the COCOMO model presented in Software Engineering Economics (SEE) by Barry W. Boehm has been at the forefront of software models. Since 1984, the existence of the Constructive COst MOdel (COCOMO) User’s Group (CUG) has served to maintain the needed information exchange and to be the vehicle for subsequent updates to the COCOMO model (by Dr. Boehm). Its public acceptance is a matter of record. The Software Engineering Institute (SEI) has served as the sponsor of the CUG since about 1987. The Department of Defense and Cost Organizations both have strong interest in using the best methodologies available for software costing. COCOMO is the best documented such method and has a wide range of uses. The COCOMO model has promoted the purposes of Software Engineering since before 1981. It has not become dated. It has more than 20 automated implementations. In fact, many people are still discovering this model.
Ronnie E. Cooper

Force Costing


Calculating the Cost Savings From Reducing Military Force Structure: How to Formulate the Cost Problem

World events, such as the Gulf War and the breakup of the Soviet Union and changes in NATO, have led to dramatic changes in the defense environment and in the associated requirements for an adequate defense. These changes have motivated a far reaching reconsideration of U.S. military force structure that will lead to a major restructuring of military forces within the next few years. Proposed changes include: a major reduction in force size (i.e., a reduction in the number of military units, such as infantry battalions, aircraft squadrons, or ships of various types) along with a corresponding reorganization of force-wide support organizations and resource management systems; a realignment of the remaining units on existing bases, closing those bases that are no longer required; a reorganization of units to smaller, more mobile and flexible entities; and a change in the mix of active and reserve units. Plans to reduce the defense budget rely heavily on such changes in force structure.
Michael G. Shanley

Systems Cost Analysis


Cost Analysis of Prototyping Major Weapons Systems

IDA performed a major study of acquisition for the Department of Defense, Effective Initiatives in Acquiring Major Systems. The study concerned the highly-publicized cost and schedule overruns that have plagued defense programs. Since the 1960s, the Defense Department and defense contractors have pioneered reviews and management initiatives to improve program outcomes. IDA reviewed program outcomes to determine whether there is a trend toward better outcomes overall, whether any specific management initiatives have improved outcomes, and what improvements could be made.
Karen W. Tyson, J. Richard Nelson, D. Calvin Gogerty, Bruce R. Harmon, Alec Salerno

Cost Estimating Relations for Space-Based Battle Management, Command, Control, and Communications Systems

This report documents an effort to develop a database and cost estimating relation (CER), for Battle Management, Command, Control and Communications (BM/C3) architectures.
Data on twenty-eight BM/C3 communications systems were collected, and a common Cost Work Breakdown Structure (WBS) was devised to eliminate cost accounting disparities among these architectures and organize the data into a single database.
The data were subjected to statistical regression analysis in order to estimate a linear equation that relates the system integration costs to system technical and performance characteristics, such as weight, data flow speed, and number of nodes.
Much of the data collection and analysis to date has concentrated on a relatively homogeneous collection of space-based architectures, resulting in a CER that is particularly appropriate to such systems but not to the full gamut of BM/C3 systems required for comprehensive cost estimation. Consequently, a recommendation arising from this study is to obtain data on additional, more heterogeneous, BM/C3 architectures in order to expand the usefulness of the cost estimating tool under development.
Daniel A. Nussbaum, Elliot Feldman, Mark McLaughlin, Everett Ayers, Donald Strope

A Review of Estimate at Completion Research

The cancellation of the Navy’s A-12 program has increased interest in forecasting the completed cost of a defense contract, termed “Estimate at Completion” (EAC). In addition, popular software packages and electronic spreadsheets allow users to quickly compute a range of EACs. Analysts and managers are left with the task of deciding which EAC or range of EACs is most accurate. Although there have been many studies that either compare existing EAC formulas and models, or propose new ones, few have been published in journals or magazines, and there is little guidance regarding which formula or model is most accurate. This paper reviews 25 studies which either propose or compare EAC formulas and models. Each study is briefly described. Tables which summarize research results are provided. Results show that no one formula or model is always best. Additional research with regression-based models is needed.
David S. Christensen, Richard C. Antolini, John W. McKinney

A CER Approach to Estimating Aircraft Integration Costs

In the wake of the reduced threat in Europe, President Bush has promised significant reductions in the size of our armed forces (and DoD budgets) and to continue the development of high technology avionics subsystems. As they have in the past, future budget constraints will inevitably mean a further decrease in the number of new aircraft acquisition programs. Future challenges for the cost estimating community will shift from estimating the cost of developing and producing new aircraft to integrating new technology into existing aircraft. This paper presents the results of four CER studies developed for estimating the cost of integrating new avionics subsytems into existing A-10 aircraft.
The first study developed CERs for the following three cost elements: (1) integration engineering; (2) Group A Kit recurring production; and (3) Group A Kit nonrecurring production. Each of these CERs is, in reality, a summation of eight different weight driven CERs. The study is documented in Section A.
Section B describes how installation costs, the subject of the second study, were estimated as a function of modification complexity (as defined by the ELSIE (ELectronic Subsystem Integration Estimator) Model. The CER was the result of regression analysis on previous attack and fighter aircraft case histories.
Kitproof and Trial Installation labor (the third study) were estimated as a function of Installation labor costs. This third CER study is presented in Section C.
The fourth and final study is discussed in Section D. It expressed all other integration cost elements as a percentage factor of the Group A and B kit costs. The factors were based on 10 previous A-10 modification case histories.
William Richardson

Cost and Production Analysis


Integrating Cost Analysis and Production System Analysis Using Rapid Modeling Technology

The ability to forecast product costs and total costs as a function of changes in production mix, volumes, processes and strategy is a fundamental objective of cost analysis in manufacturing. The need to understand the production implications, which in turn impact the cost implications of changes, follows directly. Using a Rapid Modelling Technology (RMT) approach to estimate the factory changes we can easily generate the underlying production data needed for a complete and systemic cost analysis. The decision paradigm starts by building a baseline factory and cost model. The analysis proceeds by comparing this “complete model” with other potential “complete models”. Through the use of a few examples we demonstrate the analysis method and its generality. Examples include i) overtime decisions, ii) make versus buy decisions and iii) implications of component quality.
Gregory Diehl, Rajan Suri

Policy Formation for National Industries: The Use of Non-Linear Programming Allocation of Joint Costs

This paper is written with the Icelandic fishing industry in mind, namely the freezing plants. The subject at first is how to allocate joint-cost to finished products. A joint-cost is a cost which incurred up to the split-off-point (S.O.P.), which is the point in the manufacturing process beyond which the individual products are clearly identifiable. The motivation for allocating joint-cost to the individual products in this business like others is the demands of financial and tax reporting to trace all product related costs to finished goods. It is also necessary because the prices are both derived and/or determined from the cost of the goods.
Dennis E. Kroll, Kristjan B. Gardarsson

Conceptual Model of an Activity-Based Cost Management System

Technological advances in manufacturing industries have changed the processes involved in producing end-products, the elements within the processes, the end-products themselves, and, in turn, the management and control of these processes.
Cost and performance information which reflect the actual nature of operations in high-tech environments are vital for effective management. Such information would facilitate continuous process improvement, effective product costing, and proactive cost estimating.
This paper presents a model for a cost management system in such an environment. This system relates cost and performance information to the individual activity level and associates activities with the processes by which goods and services are designed, procured, produced, delivered and supported.
Denise F. Jackson, Thomas G. Greenwood

Cost Sensitivity Analysis


Sensitivity Analysis Using Discrete Simulation Models

Simulation models are commonly constructed to permit the study of the behavior of a system when it may not be feasible or practical to study the real system, e.g. in the early stages of design and development of a complex system. It may, for example, be desirable to evaluate the effect on system performance of variations in subsystem performance parameters, operating conditions, or design configurations. It is frequently desirable to determine the sensitivity of total system performance to the performance of its individual subsystems so that subsequent testing and design improvement efforts can concentrate on those subsystems that have the greatest effect on system performance. Thus, the cost of extensive testing of those subsystems that have a lesser impact on total system performance can be avoided.
Alan S. Goldfarb, Arlene R. Wusterbarth, Patricia A. Massimini, Douglas M. Medville
Weitere Informationen