Skip to main content

Über dieses Buch

This volume contains papers presented at the first joint conference of the Software Pr- ess Workshop and the International Workshop on Software Process Simulation and Modeling (SPW/ProSim 2006) held in Shanghai, P.R. China, on May 20-21, 2006. The theme of SPW/ProSim 2006 was “Software Process Change – Meeting the Challenge.” Software developers are under ever-increasing pressure to deliver their products more quickly and with higher levels of quality. These demands are set in a dynamic context of frequently changing technologies, limited resources and globally distributed development teams. At the same time, global competition is forcing - ganizations that develop software to cut costs by rationalizing processes, outsourcing part or all of their activities, reusing existing software in new or modified applications and evolving existing systems to meet new needs, while still minimizing the risk of projects failing to deliver. To address these difficulties, new or modified processes are emerging, including agile methods and plan-based product line development. Open Source, COTS and community-developed software are becoming more popular. Outsourcing coupled with 24/7 development demands well-defined processes and interfaces to support the coordination of organizationally and geographically separated teams. All of these challenges combine to increase demands on the efficiency and effectiveness of so- ware processes.




A Value-Based Software Process Framework

This paper presents a value-based software process framework that has been derived from the 4+1 theory of value-based software engineering (VBSE). The value-based process framework integrates the four component theories – dependency, utility, decision, and control, to the central theory W, and orients itself as a 7-step process guide to practice value-based software engineering. We also illustrate applying the process framework to a supply chain organization through a case study analysis.

Barry Boehm, Apurva Jain

Exploring the Business Process-Software Process Relationship

This paper argues for the need for mechanisms to support the analysis and tracing of relationships between the business process and the software process used to instantiate elements of that business process in software. Evidence is presented to support this argument from research in software process and industry actions and needs as stated in reports to government.

Ross Jeffery

Assessing 3-D Integrated Software Development Processes: A New Benchmark

The increasing complexity and dynamic of software development have become the most critical challenges for large projects. As one of the new emerged methodologies to these problems, TRISO-Model uses an integrated three-dimensional structure to classify and organize the essential elements in software development. In order to simulate and evaluate the modeling ability of TRISO-Model, a new benchmark is created in this paper, called

SPW-2006 Example

, by extending the ISPW-6 Example. It may be used to evaluate other software process models, and/or to evaluate software organizations, software projects and also software development processes, particularly 3-D integrated software development processes. With the

SPW-2006 Example

and its evolution for quantitative evaluation to 3-D integrated software development processes, a new approach of

TRISO-Model based assessment and improvement

is enabled.

Mingshu Li

Ubiquitous Process Engineering: Applying Software Process Technology to Other Domains

Software engineering has learned a great deal about how to create clear and precise process definitions, and how to use them to improve final software products. This paper suggests that this knowledge can also be applied to good effect in many other domains where effective application of process technology can lead to superior products and outcomes. The paper offers medical practice and government as two examples of such domains, and indicates how process technology, first developed for application to software development, is being applied with notable success in those areas of endeavor. The paper also notes that some characteristics of these domains are highlighting ways in which current process technology seems to be inadequate, thereby suggesting ways in which this research is adding to the agenda for research in software process.

Leon J. Osterweil

Process Tailoring and Decision-Support

Dependencies Between Data Decisions

In this paper we show that storing and transmitting data is a complex practice, especially in an inter-organizational setting. We found 18 data aspects on which heavy consideration and coordination is important during a software process. We present these data aspects and point out that these data aspects are dealt with at different levels within Extended Enterprises. A good software process embraces the idea that choices have to be made on these 18 data aspects, and it recognizes the dependencies between the aspects, and the dependencies between decisions made at different levels in the enterprise.

Frank G. Goethals, Wilfried Lemahieu, Monique Snoeck, Jacques Vandenbulcke

Tailor the Value-Based Software Quality Achievement Process to Project Business Cases

This paper proposes a risk-based process strategy decision-making approach. To improve the flexibility in applying the Value-Based Software Quality Achievement (VBSQA) process framework, we embed the risk-based process strategy decision-making approach into the VBSQA process framework. It facilitates project managers to tailor the VBSQA process framework to different project business cases (schedule-driven, product-driven, and market trend-driven). A real world ERP (






lanning) software project (DIMS) in China is used as an example to illustrate different process strategies generated from process tailoring.

Liguo Huang, Hao Hu, Jidong Ge, Barry Boehm, Jian Lü

Optimizing Process Decision in COTS-Based Development Via Risk Based Prioritization

Good project planning requires the use of appropriate process model as well as effective decision support technique(s). However, current software process models provide very little COTS-specific insight and guidance on helping COTS-based application developers to make better decisions with respect to their particular project situations. This paper presents a risk based prioritization approach that is used in the context of COTS Process Decision Framework [6]. This method is particularly useful in supporting many dominant decisions during COTS-based development process, such as establishing COTS assessment criteria, scoping and sequencing development activities, prioritizing features to be implemented in incremental development, etc. In this way, the method not only provides a basis for optimal COTS selection, but also helps to focus the limited development resource on more critical tasks that represent greater risks.

Ye Yang, Barry Boehm

Process Tools and Metrics

Project Replayer – An Investigation Tool to Revisit Processes of Past Projects

In order to help knowledge acquisition and accumulation from past experiences, we propose a KFC (Knowledge Feedback Cycle) framework among engineers and researchers. Three tools (Empirical Project Monitor, Simulator, and Replayer) are used to circulate captured knowledge in KFC. Project Replayer is a most characteristic tool used to review data of past projects derived from development logs; version control, bug reports and e-mails. With Project Replayer, past projects can be easily revisited and complicated phenomena of past projects can be investigated. As a result of preliminary experiments, we have confirmed that Project Replayer helps researchers construct and validate hypotheses of software process. We also confirmed that developers have acquired new knowledge about a certain problem extracted from past projects.

Keita Goto, Noriko Hankawa, Hajimu Iida

Software Process Measurement in the Real World: Dealing with Operating Constraints

Process measurement occurs in an increasingly dynamic context, characterized by limited resources and by the need to deliver results at the pace of changing technologies, processes and products. Traditional measurement techniques (like the GQM) have been extensively and successfully employed in situations with little or no operating constraints. This paper reports about a measurement project in which –in order to limit the cost and duration of the activities– the team could not perform ad hoc measurements, but had to rely almost exclusively on the data that could be extracted automatically from development and measurement tools already in use. Exploiting the flexibility of the GQM technique, and with the support of a tool supporting the GQM, it was possible to define and execute the measurement plan, to analyze the collected data, and to formulate results in only three months, and spending a very small amount of resources.

Luigi Lavazza, Marco Mauri

Evaluation of Project Quality: A DEA-Based Approach

The evaluation of project quality exhibits multivariable, VRS (variable return to scale) and decision maker’s preference properties. In this paper, we present a Data Envelopment Analysis (DEA) based evaluation approach. The DEA VRS model, which handles multivariable and VRSeffectively, is used to measure project quality. And the DEA cone ratio model, which utilizes Analytical Hierarchy Process (AHP) to constrain quality metrics with respect to decision maker’s preference, is also adopted to analyze the return to scaleof the projects. A case study, which assesses 10 projects from ITECHS and 20 “Top active” projects on with the novel method, is demonstrated. The results indicate that our approach is effective for quality evaluation and can get accurate estimates of future possible improvements.

Shen Zhang, Yongji Wang, Jie Tong, Jinhui Zhou, Li Ruan

Process Management

A Pattern-Based Solution to Bridge the Gap Between Theory and Practice in Using Process Models

In order to extend the use of software process improvement programs and to make it independent of organizational features, this work describes the results obtained using a knowledge based model and tool, and proposes a pattern-based solution, using a SPEM (Software Process Engineering Metamodel) exten-sion, in order to improve the efficiency of use of the knowledge-based model proposed.

Antonio Amescua, Javier García, Maria-Isabel Sánchez-Segura, Fuensanta Medina-Domínguez

On Mobility of Software Processes

In this paper, the

mobility of software processes

is proposed as a novel concept. It is defined as the structural change in a software process resulting from interactions among linked process elements. The concept addresses the essential change in a software process which brings a high variability and unpredictability to process performance. Three categories of the


that lead to the structural change are identified and expounded upon. A reference model for describing the concept is put forward based on the polyadic


-calculus. With the

mobility of software processes

, it is possible to design a new PCSEE and associated PML with increased flexibilities.

Mingshu Li, Qiusong Yang, Jian Zhai, Guowei Yang

Software Process Fusion: Uniting Pair Programming and Solo Programming Processes

The role of pair programming process in software development is controversial. This controversy arises in part from their being presented as alternatives, yet it would be more helpful to see them as complementary software management tools. This paper describes the application of such a complementary model, software process fusion (SPF), in a real-world software management situation in China. Pair and solo programming are adopted at different stages of the process and according to the background of programmers, as appropriate. Unlike the usual practice of eXtreme Programming, in which all production code must written in pairs, all-the-time pair programming, the proposed model encourages programmers to design code patterns of their own in pairs and then to use these patterns to build sub-modules solo. The report finds that the longer team members work alone, the more code patterns they develop for reuse later in pairs.

Kim Man Lui, Keith C. C. Chan

Towards an Approach for Security Risk Analysis in COTS Based Development

More and more companies tend to use secure products as COTS to develop their secure systems due to resource limitations. The security concerns add more complexity as well as potential risks to COTS selection process, and it is always a great challenge for developers to make the selection decisions. In this paper, we provide a method for security risk analysis in COTS based development (CBD) based on Common Criteria and our previous work in identifying general risk items for CBD. The research result provides useful insights for developers in identifying security risks, so that it can be used to aid for the COTS selection decision.

Dan Wu, Ye Yang

COCOMO-U: An Extension of COCOMO II for Cost Estimation with Uncertainty

It is well documented that the software industry suffers from frequent cost overruns, and the software cost estimation remains a challenging issue. A contributing factor is, we believe, the inherent uncertainty of assessment of cost. Considering the uncertainty with cost drivers and representing the cost as a distribution of values can help us better understand the uncertainty of cost estimations and provide decision support for budge setting or cost control. In this paper, we use Bayesian belief networks to extend the COCOMO II for cost estimation with uncertainty, and construct the probabilistic cost model COCOMO-U. This model can be used to deal with the uncertainties of cost factors and estimate the cost probability distribution. We also demonstrate how the COCOMO-U is used to provide decision support for software development budget setting and cost control in a case study.

Da Yang, Yuxiang Wan, Zinan Tang, Shujian Wu, Mei He, Mingshu Li

A Product Line Enhanced Unified Process

The Unified Process facilitates reuse for a single system, but falls short handling multiple similar products. In this paper we present an enhanced Unified Process, called UPEPL, integrating the product line technology in order to alleviate this problem. In UPEPL, the product line related activities are added and could be conducted side by side with other classical UP activities. In this way both the advantages of Unified Process and software product lines could co-exist in UPEPL. We show how to use UPEPL with an industrial mobile device product line in our case study.

Weishan Zhang, Thomas Kunz

Process Representation, Analysis and Modeling

Automatic Fault Tree Derivation from Little-JIL Process Definitions

Defects in safety critical processes can lead to accidents that result in harm to people or damage to property. Therefore, it is important to find ways to detect and remove defects from such processes. Earlier work has shown that Fault Tree Analysis (FTA) [3] can be effective in detecting safety critical process defects. Unfortunately, it is difficult to build a comprehensive set of Fault Trees for a complex process, especially if this process is not completely well-defined. The Little-JIL process definition language has been shown to be effective for defining complex processes clearly and precisely at whatever level of granularity is desired [1]. In this work, we present an algorithm for generating Fault Trees from Little-JIL process definitions. We demonstrate the value of this work by showing how FTA can identify safety defects in the process from which the Fault Trees were automatically derived.

Bin Chen, George S. Avrunin, Lori A. Clarke, Leon J. Osterweil

Workflows and Cooperative Processes

Workflows emphasize the partial order of activities, and the flow of data between activities. In contrast, cooperative processes emphasize the sharing of artefact, and its gradual evolution toward the final product, under the cooperative and concurrent activities of all the involved actors.

This paper contrasts workflow and cooperative processes and shows that they are more complementary than conflicting and that, provided some extensions, both approaches can fit into a single tool and formalism.

The paper presents Celine, a concurrent engineering tool that allows also to define and support classic workflows and software processes. We claim that the availability of both classes of features allows for the modelling and support of very flexible processes, closer to software engineering reality.

Jacky Estublier, Sergio Garcia

Spiral Lifecycle Increment Modeling for New Hybrid Processes

The spiral lifecycle is being extended to address new challenges for Software-Intensive Systems of Systems (SISOS), such as coping with rapid change while simultaneously assuring high dependability. A hybrid plan-driven and agile process has been outlined to address these conflicting challenges with the need to rapidly field incremental capabilities. A system dynamics model has been developed to assess the incremental hybrid process and support project decision-making. It estimates cost and schedule for multiple increments of a hybrid process that uses three specialized teams. It considers changes due to external volatility and feedback from user-driven change requests, and dynamically re-estimates and allocates resources in response to the volatility. Deferral policies and team sizes can be experimented with, and it includes tradeoff functions between cost and the timing of changes within and across increments, length of deferral delays, and others. Both the hybrid process and simulation model are being evolved on a very large scale incremental project and other potential pilots.

Raymond Madachy, Barry Boehm, Jo Ann Lane

Definition and Analysis of Election Processes

This paper shows that process definition and analysis technologies can be used to reason about the vulnerability of election processes with respect to incorrect or fraudulent behaviors by election officials. The Little-JIL language is used to model example election processes, and various election worker fraudulent behaviors. The FLAVERS finite-state verification system is then used to determine whether different combinations of election worker behaviors cause the process to produce incorrect election results or whether protective actions can be used to thwart these threats.

Mohammad S. Raunak, Bin Chen, Amr Elssamadisy, Lori A. Clarke, Leon J. Osterweil

The Design of a Flexible Software Process Language

We propose a flexible process language (FLEX) to specify both process model and meta-process model within a uniform framework based on process ontology. A process ontology and some kernel meta-activities are presented as the fundament for process support, such as modeling, enaction and evolution. In contrast to other process languages that only can evolve process model while encoding meta-process logic within PSEEs, our approach can also evolve meta-process model for adjusting process support mechanism flexibly.

Beijun Shen, Cheng Chen

Building Business Process Description and Reasoning Meta-model M bp in A-Prolog

In order to elicit and describe business processes of

Complex Information System



) in requirements analysis phase definitely, avoid inconsistent or ambiguous process definitions, and help reasoning, checking and planning processes,

Business Process Meta-model M




is proposed, which is composed of three hierarchical representations: interactive multi-business processes


, business process


, and business


cored by


. This paper presents the applicability of


to the representation of business process and multiple aspects of reasoning about processes and effects. Finally, based on


system (

Business Process Planning based on A-Prolog

) which has been applied in


development, an example of applying business process reasoning to workflow planning demonstrates that



can simplify and improve business process representation and analysis of


reasonably and effectively.

Hai Wan, Yunxiang Zheng, Yin Chen, Lei Li

A Process-Agent Construction Method for Software Process Modeling in SoftPM

Software development, unlike manufacturing industry, is highly dependent on the capabilities of individual software engineers and software development teams. SEI presents PSP and TSP to establish personal and team capabilities in the software process, to maintain them and assist organizations in conducting CMMI-Based process improvement. Thus, executors’ capabilities should be taken into account as a key issue of the software process modeling method. ISCAS conducts research on Organization-Entities capabilities- based software process modeling and presents a corresponding method. The Organization-Entities have definite capabilities and are called Process-Agents. The modeling method applies Agent technology to organize the basic process units and to establish the project process system self-adaptively according to the special project goal and constraining environment. In this paper, we present the method for constructing the Process-Agent. Each Process-Agent is comprised of two parts: Firstly, the infrastructure to describe Process-Agent’s knowledge, and secondly the engine driven by external environment, used for reasoning Process-Agent’s behavior based on its knowledge.

Qing Wang, Junchao Xiao, Mingshu Li, M. Wasif Nisar, Rong Yuan, Lei Zhang

Applying Little-JIL to Describe Process-Agent Knowledge in SoftPM

In a software process modeling method based upon the Organization-Entity capability, the Process-Agent is a well-defined unit whose role is to encapsulate an entity’s knowledge, skill etc. The Process-Agent’s infrastructure comprises descriptive knowledge, process knowledge and an experience library. The process knowledge is represented by process steps, whose execution determines the behaviors of the Process-Agent. This causes Process-Agent knowledge to be precisely described and well organized. In this paper, Little-JIL, a well-known process modeling language, is used to define a Process-Agent’s process knowledge. Benefits for process element knowledge representation arising from Little-JIL’s simplicity, semantic richness, expressiveness, formal and precise yet graphical syntax etc., are described.

Junchao Xiao, Leon J. Osterweil, Lei Zhang, Alexander Wise, Qing Wang

Process Simulation Modeling

Reusable Model Structures and Behaviors for Software Processes

An organization of increasingly complex system dynamics model structures and behaviors has been developed to promote modeling reuse for software processes. It uses an object-oriented framework for describing structures in a class hierarchy with inheritance relationships. This original approach provides a set of common assets that can be referenced for a “product line” of software process models. The structures and their behaviors are process patterns that frequently occur, and the recurring structures are model building blocks that can be reused. They provide a framework for understanding, modifying and creating system dynamics models regardless of experience. Previous work can be understood easier and the structures incorporated into new models with minimal modification. A goal of this work is to help accelerate software process modeling and simulation activities. Experience indicates that the model assets scale from small to large, complex models. Examples of the constructs and executable versions of associated models are also available.

Raymond Madachy

Organization-Theoretic Perspective for Simulation Modeling of Agile Software Processes

Software development is a team effort that requires cooperation among individuals via task allocation, coordination of actions, and if necessary avoidance and/or management of conflicts among members of the organization. This perspective contrasts with the production focused view of software development. That is, interaction becomes the central activity, not a side-effect of a method’s prescription. Understanding the principles and components of organizational behavior for inclusion in software process models improves the level of fidelity and credibility of existing process simulations. Furthermore, there are strong connections between the neo-information processing view of organizations and agile software development. This paper introduces the conceptual basis for an agent-based simulation modeling test-bed, Team-RUP, which is based on an organization-theoretic perspective for simulation modeling of agile software processes.

Levent Yilmaz, Jared Phillips

Semi-quantitative Simulation Modeling of Software Engineering Process

Software process simulation models hold out the promise of improving project planning and control. However, purely quantitative models require a very detailed understanding of the software process, i.e. process knowledge represented quantitatively. When such data is lacking, quantitative models impose severe constraints, restricting the model’s value. In contrast, qualitative models display all possible behaviors but only in qualitative terms. This paper illustrates the value and flexibility of semi-quantitative modeling by developing a model of the software staffing process and comparing it with other quantitative staffing models. We show that the semi-quantitative model provides more insights into the staffing process and more confidence in the outcomes than the quantitative models by achieving a tradeoff between quantitative and qualitative simulation. In particular, the semi-quantitative simulation produces a set of possible outcomes with the ranges of real numeric values. The semi-quantitative model allows us to determine the solution boundaries for specific scenarios under the conditions of limited knowledge.

He Zhang, Barbara Kitchenham

Process Simulation Applications

Analysis of Software-Intensive System Acquisition Using Hybrid Software Process Simulation

Many sources have reported that the technical and managerial maturity of the acquirer is the essential key to success of Software-Intensive System Acquisition (SISA) and recommended to adopt the best practices. However, DoD is inactive to implement the SISA practices because DoD doesn’t fully understand how and why the SISA practices affect the performance of software-intensive system development.

In this research, we analyze the effects of SISA practices on acquirer and developer using hybrid software process simulation modeling. Our approach represents the dynamic characteristics (e.g., the interactions of acquisition organization and development organization and the effects of several SISA practices) and discrete characteristics (e.g., specific characteristics of discrete phase, etc.) of SISA programs. This research will contribute to reveal how the acquirer’s activities influence the performance of the developer’s process.

KeungSik Choi, Doo-Hwan Bae

Simulation-Based Stability Analysis for Software Release Plans

Release planning for incremental software development assigns features to releases such that most important technical, resource, risk and budget constraints are met. The research presented in this paper is based on a three staged procedure. In addition to an existing method for (i) strategic release planning that maps requirements to subsequent releases and (ii) a more fine-grained planning that defines resource allocations for each individual release, we propose a third step, i.e., (iii) stability analysis, which analyzes proposed release plans with regards to their sensitivity to unforeseen changes. Unforeseen changes can relate to alterations in expected personnel availability and productivity, feature-specific task size (measured in terms of effort), and degree of task dependency (measured in terms of work load that can only be processed if corresponding work in predecessor tasks has been completed). The focus of this paper is on stability analysis of proposed release plans. We present the simulation model REPSIM (Release Plan Simulator) and illustrate its usefulness for stability analysis with the help of a case example.

Dietmar Pfahl, Ahmed Al-Emran, Günther Ruhe

Exploring the Impact of Task Allocation Strategies for Global Software Development Using Simulation

We describe a hybrid computer simulation model of the software development process that is specifically architected to study alternative ways to configure global software development projects, including phased-based, module-based, and follow-the-sun allocation strategies. The model is a hybrid system dynamics and discrete event model. In this paper, test cases have been developed for each allocation strategy, and project duration under each configuration is computed under a range of plausible assumptions for key parameters. The primary finding is that although under ideal assumptions, follow-the-sun is able to produce impressive reductions in time-to-market, under more realistic assumptions the reverse is true, thus corroborating findings by other researchers. Further analysis reveals the presence of some interaction between the assumptions, but the results remain robust.

Siri-on Setamanit, Wayne Wakeland, David Raffo

Users and Developers: An Agent-Based Simulation of Open Source Software Evolution

We present an agent-based simulation model of open source software (OSS). To our knowledge, this is the first model of OSS evolution that includes four significant factors: productivity limited by the complexity of software modules, the software’s fitness for purpose, the motivation of developers, and the role of users in defining requirements. The model was evaluated by comparing the simulated results against four measures of software evolution (system size, proportion of highly complex modules, level of complexity control work, and distribution of changes) for four large OSS systems. The simulated results resembled all the observed data, including alternating periods of growth and stagnation. The fidelity of the model suggests that the factors included here have significant effects on the evolution of OSS systems.

Neil Smith, Andrea Capiluppi, Juan Fernández-Ramil

Simulating the Structural Evolution of Software

As functionality is added to an ageing piece of software, its original design and structure tends to erode. The underlying forces which cause such degradation have been the subject of much research. However, progress in this field is slow due to the difficultly faced in generating empirical data [6] as well as attributing observed effects to the various points in the causal chain [7]. This paper tackles these problems by providing a framework for simulating the structural evolution of software. A complete model is built by incrementally adding modules to the framework, each of which contribute an individual evolutionary effect. These effects are then combined to form a multi-faceted simulation that evolves a fictitious code base approximating real world behavior. Validation of a simple set of evolutionary parameters is provided, demonstrating agreement with current empirical observations.

Benjamin Stopford, Steve Counsell

Experience Report

An Empirical Study on SW Metrics for Embedded System

One of the most important reasons to measure software projects is to increase visibility of the development process. High visibility enables the development team to estimate their project more accurately and to establish a reasonable project plan. It also provides a rationale for improving the software development process. We propose standard metrics needed for our organization to develop embedded systems and present the results of related data collection and measurement. To define software metrics, the GQM (Goal Question Metric) approach was used. This empirical study should benefit people on the SEPG (Software Engineering Process Group) of each software development division who drive to improve their software development process.

Taehee Gwak, Yoonjung Jang


Software system families are characterized through a structured reuse of components and a high degree of automation based on a common infrastructure. It is possible to increase the efficiency of software system families by an explicit consideration of process flows in application domains which are driven by processes. Based on that fact this article briefly describes the approach of process family engineering. Afterwards the metrics of Process-Family-Points are explained in detail. These are the only framework to measure the size and estimate the effort of process families. Subsequently this paper shows the first results from a validation of the Process-Family-Points in the application domains of eBusiness and Automotive. After an evaluation of these empirical data this paper concludes with an outlook on future activities.

Sebastian Kiebusch, Bogdan Franczyk, Andreas Speck

Automated Recognition of Low-Level Process: A Pilot Validation Study of Zorro for Test-Driven Development

Zorro is a system designed to automatically determine whether a developer is complying with the Test-Driven Development (TDD) process. Automated recognition of TDD could benefit the software engineering community in a variety of ways, from pedagogical aids to support the learning of test-driven design, to support for more rigorous empirical studies on the effectiveness of TDD in practice. This paper presents the Zorro system and the results of a pilot validation study, which shows that Zorro was able to recognize test-driven design episodes correctly 89% of the time. The results also indicate ways to improve Zorro’s classification accuracy further, and provide evidence for the effectiveness of this approach to low-level software process recognition.

Hongbing Kou, Philip M. Johnson

Process Evolution Supported by Rationale: An Empirical Investigation of Process Changes

Evolving a software process model without a retrospective and, in consequence, without an understanding of the process evolution, can lead to severe problems for the software development organization, e.g., inefficient performance as a consequence of the arbitrary introduction of changes or difficulty in demonstrating compliance to a given standard. Capturing information on the rationale behind changes can provide a means for better understanding process evolution. This article presents the results of an exploratory study with the goal of understanding the nature of process changes in a given context. It presents the most important issues that motivated process engineers changing important aerospace software process standards during an industrial project. The study is part of research work intended to incrementally define a systematic mechanism for process evolution supported by rationale information.

Alexis Ocampo, Jürgen Münch

Implementing Process Change in a Software Organization – An Experience Based Study

No matter how much better our software process becomes, it would not bring us much benefit without being adopted by the targeted process users. Based on the author’s recent industrial working experience at Motorola Global Software Group (GSG) as an organizational technology deployment champion, with the implementation of Motorola’s Enterprise Project Management System in GSG during 2001 – 2003 as a background story, this paper presents a study on the real world challenges in implementing process change in today’s software organizations. Upon reflection of the good practices and the do-differentlies, the paper provides some strategic as well as practical recommend-dations on dealing with the challenges, and points out the key success practices for software process change management.

Shaowen Qin

Practical Experiences of Cost/Schedule Measure Through Earned Value Management and Statistical Process Control

Cost and schedule measures are the most important support activities for the success of a project; it provides the basis for process improvement and project management. This paper reports practical experiences on using EVM (Earned Value Management) and SPC(Statistical Process Control) in cost/ schedule measure. The analysis of experience data indicates the distributions of CPI(Cost Performance Index) and SPI(Schedule Performance Index) index are generally following the normal distribution. And consequently, it is reasonable and effective to employ SPC in EVM.

Qing Wang, Nan Jiang, Lang Gou, Meiru Che, Ronghui Zhang


Weitere Informationen

Premium Partner