Skip to main content

2010 | Buch

New Modeling Concepts for Today’s Software Processes

International Conference on Software Process, ICSP 2010, Paderborn, Germany, July 8-9, 2010. Proceedings

herausgegeben von: Jürgen Münch, Ye Yang, Wilhelm Schäfer

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

2010 was the first time that the International Conference on Software Process was held autonomously and not co-located with a larger conference. This was a special challenge and we are glad that the conference gained a lot of attention, a significant number of contributions and many highly interested participants from industry and academia. This volume contains the papers presented at ICSP 2010 held in Paderborn, G- many, during July 8-9, 2010. ICSP 2010 was the fourth conference of the ICSP series. The conference provided a forum for researchers and industrial practitioners to - change new research results, experiences, and findings in the area of software and system process modeling and management. The increasing distribution of development activities, new development paradigms such as cloud computing, new classes of systems such as cyber-physical systems, and short technology cycles are currently driving forces for the software domain. They require appropriate answers with respect to process models and management, suitable modeling concepts, and an understanding of the effects of the processes in specific environments and domains. Many papers in the proceedings address these issues.

Inhaltsverzeichnis

Frontmatter

Invited Talk

A Risk-Driven Decision Table for Software Process Selection

It is becoming increasingly clear that there is no one-size-fits all process for software projects. Requirements-first, top-down waterfall or V-models work well when the requirements can be determined in advance, but are stymied by emergent requirements or bottom-up considerations such as COTS- or legacy-driven projects. Agile methods work well for small projects with emergent requirements, but do not scale well or provide high assurance of critical attributes such as safety or security. Formal methods provide high assurance but are generally expensive in calendar time and effort and do not scale well.

We have developed the Incremental Commitment Model (ICM) to include a set of explicit risk-driven decision points that branch in different ways to cover numerous possible project situations. As with other models trying to address a wide variety of situations, its general form is rather complex. However, its risk-driven nature has enabled us to determine a set of twelve common risk patterns and organize them into a decision table that can help new projects converge on a process that fits well with their particular process drivers. For each of the twelve special cases, the decision table provides top-level guidelines for tailoring the key activities of the ICM, along with suggested lengths between each internal system build and each external system increment delivery. This presentation will provide a summary of the ICM, show how its process decision drivers can generally be determined in the initial ICM Exploration phase, summarize each of the twelve cases, and provide representative examples of their use.

Barry W. Boehm

Process Alignment

Using Process Definitions to Support Reasoning about Satisfaction of Process Requirements

This paper demonstrates how a precise definition of a software development process can be used to determine whether the process definition satisfies certain of its requirements. The paper presents a definition of a Scrum process written in the Little-JIL process definition language. The definition’s details facilitate understanding of this specific Scrum process (while also suggesting the possibility of many variants of the process). The paper also shows how these process details can support the use of analyzers to draw inferences that can then be compared to requirements specifications. Specifically the paper shows how finite state verification can be used to demonstrate that the process protects the team from requirements changes during a sprint, and how analysis of a fault tree derived from the Little-JIL Scrum definition can demonstrate the presence of a single point of failure in the process, suggesting that this particular Scrum process may fail to meet certain process robustness requirements. A new Scrum process variant is then presented and shown to be more robust in that it lacks the single of point failure.

Leon J. Osterweil, Alexander Wise
Early Empirical Assessment of the Practical Value of GQM+Strategies

The success of a measurement initiative in a software company depends on the quality of the links between metrics programs and organizational business goals. GQM

+

Strategies is a new approach designed to help in establishing links between the organizational business goals and measurement programs. However, there are no reported industrial experiences that demonstrate the practical value of the method. We designed a five-step research approach in order to assess the method’s practical value. The approach utilized revised Bloom’s taxonomy as a framework for assessing the practitioners’ cognition level of the concepts. The results of our study demonstrated that the method has practical value for the case company, and it is capable of addressing real-world problems. In the conducted empirical study, the cognition level of the practitioners was sufficient for a successful implementation of the method.

Vladimir Mandić, Lasse Harjumaa, Jouni Markkula, Markku Oivo
Determining Organization-Specific Process Suitability

Having software processes that fit technological, project, and business demands is one important prerequisite for software-developing organizations to operate successfully in a sustainable way. However, many such organizations suffer from processes that do not fit their demands, either because they do not provide the necessary support, or because they provide features that are no longer necessary. This leads to unnecessary costs during the development cycle, a phenomenon that worsens over time. This paper presents the SCOPE approach for systematically determining the process demands of current and future products and projects, for analyzing existing processes aimed at satisfying these demands, and for subsequently selecting those processes that provide the most benefit for the organization. The validation showed that SCOPE is capable of adjusting an organization’s process scope in such a way that the most suitable processes are kept and the least suitable ones can be discarded.

Ove Armbrust
On Scoping Stakeholders and Artifacts in Software Process

Stakeholder and artifact are considered as two important elements in software engineering processes, but they are rarely systematically investigated in software process modeling and simulation. Inspired by the Workshop of Modeling Systems and Software Engineering Processes in 2008 at the University of Southern California and our previous studies on integrating stakeholders’ perspectives into software process modeling, we undertook a study on the application of these entities in software engineering, through both a systematic review and a complementary web survey within software process research and practice communities. Our results reveal that the portion of studies on process stakeholders and process artifacts in software engineering is unexpectedly small, and there lacks consistent understanding of process stakeholder roles. By further analysis of stakeholder roles and artifact types based on our results, we define the stakeholder and artifact in the lieu of software process engineering, and differentiate stakeholder and artifact in different application scopes.

Xu Bai, LiGuo Huang, He Zhang
Critical Success Factors for Rapid, Innovative Solutions

Many of today’s problems are in search of new, innovative solutions. However, the development of new and innovative solutions has been elusive to many, resulting in considerable effort and dollars and no solution or a mediocre solution late to the marketplace or customer. This paper describes the results of research conducted to identify the critical success factors employed by several successful, high-performance organizations in the development of innovative systems. These critical success factors span technical, managerial, people, and cultural aspects of the innovative environment.

Jo Ann Lane, Barry Boehm, Mark Bolas, Azad Madni, Richard Turner

Process Management

Evidence-Based Software Processes

Many software projects fail because they commit to a set of plans and specifications with little evidence that if these are used on the project, they will lead to a feasible system being developed within the project’s budget and schedule. An effective way to avoid this is to make the evidence of feasibility a first-class developer deliverable that is reviewed by independent experts and key decision milestones: shortfalls in evidence are risks to be considered in going forward. This further implies that the developer will create and follow processes for evidence development. This paper provides processes for developing and reviewing feasibility evidence, and for using risk to determine how to proceed at major milestones. It also provides quantitative result on ”how much investment in evidence is enough,” as a function of the project’s size, criticality, and volatility.

Barry Boehm, Jo Ann Lane
SoS Management Strategy Impacts on SoS Engineering Effort

To quickly respond to changing business and mission needs, many organizations are integrating new and existing systems with commercial-off-the-shelf (COTS) products into network-centric, knowledge-based, software-intensive systems of systems (SoS). With this approach, system development processes to define the new architecture, identify sources to either supply or develop the required components, and eventually integrate and test these high level components are evolving and are being referred to as SoS Engineering (SoSE). This research shows that there exist conditions under which investments in SoSE have positive and negative returns on investment, provides the first quantitative determination of these conditions, and points out directions for future research that would strengthen the results.

Jo Ann Lane
Using Project Procedure Diagrams for Milestone Planning

This paper presents Project Procedure Diagrams (PPDs) as a technique for specifying and elaborating project milestone plans. The graphical syntax of PPDs is similar to UML activity diagrams. The operational semantics is based on token flow and resembles playing common board games. The base concepts can be easily grasped by project managers and lend themselves to experimentation and simulation.

In spite of their apparent simplicity, PPDs offer support for advanced concepts like parallel subprojects, multi-level iterations, forward and backward planning and for the composition of complex plans from modular process components. Furthermore, PPDs are based on rigorous, formally founded syntax and semantics to ease the development of tools.

Klaus Bergner, Jan Friedrich
A Framework for the Flexible Instantiation of Large Scale Software Process Tailoring

Due to the variety of concerns affecting software development in large organizations, generic software processes have to be adapted to project specific needs to be effectively applicable in individual projects. We describe the architecture of a tool aiming to provide support for tailoring and instantiation of reference processes. The goal is to minimize the effort for the individualisation of generic software processes to project specific needs. In contrast to existing approaches, our prototype provides flexible support for adaptation decisions made by project managers while adhering to modelling constraints stated by the used modelling language, enterprise policies or business process regulations.

Peter Killisperger, Markus Stumptner, Georg Peters, Georg Grossmann, Thomas Stückl
A Methodological Framework and Software Infrastructure for the Construction of Software Production Methods

The theory of Method Engineering becomes increasingly solid, but very few engineering tools have been developed to support the application of its research results. To overcome this limitation, this paper presents a methodological framework based on Model Driven Engineering techniques. The framework provides a method supported by a software platform for the construction of software production methods. This framework covers from the specification of the software production method to the generation of the CASE tool that supports it. This generation process has been semi-automated through model transformations. The CASE tool and the software platform are based on the Eclipse-based MOSKitt tool. The plugin-based architecture and the integrated modelling tools included in the MOSKitt tool turn it into a suitable software platform to support our proposal. To validate the proposal we have applied the framework to a case study.

Mario Cervera, Manoli Albert, Victoria Torres, Vicente Pelechano
Software Factories: Describing the Assembly Process

Software Factories

pose a paradigm shift that promises to turn application assembly more cost effective through systematic reuse. These advances in software industrialization have however reduced the cost of coding applications at the expense of increasing assembling complexity, i.e., the process of coming up with the final end application. To alleviate this problem, we advocate for a new discipline inside the general software development process, i.e.

Assembly Plan Management

, that permits to face complexity in assembly processes. A non-trivial case study application is presented.

Maider Azanza, Oscar Díaz, Salvador Trujillo
How to Welcome Software Process Improvement and Avoid Resistance to Change

Pressures for more complex products, customer dissatisfaction and problems related to cost and schedule overruns increase the need for effective management response and for improvement of software development practices. In this context, cultural aspects can influence and interfere in a successful implementation of a software process improvement program. This paper explores cultural issues, discussing in a detailed way one de-motivator factor to implement successfully a software process improvement action. The analysis was carried out in a software development organization and provided some insights into how this organization would overcome it. We backed our studies conducting a process simulation. Our findings suggest that other than finance, technology and other issues, the cultural aspects should be among the first concerns to be taken into account when implementing a Software Process Improvement program. Our main contribution is to give evidences that a small change in the behavior of the software development team members can improve the quality of the product and reduce development rework.

Daniela C. C. Peixoto, Vitor A. Batista, Rodolfo F. Resende, Clarindo Isaías P. S. Pádua

Process Models

The Incremental Commitment Model Process Patterns for Rapid-Fielding Projects

To provide better services to customers and not to be left behind in a competitive business environment, a wide variety of ready-to-use software and technologies are available for one to grab and build up software systems at a very fast pace. Rapid fielding plays a major role in developing software systems to provide a quick response to the organization. This paper investigates the appropriateness of current software development processes and develops new software development process guidelines, focusing on four process patterns: Use single Non-Developmental Item (NDI), NDI-intensive, Services-intensive, and Architected Agile. Currently, there is no single software development process model that is applicable to all four process patterns, but the Incremental Commitment Model (ICM) can help a new project converge on a process that fits their process drivers. This paper also presents process decision criteria in terms of these drivers and relates them to the ICM Electronic Process Guide.

Supannika Koolmanojwong, Barry Boehm
A Repository of Agile Method Fragments

Despite the prevalence of agile methods, few software companies adopt a prescribed agile process in its entirety. For many practitioners, agile software development is about picking up fragments from various agile methods, assembling them as a light-weight process, and deploying them in their software projects. Given the growing number of published empirical studies about using agile in different project situations, it is now possible to gain a more realistic view of what each agile method fragment can accomplish, and the necessary requisites for its successful deployment. With the aim of making this knowledge more accessible, this paper introduces a repository of agile method fragments, which organizes the evidential knowledge according to their objectives and requisites. The knowledge is gathered through systematic review of empirical studies which investigated the enactment of agile methods in various project situations. In addition, the paper proposes a modeling paradigm for visualizing the stored knowledge of method fragments, to facilitate the subsequent assembly task.

Hesam Chiniforooshan Esfahani, Eric Yu
OAP: Toward a Process for an Open World

To deal with increasing complexity, volatility, and uncertainty, a process for engineering software-intensive systems in an open world, called Organic Aggregation Process (OAP), is proposed. It is based on a concept called “Organic Aggregation”, and consists of a coherent and flexible aggregation of carefully-categorised, hierarchically organised, and inter-connected activities. It supports flexible process aggregations via a capacity reuse mechanism called

Epitome

. A model-driven method and a supporting tool environment are integrated into OAP, which enables efficient manipulation and utilisation of OAP models, and automatic generation of concrete systems. It supports, and promotes a focus on, high-level intellectual efforts such as perception of reality, rationalisation of problem situations, and derivation of “soft” desired ends. A proof-of-concept case study is briefly presented in this paper to illustrate the application of OAP approach in a real world setting.

Yuanzhi Wang
An Automatic Approach to Aid Process Integration within a Secure Software Processes Family

Defining secure processes is an important means for assuring software security. A wealth of dedicated secure processes has emerged in these years. These processes are similar to some extent, while differ from one another in detail. Conceptually, they can be further regarded as a so called “Process Family”. In order to integrate practices from different family members, and further improve efficiency and effectiveness compared to using a single process, in this paper we propose an automatic approach to implement the integration of the three forefront secure processes, namely, CLASP, SDL and Touchpoints. Moreover, we select a module from an e-government project in China, and conduct an exploratory experiment to compare our approach with cases when one single secure process is employed. The empirical result confirms the positive effects of our approach.

Jia-kuan Ma, Ya-sha Wang, Lei Shi, Hong Mei
Engineering Adaptive IT Service Support Processes Using Meta-modeling Technologies

IT service support is a process-oriented practice that strives to manage the efficient supply of IT services with guaranteed quality. Many organizations adopt best practices and tools in order to improve the maturity of IT service support processes. However, when existing solutions and methodologies are applied in various organizations, the customization efforts and costs are usually large. And also dynamic changes during IT service support process execution can’t be supported by almost all of the existed tools. Furthermore, process model integration, reuse and exchange still remain challenges in IT service support process modeling area. In this context, an IT service support process metamodel is presented in this paper. The metamodel extends a generic business process definition metamodel – BPDM, through its domain characteristics. Based on the proposed metamodel, we develop a flexible IT service support process engineering platform, which integrates and automates IT service support processes including incident, problem, change, release and configuration management. In contrast to other IT service support tools, besides process modeling and enactment, it can evolve the metamodel, interchange process models with other tools through metamodel parsing adapter flexibly, and also support adaptive process management via metamodel-based ECA rules.

Beijun Shen, Xin Huang, Kai Zhou, Wenwei Tang
Modeling a Resource-Constrained Test-and-Fix Cycle and Test Phase Duration

Software process simulation has been proposed as a means of studying software development processes and answering specific questions about them, such as, “How feasible is a schedule?” This study looked at the question of schedule feasibility for a testing process constrained by availability of test facilities. Simulation models of the testing process were used to identify the significant factors affecting the test duration, estimate a likely completion time for the testing, and provide an empirical basis for re-planning the testing.

Dan Houston, Man Lieu

Process Representation

Visual Patterns in Issue Tracking Data

Software development teams gather valuable data about features and bugs in issue tracking systems. This information can be used to measure and improve the efficiency and effectiveness of the development process. In this paper we present an approach that harnesses the extraordinary capability of the human brain to detect visual patterns. We specify generic visual process patterns that can be found in issue tracking data. With these patterns we can analyze information about effort estimation, and the length, and sequence of problem resolution activities. In an industrial case study we apply our interactive tool to identify instances of these patterns and discuss our observations. Our approach was validated through extensive discussions with multiple project managers and developers, as well as feedback from the project review board.

Patrick Knab, Martin Pinzger, Harald C. Gall
Disruption-Driven Resource Rescheduling in Software Development Processes

Real world systems can be thought of as structures of activities that require resources in order to execute. Careful allocation of resources can improve system performance by enabling more efficient use of resources. Resource allocation decisions can be facilitated when process flow and estimates of time and resource requirements are statically determinable. But this information is difficult to be sure of in disruption prone systems, where unexpected events can necessitate process changes and make it difficult or impossible to be sure of time and resource requirements. This paper approaches the problems posed by such disruptions by using a Time Window based INcremental resource Scheduling method (TWINS). We show how to use TWINS to respond to disruptions by doing reactive rescheduling over a relatively small set of activities. This approach uses a genetic algorithm. It is evaluated by using it to schedule resources dynamically during the simulation of some example software development processes. Results indicate that this dynamic approach produces good results obtained at affordable costs.

Junchao Xiao, Leon J. Osterweil, Qing Wang, Mingshu Li
MODAL: A SPEM Extension to Improve Co-design Process Models

Process engineering has been applied for several years in software developments, with more or less success thanks to the use of process languages easing analysis, improvement and even execution of software processes. In this paper, we present a SPEM derived metamodel that takes advantages of Model Based Engineering and innovative approaches of process engineering. Through this work, we try to clarify the definition of process components in order to be able to construct and execute MBE processes “On-Demand”.

Ali Koudri, Joel Champeau

Process Analysis and Measurement

Application of Re-estimation in Re-planning of Software Product Releases

Context:

Re-planning of product releases is a very dynamic endeavor and new research methods or improvements of existing methods are still required. This paper explores the role of re-estimation in the re-planning process of product releases.

Objective:

The purpose of this study is to analyze effects of defect and effort re-estimation in the process of release re-planning. In particular, two questions are answered: Question 1: In the absence of re-estimation, does conducting replanning have any advantages over not conducting re-planning? Question 2: In the case of re-planning, does conducting re-estimation have any advantages over not conducting re-estimation?

Method:

The proposed method H2W-Pred extends the existing H2W re-planning method by accommodating dynamic updates on defect and effort estimates whenever re-planning takes place. Based on the updates, effort for development of new functionality needs to be re-adjusted and balanced against the additional effort necessary to ensure quality early. The proposed approach is illustrated by case examples with simulated data.

Results:

The simulation results show that conducting re-planning yields better release value in terms of functionality than not conducting re-planning. Furthermore, performing re-estimation when doing re-planning generates a portfolio of solutions that help balance trade-offs between several aspects of release value, e.g., between functionality and quality.

Conclusion:

If the development of a product release requires balancing between potentially conflictive aspects, such as quality vs. functionality, then re-estimation in the re-planning process is beneficial.

Ahmed Al-Emran, Anas Jadallah, Elham Paikari, Dietmar Pfahl, Günther Ruhe
Software Process Model Blueprints

Explicitly defining a software process model is widely recognized as a good software engineering practice. However, having a defined process does not necessarily mean that this process is good, sound and/or useful. There have been several approaches for software process evaluation including testing, simulation and metrics; the first one requires software process enactment, i.e., an expensive, risky and long process, and the others require high expertise for correctly interpreting their meaning. In this paper we propose a visual approach for software process model evaluation based on three architectural view types, each one focusing on basic process elements:

Role Blueprint

,

Task Blueprint

and

Work Product Blueprint

. They enable visual evaluation of different perspectives of a software process, each being relevant for a particular stakeholder. We illustrate the proposed approach by applying it to the software process defined for a real world company that develops software for retail. We show how design errors were identified.

Julio Ariel Hurtado Alegría, Alejandro Lagos, Alexandre Bergel, María Cecilia Bastarrica
Measurement and Analysis of Process Audit: A Case Study

Process audit is one of the popular quality assurance activities. It helps individual projects to be in conformity with the process definition, standard, etc. and provide insights for process improvement. However, to the best of our knowledge, little quantitative analysis of process audit has been reported so far. The paper introduces a case study of quantitatively analyzing process audit based on the practice of a Chinese software research and development organization, CSRD. It presents a measurement schema for evaluating the effectiveness and efficiency of process audit from various facets. The study shows that the auditing process in CSRD has contributed to the process management and improvement and brought value to the organization. The presented measurement and analysis methods and empirical results can provide reference for other organizations within similar context.

Fengdi Shu, Qi Li, Qing Wang, Haopeng Zhang
A Fuzzy-Based Method for Evaluating the Trustworthiness of Software Processes

Software trustworthiness is built in its development and maintenance processes. Effective software measurement system can help to assess and predict the trustworthiness of end products in early software lifecycles in order to plan for proper corrective actions to meet trustworthiness objectives. However, it is a challenging task to aggregate individual metric data and derive a single, overall trustworthiness indicator given the hierarchies in software measurement system as well as heterogeneity within a mix of metric data. In addition, very few metric data can be collected precisely and without subjective judgment inherent in data reporters or collecting tools. In this paper, based on the theory of fuzzy set and the AHP method, we propose an evaluation method supporting fuzzy data and multiple types of values, for evaluating process trustworthiness in the form of user-customized levels. A case study is also presented to illustrate the application of our method.

Haopeng Zhang, Fengdi Shu, Ye Yang, Xu Wang, Qing Wang

Process Simulation Modeling

Software Process Simulation Modeling: An Extended Systematic Review

Software Process Simulation Modeling (SPSM) research has increased in the past two decades, especially since the first ProSim Workshop held in 1998. Our research aims to systematically assess how SPSM has evolved during the past 10 years in particular whether the purposes for SPSM, the simulation paradigms, tools, research topics, and the model scopes and outputs have changed. We performed a systematic literature review of the SPSM research in two subsequent stages, and identified 156 relevant studies in four categories. This paper reports the review process of the second stage and the preliminary results by aggregating studies from the two stages. Although the load of SPSM studies was dominated in ProSim/ICSP community, the outside research presented more diversity in some aspects. We also perceived an immediate need for refining and updating the reasons and the classification scheme for SPSM introduced by Kellner, Madachy and Raffo (KMR).

He Zhang, Barbara Kitchenham, Dietmar Pfahl
SimSWE – A Library of Reusable Components for Software Process Simulation

SimSWE is a library of components for modeling and simulation of software engineering processes. It consists of a generic, implementation independent description of the components and a reference implementation using the MATLAB

®

/ Simulink

®

environment. By providing ready-to-use building blocks for typical functionality, the library should facilitate and ease the use of simulation to analyze software engineering issues. The goals are to provide a collection of reusable components within an open and evolvable framework to help practitioners develop simulation models faster and enable researchers to implement complex modeling and simulation tasks. Currently, the library contains more than thirty components. It can be used under the LPGL license. The paper presents the current status of SimSWE to encourage its use and participation within the research community.

Thomas Birkhölzer, Ray Madachy, Dietmar Pfahl, Dan Port, Harry Beitinger, Michael Schuster, Alexey Olkov
Applications of a Generic Work-Test-Rework Component for Software Process Simulation

Software process simulation can be a valuable support for process analysis and improvement provided the respective model development can focus on the issues at hand without spending effort on basically modeling everything from scratch. Rather, the modeling groundwork should readily be available as building blocks for (re)use. In the domain of software engineering, the work-test-rework cycle is one of the most important reoccurring patterns – at the level of a individual tasks as well as at the level of process phases or projects in general. Therefore, a generic reusable simulation component, which captures and represents this pattern, has been realized as part of the SimSWE library. This

WorkTestRework

component is designed such that it provides comprehensive adaptability and flexibility to be used in different simulation scenarios and process contexts. This paper introduces the component and substantiates this claim by modeling typical, but very different process scenarios based on this component.

Thomas Birkhölzer, Dietmar Pfahl, Michael Schuster

Experience Reports and Empirical Studies

An Empirical Study of Lead-Times in Incremental and Agile Software Development

Short lead-times are essential in order to have a first-move advantages and to be able to react on changes on a fast-paced market. Agile software development is a development paradigm that aims at being able to respond quickly to changes in customer needs. So far, to the best of our knowledge no empirical study has investigated lead-times with regard to different aspects (distribution between phases, difference of lead-time with regard to architecture dependencies, and size). However, in order to improve lead-times it is important to understand the behavior of lead-times. In this study the lead-times of a large-scale company employing incremental and agile practices are analyzed. The analysis focuses on 12 systems at Ericsson AB, Sweden.

Kai Petersen
Improving the ROI of Software Quality Assurance Activities: An Empirical Study

Review, process audit, and testing are three main Quality Assurance activities during the software development life cycle. They complement each other to examine work products for defects and improvement opportunities to the largest extent. Understanding the effort distribution and inter-correlation among them will facilitate software organization project planning, improve the software quality within the budget and schedule and make continuous process improvement. This paper reports some empirical findings of effort distribution pattern of the three types of QA activities from a series of incremental projects in China. The result of the study gives us some implications on how to identify which type of QA activity is insufficient while others might be overdone, how to balance the effort allocation and planning for future projects, how to improve the weak part of each QA activity and finally improve the Return On Investment (ROI) of QA activities and the whole process effectiveness under the specific organization context.

Qi Li, Fengdi Shu, Barry Boehm, Qing Wang
Benchmarking the Customer Configuration Updating Process of the International Product Software Industry

Product software vendors have trouble defining their customer configuration updating process; release, delivery and deployment are generally underrated and thus less attention is paid to them. This paper presents an international benchmark survey providing statistical evidence that product software vendors who invest in the customer configuration updating process perceive that they are more successful than those who do not. Furthermore, the benchmark survey provides a vivid picture of the customer configuration updating practices of product software vendors and that customer configuration updating is an underdeveloped process.

Slinger Jansen, Wouter Buts, Sjaak Brinkkemper, André van der Hoek
Backmatter
Metadaten
Titel
New Modeling Concepts for Today’s Software Processes
herausgegeben von
Jürgen Münch
Ye Yang
Wilhelm Schäfer
Copyright-Jahr
2010
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-14347-2
Print ISBN
978-3-642-14346-5
DOI
https://doi.org/10.1007/978-3-642-14347-2