Skip to main content

2011 | Buch

Product-Focused Software Process Improvement

12th International Conference, PROFES 2011, Torre Canne, Italy, June 20-22, 2011. Proceedings

herausgegeben von: Danilo Caivano, Markku Oivo, Maria Teresa Baldassarre, Giuseppe Visaggio

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

This book constitutes the refereed proceedings of the 12 International Conference on Product-Focused Software Process Improvement, PROFES 2011, held in Torre Canne, Italy, in June 2011. The 24 revised full papers presented together with the abstracts of 2 keynote addresses were carefully reviewed and selected from 54 submissions. The papers are organized in topical sections on agile and lean practices, cross-model quality improvement, global and competitive software development, managing diversity, product and process measurements, product-focused software process improvement, requirement process improvement, and software process improvement.

Inhaltsverzeichnis

Frontmatter

Keynote Addresses

The Impact of Emerging Software Paradigms on Software Quality and User Expectations
Abstract
This talk will discuss emerging approaches to software development and evolution that enable organizations to respond quickly to new business needs while maintaining their legacy applications. These approaches include:
  • Service oriented architecture (SOA) which is a way of designing, developing, deploying, and managing systems where coarse-grained services represent reusable functionality, and service consumers compose applications or systems using the functionality provided by these services through standard interfaces. This approach enables the flexible composition and evolution of new services. Major barriers include lack of a long term strategy, lack of effective governance, unrealistic expectations, and inappropriate technical strategy.
  • Cloud computing which is a “a large-scale distributed computing paradigm that is driven by economies of scale, in which a pool of abstracted, virtualized, dynamically-scalable, managed computing power, storage, platforms, and services are delivered on demand to external customers over the Internet [1]. This approach enables consumer organizations to have access to state of the practice hardware and software without making many of the upfront infrastructure investments. The main drivers for cloud computing adoption include scalability, elasticity, virtualization, cost, mobility, collaboration, and risk reduction. Major barriers include security, interoperability, control and performance.
  • Enterprise architecture planning and development which develops a comprehensive plan for using business functionality across an enterprise and building applications from shared resources. Because it takes a perspective that crosses an entire enterprise, it enables the breaking down of barriers that take place within individual organizations. Major barriers include lack of long term commitment, and a focus on completeness rather than practicality.
The combination of these three approaches offers potentials for both success and failure. They can enable rapid responses to new business situations through the shared use of common resources, as well as the discipline to use a common plan for implementing enterprise wide priorities. However, the use of these approaches has often been over-hyped, resulting in failure. This talk uses lessons learned from current adoption efforts to identify the core concepts of these approaches, what their potentials are, as well as common misconceptions and risks.
Dennis B. Smith
Acquiring Information from Diverse Industrial Processes Using Visual Analytics
Abstract
Visual analytics is an emerging technology that provides another technique to derive better information from data. As a technology, visual analytics is a new approach that complements more traditional methods in business intelligence, information visualization, and data mining. The basic concept is to provide highly interactive visual techniques that let people explore multiple heterogeneous data sets simultaneously. The key aspect is interactivity, which lets people follow paths to seek answers to questions normally unasked. The visual aspect allows people to detect the expected and discover the unexpected.
David J. Kasik

Agile and Lean Practices

Monitoring Bottlenecks in Agile and Lean Software Development Projects – A Method and Its Industrial Use
Abstract
In the face of growing competition software projects have to deliver software products faster and with better quality – thus leaving little room for unnecessary activities or non-optimal capacity. To achieve the desired high speed of the projects and the optimal capacity,bottlenecks existing in the projects have to be monitored and effectively removed. The objective of this research is to show experiences from a mature software development organization working according to Lean and Agile software development principles. By conducting a formal case study at Ericsson we were able to elicit and automate measures required to monitor bottlenecks in software development workflow, evaluated in one of the projects. The project developed software for one of the telecom products and consisted of over 80 developers. The results of the case study include a measurement system with a number of measures/indicators which can indicate existence of bottlenecks in the flow of work in the project and a number of good practices helping other organizations to start monitoring bottlenecks in an effective way – in particular what to focus on when designing such a measurement system.
Miroslaw Staron, Wilhelm Meding
Applying Agile and Lean Practices in a Software Development Project into a CMMI Organization
Abstract
This paper presents an approach based on a practical experience in applying agile and lean practices in a software development process performed into an organization evaluated CMMI level 5. As a result of a theoretical review on agile and lean practices, and the organization’s needs, an integrated proposal between these practices and CMMI was found and was also put into practice. The work carried out by the organization using this proposal led to a successful integration experience in order to innovate, improve product quality, get clients’ satisfaction, and the most important, show the feasibility of coexisting of CMMI and agile practices resulting in a significant improvement for the organization.
Miguel Morales Trujillo, Hanna Oktaba, Francisco J. Pino, María J. Orozco
Adopting Agile Practices in Teams with No Direct Programming Responsibility – A Case Study
Abstract
To respond to the need for flexibility in reacting to customer needs, the agile practices have been introduced in the software development organizations. In order to get full benefit of agile, it is proposed that the agile and lean practices should be adopted organization wide. Agile practices are mainly used by programmers, but in large companies, there can be different work roles and tasks which do not directly include programming of the product but are supporting the software development in the system level, such as system level testing and testing environment maintenance. In this study, the goal was to provide information on the progress of agile transformation in teams with no direct programming responsibility in a large-scale, distributed software development organization. A survey was done to collect data during the first year of agile adoption about agile practices utilisation within three different teams: the developers, the system level testers and people in test laboratory support. The results show that certain agile practices were adopted in the teams with no direct programming responsibility, and there were differences between the teams.
Kirsi Korhonen

Cross-Model Quality Improvement

Proposing an ISO/IEC 15504-2 Compliant Method for Process Capability/Maturity Models Customization
Abstract
The customization of software process capability/maturity models (SPCMMs) to specific domains/sectors or development methodologies represents one of the most discussed and applied trends in ICT organizations. Nonetheless, little research appears to have been performed on how theoretically sound and widely accepted SPCMMs should be developed to high quality. The aim of this paper is therefore to elicit the state-of-the-art regarding the processes adopted to develop such models and to propose a systematic approach to support the customization of SPCMMs. Such an approach is developed based on ISO/IEEE standard development processes integrating Knowledge Engineering techniques and experiences about how such models are currently developed in practice. Initial feedback from an expert panel indicates the usefulness and adequacy of the proposed method.
Jean Carlo Rossa Hauck, Christiane Gresse von Wangenheim, Fergal Mc Caffery, Luigi Buglione
Homogenization, Comparison and Integration: A Harmonizing Strategy for the Unification of Multi-models in the Banking Sector
Abstract
Information Technologies (IT) play a crucial role in the development of the business processes in organizations. Acquiring the best technologies is quickly becoming as important as understanding and improving the business model of organizations. As a result, many (inter)national standards and models for IT Management, IT Government and IT Security have been developed. This situation allows organizations to choose and improve their processes, selecting the models that best suit their needs. Since several relationships between these models can be found, carrying out the harmonization of their similarities and differences will make it possible to reduce the time and effort involved in implementing them. In this paper, we present a harmonization strategy which has been defined to harmonize COBIT 4.1, Basel II, VAL IT, RISK IT, ISO 27002 and ITIL V3. This work intends to support organizations which are interested in knowing how to carry out the harmonization of these models. Furthermore, as a result of the execution of the harmonization strategy we have defined, a unified model for Banking, called ITGSM, is presented. It resolves the conflicts between the models mentioned above and provides a useful reference model to organizations that are planning to adopt them.
César Pardo, Francisco J. Pino, Félix García, Mario Piattini, Maria Teresa Baldassarre, Sandra Lemus
Supporting Audits and Assessments in Multi-model Environments
Abstract
Software development organizations are adopting multiple improvement technologies to guide improvement efforts. A recent trend is the simultaneous adoption of CMMI and ISO models into a single environment originating multi-model process solutions. Some of these models address similar areas of concern and share similar quality goals. Reusing organizational implemented practices is an opportunity to establishing compliance with multiple models and reduce implementation costs.
Audits and assessments can take advantage of practices reuse if information characterizing similarities between quality goals of different models is maintained. This paper proposes a conceptual model to support management of quality goals information in support of multi-model audits and assessments. An example is described of applying the proposed model in supporting the generation of data collection checklists to perform, in a single effort, a multi-model audit process. The conceptual model is being applied in a Portuguese software house with a multi-model process solution compliant with several improvement technologies.
Andre L. Ferreira, Ricardo J. Machado, Mark C. Paulk

Global and Competitive Software Development

Scrum Practices in Global Software Development: A Research Framework
Abstract
Project stakeholder distribution in Global Software Development (GSD) is characterized by temporal, geographical and socio-cultural distance, which creates challenges for communication, coordination and control. Practitioners constantly seek strategies, practices and tools to counter the challenges of GSD. There is increasing interest in using Scrum in GSD even though it originally assumed collocation. However, empirically, little is known about how Scrum practices respond to the challenges of GSD. This paper develops a research framework from the literature as a basis for future research and practice. The framework maps current knowledge and views on how Scrum practices can be used to mitigate commonly recognized challenges in GSD. This research is useful as a reference guide for practitioners who are seeking to understand how Scrum practices can be used effectively in GSD, and for researchers as a research framework to validate and extend current knowledge.
Emam Hossain, Paul L. Bannerman, D. Ross Jeffery
Towards the Competitive Software Development
Abstract
The concept of competitive software development is founded on the observation that the system’s owner and development companies have not only common interests, but also conflicting interests as well. This is particularly true in the case of large-scale software systems. Competitive development is an answer to the syndrome of large-scale software systems evolution and maintenance being monopolised by the companies that originally developed these systems. Competitive development is founded on the idea that the entire system is divided into smaller units, which are independently developed by different companies, i.e. no co-operation is assumed between various development organisations and there is no communication between them. However, strong and efficient co-ordination has to be performed on behalf of the system’s owner in order to make such an approach successful. These assumptions are radically different to the typical collaboration assumption for agile development. This prevents one software company from making a system owner entirely dependent on its services. We show that such demonopolisation can save large sums of money, making the prices of software development considerably lower than they would be in the case of a single software development company. Our experiments show that, if efficiently co-ordinated, such a distributed, competitive development requires a similar effort to traditional approaches.
Andrzej Zalewski, Szymon Kijas
Defect Detection Effectiveness and Product Quality in Global Software Development
Abstract
Global software development (GSD) has become a common practice in the software development industry. The main challenge organizations have to overcome is to minimize the effect of organizational diversity on the effectiveness of their GSD collaboration. The objective of this study is to understand the differences in the defect detection effectiveness among different organizations involved into the same GSD project, and how these differences, if any, are reflected on the delivered product quality. The case study is undertaken in a GSD project at Ericsson corporation involving nine organizations that are commonly developing a software product for telecommunication exchanges. Comparing the effectiveness of defect detection on the sample of 216 software units developed by nine organizations, it turns out that there is statistically significant difference between defect detection effectiveness among organizations. Moreover, the defect density serves better as a measure of defect detection effectiveness than as a measure of the product quality.
Tihana Galinac Grbac, Darko Huljenić

Managing Diversity

Managing Process Diversity by Applying Rationale Management in Variant Rich Processes
Abstract
Process diversity arises as software processes are influenced by organization, project and other contextual factors. Managing this diversity consists of considering how these factors actually modify the process. Variant rich processes offer support for process tailoring, but they do not currently link these changes with the business factors motivating them. The lack of decision traceability signifies that variant rich processes are not suitable for addressing process diversity. This article aims to fill this gap by applying rationale management to supporting decision-making when tailoring processes. Rationale management has become one of the main assets in variant rich process tailoring, since it handles how context-related factors are transformed into real variations in the tailoring process, as a consequence of well-reasoned and traceable steps. An application study shows how rationale provides useful mechanisms with which to tailor a process according to its context of enactment.
Tomás Martínez-Ruiz, Félix García, Mario Piattini
Usage of Open Source in Commercial Software Product Development – Findings from a Focus Group Meeting
Abstract
Open source components can be used as one type of software component in development of commercial software. In development using this type of component, potential open source components must first be identified, then specific components must be selected, and after that selected components should maybe be adapted before they are included in the developed product. A company using open source components must also decide how they should participate in open source project from which they use software. These steps have been investigated in a focus group meeting with representatives from industry. Findings, in the form of recommendations to engineers in the field are summarized for all the mentioned phases. The findings have been compared to published literature, and no major differences or conflicting facts have been found.
Martin Höst, Alma Oručević-Alagić, Per Runeson
Identifying and Tackling Diversity of Management and Administration of a Handover Process
Abstract
Software handover is a de facto process in all software organizations. It is one of the most business critical and complex processes. It is also one of the most diverse processes, and thereby, one of the most difficult processes to define. Despite this, software handover is not well recognized within the academia. Right now, there are no software handover process models whatsoever although software organizations desperately need guidelines for how to perform this important and critical task. To aid them in defining their handover process models, we are in the process of creating Evolution and Maintenance Management Model (EM 3 ): Software Handover focusing on handover (alias transition) of a software system from developer to maintainer. In this paper, we evaluate one of the EM 3 components, Management and Administration (MA), addressing activities for planning and controlling the transition process, its schedule, budget and resources. We do it within 29 organizations. Our primary goal is to find out whether the component is realistic and whether it meets the needs and requirements of the software industry today. Using the feedback from the industry, we tackle process diversity using the C ontext- D riven P rocess O rchestration M ethod (CoDPOM).
Ahmad Salman Khan, Mira Kajko-Mattsson

Product and Process Measurements

A Process Complexity-Product Quality (PCPQ) Model Based on Process Fragment with Workflow Management Tables
Abstract
In software development projects, large gaps between planned development process and actual development exist. A planned process is often gradually transformed into complicated processes including a base process and many process fragments. Therefore, we propose a metric of process complexity based on process fragments. Process fragments mean additional and piecemeal processes that are added on the way of a project. The process complexity depends on three elements; the number of group of developers, the number of simultaneous process, and ratio of an executing period for a period of the whole project. The process complexity was applied to six industrial projects. As a result, changes of process complexities in the six projects were clarified. In addition, we propose a procedure of making a PCPQ (Process Complexity-Product quality) model that can predict post-release product quality on the way of a project. As a result of making a PCPQ model using the six projects, a post-release product quality was able to be predicted.
Masaki Obana, Noriko Hanakawa, Hajimu Iida
Using Web Objects for Development Effort Estimation of Web Applications: A Replicated Study
Abstract
The spreading of Web applications has motivated the definition of size measures suitable for such kind of software systems. Among the proposals existing in the literature, Web Objects were conceived by Reifer specifically for Web applications as an extension of Function Points. In this paper we report on an empirical analysis we performed exploiting 25 Web applications developed by an Italian software company. The results confirm the ones obtained in a previous study and extend them in several aspects, showing the robustness of the measure with respect to the size and technologies of the applications, and to the employed estimation techniques.
Sergio Di Martino, Filomena Ferrucci, Carmine Gravino, Federica Sarro
Applying EFFORT for Evaluating CRM Open Source Systems
Abstract
Free Open Source Software (FlOSS) for Customer relationship management (CRM) represents a great opportunity for small and medium enterprises as they can significantly impact their competitiveness. These software solutions can be adopted with success, whatever the size, and offer customized solutions for clients, even with few people working of it. This paper presents an analysis of the quality of CRM FlOSS systems using a predefined quality model. The analysis focuses on the main relevant aspect of both open source domain and application domain proposing a trade off assessment among these.
Lerina Aversano, Maria Tortorella

Product-Focused Software Process Improvement

A Factorial Experimental Evaluation of Automated Test Input Generation
– Java Platform Testing in Embedded Devices
Abstract
Background. When delivering an embedded product, such as a mobile phone, third party products, like games, are often bundled with it in the form of Java MIDlets. To verify the compatibility between the runtime platform and the MIDlet is a labour-intensive task, if input data should be manually generated for thousands of MIDlets. Aim. In order to make the verification more efficient, we investigate four different automated input generation methods which do not require extensive modeling; random, feedback based, with and without a constant startup sequence. Method. We evaluate the methods in a factorial design experiment with manual input generation as a reference. One original experiment is run, and a partial replication. Result. The results show that the startup sequence gives good code coverage values for the selected MIDlets. The feedback method gives somewhat better code coverage than the random method, but requires real-time code coverage measurements, which decreases the run speed of the tests. Conclusion The random method with startup sequence is the best trade-off in the current setting.
Per Runeson, Per Heed, Alexander Westrup
Automating and Evaluating Probabilistic Cause-Effect Diagrams to Improve Defect Causal Analysis
Abstract
Defect causal analysis (DCA) has shown itself an efficient means to obtain product-focused software process improvement. A DCA approach, called DPPI, was assembled based on guidance acquired through systematic reviews and feedback from experts in the field. To our knowledge, DPPI represents an innovative approach integrating cause-effect learning mechanisms (Bayesian networks) into DCA meetings, by using probabilistic cause-effect diagrams. The experience of applying DPPI to a real Web-based software project showed its feasibility and provided insights into the requirements for tool support. Moreover, it was possible to observe that DPPI’s Bayesian diagnostic inference predicted the main defect causes efficiently, motivating further investigation. This paper describes (i) the framework built to support the application of DPPI and automate the generation of the probabilistic cause-effect diagrams, and (ii) the results of an experimental study aiming at investigating the benefits of using DPPI’s probabilistic cause-effect diagrams during DCA meetings.
Marcos Kalinowski, Emilia Mendes, Guilherme H. Travassos
A Genetic Algorithm to Configure Support Vector Machines for Predicting Fault-Prone Components
Abstract
In some studies, Support Vector Machines (SVMs) have been turned out to be promising for predicting fault-prone software components. Nevertheless, the performance of the method depends on the setting of some parameters. To address this issue, we propose the use of a Genetic Algorithm (GA) to search for a suitable configuration of SVMs parameters that allows us to obtain optimal prediction performance. The approach has been assessed carrying out an empirical analysis based on jEdit data from the PROMISE repository. We analyzed both the inter- and the intra-release performance of the proposed method. As benchmarks we exploited SVMs with Grid-search and several other machine learning techniques. The results show that the proposed approach let us to obtain an improvement of the performance with an increasing of the Recall measure without worsening the Precision one. This behavior was especially remarkable for the inter-release use with respect to the other prediction techniques.
Sergio Di Martino, Filomena Ferrucci, Carmine Gravino, Federica Sarro

Requirement Process Improvement

A Systematic Approach to Requirements Engineering Process Improvement in Small and Medium Enterprises: An Exploratory Study
Abstract
Requirements Engineering (RE) studies have demonstrated that requirements errors affect the quality of software developed, making software requirements critical determinants of software quality. Requirements Engineering Process Improvement (REPI) models have been provided by different authors to improve the RE process. However, little success has been achieved in Small and Medium Enterprises (SMEs) software companies especially in transitional countries such as Uganda. This study reports on an exploratory study which provides insights into current RE practices in four Ugandan SME software companies, critical success factors and challenges that impede REPI. As a result a Systematic Approach to REPI has been designed following the design science approach. It provides guidelines and steps for SMEs in improving their RE processes.
Edward Kabaale, Josephine Nabukenya
Understanding the Dynamics of Requirements Process Improvement: A New Approach
Abstract
Many software development organizations invest heavily in the requirements engineering process programmes, and with good reason. They fail, however, to maximize a healthy return on investment.
This paper explores factors that influence requirements process improvement (RPI) with the aim to explain how the attributes of the underpinning process affect both the quality and associated costs of the requirements specification delivered to the customer. Although several tools and techniques have been proposed and used for RPIs, many lack a systematic approach to RPI or fail to provide RPI teams with the required understanding to assess their effectivity.
The authors contend that the developed quality-cost RPI descriptive model is a generic framework, discipline and language for an effective approach to RPI. This descriptive model allows a systematic enquiry that yields explanations and provides RPI stakeholders with a common decision making framework. The descriptive model was validated by practicing process improvement consultants and managers and makes a contribution towards understanding of the quality-cost dynamics of RPI. To address the acknowledged deficiencies of RPI, the authors further suggest a generic RPI model and approach that integrates statistical process control (SPC) into system dynamics (SD). The approach enables RPI teams to steer for a cost-effective and successful RPI.
A. S. Aminah Zawedde, M. D. Martijn Klabbers, D. Ddembe Williams, M. G. J. Mark van den Brand
Precise vs. Ultra-Light Activity Diagrams - An Experimental Assessment in the Context of Business Process Modelling
Abstract
UML activity diagrams are a commonly used notation for modelling business processes in the field of both workflow automation and requirements engineering. In this paper, we present a novel precise style for this notation. Further, the effectiveness of this style has been investigated in the context of the modelling of business processes through a controlled experiment conducted with master students in Computer Science at the Free University of Bolzano-Bozen. The results indicate that the subjects achieved a significantly better comprehension level when business processes are modelled using the precise style with respect to a “lighter” variant, with no significant impact on the effort to accomplish the tasks.
Francesco Di Cerbo, Gabriella Dodero, Gianna Reggio, Filippo Ricca, Giuseppe Scanniello

Software Process Improvement

If the SOK Fits, Wear It: Pragmatic Process Improvement through Software Operation Knowledge
Abstract
Knowledge of in-the-field software operation is nowadays acquired by many software-producing organizations. Vendors are effective in acquiring large amounts of valuable software operation data to improve the quality of their software products. For many vendors, however, it remains unclear how their actual product software processes can be advanced through structural integration of such information. In this paper, we present a template method for integration of software operation information with product software processes, and present four lessons learned that are identified based on a canonical action research study of ten months, during which the method was instantiated at a European software vendor. Results show that the template method contributes to significant software quality increase, by pragmatic but measurable improvement of software processes, without adhering to strict requirements from cumbersome maturity models or process improvement frameworks.
Henk van der Schuur, Slinger Jansen, Sjaak Brinkkemper
Critical Issues on Test-Driven Development
Abstract
During the last decade, Test-Driven Development (TDD) has been actively discussed in the software engineering community. It has been regarded as a useful and beneficial software development practice as well in industry as in academia. After a decade of active research, there is still very little critical discussion on TDD in the literature. This paper is based on a literature review and it is focused on identifying and introducing critical viewpoints on TDD. First, the current evidence on TDD’s benefits is still weak and it includes several issues. Second, the paper presents a number of other possible issues and challenges with TDD that are referred in the literature. Finally, based on the findings, a list of concrete research questions for the future research is presented.
Sami Kollanus
On the Difficulty of Computing the Truck Factor
Abstract
In spite of the potential relevance for managers and even though the Truck Factor definition is well-known in the “agile world” for many years, shared and validated measurements, algorithms, tools, thresholds and empirical studies on this topic are still lacking.
In this paper, we explore the situation implementing the only approach proposed in literature able to compute the Truck Factor. Then, using our tool, we conduct an exploratory study with 37 open source projects for discovering limitations and drawbacks that could prevent its usage.
Lessons learnt from the execution of the exploratory study and open issues are drawn at the end of this work. The most important lesson that we have learnt is that more research is needed to render the notion of Truck Factor operative and usable.
Filippo Ricca, Alessandro Marchetto, Marco Torchiano
Backmatter
Metadaten
Titel
Product-Focused Software Process Improvement
herausgegeben von
Danilo Caivano
Markku Oivo
Maria Teresa Baldassarre
Giuseppe Visaggio
Copyright-Jahr
2011
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-21843-9
Print ISBN
978-3-642-21842-2
DOI
https://doi.org/10.1007/978-3-642-21843-9