Skip to main content

Über dieses Buch

This book constitutes the refereed proceedings of the scientific track of the 8th Software Quality Days Conference, SWQD 2016, held in Vienna, Austria, in January 2016.

The SWQD conference offers a range of comprehensive and valuable information by presenting new ideas from the latest research papers, keynote speeches by renowned academics and industry leaders, professional lectures, exhibits, and tutorials.

The five scientific full papers accepted for SWQD were each peer reviewed by three or more reviewers and selected out of 13 high-quality submissions. Further, nine short papers were also presented and are included in this book. In addition, one keynote paper by Scott Ambler and Mark Lines is also included.





The Disciplined Agile Process Decision Framework

The Disciplined Agile 2.0 process decision framework [1] provides light-weight guidance to help organizations streamline their information technology (IT) processes in a context-sensitive manner. It does this by showing how various activities such as solution delivery, operations, enterprise architecture, portfolio management, and many others work together in a cohesive whole. The framework also describes what these activities should address, provides a range of options for doing so, and describes the tradeoffs associated with each option. Every person, every team, and every organization is unique, therefore process frameworks must provide choices, not prescribe answers.

Scott W. Ambler, Mark Lines

Software Engineering Processes and Process Modelling


How Scrum Tools May Change Your Agile Software Development Approach

A major problem for distributed Scrum teams is proper communication between the involved parties to ensure the quality of the final product. This is especially true for coordination issues such as sharing requirements, time schedules, to-dos and code artefacts. Hence, ScrumMasters complain frequently about software tools not suiting their daily needs when supporting agile teamwork, finally leading to the fact of not using a Scrum tool at all. In this paper we describe the extensive interviews held with selected ScrumMasters in which they explained their current tools and the existing gap to their real needs. Within this context, they were able to define the features and aspects they really need. After collecting those requirements for their daily work, we extracted the most wanted ingredients, prioritised them and finally forged them into an Open Source tool called Scrumpy, helping us to present a first solution, which focuses on the agile philosophy of Scrum and the elements needed most by ScrumMasters. Features of Scrumpy include, for example, web-based access to the task board with real-time updates, advanced dashboard visualisation techniques and a sophisticated chat system, which enables effortless communication for distributed teams. Although we already have first anecdotal feedback from users, we plan to improve the tool in a next step by adding more commodity features, perform additional mobile usability tests and systematically evaluate Scrumpy with a large number of end users.

Matthias Eckhart, Johannes Feiner

Towards Business Process Execution Adequacy Criteria

Monitoring of business process execution has been proposed for the evaluation of business process performance. An important aspect to assess the thoroughness of the business process execution is to monitor if some entities have not been observed for some time and timely check if something is going wrong. We propose in this paper business process execution adequacy criteria and provide a proof-of-concept monitoring framework for their assessment. Similar to testing adequacy, the purpose of our approach is to identify the main entities of the business process that are covered during its execution and raise a warning if some entities are not covered. We provide a first assessment of the proposed approach on a case study in the learning context.

Antonia Bertolino, Antonello Calabró, Francesca Lonetti, Eda Marchetti

An Experience on Applying Process Mining Techniques to the Tuscan Port Community System

[Context & Motivation] The Business Process Management is an important and widespread adopted proposal for modelling process specifications and developing an executable framework for the management of the process itself. In particular the monitoring facilities associated to the on-line process execution provide an important means to the control of process evolution and quality. In this context, this paper provides an experience on the application of business process modelling techniques and process mining techniques to the TPCS, Tuscan Port Community System. This is a web-services based platform with multilevel access control and data recovery facilities, developed for supporting and strengthening the Motorways of the Sea and Italian regulations. The paper describes a storytelling approach applied to derive the TPCS business process model and the conformance checking techniques used to validate it and improve the overall TPCS software quality.

Giorgio O. Spagnolo, Eda Marchetti, Alessandro Coco, Paolo Scarpellini, Antonella Querci, Fabrizio Fabbrini, Stefania Gnesi

Requirements Engineering


Preventing Incomplete/Hidden Requirements: Reflections on Survey Data from Austria and Brazil

[Context] Many software projects fail due to problems in requirements engineering (RE). [Goal] The goal of this paper is analyzing a specific and relevant RE problem in detail: incomplete/hidden requirements. [Method] We replicated a global family of RE surveys with representatives of software organizations in Austria and Brazil. We used the data to (a) characterize the criticality of the selected RE problem, and to (b) analyze the reported main causes and mitigation actions. Based on the analysis, we discuss how to prevent the problem. [Results] The survey includes 14 different organizations in Austria and 74 in Brazil, including small, medium and large sized companies, conducting both, plan-driven and agile development processes. Respondents from both countries cited the incomplete/hidden requirements problem as one of the most critical RE problems. We identified and graphically represented the main causes and documented solution options to address these causes. Further, we compiled a list of reported mitigation actions. [Conclusions] From a practical point of view, this paper provides further insights into common causes of incomplete/hidden requirements and on how to prevent this problem.

Marcos Kalinowski, Michael Felderer, Tayana Conte, Rodrigo Spínola, Rafael Prikladnicki, Dietmar Winkler, Daniel Méndez Fernández, Stefan Wagner

An Expert-Based Requirements Effort Estimation Model Using Bayesian Networks

[Motivation]: There are numerous software companies worldwide that split the software development life cycle into at least two separate projects – an initial project where a requirements specification document is prepared; and a follow-up project where the previously prepared requirements document is used as input to developing a software application. These follow-up projects can also be delegated to a third party, as occurs in numerous global software development scenarios. Effort estimation is one of the cornerstones of any type of project management; however, a systematic literature review on requirements effort estimation found hardly any empirical study investigating this topic. [Objective]: The goal of this paper is to describe an industrial case study where an expert-based requirements effort estimation model was built and validated for the Brazilian Navy. [Method]: A knowledge engineering of Bayesian networks process was employed to build the requirements effort estimation model. [Results]: The expert-based requirements effort estimation model was built with the participation of seven software requirements analysts and project managers, leading to 28 prediction factors and 30+ relationships. The model was validated based on real data from 11 large requirements specification projects. The model was incorporated into the Brazilian navy’s quality assurance process to be used by their software requirements analysts and managers. [Conclusion]: This paper details a case study where an expert-based requirements effort estimation model based solely on knowledge from requirements analysts and project managers was successfully built to help the Brazilian Navy estimate the requirements effort for their projects.

Emilia Mendes, Veronica Taquete Vaz, Fernando Muradas

Software Architecture


Experiences from Monitoring Effects of Architectural Changes

A common situation is that an initial architecture has been sufficient in the initial phases of a project, but when the size and complexity of the product increases the architecture must be changed. In this paper experiences are presented from changing an architecture into independent units, providing basic reuse of main functionality although giving higher priority to independence than reuse. An objective was also to introduce metrics in order to monitor the architectural changes. The change was studied in a case-study through weekly meetings with the team, collected metrics, and questionnaires. The new architecture was well received by the development team, who found it to be less fragile. Concerning the metrics for monitoring it was concluded that a high abstraction level was useful for the purpose.

Ulf Asklund, Martin Höst, Krzysztof Wnuk

Making the Case for Centralized Software Architecture Management

Architectural knowledge is an important artifact for many software development and quality control activities. Examples for quality control activities based on architectural information are architecture reviews, dependency analyses, and conformance analyses. Architecture is also an important artifact for understanding, reuse, evolution, and maintenance. Unfortunately, in many projects architectural knowledge often remains implicit and is not available for a particular system stakeholder or outdated when it is actually needed. To address this problem, we propose to manage semi-formal software architecture knowledge in a central repository, where it is accessible to all stakeholders and where it can be automatically and continuously updated and analyzed by different tools. In this paper we discuss important elements and use cases of such an approach, and present an example for an architecture knowledge and information repository in the context of an enterprise service-oriented architecture (SOA).

Georg Buchgeher, Rainer Weinreich, Thomas Kriechbaum

Software Estimation and Development


Preventing Composition Problems in Modular Java Applications

Java JAR files have become a common means to bring reusability into Java programming. While they allow developers to easily share and use third-party libraries, they bring new challenges as well. This paper addresses the problem of Linkage Errors that occur when a type incompatibility or a missing dependency between two libraries is detected at runtime. This is a direct impact of composing applications from pre-existing binary libraries as the problem would be normally detected during compilation. A problem may occur relatively easily as rules of source and binary compatibility in Java differ and may be counter-intuitive. To deal with this problem, we offer a set of tools that analyse existing binaries and discover problems normally leading to runtime errors. The tools work automatically, may complement existing testing, and find problems effectively from the start of development. Their additional features such as detection of redundant or duplicated libraries are usually not provided by standard testing tools.

Kamil Jezek, Lukas Holy, Jakub Danek

Deriving Extract Method Refactoring Suggestions for Long Methods

The extract method is a common way to shorten long methods in software development. Before developers can use tools that support the extract method, they need to invest time in identifying a suitable refactoring candidate. This paper addresses the problem of finding the most appropriate refactoring candidate for long methods written in Java. The approach determines valid refactoring candidates and ranks them using a scoring function that aims to improve readability and reduce code complexity. We use length and nesting reduction as complexity indicators. The number of parameters needed by the candidate influences the score. To suggest candidates that are consistent with the structure of the code, information such as comments and blank lines are also considered by the scoring function. We evaluate our approach to three open source systems using a user study with ten experienced developers. Our results show that they would actually apply 86 % of suggestions for an extract method refactoring.

Roman Haas, Benjamin Hummel

The Use of Precision of Software Development Effort Estimates to Communicate Uncertainty

The precision of estimates may be applied to communicate the uncertainty of required software development effort. The effort estimates 1000 and 975 work-hours, for example, communicate different levels of expected estimation accuracy. Through observational and experimental studies we found that software professionals (i) sometimes, but not in the majority of the examined projects, used estimate precision to convey effort uncertainty, (ii) tended to interpret overly precise, inaccurate effort estimates as indicating low developer competence and low trustworthiness of the estimates, while too narrow effort prediction intervals had the opposite effect. This difference remained even when the actual effort was known to be outside the narrow effort prediction interval. We identified several challenges related to the use of the precision of single value estimates to communicate effort uncertainty and recommend that software professionals use effort prediction intervals, and not the preciseness of single value estimates, to communicate effort uncertainty.

Magne Jørgensen

Software Testing


Web Service Test Evolution

In order to remain useful test scripts must evolve parallel to the test objects they are intended to test. In the approach described here the test objects are web services whose test script is derived from the web service interface definition. The test script structure is automatically generated from the WSDL structure with tags and attributes, however, the content, i.e. the test data has to be inserted by hand. From this script service requests are automatically generated and service responses automatically validated. As with other generated software artifacts, once the structure of the interface or the logic of the targeted service is changed, the content of the test script is no longer valid. It has to be altered and/or enhanced to fit the new interface structure and/or the altered service logic. In this paper the author proposes a semi-automated approach to solving this test maintenance problem and explains how it has been implemented in a web service testing tool by employing data reverse engineering techniques. The author also report on his experience with the approach when maintaining a test in the field.

Harry M. Sneed

Integrating a Lightweight Risk Assessment Approach into an Industrial Development Process

Risk assessment is dependent on its application domain. Risk values consist of probability and impact factors, but there is no fixed, unique guideline for the determination of these two factors. For a precise risk-value calculation, an adequate collection of factors is crucial. In this paper, we show the evolution from the first phase until the application of a risk assessment approach in the area of an international insurance company. In such a risk-aware field we have to systematically determine relevant factors and their severity. The final results are melted into a calculation tool that is embedded in the companies development process and used for decision support system. This paper shows the results and observations for the whole implementation process achieved via action research.

Viktor Pekar, Michael Felderer, Ruth Breu, Friederike Nickl, Christian Roßik, Franz Schwarcz

Fast Feedback from Automated Tests Executed with the Product Build

Nowadays Continuous Integration (CI) is a very common practice with many advantages and it is used in many software projects. For large software projects checking out the source code, building the product and testing the product build via automated tests during CI can take a long time (e.g. many hours). So the software developers do not get fast feedback about their changes. Often the test report contains the results of many changes from several software developers or the feedback is not accurate enough according to the developer’s source code changes. This paper describes a novel approach to reduce the feedback time and to provide test results for only these changes the developer has committed.

Martin Eyl, Clements Reichmann, Klaus Müller-Glaser

E-Government Applications


Approach of a Signature Based Single Sign on Proxy Solution

Many e-government applications need certificates, which are stored in the client’s browser certificate store to gain access into a service. The problems of such an authentication methodology at first are, a client has to store all certificates on every device, with whom access will be gained into e-government services and secondly, if the client lost such a device (e.g. notebook etc.) than all involved certificates have to be exchanged, otherwise it exist the danger to be compromised. To phase such a problem and to provide a secure single sign on solution, we implemented a secure proxy solution with integrated encrypted certificate storage, where citizens can store all their certificates to use e-government services. Our solution – we call “proxy authenticator” – enabled us to omit any alteration of existing protocol structure or amending of software architecture for all Austrian e-government applications. This saved time, effort, and costs, by connecting the existing e-delivery services in Austria into the myHelp portal through the proxy authenticator.

Klaus John, Stefan Taber


Weitere Informationen

Premium Partner

Neuer Inhalt

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.



Product Lifecycle Management im Konzernumfeld – Herausforderungen, Lösungsansätze und Handlungsempfehlungen

Für produzierende Unternehmen hat sich Product Lifecycle Management in den letzten Jahrzehnten in wachsendem Maße zu einem strategisch wichtigen Ansatz entwickelt. Forciert durch steigende Effektivitäts- und Effizienzanforderungen stellen viele Unternehmen ihre Product Lifecycle Management-Prozesse und -Informationssysteme auf den Prüfstand. Der vorliegende Beitrag beschreibt entlang eines etablierten Analyseframeworks Herausforderungen und Lösungsansätze im Product Lifecycle Management im Konzernumfeld.
Jetzt gratis downloaden!