Skip to main content
main-content

Über dieses Buch

The LNCS Transactions on Foundations for Mastering Change, FoMaC, aims to establish a forum for formal-methods-based research, dealing with the nature of today’s agile system development, which is characterized by unclear premises, unforeseen change, and the need for fast reaction, in a context of hard-to-control frame conditions, such as third-party components, network problems, and attacks. Submissions are evaluated according to these goals.
This book, the first volume in the series, contains contributions by the members of the editorial board. These contributions indicate the envisioned style and range of papers of topics covered by the transactions series. They cross-cut various traditional research directions and are characterized by a clear focus on change.

Inhaltsverzeichnis

Frontmatter

Introduction to the First Issue of FoMaC

We briefly introduce the envisioned style and scope of the LNCS Transactions on Foundations for Mastering Change (FoMaC) on the basis of the individual contributions of the initial issue, which mainly features invited papers co-authored by the founding members of the editorial board.
Bernhard Steffen

Knowledge Management for Inclusive System Evolution

When systems evolve in today’s complex, connected, and heterogeneous IT landscapes, waves of change ripple in every direction. Sometimes a change mandates other changes elsewhere, very often it is needed and opportune to check that a change indeed has no effects, or maybe only the announced effects, on other portions of the connected landscape, and impacts are often assessable only or also by expert professionals distinct from IT professionals. In this paper, we discuss the state of affairs with the current practice of software design, and examine it from the point of view of the adequacy of knowledge management and change enactment in a co-creation environment, as it is predicated and practiced by modern agile and lean IT development approaches, and in software ecosystems. True and functioning inclusion of non-IT stakeholders on equal terms, in our opinion, hinges on adequate, i.e., accessible and understandable, representation and management of knowledge about the system under development along the entire toolchain of design, development, and maintenance.
Tiziana Margaria

Archimedean Points: The Essence for Mastering Change

Explicit Archimedean Point-driven (software) system development aims at maintaining as much control as possible via ‘things’ that do not change, and may radically alter the role of modeling and development tools. The idea is to incorporate as much knowledge as possible into the tools themselves. This way they become domain-specific, problem-specific, or even specific to a particular new requirement for a system already in operation. Key to the practicality of this approach is a much increased ease of tool development: it must be economic to alter the modeling tool as part of specific development tasks. The Cinco framework aims at exactly this kind of ease: once the intended change is specified, generating a new tool is essential a push button activity. This philosophy and tool chain are illustrated along the stepwise construction of a BPMN tool via a chain of increasingly expressive Petri net tools. By construction, the resulting BPMN tool has a conceptually very clean semantic foundation, which enables tool features like various consistency checks, type-controlled activity integration, and true full code generation.
Bernhard Steffen, Stefan Naujokat

Model Patterns

The Quest for the Right Level of Abstraction
We know by now that evolution in software is inevitable. Given that is so, we should not just allow for but accommodate for change throughout the software lifecycle. The claim of this paper is that, in order to accommodate for change effectively, we need a modelling discipline with a built-in notion of refinement, so that domain concepts can be defined and understood on their appropriate level of abstraction, and change can be captured on that same level. Refinement serves to connect levels of abstraction within the same model, enabling a simultaneous understanding of that same model on different levels. We propose the term model pattern for the central concept in such a modelling discipline.
Arend Rensink

Verified Change

We present the textual wide-spectrum modeling and programing language K, which has been designed for representing graphical SysML models, in order to provide semantics to SysML, and pave the way for analysis of SysML models. The current version is supported by the Z3 SMT theorem prover, which allows to prove consistency of constraints. The language is intended to be used by engineers for designing space missions, and in particular NASA’s proposed mission to Jupiter’s moon Europa. One of the challenges facing software development teams is the notion of change: the fact that code changes over time, and the subsequent problem of demonstrating that no harm has been done due to a change. K is in this paper being applied to demonstrate how change can be perceived as a software verification problem, and hence verified using more traditional software verification techniques.
Klaus Havelund, Rahul Kumar

Good Change and Bad Change: An Analysis Perspective on Software Evolution

Software does change, and should change. Traditional industrial software systems often evolve over long periods of time with each new version forming a discreet milestone, while some new software systems involve constant adaptation to situations in the environment and therefore evolve continually. While necessary, software change can also be devastating, making the system difficult to change and maintain further. We believe that one promising way to manage and control change is to view an evolving system as a software product line where each version of the software is a product. Key to any successful software product line approach is a software architecture that supports variability management. Tools that can identify commonalities and differences among various releases are essential in collecting and managing the information on changed, added and deleted components. Equally important are tools that allow the architect to analyse the current status of the product line as well as its products from various perspectives, and to be able to detect and remove architectural violations that threaten the variability points and built-in flexibility. In this paper, we describe our current research on defining such a process and supporting tools for software evolution management based on product line concepts and apply it in a case study to a software testbed called TSAFE. We describe how we reverse engineer the actual architecture from the source code and how we develop new target architectures based on the reverse engineered one and the expected changes. We then described how we analyse the actual change across different implementations and visualize where the change actually occurred. We then describe how we determine if a particular implementation match the target architecture. The conclusion is that we have found that both these analysis techniques are particularly useful for analysing software evolution and complement each other.
Mikael Lindvall, Martin Becker, Vasil Tenev, Slawomir Duszynski, Mike Hinchey

Compositional Model-Based System Design and Other Foundations for Mastering Change

The topic of this paper is the design of dynamical systems, such as embedded or cyber-physical systems. Is there a science of system design, and if so, what are its elements? We believe that its key elements are three: modeling, analysis, and implementation, and we discuss fundamental research challenges in each category. We then argue that compositionality is an essential cross-cutting concern, and one of the foundations for mastering change. Beyond compositionality, we discuss other key problems when dealing with change, namely, multi-view modeling and synthesis.
Stavros Tripakis

Proof Repositories for Compositional Verification of Evolving Software Systems

Managing Change When Proving Software Correct
We propose a new and systematic framework for proof reuse in the context of deductive software verification. The framework generalizes abstract contracts into incremental proof repositories. Abstract contracts enable a separation of concerns between called methods and their implementations, facilitating proof reuse. Proof repositories allow the systematic caching of partial proofs that can be adapted to different method implementations. The framework provides flexible support for compositional verification in the context of, e.g., partly developed programs, evolution of programs and contracts, and product variability.
Richard Bubel, Ferruccio Damiani, Reiner Hähnle, Einar Broch Johnsen, Olaf Owe, Ina Schaefer, Ingrid Chieh Yu

Statistical Model Checking with Change Detection

Statistical Model Checking (SMC) is a powerful and widely used approach that consists in estimating the probability for a system to satisfy a temporal property. This is done by monitoring a finite number of executions of the system, and then extrapolating the result by using statistics. The answer is correct up to some confidence that can be parameterized by the user. It is known that SMC mitigates the state-space explosion problem and allows to approximate undecidable queries. The approach has been implemented in several toolsets such as Plasma Lab, and successfully applied in a wide range of diverse areas such as systems biology, robotic, or automotive. In this paper, we add two new modest contributions to the cathedral of results on SMC. The first contribution is an algorithm that can be used to monitor changes in the probability distribution to satisfy a bounded-time property at runtime. Concretely, the algorithm constantly monitors the execution of the deployed system, and raises a flag when it observes that the probability has changed significantly. This is done by extending the applicability of the CUSUM algorithm used in signal processing into the formal validation setting. Our second contribution is to show how the programming interface of Plasma Lab can be exploited in order to make SMC technology directly available in toolsets used by designers. This integration is done by exploiting simulation facilities of design tools. Our approach thus differs from the one adopted by other SMC/formal verification toolsets which assume the existence of formal semantics for the design language, as well as a compiling chain to the rather academic one used by validation tool. The concept results in the integration of Plasma Lab as a library of the Simulink toolset. The contributions are illustrated by using Plasma Lab to verify a Simulink case study modelling a pig shed temperature controller.
Axel Legay, Louis-Marie Traonouez

Collective Autonomic Systems: Towards Engineering Principles and Their Foundations

Collective autonomic systems (CAS) are adaptive, open-ended, highly parallel, interactive and distributed software systems. They consist of many collaborative entities that manage their own knowledge and processes. CAS present many engineering challenges, such as awareness of the environmental situation, performing suitable and adequate adaptations in response to environmental changes, or preserving adaptations over system updates and modifications. Recent research has proposed initial solutions to some of these challenges, but many of the difficult questions remain unanswered and will open up a rich field of future research.
In an attempt to initiate a discussion about the structure of this emerging research area, we present eight engineering principles that we consider promising candidates for relevant future research, and shortly address their possible foundations. Our discussion is based on a development life cycle (EDLC) for autonomic systems. Going beyond the traditional iterative development process, the EDLC proposes three control loops for system design, runtime adaptation, as well as feedback between design- and runtime. Some of our principles concern the whole development process, while others focus on a particular control loop.
Lenz Belzner, Matthias Hölzl, Nora Koch, Martin Wirsing

Continuous Collaboration for Changing Environments

Collective autonomic systems (CAS) are distributed collections of agents that collaborate to achieve the system’s goals but autonomously adapt their behavior. We present the teacher/student architecture for locally coordinated distributed learning and show that in certain scenarios the performance of a swarm using teacher/student learning can be significantly better than that of agents learning individually. Teacher/student learning serves as foundation for the continuous collaboration (CC) development approach. We introduce CC, relate it to the EDLC, a life cycle model for CAS, and show that CC embodies many of the principles proposed for developing CAS.
Matthias Hölzl, Thomas Gabor

Issues on Software Quality Models for Mastering Change

A promising cornerstone to master change and to continuously control software quality in the context of todays dynamically evolving complex software systems are software quality models. These models provide an abstract and analyzable view of software artifacts with the objective to describe, assess and/or predict quality. Although software quality models have a high potential to improve effectiveness and efficiency of quality assurance to cope with software change, their acceptance and spread in the software industry is still rather low, as there are several unresolved issues that have to be addressed by upcoming research. This article discusses and exemplifies unresolved key issues on descriptive, generating and predictive software quality models with regard to the (1) creation and maintenance of models, (2) support for extra-functional aspects, (3) traceability between quality models and unstructured artifacts, (4) integration of software analytics and runtime information, (5) balance between quality and risk, (6) process integration, as well as (7) justification by empirical evidence, and relates these issues to challenges of mastering change in terms of the manifesto of the LNCS Transactions on Foundations for Mastering Change.
Michael Felderer

Traceability Types for Mastering Change in Collaborative Software Quality Management

Software is constantly evolving and to successfully comprehend and manage this evolutionary change is a challenging task which requires traceability support. In this paper we propose a novel approach to traceability as a cornerstone for successful impact analysis and change management, in the context of collaborative software quality management. We first motivate the crucial role of traceability within lifecycle management of the new generation of distributed fragmented software services. Based on the model-based collaborative software quality management framework of Living Models, we then categorize software quality management services and identify novel types of traceability. This is followed by an overview and classification of sample software quality management services from literature, enabled by the interrelation with the identified types of traceability. From this classification we derive the need for further research on traceability in collaborative software quality management.
Boban Celebic, Ruth Breu, Michael Felderer

Backmatter

Weitere Informationen

Premium Partner

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.

Whitepaper

- ANZEIGE -

Best Practices für die Mitarbeiter-Partizipation in der Produktentwicklung

Unternehmen haben das Innovationspotenzial der eigenen Mitarbeiter auch außerhalb der F&E-Abteilung erkannt. Viele Initiativen zur Partizipation scheitern in der Praxis jedoch häufig. Lesen Sie hier  - basierend auf einer qualitativ-explorativen Expertenstudie - mehr über die wesentlichen Problemfelder der mitarbeiterzentrierten Produktentwicklung und profitieren Sie von konkreten Handlungsempfehlungen aus der Praxis.
Jetzt gratis downloaden!

Bildnachweise