1 Introduction
-
Software engineering as an engineering discipline, including its interaction with and impact on society and economics;
-
Requirements engineering: capture, consistency, and change management of software requirements;
-
Software architectures: description and analysis of the architecture of individual systems or classes of applications;
-
Specification, design, and implementation of particular classes of systems: (self-)adaptive, collaborative, embedded, distributed, mobile, pervasive, cyber-physical or service-oriented applications;
-
Software quality: (static or run-time) validation and verification of functional as well as non-functional software properties using theorem proving, model checking, testing, analysis, simulation, refinement methods, metrics or visualization techniques;
-
Model-driven development and model transformation: meta-modeling, design and semantics of domain-specific languages, consistency and transformation of models, generative architectures;
-
Software processes: support for iterative, agile, and open source development;
-
Software evolution: refactoring, reverse and re-engineering, configuration management, also architectural change, or aspect-orientation.
2 Contributions
-
Incremental Execution of Rule-based Model Transformation by A. Boronat [2] presents the design of a model change propagation mechanism for executing change-driven model transformations, which has been implemented in YAMTL. Change propagation mechanisms form the foundation for maintaining consistency between large models. The novelty of the approach lies in the standardized representation of model changes. The VIATRA CPS benchmark has been used to evaluate the work.
-
Cooperative, Verifier-Based Testing with CoVeriTest by Beyer & Jakobs [1] is a test generation approach based on verification attempts: from a solved reachability problem, one can generate a test case. This idea is not new, the innovation lies in how the approach is made feasible: the reachability analysis is configurable with different types of information and levels of abstraction. These ideas are implemented in the CoVeriTest tool.
-
Avoiding Unnecessary Information Loss by Fritsche et al. [4] focuses on Triple Graph Grammars (TGGs) for model synchronization, i.e., the task of restoring the consistency between two models after model changes. Model synchronization poses challenges, e.g., one should be able to synchronize changes without unnecessary loss of information and to show a reasonable performance. The novel idea to address these challenges is to add shortcut rules, a special form of generalized TGG rules that allow taking back one edit action and to perform an alternative one. The authors show that repair rules derived from shortcut rules allow for incremental model synchronization with considerably decreased information loss and improved runtime compared to existing approaches.
-
A Logic-Based Incremental Approach to Graph Repair Featuring Delta Preservation by Schneider et al. [5] tackle the problem of graph repair, i.e., to find a minimal set of changes that make two given type graphs consistent. This an essential capability of tools in model-based design, because graphs are frequently used as abstract representations of code. Numerous graph repair approaches are known. This one distinguishes itself by being incremental and founded on a formal logic as its semantics. The algorithms are implemented in the tool AutoGraph.
-
Formal Testing of Timed Graph Transformation Systems using Metric Temporal Graph Logic by Schneider et al. [6] is about a model of embedded real-time systems called timed graph transformation systems (TGTS). During model-based development of such systems it is essential that one can test the model, but, while TGTS are adequate for modeling, they are too expressive to directly permit testing. The authors define a series of transformations that reduce testing of TGTS to satisfiability checking in a metric temporal graph logic. All this is implemented in the AutoGraph system.
-
PolyGraph: A Data Flow Model with Frequency Arithmetic by Dubrulle et al. [3] proposes an extension to static data flow paradigms that includes frequency constraints and adjustable communication rates. PolyGraph, the proposed data flow formalism, extends SDF with synchronous firing semantics for the actors. The authors define a novel algorithm for checking liveness of a given polygraph. The proposed framework was evaluated using both small (but realistic) examples as well as a large set of automatically generated models.