Skip to main content

Über dieses Buch

This book constitutes the refereed proceedings of seven workshops and a symposium, held at the 35th International Conference on Conceptual Modeling, ER 2016, in Gifu, Japan.

The 19 revised full and 3 keynote papers were carefully reviewed and selected out of 52 submissions to the following events: Conceptual Modeling for Ambient Assistance and Healthy Ageing, AHA 2016; Modeling and Management of Big Data, MoBiD 2016; Modeling and Reasoning for Business Intelligence, MORE-BI 2016; Conceptual Modeling in Requirements and Business Analysis, MREBA 2016; Quality of Models and Models of Quality, QMMQ 2016; and the Symposium on Conceptual Modeling Education, SCME 2016; and Models and Modeling on Security and Privacy, WM2SP 2016.





A Capability-Driven Development Approach for Requirements and Business Process Modeling

Requirements modeling and business process modeling are two essential activities in the earliest steps of any sound software production process. A precise conceptual alignment between them is required in order to assess that requirements are “operationalized” through an adequate set of processes. Complementary, the trip from requirements to code should benefit from using a precise model driven development connection, intended to characterize not only the involved conceptual models, but also their corresponding model transformations. Selecting the most appropriate conceptual models for specifying the different system perspective becomes a crucial task. This conceptual modeling-based solution requires to use a holistic conceptual framework to determine those modeling elements to be taken into account. Surprisingly, the link with MDD approaches to provide a rigorous link with the software components of a final software application has not been analyzed in a clear and convincing way. Exploring the notion of capability, this keynote will present a capability driven development approach together with its associated meta-model as the selected conceptual framework. Additionally, it will be shown how this framework facilitates the selection of the most appropriate method components in order to design an effective software process and in order to make feasible a sound MDD connection.
Oscar Pastor

Grounding for Ontological Architecture Quality: Metaphysical Choices

Information systems (IS) are getting larger and more complex, becoming ‘gargantuan’. IS practices have not evolved in step to handle the development and maintenance of these gargantuan systems, leading to a variety of quality issues.
Chris Partridge, Sergio de Cesare

Conceptual Modelling for Ambient Assistance and Healthy Ageing


A Model-Driven Engineering Approach for the Well-Being of Ageing People

Ambient Assisted Living has been widely perceived as a viable solution to mitigate the astronomical increase in the cost of health care. In the context of our Geras Project, we propose a Model-Driven Engineering framework for handling high-level specifications that capture the concerns of elderly people still living at home. These concerns are related to concrete living issues, like being notified of a ringing phone for a deaf people, or receiving adequate assistance after a fall. The framework explicitly models three aspects: agent’s goals, formally capturing users’ concerns; abstract solutions, defining a canvas for answering the goal; and concrete solutions in terms of APIs or various combination of APIs, for their operationalisation. We illustrate the usage of our framework on two simple scenarios.
Amanuel Alemayehu Koshima, Vincent Englebert, Moussa Amani, Abdelmounaim Debieche, Amanuel Wakjira

The Cultural Background and Support for Smart Web Information Systems

Applications and technical solutions change very quickly. Societies change also change to certain extent. Users do not change in the same manner. They want to use systems in a way they are used to. So a systems should be smart in the sense that users may stay within their habits, their way of working, their ways of accessing systems, and their circumstances. We introduce a culture-based approach to smart systems. Such smart systems may adapt to the current user independently on her/his age, abilities, habits, environments, and collaborators of them.
Bernhard Thalheim, Hannu Jaakkola

Modelling and Management of Big Data


Walmart Online Grocery Personalization: Behavioral Insights and Basket Recommendations

Food is so personal. Each individual has her own shopping characteristics. In this paper, we introduce personalization for Walmart online grocery. Our contribution is twofold. First, we study shopping behaviors of Walmart online grocery customers. In contrast to traditional online shopping, grocery shopping demonstrates more repeated and frequent purchases with large orders. Secondly, we present a multi-level basket recommendation system. In this system, unlike typical recommender systems which usually concentrate on single item or bundle recommendations, we analyze a customer’s shopping basket holistically to understand her shopping tasks. We then use multi-level cobought models to recommend items for each of the purposes. At the stage of selecting particular items, we incorporate both the customers’ general and subtle preferences into decisions. We finally recommend the customer a series of items at checkout. Offline experiments show our system can reach 11 % item hit rate, 40 % subcategory hit rate and 70 % category hit rate. Online tests show it can reach more than 25 % order hit rate.
Mindi Yuan, Yannis Pavlidis, Mukesh Jain, Kristy Caster

Searching for Optimal Configurations Within Large-Scale Models: A Cloud Computing Domain

Feature modeling is a widely accepted variability modeling technique for supporting decision-making scenarios, by representing decisions as features. However, there are scenarios where domain concepts have multiple implementation alternatives that have to be analyzed from large-scale data sources. Therefore, a manual selection of an optimal solution from within the alternatives space or even the complete representation of the domain is an unsuitable task. To solve this issue, we created a feature modeling metamodel and two specific processes to represent domain and implementation alternative models, and to search for optimal solutions whilst considering a set of optimization objectives. We applied this approach to a cloud computing case study and obtained an optimal provider configuration for deploying a JEE application.
Lina Ochoa, Oscar González-Rojas, Mauricio Verano, Harold Castro

A Link-Density-Based Algorithm for Finding Communities in Social Networks

Label propagation is a very popular, simple and fast algorithm for detecting communities in a graph such as a social network. However, it known to be non-deterministic, unstable and not very accurate. These shortcoming have attracted much attention by the research community, and many improvements have been suggested. In this paper we propose an new approach for computing preference to stabilize label propagation. The idea is to exploit the structure of the graph at study and use the link density to determine the preference of nodes. Our approach do not require any input parameter aside from the input graph itself. The complexity of propagation-based is slightly increased, but the stabilization and determinism are almost reached. Furthermore, we also propose a fuzzy version of our approach that allows one to detect overlapping communities as common in social networks. We have tested our algorithms with various real-world social networks.
Vladivy Poaka, Sven Hartmann, Hui Ma, Dietrich Steinmetz

Modelling and Reasoning for Business Intelligence


Searching for Patterns in Sequential Data: Functionality and Performance Assessment of Commercial and Open-Source Systems

Ubiquitous devices and applications generate data that are naturally ordered by time. Thus elementary data items can form sequences. The most popular way of analyzing sequences is searching for patterns. To this end, sequential pattern discovery techniques were proposed in some research contributions and implemented in a few database systems, e.g., Oracle Database, Teradata Aster, Apache Hive. The goal of this work is to assess the functionality of the systems and to evaluate their performance with respect to pattern queries.
Witold Andrzejewski, Bartosz Bębel, Szymon Kłosowski, Bartosz Łukaszewski, Robert Wrembel, Gastón Bakkalian

Analysis of Natural and Technogenic Safety of the Krasnoyarsk Region Based on Data Mining Techniques

This paper presents a comprehensive analysis of natural and technogenic safety indicators of the Krasnoyarsk region in order to explore geographical variations and patterns in occurrence of emergencies by applying the multidimensional analysis techniques – principal component analysis and cluster analysis – to data of the Territory Safety Passports. For data modelling, two principal components are selected and interpreted taking account of the contribution of the data attributes to the principal components. Data distribution on the principal components is analysed at different levels of the territory detail: municipal areas and settlements. Two- and three- cluster structures are constructed in multidimensional data space; the main clusters features are analyzed. The results of this analysis have allowed to identify the high-risk municipal areas and rank the territories according to danger degree of occurrence of the natural and technogenic emergencies. It gives the basis for decision making and makes it possible for authorities to allocate the forces and means for territory protection more efficiently and develop a system of measures to prevent and mitigate the consequences of emergencies in the large region.
Tatiana Penkova

From Design to Visualization of Spatial OLAP Applications: A First Prototyping Methodology

The design of Spatial OLAP (SOLAP) applications consists of (i) Spatial Data Warehouse (SDW) model design and (ii) SOLAP visualization definition because a specific set of understandable and readable cartographic visualizations corresponds to a particular type of SOLAP query. Unfortunately few works investigate geovisualization issues in SOLAP systems and propose new methodologies to visualize spatio-temporal data, and no works investigate tools for readable SOLAP cartographic displays. Moreover, some works propose ad-hoc methodologies for DWs and SDWs exclusively based on data and user analysis requirements. Therefore, we present in this paper (i) a new geovisualization methodology for SOLAP queries that yields readable maps and (ii) a new prototyping design methodology for SOLAP applications that accounts for geovisualization requirements.
Sandro Bimonte, Ali Hassan, Philippe Beaune

Conceptual Modeling in Requirements and Business Analysis


Bridging User Story Sets with the Use Case Model

User Stories (US) are mostly used as basis for representing requirements in agile development. Written in a direct manner, US fail in producing a visual representation of the main system-to-be functions. A Use-Case Diagram (UCD), on the other hand, intends to provide such a view. Approaches that map US sets to a UCD have been proposed; they however consider every US as a Use Case (UC). Nevertheless, a valid UC should not be an atomic task or a sub-process but enclose an entire scenario of the system use instead. A unified model of US templates to tag US sets was previously build. Within functional elements, it notably distinguishes granularity levels. In this paper, we propose to transform specific elements of a US set into a UCD using the granularity information obtained through tagging. In practice, such a transformation involves continuous round-tripping between the US and UC views; a CASE-tool supports this.
Yves Wautelet, Samedi Heng, Diana Hintea, Manuel Kolp, Stephan Poelmans

A Study on Tangible Participative Enterprise Modelling

Enterprise modelling (EM) is concerned with discovering, structuring and representing domain knowledge pertaining to different aspects of an organization. Participative EM, in particular, is a useful approach to eliciting this knowledge from domain experts with different backgrounds. In related work, tangible modelling – modelling with physical objects – has been identified as beneficial for group modelling.
This study investigates effects of introducing tangible modelling as part of participative enterprise modelling sessions. Our findings suggest that tangible modelling facilitates participation. While this can make reaching an agreement more time-consuming, the resulting models tend to be of higher quality than those created using a computer. Also, tangible models are easier to use and promote learnability. We discuss possible explanations of and generalizations from these observations.
Dan Ionita, Julia Kaidalova, Alexandr Vasenev, Roel Wieringa

Bridging the Requirements Engineering and Business Analysis Toward a Unified Knowledge Framework

Several similar but different disciplines have evolved in the arena of requirements engineering and business analysis, including business analysis, BPM (Business Process Management), and business architecture. Yet, they are forming bodies of knowledges in each discipline. Each discipline has its raison d’etre. However, such diversity of the bodies of knowledge causes a confusion. This article is intended to review the bodies of knowledge, and proposes a unified knowledge framework of the bodies of knowledge across the disciplines.
Mikio Aoyama

Quality of Models and Models of Quality


An Exploratory Analysis on the Comprehension of 3D and 4D Ontology-Driven Conceptual Models

In this paper, we perform an exploratory analysis to investigate the impact of adopting a 3D and a 4D foundational ontology on the quality of a conceptual model. More specifically, we determine the impact of the metaphysical characteristics of an ontology on the comprehension and understandability of the ontology-driven models by its users. The contributions of this research are: (1) while much effort in ODCM has been devoted into the syntactic and semantic aspects of models for improving their overall quality, this research focuses on the pragmatic aspect of a model; and (2) since little empirical research has yet been performed in this area, we formulated several hypotheses that are derived from the results and observations from our exploratory analysis. These hypotheses can then serve as a testing ground for future empirical research in order to investigate the fundamental differences between 3D and 4D ontology-driven models.
Michaël Verdonck, Frederik Gailly

Data Quality Problems When Integrating Genomic Information

Due to the complexity of genomic information and the broad amount of data produced every day, the genomic information accessible on the web has become very difficult to integrate, which hinders the research process. Using the knowledge from the Data Quality field and after a specific study of a set of genomic databases we have found problems related to six Data Quality dimensions. The aim of this paper is to highlight the problems that bioinformaticians have to face when they integrate information from different genomic databases. The contribution of this paper is to identify and characterize those problems in order to understand which ones hinder the research process, increasing the time-waste that this task means for researchers.
Ana León, José Reyes, Verónica Burriel, Francisco Valverde

The Design of a Core Value Ontology Using Ontology Patterns

The creation of value is an important concern in organizations. However, current Enterprise Modeling languages all interprete value differently, which has a negative impact on the semantic quality of the model instantiations. This issue need to be solved to increase the relevance of these instantiations for business stakeholders. Therefore, the goal of this paper is the development of a sound Core Value Ontology. In order to do that, we employ a pattern-based ontology engineering approach, which employs the Unified Foundational Ontology.
Frederik Gailly, Ben Roelens, Giancarlo Guizzardi

Conceptual Modelling Education


YASQLT – Yet Another SQL Tutor

A Pragmatic Approach
The paper describes an ongoing project of creating an automated assessment tool to help novice students learning SQL in a frame of an introductory database course. In difference to other tools of this kind, the project has chosen a pragmatic approach of focusing on catching common semantic errors, leaving syntax control to professional DBMS. Using agile system development, the project successfully completed two iterations, both of which were tested in practice with satisfactory results. The students appreciated the tool and would like to have similar tools for other subjects, including Relational Algebra, and Conceptual Modeling. The latter is planned for implementation in the near future. The tool is considered to be appropriate for Learning by Failure in the situation of large size classes and short courses.
Ilia Bider, David Rogers

Human Factors in the Adoption of Model-Driven Engineering: An Educator’s Perspective

This paper complements previous empirical studies on teaching Model-driven Engineering (MDE) by reporting on the authors’ attempt at introducing MDE to undergrad students. This is important because: (1) today’s students are tomorrow’s professionals and industrial adoption depends also on the availability of trained professionals and (2) observing problems in the introduction of MDE in the more controlled environment of a classroom setting allows us to identify additional adoption factors, more at the individual level, to be taken into account after in industrial settings. As we report herein, this attempt was largely unsuccessful. We will analyze what went wrong, what we learned from the process and the implications this has for both future endeavors of introducing MDE in both educational and professional environments, particularly regarding human/socio-technical factors to be considered.
Jordi Cabot, Dimitrios S. Kolovos

Learning Pros and Cons of Model-Driven Development in a Practical Teaching Experience

Current teaching guides on Software Engineering degree focus mainly on teaching programming languages from the first courses. Conceptual modeling is a topic that is only taught in last courses, like master courses. At that point, many students do not see the usefulness of conceptual modeling and most of them have difficulty to reach the level of abstraction needed to work with them. In order to make the learning of conceptual modeling more attractive, we have conducted an experience where students compare a traditional development versus a development using conceptual models through a Model-Driven Development (MDD) method. This way, students can check on their own pros and cons of working with MDD in a practical environment. Comparison has been done in terms of Accuracy, Effort, Productivity and Satisfaction. The contribution of this paper is twofold: the description of the teaching methodology used throughout the whole course; and the presentation of results and discussions of the comparison between MDD and a traditional development method. Results show that Accuracy, Effort and Productivity are better for MDD when the problem to solve is not easy. These results are shown to students to promote a discussion in the classroom about the use of MDD. According to this discussion, the most difficult part of using MDD is the learnability and the best part is the automatic code generation.
Óscar Pastor, Sergio España, Jose Ignacio Panach

Models and Modelling on Security and Privacy


Towards Provable Security of Dynamic Source Routing Protocol and Its Applications

Routing control such as Internet routing is one of the most popular topics that dram many researchers’ attention in recent years. However, to the best of our knowledge, there are few works to deal with their provable security, where the security can be mathematically proven under some reasonable assumption. Although the provable security has been discussed in the area of cryptography, we consider that such analysis should also be done to conventional network systems in order to guarantee the security of their specifications. In this work, we aim to construct such a provable framework, and particularly discuss formalization of dynamic source routing (DSR) protocol which is a kernel protocol for sensor networks. Our formalization can be easily extended into secure routing protocols with cryptographic schemes such as digital signatures.
Naoto Yanai

Tool Demonstrations


A Tool for Analyzing Variability Based on Functional Requirements and Testing Artifacts

Analyzing differences among software artifacts is beneficial in a variety of scenarios, such as feasibility study, configuration management, and software product line engineering. Currently variability analysis is mainly done based on artifacts developed in a certain development phase (most notably, requirements engineering). We will demonstrate a tool that utilizes both functional requirements and test cases in order to analyze variability more comprehensively. The tool implements the ideas of SOVA R-TC method.
Michal Steinberger, Iris Reinhartz-Berger, Amir Tomer


Weitere Informationen

Premium Partner

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.



Best Practices für die Mitarbeiter-Partizipation in der Produktentwicklung

Unternehmen haben das Innovationspotenzial der eigenen Mitarbeiter auch außerhalb der F&E-Abteilung erkannt. Viele Initiativen zur Partizipation scheitern in der Praxis jedoch häufig. Lesen Sie hier  - basierend auf einer qualitativ-explorativen Expertenstudie - mehr über die wesentlichen Problemfelder der mitarbeiterzentrierten Produktentwicklung und profitieren Sie von konkreten Handlungsempfehlungen aus der Praxis.
Jetzt gratis downloaden!