Skip to main content
Top

2016 | Book

Advanced Information Systems Engineering Workshops

CAiSE 2016 International Workshops, Ljubljana, Slovenia, June 13-17, 2016, Proceedings

insite
SEARCH

About this book

This book constitutes the thoroughly refereed proceedings of five international workshops held in Ljubljana, Slovenia, in conjunction with the 28th International Conference on Advanced Information Systems Engineering, CAiSE 2016, in June 2016.

The 16 full and 9 short papers were carefully selected from 51 submissions.

The associated workshops were the Third International Workshop on Advances in Services DEsign based on the Notion of CApabiliy (ASDENCA) co-arranged with the First International Workshop on Business Model Dynamics and Information Systems Engineering (BumDISE), the Fourth International Workshop on Cognitive Aspects of Information Systems Engineering (COGNISE), the First International Workshop on Energy-awareness and Big Data Management in Information Systems (EnBIS), the Second International Workshop on Enterprise Modeling (EM), and the Sixth International Workshop on Information Systems Security Engineering (WISSE).

Table of Contents

Frontmatter
Erratum to: File Type Identification for Digital Forensics
Konstantinos Karampidis, Giorgios Papadourakis

ASDENCA/BUMDISE 2016 – Capability-Based Development Methods

Frontmatter
Selection and Evolutionary Development of Software-Service Bundles: A Capability Based Method
Abstract
Software-service bundles are combinations of software products and services offered by their vendors to clients. The clients select a combination of software product and associated service best suited to their specific circumstances. The paper proposes an information sharing based method helping clients to select the most appropriate combination or configuration and also supporting the continuous improvement of the solution in response to changing circumstances. The method utilizes principles of the Capability Driven Development to characterize performance objectives and contextual factors affecting delivery of a software-service bundle. Application of the method is demonstrated using an illustrative example of data processing.
Jānis Grabis, Kurt Sandkuhl
LightCDD: A Lightweight Capability-Driven Development Method for Start-Ups
Abstract
Novice innovators and entrepreneurs face the risk of designing naive business models. In fact, lack of realism and failing to envision contextual constraints is one of the main threats to start-up success. Both the literature and the responses we gathered from experts in incubation confirm this problem. Capability Driven Development (CDD) is an integrated approach consisting of a method, tools, and best practices. It has proved to be successful when applied to mature enterprises that intend to become context-aware and adaptive. In this paper we report on the application of CDD to two start-up projects and how, despite being useful in making the entrepreneurs aware of dynamic business environments and constraints, a trade-off analysis showed that a simpler version of the method was necessary. Therefore, we present LightCDD, a context-aware enterprise modelling method that is tailored for business model generation. It reduces the set of modelling constructs and guidelines to facilitate its adoption by entrepreneurs, yet keeping it expressive enough for their purposes and, at the same time, compatible with CDD methodology. We also discuss what implications this simplification has with regard to the CDD tool environment.
Hasan Koç, Marcela Ruiz, Sergio España

ASDENCA/BUMDISE 2016 – Integration of Capability with Goals and Context

Frontmatter
Comparison of Tool Support for Goal Modelling in Capability Management
Abstract
Capability management is attracting more and more attention in different research and industry areas. A starting point for implementing capability management in an organization often is to identify desired and existing capabilities in relation to the organization’s goals and to specify the capability. Due to the close relationship to organizational goals, tool support for capability modelling should also include capturing goal or even complete enterprise models. The aim of the paper is to contribute to an understanding what kind of tool support is suitable for capability management to capture enterprise goals. The focus in this paper is on goal modelling as part of capability modelling tools and aims at comparing modelling without computer support, modelling with tools enforcing a meta-model and modeling with support of a drawing tool. The comparison is based on experiments.
Claas Fastnacht, Hasan Koç, Dimitrijs Nesterenko, Kurt Sandkuhl
Extending Capabilities with Context Awareness
Abstract
Organizations have the need to continuously adjust their capabilities to changes in the business context. If existing IT systems and associated development methods does not support this adjustment they need to be changed to do so. However, there exist specialized methods and tools that allow the design, and run-time monitoring of context information. In this paper an approach that allows existing systems to be extended with the management of context information is presented. The approach allows organizations to analyze the potential effect and effort of combing existing systems and tools with specialized tools that handle context information. The purpose of providing means for the integrated use of existing systems and specialized tools is to leverage the strength of both. The approach in grounded in and illustrated by a case of industrial symbiosis.
Martin Henkel, Christina Stratigaki, Janis Stirna, Pericles Loucopoulos, Yannis Zorgios, Antonis Migiakis
Design of Capability Delivery Adjustments
Abstract
Capabilities are designed for ensuring that business services can be delivered to satisfy business performance objectives in different circumstances. Run-time adjustments are used to adapt capability delivery to these specific circumstances. The paper elaborates the concept of the capability delivery adjustments on the basis of capability meta-model proposed as a part of the Capability Driven Development approach. The types of adjustments are identified as their specifications are provided. An example of adjustments modeling is developed.
Jānis Grabis, Jānis Kampars

ASDENCA/BUMDISE 2016 – Business Modeling

Frontmatter
The Fast Fashion Business Model
Abstract
Fashion has a deep impact on the apparel industry and is used as a lever to gain a competitive advance position. The original idea of the fast fashion firm consists in offering fashionable products at cheap or very cheap prices. The fast fashion players try to capture the trends as opposed to imposing the trends, and simultaneously to reduce the risks of betting the future staying close to the “proof of the market” and be flexible to jump in as soon as signals become stronger and more reliable, therefore they are not incurring in the market risk of unmatched demand or, at least, strongly erase them.
Gianni Lorenzoni
Cycles of Organizational Renewal: The Interplay of Strategy and Innovation at Bang & Olufsen
Abstract
Bang & Olufsen, a Danish high-end producer of consumer electronics, has experienced both failure and success in the process of strategic renewal, challenged by blurring industry boundaries and the rise of the digital technology paradigm. A study of the last decade shows how the company tried to renew itself by improving its new product development process from three angles: strategy, innovation and organizational design. Using ethnographic and grounded theory methods with a micro-foundational perspective we illustrate how the dynamics in the process of renewal can be accrued into a longitudinal perspective where the key drivers of change cycle between strategy and innovation.
Giacomo Cattaneo, Fredrik Hacklin
Information Systems for Innovation: A Comparative Analysis of Maturity Models’ Characteristics
Abstract
Nowadays, virtually all industries are impacted by the digitalization of business enabled by information and communication technologies. Consequently, it is a major challenge to any business to increase its ability to innovate through information systems. However the effort and the investments of companies are extremely varied, they do not have the same level of maturity with respect to their innovation strategy. While some highly mature use effective approaches, others still act as novices or use inadequate practices. The question raised in this paper is how to evaluate the level of maturity of an organization with respect to information systems based innovation. Also, the question concerns the identification of the salient features of ICT centred innovation maturity models. Taking these issues into account, the paper makes the following contributions: (i) a review of sixteen innovation maturity models collected from the research and the practitioners community, gathering facts about the models and about their effectiveness; (ii) a comparative analysis of these models.
Abdelkader Achi, Camille Salinesi, Gianluigi Viscusi

COGNISE 2016

Frontmatter
Learning from Errors as a Pedagogic Approach for Reaching a Higher Conceptual Level in Database Modeling
Abstract
We apply a pedagogic approach named learning from errors (LFE) to the area of relational database modeling. Database modeling is a complex cognitive process characterized by a high level of element interactivity. Finding an appropriate pedagogy to teach database modeling is a challenge for information systems educators. One of the challenges that practitioners meet is the need to help database students shift between different levels of abstraction. We metaphorically treat the LFE approach as a bridge over the gulf of abstraction levels. Errors have a powerful potential in education, to encourage students to move to a deeper processing level of the course material. We use Rasmussen’s three levels of human performance model to explain and demonstrate the promising potential of the LFE approach in database modeling.
Adi Katz, Ronit Shmallo
‘Mathematical’ Does Not Mean ‘Boring’: Integrating Software Assignments to Enhance Learning of Logico-Mathematical Concepts
Abstract
Insufficient mathematical skills of practitioners are hypothesized as one of the main hindering factors for the adoption of formal methods in industry. This problem is directly related to negative attitudes of future computing professionals to core mathematical disciplines, which are perceived as difficult, boring and not relevant to their future daily practices. This paper is a contribution to the ongoing debate on how to make courses in Logic and Formal Methods both relevant and engaging for future software practitioners. We propose to increase engagement and enhance learning by integrating ‘hands-on’ software engineering assignments based on cross-fertilization between software engineering and logic. As an example, we report on a pilot assignment given at a Logic and Formal Methods course for Information Systems students at the University of Haifa. We describe the design of the assignment, students’ feedback and discuss some lessons learnt from the pilot.
Anna Zamansky, Yoni Zohar
User Involvement in Applications of the PoN
Abstract
In a previous paper [12] we argued for more user-centric analysis of modeling languages’ visual notation quality. Here we present initial findings from a systematic literature review on the use of the Physics of Notations (PoN) to further that argument. Concretely, we show that while the PoN is widely applied, these applications rarely actively involve intended users of a visual notation in setting the requirements for the notation, nor in evaluating its improvement or design according to the PoN’s criteria. We discuss the potential reasons for this lack of user involvement, and what can be gained from increasing user involvement.
Dirk van der Linden, Irit Hadar
A Visual Logical Language for System Modelling in Combinatorial Test Design
Abstract
This position paper addresses some weaknesses of the standard logical languages used for specification of system models in combinatorial test design. To overcome these weaknesses, we propose a new logical language which uses visual elements with the aim to lower the cognitive load of the modeller and thereby reduce the risk of modelling errors.
Maria Spichkova, Anna Zamansky, Eitan Farchi
Peel the Onion: Use of Collaborative and Gamified Tools to Enhance Software Engineering Education
Abstract
As software engineering and information systems projects become more and more of collaborative nature, project-based courses are an integral part of information system (IS) and system engineering (SE) curricula. Several challenges stem from the nature of these courses, the most significant are equal participation of all students, and creating projects of high quality and utility. Several mechanisms, such as gamification and collaborative tools, were helpful when dealing with these challenges. This paper present a teaching case, where several tools were used, based on the onion model for open source systems, and several motivation theories.
Naomi Unkelos-Shpigel

ENBIS 2016

Frontmatter
Energy Enhancement of Multi-application Monitoring Systems for Smart Buildings
Abstract
High energy consumption of sensor devices is a major problem in smart building systems, since it strongly impacts the system lifetime. However, existing approaches are often fitted to a single monitoring application and rely on static configurations for sensor devices: optimization of their acquisition and transmission frequencies to actual multiple application requirements is not tackled. In this paper, we focus on energy-aware dynamic sensor device re-configuration to lower energy consumption while fulfilling real-time application requirements. We introduce the Smart-Service Stream-oriented Sensor Management (3SoSM) that binds together sensor configuration and management of sensor data streams. We present a multi-application monitoring system architecture that optimizes application requirements for data streams into sensor device configurations, and we relate the experiments with our experimental platform.
Ozgun Pinarer, Yann Gripay, Sylvie Servigne, Atay Ozgovde
Micro-accounting for Optimizing and Saving Energy in Smart Buildings
Abstract
Energy management, and in particular its optimization, is one of the hot trends in the current days, both at the enterprise level (optimization of whole corporate/government buildings) and single-citizens’ homes. The current trend is to provide knowledge about the micro(scopic) energy consumption. This allows to save energy, but also to optimize the different energy sources (e.g., solar vs. traditional one) in case of a mixed architecture. In this work, after briefly introducing our specific platform for smart environments able to micro-account energy consumption of devices, we present two case studies of its utilization: energy saving in offices and smart switching among different energy sources.
Daniele Sora, Massimo Mecella, Francesco Leotta, Leonardo Querzoni, Roberto Baldoni, Giuseppe Bracone, Daniele Buonanno, Mario Caruso, Adriano Cerocchi, Mariano Leva
Modeling CO Emissions to Reduce the Environmental Impact of Cloud Applications
Abstract
Cloud computing has important impacts on the environment: data centers – that represent the physical infrastructure where the cloud resources run – are not always designed as green entities, as they consume large amounts of energy (in form of electric power or fuel) often producing significant amounts of CO2 emissions. Such emissions depend on the energy sources used by the data centers and may vary over time with respect to the location in which the data center is operating. To decrease the carbon footprint of cloud computing, the selection of the site where to deploy an application, along with the decision of when to start the execution of the application, should be based not only on the satisfaction of the traditional QoS requirements but also on the energy-related constraints and their dynamics over time.
Goal of this paper is to propose a CO2 emission model able to support emission forecasting, especially for data centers that are based on electricity from the national grid. The proposed emission model can be used to improve the decisions on where and when to deploy applications on data centers in order to minimize CO2 emissions.
Cinzia Cappiello, Paco Melià, Pierluigi Plebani

EM 2016

Frontmatter
Modeling and Enacting Enterprise Decisions
Abstract
For years, the capturing of business decisions in enterprise models has not been treated as a separate concern. Rather, decisions were included in business process models or in knowledge models and ontologies. This leaves the overall view of a decision and its interplay with other decision and data requirements dispersed and hard to maintain.
The recently introduced OMG Decision Model and Notation (DMN) standard deals with decisions as a separate concern and presents decision modeling as a sovereign part of enterprise modeling. Decisions are modeled at the logical level, and the model is executable.
This work links the logical decision model to various execution strategies and processes by formalizing decisions within a business context, and by examining execution mechanisms for different availability of input data. These strategies can determine the preferred business process that handles the decision or allow to execute a decision model even when not all inputs are available up front.
Laurent Janssens, Johannes De Smedt, Jan Vanthienen
A Modelling Environment for Business Process as a Service
Abstract
Business processes can benefit from cloud offerings, but bridging the gap between business requirements and technical solutions is still a big challenge. We propose Business Process as a Service (BPaaS) as a main concept for the alignment of business process with IT in the cloud. The mechanisms described in this paper provide modelling facilities for both business and IT levels: (a) a graphical modelling environment for processes, workflows and service requirements, (b) an extension of an enterprise ontology with cloud-specific concepts, (c) semantic lifting of graphical models and (d) SPARQL querying and inferencing for semantic alignment of business and cloud IT.
Knut Hinkelmann, Kyriakos Kritikos, Sabrina Kurjakovic, Benjamin Lammel, Robert Woitsch
Understanding Production Chain Business Process Using Process Mining: A Case Study in the Manufacturing Scenario
Abstract
Due to the continuous market change the enterprises need to react fast. To do that a better understanding of the way to work is needed. Indeed this was a real need of a manufacturing enterprise working in the production of coffee machines and selling them all over the world. In this paper, we present the experience made in the application of process mining techniques on a rich set of data that such enterprise collected during the last six years. We compare five mining algorithms, such as: \(\alpha \)-algorithm, Heuristics Miner, Integer Linear Programming Miner, Inductive Miner, Evolutionary Tree Miner. We evaluated algorithms according to specific quality criteria: fitness, precision, generalization and simplicity. Even if comparison studies are already available in the literature we check them according to our working context. We conclude that the Inductive Miner algorithm is especially suited for discovering production chain processes in the context under study. The application of process mining gives the enterprise a comprehensive picture of the internal process organization. Resulting models were used by the company with successful results to motivate the discussion on the need of developing a flexible production chain.
Alessandro Bettacchi, Alberto Polzonetti, Barbara Re

WISSE 2016

Frontmatter
Software Vulnerability Life Cycles and the Age of Software Products: An Empirical Assertion with Operating System Products
Abstract
This empirical paper examines whether the age of software products can explain the turnaround between the release of security advisories and the publication vulnerability information. Building on the theoretical rationale of vulnerability life cycle modeling, this assertion is examined with an empirical sample that covers operating system releases from Microsoft and two Linux vendors. Estimation is carried out with a linear regression model. The results indicate that the age of the observed Microsoft products does not affect the turnaround times, and only feeble statistical relationships are present for the examined Linux releases. With this negative result, the paper contributes to the vulnerability life cycle modeling research by presenting and rejecting one theoretically motivated and previously unexplored question. The rejection is also a positive result; there is no reason for users to fear that the turnaround times would significantly lengthen as operating system releases age.
Jukka Ruohonen, Sami Hyrynsalmi, Ville Leppänen
Apparatus: Reasoning About Security Requirements in the Internet of Things
Abstract
Internet of Things (IoT) can be seen as the main driver towards an era of ubiquitous computing. Taking into account the scale of IoT, the number of security issues that emerge are unprecedented, therefore the need for proposing new methodologies for elaborating about security in IoT systems is undoubtedly crucial and this is recognised by both academia and the industry alike. In this work we present Apparatus, a conceptual model for reasoning about security in IoT systems through the lens of Security Requirements Engineering. Apparatus is architecture-oriented and describes an IoT system as a cluster of nodes that share network connections. The information of the system is documented in a textual manner, using Javascript Notation Object (JSON) format, in order to elicit security requirements. To demonstrate its usage the security requirements of a temperature monitor system are identified and a first application of Apparatus is exhibited.
Orestis Mavropoulos, Haralambos Mouratidis, Andrew Fish, Emmanouil Panaousis, Christos Kalloniatis
Associating the Severity of Vulnerabilities with their Description
Abstract
Software vulnerabilities constitute a major problem for today’s world, which relies more than ever to technological achievements. The characterization of vulnerabilities’ severity is an issue of major importance in order to address them and extensively study their impact on information systems. That is why scoring systems have been developed for the ranking of vulnerabilities’ severity. However, the severity scores are based on technical information and are calculated by combining experts’ assessments. The motivation for the study conducted in this paper was the question of whether the severity of vulnerabilities is directly related to their description. Hence, the associations of severity scores and individual characteristics with vulnerability descriptions’ terms were studied using Text Mining, Principal Components and correlation analysis techniques, applied to all vulnerabilities registered in the National Vulnerability Database. The results are promising for the determination of severity by the use of the description since significant correlations were found.
Dimitrios Toloudis, Georgios Spanos, Lefteris Angelis
Discovering Potential Interaction Violations among Requirements
Abstract
To fully embrace the challenge of developing more robust software, potential defects must be considered at the earliest stages of software development. Studies have shown that this reduces the time, cost and effort required to integrate corrective features into software during development. In this paper we describe a technique for uncovering potential software vulnerabilities through an analysis of software requirements and describe its use using small, motivating examples.
Curtis Busby-Earle, Robert B. France
Extending HARM to make Test Cases for Penetration Testing
Abstract
[Context and motivation] Penetration testing is one key technique for discovering vulnerabilities, so that software can be made more secure. [Question/problem] Alignment between modeling techniques used earlier in a project and the development of penetration tests could enable a more systematic approach to such testing, and in some cases also enable creativity. [Principal ideas/results] This paper proposes an extension of HARM (Hacker Attack Representation Method) to achieve a systematic approach to penetration test development. [Contributions] The paper gives an outline of the approach, illustrated by an e-exam case study.
Aparna Vegendla, Thea Marie Søgaard, Guttorm Sindre
File Type Identification for Digital Forensics
Abstract
In modern world the use of digital devices for leisure or professional reasons (computers, tablets and smartphones etc.) is growing quickly. Nevertheless, criminals try to fool authorities and hide evidence in a computer or any other digital device, by changing the file type. File type detection is a very demanding task for a digital forensic examiner. In this paper a new methodology is proposed – in a digital forensics perspective- to identify altered file types with high accuracy by employing computational intelligence techniques. The proposed methodology is applied in the four most common types of files (jpg, png and gif). A three stage process involving feature extraction (Byte Frequency Distribution), feature selection (genetic algorithm) and classification (neural network) is proposed. Experimental results were conducted having files altered in a digital forensics perspective and the results are presented. The proposed model shows very high and exceptional accuracy in file type identification.
Konstantinos Karampidis, Giorgios Papadourakis
Backmatter
Metadata
Title
Advanced Information Systems Engineering Workshops
Editors
John Krogstie
Haralambos Mouratidis
Jianwen Su
Copyright Year
2016
Electronic ISBN
978-3-319-39564-7
Print ISBN
978-3-319-39563-0
DOI
https://doi.org/10.1007/978-3-319-39564-7

Premium Partner