Skip to main content

2023 | Book

Digital Transformation

Core Technologies and Emerging Topics from a Computer Science Perspective

Editors: Birgit Vogel-Heuser, Manuel Wimmer

Publisher: Springer Berlin Heidelberg


About this book

Digital Transformation in Industry 4.0/5.0 requires the effective and efficient application of digitalization technologies in the area of production systems. This book elaborates on concepts, techniques, and technologies from computer science in the context of Industry 4.0/5.0 and demonstrates their possible applications. Thus, the book serves as an orientation but also as a reference work for experts in the field of Industry 4.0/5.0 to successfully advance digitization in their companies.

Table of Contents


Digital Representation

Engineering Digital Twins and Digital Shadows as Key Enablers for Industry 4.0
Industry 4.0 opens up new potentials for the automation and improvement of production processes, but the associated digitization also increases the complexity of this development. Monitoring and maintenance activities in production processes still require high manual effort and are only partially automated due to immature data aggregation and analysis, resulting in expensive downtimes, inefficient use of machines, and too much production of waste. To maintain control over the growing complexity and to provide insight into the production, concepts such as Digital Twins, Digital Shadows, and model-based systems engineering for Industry 4.0 emerge. Digital Shadows consist of data traces of an observed Cyber-Physical Production System. Digital Twins operate on Digital Shadows to enable novel analysis, monitoring, and optimization. We present a general overview of the concepts of Digital Twins, Digital Shadows, their usage and realization in Data Lakes, their development based on engineering models, and corresponding engineering challenges. This provides a foundation for implementing Digital Twins, which constitute a main driver for future innovations in Industry 4.0 digitization.
Stefan Braun, Manuela Dalibor, Nico Jansen, Matthias Jarke, István Koren, Christoph Quix, Bernhard Rumpe, Manuel Wimmer, Andreas Wortmann
Designing Strongly-decoupled Industry 4.0 Applications Across the Stack: A Use Case
Loose coupling of system components on all levels of automated production systems enables vital systems-of-systems properties such as simplified composition, variability, testing, reuse, maintenance, and adaptation. All these are crucial aspects needed to realize highly flexible and adaptable production systems. Based on traditional software architecture concepts, we describe in this chapter a use case of how message-based communication and appropriate architectural styles can help to realize these properties. A building block is the capabilities that describe what production participants (machines, robots, humans, logistics) are able to do. Capabilities are applied at all levels in our use case: describing the production process, describing machines, transport logistics, down to capabilities of the various functional units within a machine or robot. Based on this use case, this chapter aims to show how such a system can be designed to achieve loosely coupling and what example technologies and methodologies can be applied on the different levels.
Christoph Mayr-Dorn, Alois Zoitl, Georg Weichhart, Michael Mayrhofer, Alexander Egyed
Variability in Products and Production
Products and production are inherently variable. That is, the products themselves often need to be variable—as in a car plant producing many similar, albeit not identical cars. Such flexibility allows a product to be more easily customizable. We speak of variable products. At the same time, production systems typically need to be flexible in supporting the production of different products. Such flexibility allows for a broader use of production systems, supports lower production volumes while remaining economical, or optimizes production resources to avoid delays. We speak of variable production. This chapter explores variability in products and variability during production where product variability needs to be understood together with its implications on production. Special considerations are products that are consequently used during production and the issue of hardware/software variability, which is mostly handled separately today. We provide examples from an injection molding machine and also discuss open research challenges.
Alexander Egyed, Paul Grünbacher, Lukas Linsbauer, Herbert Prähofer, Ina Schaefer

Digital Infrastructures

Reference Architectures for Closing the IT/OT Gap
The Internet of Things (IoT) is an allegory for the concept of seamlessly connecting intelligent devices. Its application in the industrial domain envisions a next-generation manufacturing industry. Initiatives such as Industry 4.0 promise higher flexibility, improved quality and productivity. Nonetheless, the enhancements cause an increased complexity in a factory and its organisation as they require a seamless collaboration between all involved units, technological systems and individuals. One way of coping with the extended additional complexity is by utilising Architectural Reference Models (ARMs). State-of-the-art architectures combine different perspectives with a standard model to accommodate design choices, remove knowledge barriers and link the physical and virtual realm. This chapter introduces the basic concepts behind architectural designs and points out historical connections and differences between current ARMs. Moreover, it addresses the needs of converging the historically separated Information Technology (IT) and Operational Technology (OT) and exemplifies in a use case how ARMs can assist in closing the gap. Finally, the chapter serves as a foundation for the following chapters, introducing architectural concepts like cloud, fog and edge computing.
Patrick Denzler, Wolfgang Kastner
Edge Computing: Use Cases and Research Challenges
The continuum increase of connected devices and the rise of new emergent applications with fast response times, higher privacy, and security, push the horizon to a new industrial revolution. As a result, the impact of optimizing production and product transactions manifests a fierce necessity of developing new concepts like the Industry 4.0. The combination of traditional manufacturing and industrial practices with the increasingly large-scale machine-to-machine and the Internet of Things deployments, helps manufacturers and consumers to better communication and monitoring, along with new levels of analysis, providing a truly productive future. Edge computing represents an integral part of Industry 4.0, having the purpose of enabling computational resources closer to the edge of the network. In this chapter, we describe in detail this paradigm by looking at its advantages and disadvantages as well as some representative use cases. Finally, we present and discuss the research challenges found in edge computing, mostly focusing on resource management.
Cosmin Avasalcai, Schahram Dustdar
Dynamic Access Control in Industry 4.0 Systems
Industry 4.0 enacts ad-hoc cooperation between machines, humans, and organizations in supply and production chains. The cooperation goes beyond rigid hierarchical process structures and increases the levels of efficiency, customization, and individualisation of end-products. Efficient processing and cooperation requires exploiting various sensorand process data and sharing them across various entities including computer systems, machines, mobile devices, humans, and organisations. Access control is a common security mechanism to control data sharing between involved parties. However, access control to virtual resources is not sufficient in presence of Industry 4.0 because physical access has a considerable effect on the protection of information and systems. In addition, access control mechanisms have to become capable of handling dynamically changing situations arising from ad-hoc horizontal cooperation or changes in the environment of Industry 4.0 systems. Established access control mechanisms do not consider dynamic changes and the combination with physical access control yet. Approaches trying to address these shortcomings exist but often do not consider how to get information such as the sensitivity of exchanged information. This chapter proposes a novel approach to control physical and virtual access tied to the dynamics of custom product engineering, hence, establishing confidentiality in ad-hoc horizontal processes. The approach combines static design-time analyses to discover data properties with a dynamic runtime access control approach that evaluates policies protecting virtual and physical assets. The runtime part uses data properties derived from the static design-time analysis, as well as the environment or system status to decide about access.
Robert Heinrich, Stephan Seifermann, Maximilian Walter, Sebastian Hahner, Ralf Reussner, Tomáš Bureš, Petr Hnětynka, Jan Pacovský
Challenges in OT Security and Their Impacts on Safety-Related Cyber-Physical Production Systems
In Cyber-Physical Production Systems (CPPS), integrity and availability of hardware and software components are necessary to ensure product quality and the safety of employees and customers, while the confidentiality of engineering artifacts and product details must be kept to hide company secrets. At the same time, an increasing number of Internet connected control systems causes the presence of new attack vectors. As a result, unauthorized hardware/software modifications of CPPS components through cyber attacks become more prevalent. This development raises the demand for proper protection measures significantly, not only to ensure product quality and security but also the safety of people working with the machinery. In this chapter, we describe vulnerable assets of Operational Technology (OT) and identify information security requirements for these assets. Based on this assessment, possible attack vectors and threat models are discussed. Furthermore, measures against the mentioned threats and security relevant differences between OT and Information Technology (IT) systems are outlined. To manage a CPPS and its related threats, risk management will be addressed in more detail. Although safety and security should no longer be viewed as isolated, there are several challenges of integrating safety and security, which can lead to struggles and trade-offs. For this reason, the “Safety and Security Lab in Industry” currently investigates different aspects of future integrated solutions covering both safety and security. Challenges of such integrated solutions are outlined at the end of the chapter.
Siegfried Hollerer, Bernhard Brenner, Pushparaj Rajaram Bhosale, Clara Fischer, Ali Mohammad Hosseini, Sofia Maragkou, Maximilian Papa, Sebastian Schlund, Thilo Sauter, Wolfgang Kastner
Runtime Monitoring for Systems of System
A Closer Look on Opportunities for Manufacturers in the Context of Industry 4.0
Software-intensive systems in general and Cyber-Physical Systems (CPS) in particular have drawn considerable attention from both industry and academia in recent years, with companies increasingly adopting Cyber-Physical Production Systems (CPPS). Regardless of the domain in which these systems are deployed, what they have in common is a shift from traditional software engineering principles towards a development process where software, hardware, and human actors controlling these systems are deeply interwoven and dependent upon each other. To ensure safe operation, it is crucial that an SoS complies with its requirements. Engineers and maintenance personnel must monitor if and how the SoS meets its requirements at runtime, e.g., to correctly verify timing behavior, or measure performance and resource consumption. In this chapter, we provide a brief introduction to SoS and CPPS, and investigate runtime monitoring from two different angles. First, we discuss requirements and challenges from the machine vendor’s perspective. Second, we focus on the customer itself, i.e., typically a shop floor owner who has to combine a multitude of different components, both machinery and software systems for her production system. Finally, we discuss potential applications and benefits of runtime monitoring and provide an outlook presenting current research lines in SoS monitoring.
Michael Vierhauser, Alexander Egyed
Blockchain Technologies in the Design and Operation of Cyber-Physical Systems
A blockchain is an open, distributed ledger that can record transactions between two parties in an efficient, verifiable, and permanent way. Once recorded in a block, the transaction data cannot be altered retroactively. Moreover, smart contracts can be put in place to ensure that any new data added to the blockchain respects the terms of an agreement between the involved parties. As such, the blockchain becomes the single source of truth for all stakeholders in the system.
These characteristics make blockchain technology especially useful in the context of Industry 4.0, distributed in nature, but with important requirements of trust and accountability among the large number of devices involved in the collaboration. In this chapter, we will see concrete scenarios where cyber-physical systems (CPSs) can benefit from blockchain technology, especially focusing on how blockchain works in practice, and which are the design and architectural trade-offs we should keep in mind when adopting this technology both for the design and operation of CPSs.
Abel Gómez, Christophe Joubert, Jordi Cabot

Data Management

Big Data Integration for Industry 4.0
The fourth industrial revolution promises a new quality of automation with smart manufacturing devices sharing enormous amounts of data. A crucial step in fulfilling this promise is developing advanced data integration methods that are able to consolidate and combine heterogeneous data from multiple sources. We outline the use of knowledge graphs for data integration and provide an overview of proposed approaches to create and update such knowledge graphs, in particular for schema and ontology matching, data lifting and especially for entity resolution. Furthermore, we present data integration use cases for Industry 4.0 and discuss open problems.
Daniel Obraczka, Alieh Saeedi, Victor Christen, Erhard Rahm
Massive Data Sets – Is Data Quality Still an Issue?
The term “big data” has become a buzzword in the last years, and it refers to the possibility to collect and store huge amounts of information, resulting in big data bases and data repositories. This also holds for industrial applications: In a production process, for instance, it is possible to install many sensors and record data in a very high temporal resolution. The amount of information grows rapidly, but not necessarily does the insight into the production process. This is the point where machine learning or, say, statistics needs to enter, because sophisticated algorithms are now required to identify the relevant parameters which are the drivers of the quality of the product, as an example. However, is data quality still an issue? It is clear that with small amounts of data, single outliers or extreme values could affect the algorithms or statistical methods. Can “big data” overcome this problem? In this article we will focus on some specific problems in the regression context, and show that even if many parameters are measured, poor data quality can severely influence the prediction performance of the methods.
Peter Filzmoser, Alexandra Mazak-Huemer
Modelling the Top Floor: Internal and External Data Integration and Exchange
Digital representations of top floor entities are inherent in higher level software suites such as enterprise resource planning (ERP) systems or manufacturing execution systems (MESs). Typical implementations utilise proprietary conceptual models that lead to a plethora of both import and export filters between different systems. In this chapter we will highlight the modelling of top floor entities by adopting international standards and discussing arising interoperability issues. With the selected standards, we outline an approach for vertical integration between the ERP and MES levels as well as horizontal integration among organisations in a value added network. We complement our structural, model-based and data-driven perspective with business process stencils that are to be customised to specific business case needs. With that, we establish a purely model-based perspective on the coupling of top floor internal and external data exchange matters.
Bernhard Wally, Christian Huemer, Birgit Vogel-Heuser

Data Analytics

Conceptualizing Analytics: An Overview of Business Intelligence and Analytics from a Conceptual-Modeling Perspective
Business intelligence and data analytics projects often involve low-level, ad hoc data wrangling and programming, which increases development effort and reduces usability of the resulting analytics solutions. Conceptual modeling allows to move data analytics onto a higher level of abstraction, facilitating the implementation and use of analytics solutions. In this chapter, we provide an overview of the data analytics landscape and explain, along the (big) data analysis pipeline, how conceptual modeling methods may benefit the development and use of data analytics solutions. We review existing literature and illustrate common issues as well as solutions using examples from cooperative research projects in the domains of precision dairy farming and air traffic management. We target practitioners involved in the planning and implementation of business intelligence and analytics projects as well as researchers interested in the application of conceptual modeling to business intelligence and analytics.
Christoph G. Schuetz, Michael Schrefl
Discovering Actionable Knowledge for Industry 4.0: From Data Mining to Predictive and Prescriptive Analytics
Making sense of the vast amounts of data generated by modern production operations—and thus realizing the full potential of digitization—requires adequate means of data analysis. In this regard, data mining represents the employment of statistical methods to look for patterns in data. Predictive analytics then puts the thus gathered knowledge to good use by making predictions about future events, e.g., equipment failure in process industries and manufacturing or animal illness in farming operations. Finally, prescriptive analytics derives from the predicted events suggestions for action, e.g., optimized production plans or ideal animal feed composition. In this chapter, we provide an overview of common techniques for data mining as well as predictive and prescriptive analytics, with a specific focus on applications in production. In particular, we focus on association and correlation, classification, cluster analysis and outlier detection. We illustrate selected methods of data analysis using examples inspired from real-world settings in process industries, manufacturing, and precision farming.
Christoph G. Schuetz, Matt Selway, Stefan Thalmann, Michael Schrefl
Process Mining—Discovery, Conformance, and Enhancement of Manufacturing Processes
Process-orientation has gained significant momentum in manufacturing as enabler for the integration of machines, sensors, systems, and human workers across all levels of the automation pyramid. With process orientation comes the opportunity to collect manufacturing data in a contextualized and integrated way in the form of process event logs (no data silos) and with that data, in turn, the opportunity to exploit the full range of process mining techniques. Process mining techniques serve three tasks, i.e., (i) the discovery of process models based on process event logs, (ii) checking the conformance between a process model and process event logs, and (iii) enhancing process models. Recent studies show that particularly, (ii) and (iii) have become increasingly important. Conformance checking during run-time can help to detect deviations and errors in manufacturing processes and related data (e.g., sensor data) when they actually happen. This facilitates an instant reaction to these deviations and errors, e.g., by adapting the processes accordingly (process enhancement), and can be taken as input for predicting deviations and errorsfor future process executions. This chapter discusses process mining in the context of manufacturing processes along the phases of an analysis project, i.e., preparation and analysis of manufacturing data during design and run-time and the visualization and interpretation of process mining results. In particular, this chapter features recommendations on how to employ which process mining technique for different analysis goals in manufacturing.
Stefanie Rinderle-Ma, Florian Stertz, Juergen Mangler, Florian Pauker
Symbolic Artificial Intelligence Methods for Prescriptive Analytics
Prescriptive analytics in supply chain management and manufacturing addresses the question of “what” should happen “when”, where good recommendations require the solving of decision and optimization problems in all stages of the product life cycle at all decision levels. Artificial intelligence (AI) provides general methods and tools for the automated solving of such problems.
We start our contribution with a discussion of the relation between AI and analytics techniques. As many decision and optimization problems are computationally complex, we present the challenges and approaches for solving such hard problems by AI methods and tools. As a running example for the introduction of general problem-solving frameworks, we employ production planning and scheduling.
First, we present the fundamental modeling and problem-solving concepts of constraint programming (CP), which has a long and successful history in solving practical planning and scheduling tasks. Second, we describe highly expressive methods for problem representation and solving based on answer set programming (ASP), which is a variant of logic programming. Finally, as the application of exact algorithms can be prohibitive for very large problem instances, we discuss some methods from the area of local search aiming at near-optimal solutions. Besides the introduction of basic principles, we point out available tools and practical showcases.
Gerhard Friedrich, Martin Gebser, Erich C. Teppan
Machine Learning for Cyber-Physical Systems
Machine Learning plays a crucial role for many innovations for Cyber-Physical Systems such as production systems. On the one hand, this is due to the availability of more and more data in ever better quality. On the other hand, the demands on the systems are also increasing: Production systems have to support more and more product variants, saving resources is increasingly in focus and international competition is forcing companies to innovate faster. Machine Learning leverages data to solve these issues. The goal is to have self-learning systems which improve over time. There are various algorithms and methods for this, for which an overview is given here. Furthermore, this article discusses special requirements of Cyber-Physical Systems for Machine Learning processes.
Oliver Niggemann, Bernd Zimmering, Henrik Steude, Jan Lukas Augustin, Alexander Windmann, Samim Multaheb
Visual Data Science for Industrial Applications
Advances in sensor and data acquisition technology and in methods of data analysis pose many research challenges but also promising application opportunities in many domains. The need to cope with and leverage large sensor data streams is particularly urgent for industrial applications due to strong business competition and innovation pressure. In maintenance, for example, sensor readings of machinery or products may allow to predict at which point in time maintenance will be required and allow to schedule service operations respectively. Another application is the discovery of the relationships between production input parameters on the quality of the output products. Analysis of respective industrial data typically cannot be done in an out-of-the-box manner but requires to incorporate background knowledge from fields such as engineering, operation research, and business to be effective. Hence, approaches for interactive and visual data analysis can be particularly useful for analyzing complex industrial data, combining the advantages of modern automatic data analysis with domain knowledge and hypothesis generation capabilities of domain experts.
In this chapter, we introduce some of the main principles of visual data analysis. We discuss how techniques for data visualization, data analysis, and user interaction can be combined to analyze data, generate and verify hypotheses about patterns in data, and present the findings. We discuss this in the light of important requirements and applications in the analysis of industrial data and based on current research in the area. We provide examples for visual data analysis approaches, including condition monitoring, quality control, and production planning.
Tobias Schreck, Belgin Mutlu, Marc Streit

Digital Transformation towards Industry 5.0

Self-Adaptive Digital Assistance Systems for Work 4.0
In the era of digital transformation, new technological foundations and possibilities for collaboration, production as well as organization open up many opportunities to work differently in the future. The digitization of workflows results in new forms of working which is denoted by the term Work 4.0. In the context of Work 4.0, digital assistance systems play an important role as they give users additional situation-specific information about a workflow or a product via displays, mobile devices such as tablets and smartphones, or data glasses.
Furthermore, such digital assistance systems can be used to provide instructions and technical support in the working process as well as for training purposes. However, existing digital assistance systems are mostly created focusing on the “design for all” paradigm neglecting the situation-specific tasks, skills, preferences, or environments of an individual human worker. To overcome this issue, we present a monitoring and adaptation framework for supporting self-adaptive digital assistance systems for Work 4.0. Our framework supports context monitoring as well as UI adaptation for augmented (AR) and virtual reality (VR)-based digital assistance systems. The benefit of our framework is shown based on exemplary case studies from different domains, e.g. context-aware maintenance application in AR or warehouse management training in VR.
Enes Yigitbas, Stefan Sauer, Gregor Engels
Digital Transformation—Towards Flexible Human-Centric Enterprises
Our society is progressing from an industrial society to a knowledge society and thereby establishing constant changes with unprecedented extent and speed. This is due to the urge of mankind to improve quality of life by gaining knowledge and insights, and to the steadily increased power of information technology. For enterprises, the changing environment constantly opens new chances and existential risks, which force them to adapt to their changing contexts on time. So, to survive and succeed, enterprises must organize digital transformation as a process to steadily shape their future, and they must consider their context in a wider scope than usual. Also, entrepreneurs are facing increasing challenges. With these insights, we propose a novel human-centric view on enterprises, their digital transformation, and their position in the society. It combines technical and business levers with enterprise culture. We introduce a reference model-based approach for a continuous, holistic enterprise evolution and focus on the orchestrated solution provider (OSP) as the future enterprise model. It supports the entrepreneur and self-responsible teams to master digital transformation and to sustain the success of their enterprise in the knowledge society. In this sense, the OSP follows the vision of Industry 5.0 for a sustainable, human-centric and resilient European industry, while going far beyond with its holistic view.
Burkhard Kehrbusch, Gregor Engels
Digital Transformation
Birgit Vogel-Heuser
Manuel Wimmer
Copyright Year
Springer Berlin Heidelberg
Electronic ISBN
Print ISBN

Premium Partner