Skip to main content

Open Access 05.03.2024

Holorailway: an augmented reality system to support assembly operations in the railway industry

verfasst von: Clara Garcia, Mario Ortega, Eugenio Ivorra, Manuel Contero, Pau Mora, Mariano L. Alcañiz

Erschienen in: Advances in Manufacturing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

During the last two decades, industrial applications of augmented reality (AR) have been incorporated in sectors such as automotive or aeronautics in tasks including manufacturing, maintenance, and assembly. However, AR’s potential has yet to be demonstrated in the railway sector due to its complexity and difficulties in automating tasks. This work aims to present an AR system based on HoloLens 2 to assist the assembly process of insulation panels in the railway sector significantly decreasing the time required to perform the assembly. Along with the technical description of the system, an exhaustive validation process is provided where the assembly using the developed system is compared to the traditional assembly method as used by a company that has facilitated a case study. The results obtained show that the system presented outperforms the traditional solution by 78% in the time spent in the localization subtask, which means a 47% decrease in the global assembly time. Additionally, it decreases the number of errors in 88% of the cases, obtaining a more precise and almost error-free assembly process. Finally, it is also proven that using AR removes the dependence on users’ prior knowledge of the system to facilitate assembly.

1 Introduction

Augmented reality (AR) has the potential to improve efficiency, reduce errors, and enhance the quality of work in manufacturing industries. It is an important component of Industry 4.0 paradigm [1] and can be used in a range of applications to help businesses optimize the benefits of automation and data exchange [2]. In this context, the progressive adoption of the digital twin (DT) is significantly increasing interest in the development of industrial augmented reality (IAR). The concept of DT was introduced by Michael Grieves [3] as “the digital equivalent to a physical product” where three components can be distinguished [4]: physical entities in the physical world, virtual models in the virtual world, and connection between two worlds. DTs can be used to support a range of activities, including
(i)
Simulation and visualization. DTs can be used to simulate and visualize the behavior of physical assets, processes, or systems, helping to identify and address issues before they occur in the real world.
 
(ii)
Analysis and optimization. DT can be used to analyze and optimize the performance of physical assets, processes, or systems, helping to identify opportunities for improvement and optimize operations.
 
(iii)
Control and monitoring. DT can be used to monitor the performance of physical assets, processes, or systems in real-time and provide control inputs as needed to ensure optimal performance.
 
With these features in mind, it is clear that AR, defined as a real-time direct or indirect view of a physical, real-world environment that has been enhanced or augmented by adding virtual computer-generated information to it [5], is a key technology in Industry 4.0 paradigm because it can merge the real and virtual worlds. Specifically, it augments our vision of the real world with computer-generated elements when viewed through a device.
The current major areas of AR application in Industry 4.0 are maintenance, assembly and repair, training, product quality control, human-robot collaboration, and building monitoring [68]. The technology is applied in numerous industries such as automotive [9], shipyard [10], aerospace [11], and the nuclear power sector [12], among others. Moreover, current implementations show successful results, with significant improvements in productivity, accuracy, and safety [13].
Although AR has been introduced in many industrial sectors, there is little research to show its potential in the railway sector, likely due to the low-volume and highly customized design requests, making automation complex, impractical and requiring manual assembly processes. One principal challenge is the stringent precision required in assembly. Minor misalignment or errors can result in significant safety risks and operational inefficiencies. The inherent complexity in railway vehicles is evident in their construction, involving a large number of diverse components like hydraulic systems, electrical systems, engine blocks, and cooling systems. This vast array of components, coupled with their varying sizes, amplifies the assembly process’s complexity. Moreover, technicians must possess specialized skills to handle the large number of various components and frequently move along the train’s length during assembly, creating logistical challenges in an already complex process. Therefore, it requires specialized and extended training for new workers, making it difficult to reduce operation times without expanding the workforce.
The advancements in AR technology have brought many benefits to the industry, but there are currently some technical, usability and functional limitations [14, 15]. There is a lack of robust, accurate and efficient tracking systems [16], a need for specialized AR hardware devices [17] that fulfill the task requirements, limited flexibility in complex product data management (PDM) and limited availability of DTs [18]. In addition, to those drawbacks, it is difficult to find studies showing AR tested in production environments instead of laboratory conditions in the industrial field [17].
Specifically, it can be difficult to choose a proper tracking system for AR assembly in the industrial sector [16]. Thus far, the most commonly used tracking methods are marker-based and/or marker-less, both of which have technical advantages and disadvantages, such as simultaneous localization and mapping (SLAM) [14, 17, 19]. Nonetheless, the tracking system’s accuracy, robustness, and efficiency depend on the specific application and environmental factors. Therefore, this is still an open area of research.
The main requirements for the AR hardware are sufficient autonomy and computing capacity, the ability to understand the environment, ease of interaction, and lightweight design, among others [17]. For instance, an autonomous head-mounted display (HMD) like HoloLens 2 is crucial for accomplishing the previously mentioned requirements [20].
Another drawback is the absence of a flexible, easy-to-use, and scalable procedure for creating AR work instructions and use cases [18]. This means that highly skilled developers with the necessary knowledge to access DTs in the company’s PDM system must create specific AR operational scenarios rather than using ones designed by end-users [14]. This limitation can be addressed by the development of a simple and adaptable solution that can be easily used by non-technical users.
Testing AR applications in industrial environments is challenging due to the specific characteristics of the setting [17]. This is one of the main reasons why most AR studies have not been validated in these types of production environments, and are tested only in a controlled laboratory environment. Consequently, results from these studies were inconclusive as to the effectiveness of the AR system as it applied to the industry [21].
Taking this into account, in this work, we propose a system that can visually guide the operator in the assembly of insulation structures in a rail way wagon using AR. This system pretends to optimize the process by reducing assembly times and enhancing quality through the reduction of errors by means of AR and DT solutions. Concretely, this AR tool improves the efficiency and precision of the assembly task. Moreover, this system also reduces the operator’s mental effort and increases effectiveness and safety, reducing costs and improving working conditions. Using AR also allows reducing the required training for new workers. Furthermore, by incorporating the DT allows the AR tool to identify the components, carry out the guidance process, and offer an accurate 3D representation of the wagon’s components used in assembly tasks through the AR display.
This applied work aims to optimize the construction and assembly of railway vehicles by introducing a mechanism to guide operators during the assembly process. The proposed solution is novel and is supported by a use case to demonstrate its effectiveness. In summary, the main contributions of this solution are shown as follows.
(i)
An AR system in the railway sector focused on assembly operations; the first in the sector, to our knowledge.
 
(ii)
An AR system that can automatically align, virtual content (DT) of a wagon on a real wagon using computer vision techniques. Specifically, it uses a mechanism that makes the assembly process efficient, precise, and error-free using two tracking methods: SLAM techniques and a marker-based algorithm.
 
(iii)
A flexible and simple IAR system which could be applied to different models and use cases, such as wagons with different dimensions or materials.
 
(iv)
An exhaustive validation process in an environmental production condition with different worker profiles, which proves the system’s effectiveness and removes the initial training phase.
 
(v)
An AR-based solution that improves the quality of the final product by preventing and reducing the number of errors while improving the sustainability of the process. Therefore, it works towards the concept of zero defect manufacturing in this field [22, 23].
 
The remainder of this paper is organized as follows. Section 2 briefly reviews existing AR methods applied to the industry and describes AR solutions used in assembly, specifically in the rail way sector. Furthermore, Section 3 deeply introduces the traditional use case analyzing its main limitations, while extracting the requirements to be implemented on the final solution. Section 4 presents the proposed methods and materials needed to fulfill the assembly task requirements. Section 5 introduces the results obtained using different metrics including quantitative and analytical results and other important factors. Finally, Section 6 includes a discussion of the obtained results, the overall conclusions and some suggestions for future work.

2 State of the art

Currently, AR technology is gaining widespread attention because of its potential to improve a wide range of industrial fields [13]. It is essential to examine how AR is being implemented in the industrial sector, with a focus on the assembly process, and more specifically, to analyze its use in the railway industry.

2.1 AR in industry

The IAR field was growing at the end of the 1990s with initiatives such as the ARVIKA [24] and ARTESAS projects [10], though the term “industrial augmented reality” was not introduced until 2011 by Georgel [18] when trying to define AR’s use in industrial processes, such as assembly [25], and manufacturing [26], product design [27], maintenance [28], among others. Moreover, this technology provides powerful tools that assist operators in their assigned tasks, specially, when they are complex [29]. In the following years, AR’s presence increased, to where its importance has been recognized and classified by the European Union as a core technology that will lead to the development of smart factories [30]. However, its implementation in the industry is still scarce compared to other technologies like robotics and automation [13, 17, 26, 31].
There are some industrial fields where AR is applied successfully, such as maintenance and repair [32], design [33], and training [10]. These applications and AR’s improvements to these fields are briefly explained in addition to some proposed approaches.
AR is applied in the maintenance and repair field to make tasks efficient and error-free by providing workers with augmented instructions and/or virtual content like a DT [34] aligned with the real scene or object of interest used to carry out the machinery maintenance or repair task. For example, welding tasks [32] can be improved by aligning the DT with the real scene to highlight spot-weld locations to improve accuracy by 52%. Fiorentino et al. [35] introduced a method that uses AR for maintenance tasks and achieved a significant reduction in errors (92%) and time (79%). During the last few years, remote assistance has been explored [36] in machine monitoring or repair. Using IAR could improve communication between workers and facilitate access to documentation (manuals, 3D models, etc.). Additionally, these applications allow an expert to visualize any problems, and any directions they give can be visualized by a worker on-site [28].
Another important field where AR is applied is training, where it can shorten the time required to train new employees and, increase information retention, cost savings, safety, realism, scalability, and flexibility [10]. Augmented reality training can be more efficient, effective, interactive, and engaging than traditional training methods and can be accessed remotely and on demand. In addition, AR can also be used to simulate dangerous environments, allowing employees to practice safety procedures without risking injury. For example, Hořejší [37] proposed virtual training using an AR system that added virtual 3D model instructions into a real scene for workers who used machinery. They concluded that using AR, reduced the skill requirements and training time. Moreover, Füchter et al. [38] presented an approach for creating a proto-type app for smart-phones to train new pilots, enhancing safety, and accelerating the training process.
Within the product development field, product design is a crucial task that can be enhanced through AR technology [33]. This allows for realistic simulation of the final product while making design changes in real-time, allowing for faster iterations and more accurate designs. For instance, in the naval industry, Oh et al. [27] proposed a method based on mobile AR to visualize 3D models linked to design drawings. In the automotive industry, Fründ et al. [39] introduced an AR method that could evaluate different car interiors during initial development phases by overlaying 3D models on a real car.

2.2 AR assembly

Assembly is a vital aspect that is considered throughout the design and manufacturing process, but in some sectors, it is still done using traditional techniques. In the traditional assembly process, different components or parts are manually put together to form a finished product. This process usually involves the use of hand tools, and manual labor. The assembly process may include tasks such as fitting and fastening parts in complex systems with many components, and testing the finished product. Typically, these tasks require a heavy mental workload [40] and an experienced worker who understands the technical information describing the required tasks [41]. This process can be time-consuming and requires skilled workers who have been trained in specific assembly techniques. It can also be prone to errors and inconsistencies, as each worker may take a slightly different approach to the task.
The primary objective of using AR in assembly tasks is to guide the operators performing the tasks. Simplifying assembly instructions by using 3D models aligned with the real-world scene [42], videos, etc., makes assembly safer, more accurate, and more efficient. It reduces the cognitive workload, which is closely linked to task performance [43]. Industrial augmented reality assembly methods were also developed in different fields such as the naval industry [10], aerospace [42], automotive [9], and agricultural machinery [44], among others.
Within the automotive sector, Lima et al. [25] suggested a system to guide the worker to the 3D location in different assembly processes. This sector also uses assembly simulation and planning methods. For instance, Woll et al. [45] developed an AR-based simulation system for assembling a car power generator.
Another industrial sector where AR is applied to assembly tasks is the naval sector. For instance, Fraga-Lamas et al. [10] reviewed different use cases where IAR assembly systems could be applied in a shipyard.
The aerospace industry also applied AR in assembly tasks. Mizell [46] evaluated the effectiveness of using a HMD in wire assembly tasks for Boeing airplanes. Liu et al. [47] introduced an IAR assembly system to increase task efficiency and guide the worker by creating a DT that is enhanced with 3D information.
In addition, AR is applied to assembly tasks in manufacturing sensors or machinery. For instance, Lai et al. [48] introduced an AR assembly system to guide workers in producing CNC carving machines. Richardson et al. [49] experimented with three different methods to perform the same assembly task for sensors in the military sector. It was concluded that the AR method obtained better performance in terms of time and efficiency.
Although AR applications in assembly tasks are becoming mature, some technical and functional limitations still need to be solved [15].
First, no agreement exists on the tracking method to be used. Multiple studies propose fiducial marker-based tracking methods [16] like ArUco [50] due to their simplicity and low computational cost. However, this could be too sensitive to occlusions and non-controlled light conditions. In addition, in some cases, it is not possible to place the markers in the scene. Alternatively, marker-less tracking methods, such as SLAM, were proposed [25, 47] in the industrial field to align the DT with the real scene. However, the proposed SLAM tracking approaches accumulate errors over time and they are generally not recommended for dynamic scenes, where the environment should be highly textured. As previously stated, each tracking method has its own set of benefits and limitations. It is crucial to choose and adapt the appropriate tracking approach based on the specific needs and characteristics of the industrial environment and application [48].
In terms of hardware devices, a clear division could be established between mobile devices and optical see-through glasses. On the one hand, many assembly tasks using AR can be performed using mobile devices such as smartphones or tablets. However, in industrial environments, many tasks are done by hand and, it is required to be hands-free, so holding a device is unfeasible [51]. On the other hand, optical see-through devices have experienced a significant evolution. Traditionally, devices like HMDs were too heavy, and had short battery lives, limited fields of view, and too many wires [17]. In the last few years, HMDs have progressed rapidly, with advancements in technology leading to improved resolution, field of view, and overall user experience [52]. Microsoft HoloLens 2 and Magic Leap 2 represent the most recent evolution in this type of device, with many examples of successful industrial applications [20, 5254].
Many AR systems are used as independent solutions for specific tasks, rather than as integrated components of a company’s workflow and IT infrastructure [36]. One of the main reasons for this specificity is the need to translate the CAD files describing the DT to the proper format to develop an AR application. Usually, this step requires manual adjustments. The development of AR resources is not simple, scalable or flexible and programming knowledge is required [55, 56].
Although there are some examples that test the IAR system in a production environment (e.g., the Volkswagen factory) [25], most of them are validated only under laboratory-controlled situations [45, 49]. Specifically, they tend to simulate the production conditions [48].

2.3 AR railway

Despite the strategic importance of the railway sector in the industry, there are few AR applications in this field, and these are primarily focused on maintenance. For instance, Azpiazu [57] presented a remote support application, where a worker communicated with a specialist through glasses, and the specialist could add elements such as arrows or signs to the worker’s visualization to improve maintenance times and costs.
Furthermore, Kwon et al. [58] implemented an AR system for maintenance and training on the Axle-mounted Disc Brake System using a hand-held device (mobile or tablet). The results showed an improvement in efficiency of 34% and a reduction in errors on this task.
AR has been applied to inspect freight cars’ routes with out-of-gauge cargo [59], using AR-based interference detection to detect possible collisions between the train and the environment and select the optimal route. This system showed lower cost and high precision when compared with other methods.
After reviewing the literature, it could be concluded that, to the best of our knowledge, no research papers show the application of AR in the manufacturing and production of railway vehicles mainly it is a consequence of the huge amount of challenges in the implementation of AR in this sector detailed in Sect. 1. Thus, it could be beneficial [10] to develop a powerful IAR system such as the one proposed in the following sections. First, it should avoid discontinuities and gaps in the process because they can affect functionality. Second, the system should be smooth and easy to use for those users who are unfamiliar with AR technology. For example, the interaction should be user-friendly by avoiding functional inconsistencies and integrating the end users in the development loop. All these aspects have been taken into account for the proposed Holorailway system, explained in detail in the following sections.

3 Real use case validation

Lining the inside of railway vehicles with insulating material is a necessary task that provides multiple benefits. Specifically, it reduces the transmission of heat and noise from the outside to the inside of the vehicle and vice versa. This way, energy dedicated to adjusting the temperature of the wagon is saved, either due to excess cold or heat.
In the railway industry, due to the number of elements present in the area where the material is installed, and the customization of the vehicles, it is neither possible nor practical to manufacture the entire insulation layer in one piece. Therefore, a large number of sections of the material are required making this task highly complicated and it is required to carry out it without any automation. Traditionally, to execute this manual task efficiently, operators are typically expected to adhere to the sequence illustrated in Fig. 1, which requires the use of both hands to make the assembly process feasible.
It is important to note that there are a huge number of components, which can fluctuate between approximately 800 and 2 900 depending on the carriage dimensions. Moreover, the dimensions of the insulation components vary depending on the wagons’ dimensions, but their sizes are always between 7 cm × 7 cm × 2 cm and 120 cm × 120 cm × 4 cm. This high quantity and variability make the process complex and tedious in the identification task. Specifically, to locate the position of a specific component it is required to know its ID from an alpha-numerical code (also codified in a QR code), accompanied by a list of IDs on which to consult the number of elements (see Fig. 2) and the area where they are installed. Therefore, the process of transferring 2D information to the 3D space of the wagon becomes intricate and time consuming.
Describing the process in detail, first, the operator takes one of the parts from the box. The arrangement of components within the box could potentially lead to a considerable time loss, especially when operators need to go through a disorganized collection to locate the required parts for specific assembly tasks. In the conventional process, an external company is tasked with preparing and labeling these components. However, this company does not take into account the organization and repetition of components, resulting in a randomized arrangement inside the box. This practice is widespread in the railway industry, and transitioning to a more structured system would involve additional expenses. Once the operator takes the component from the box, the next step is to identify its part number on its label as depicted in Fig. 3 and similarly in Fig. 1 - first step.
Next, the operator looks at the parts list on the assembly drawing to identify the part item number and then looks for the corresponding balloon on the drawing, to determine where to put the insulation component, as depicted in Fig. 1 - second step. Frequently, the operators need to spend extra time and effort understanding that information and trying to extrapolate 2D information to the 3D space and memorizing the position to locate the component. Thus, it increases the mental workload and requires a good spatial reasoning, which is difficult for many people, particularly when the operator is a new worker or due to similarities between different components and locations inside the wagon. Figure 2 illustrates an example of the drawings used, where, as it is depicted, there are several different components. It is a difficult task to perform due to the amount of graphic information present in the drawing.
Next, the operator should find the 3D position of the selected component (see Fig. 1 - third step). In order to consider the placement accurate there is a tolerance factor 2 cm on each side. Finally, the area should be cleaned and the insulating material should be assembled (see Fig. 1 - fourth step).
In order to measure the time spent on the insulation, the company carried out a test on a wagon containing 159 distinct components, and it was determined that, on average, it took 2 min and 13 s to place each individual piece. The implementation of our system is expected to significantly reduce this extended time requirement caused by the nature of the manual process.
Once the traditional use case was explored, some problems were identified. Specifically, in the identification and location steps, which are complicated in terms of scalability, precision, and spatial awareness, non-automated nor optimized. For that reason, we decided to develop a system able to solve those limitations. To accomplish that, different simpler automated systems were studied in order to create the final solution, such as monitors. In this specific case, monitors do not solve two of the main limitations presented in the real use case. Particularly, the operators need to memorize the position where the component should be located, and map 2D information into 3D space, increasing the operator’s mental workload. Therefore, the most appropriate solution was a more advanced system based on AR technology. This choice was driven by AR’s capacity to provide real-time guidance to operators, which not only enhanced their mobility but also eliminated the need for them to memorize component locations. Additionally, we can benefit from the easy usability and simple interfaces. Finally, the high precision achieved by combining AR technology with the DT.

4 Materials and methods

In this section, attending to the main limitations presented in Sect. 3, we explain the materials required for the new solution while analyzing the operations to accomplish the AR task that was created to prove the efficiency and accuracy of Holorailway for the assembly process. Concretely, it will be organized using different subsections. First, Section 4.1 where the hardware used to implement the AR system is presented and also the reasons why it is used. Then, in Sect. 4.2, the proposed solution is deeply explained. Finally, the system is developed in two steps: pre-operational procedures (see Sect. 4.3) and in-operation procedures (see Sect. 4.4). In the first one, the requirements and the steps followed to develop the application are explained, while the second one is focused on explaining how the different processes are executed during the task. This is summed up in Table 1.
Table 1
Detailed system information
 
HW and SW
More details
HW employed
HoloLens 2
Sect. 4.1
Graphic engine
Unity 3D
Sect. 4.3.1
Input information
Configuration file (XML) and DT
Sect. 4.3.1
DT conversion and optimization
JT models (Siemens NX) to FBX models (Autodesk 3ds Max)
Sect. 4.3.1
Initial DT alignment with physical wagon
Image marker-based by Vuforia
Sect. 4.3
Component identification
QR decoding by Mixed Reality Toolkit (MRTK)
Sects. 4.3.1 and 4.4
Guiding to target location (tracking step)
Image marker-based (Vuforia) and SLAM-based (MRTK)
Sects. 4.3.1 and 4.4
Human-computer interaction
Voice commands by MRTK
Sect. 4.1

4.1 Hardware

The device used to implement this system is Microsoft’s HoloLens 2, as depicted in Fig. 4, an autonomous HMD with see-through holographic lenses and dual wave-guides configuration. This device has two layers, one for red and green and the other one for green and blue.
The main reason for using HoloLens 2 is its proven effectiveness in implementing IAR applications in the industry [54, 60, 61]. This efficacy can be attributed to the device’s autonomy, computing capacity, ease of interaction, and reduced weight, which improves ergonomics. It also allows users to view instructions hands-free while performing a task. Additionally, compared to its predecessor, HoloLens 2 features some meaningful upgrades, with enhancements in the field of view, which is larger; a more ergonomic system which adds new and more natural gestures, including the ability to touch the holograms; an added eye-tracking method; voice instruction in various languages; and improvements on the hardware, including a co-processor designed expressly for AI. Moreover, HoloLens 2 presents a high level of precision required to carry out the assembly task. In this sense, to our knowledge, no study related to precision in locating components in the industrial sector using this device exists. However, in the medical field, several studies exist. For instance, Unger et al. [62] established a tolerance of 2 cm that was outperformed by HoloLens 2.
The device features 2k resolution, a time-of-flight depth sensor, and an IMU sensor with an accelerometer, gyroscope, and magnetometer. HoloLens 2 has four visible light cameras for head tracking and two IR cameras for eye tracking, which automatically calibrate according to the user’s inter-pupillary distance. Consequently, it reduces eye strain, and can work as a retinal scanner to identify the user. Its total weight is 566 g and has an autonomy of up to 3 h of active use, which can be increased to the duration of an entire work shift using an external battery.
Regarding its understanding of the environment, the HoloLens 2 is equipped with a 6DoF tracking system that allows the user to move around the virtual space, a spatial mapping system that reproduces a real-time environment mesh, which is essential for placing objects on real surfaces, and mixed reality capture, which provides photos and video of a mixed holographic and physical environment.
Furthermore, the device presents human understanding capabilities, such as two-handled, fully articulated hand tracking, real-time eye tracking and voice commands, and on-device controls.

4.2 AR solution

The process described above (see Sect. 3) is subject to improvement. With that aim, an IAR system, Holorailway, is proposed to enhance the assembly process with the use of a DT and augmented instructions. Figure 5 depicts the steps that the operator would follow when using the proposed AR system.
The first step is represented in Fig. 5 - initial alignment and it is explained in depth in Sect. 4.3.2. After that, the other four steps are repeated. However, some significant changes to the traditional approach were implemented to make the task more efficient. Mainly, a QR code is used to identify the components to avoid using the alpha-numeric label. Concretely, a QR code is scanned with HoloLens 2 (see Fig. 5 - second step), and quickly and automatically identifies the part in the database. Therefore, the operator should get the component with the QR code (see Fig. 5 - first step), which would be scanned and detected with HoloLens 2. Then, the ID of the QR code is compared to the ones in the database to determine if it is a unique code or a repeated one. The database is a list with the ID of the components that define the JT file; these IDs are the same as the QR codes. Our AR-based solution incorporates a register system that stores the placed components and their respective 3D positions, thus, in the case of repeated QRs (i.e., identical components), the system will guide the operator to the closest available location with respect to the operator’s position having into account the information of that register. To compute the closest available position it is used Eq. (1),
$$d = \left\| {p_{{\text{track }}} - p_{{{\text{component}}}} } \right\|_{{\underline{{\underline {2} }} }} ,$$
(1)
where ptrack is the 3D position of the operator, obtained by the tracking method (based on the SLAM technique incorporated in HoloLens 2) and pcomponent is the 3D position of the component saved previously in the DT. It is important to note that, because the alignment is performed in the previous step using tracking techniques, the DT’s coordinate system coincides with the real-world coordinate system (see Fig. 5 - initial alignment). When the QR code is detected (see Fig. 5 - second step) and the model of the railway is aligned with the real one (explained in depth in Sect. 4.3.2), Holorailway starts to guide the operator with a 3D arrow (see Fig. 5 - third step) and visually indicates where the component should be installed. An example of how it is seen is depicted in Fig. 6. The GUI displays a red arrow, which guides the operator to where the component should be installed. Likewise, the blue part represents the DT of the wagon previously aligned with the real wagon. The yellow part marks the position where the component should be set up. A text box is also depicted to provide the operator with instructions, such as the component’s code or the option to use a voice command if the element needs to be duplicated. As can be observed, the application is quite intuitive due to its minimalist design.
The operator should be able to clean the area and install the component without being obstructed by the hologram (see Fig. 5 - fourth step). This is why a shader is applied to the object to indicate its position. As the operator approaches, a shader appears to visually guide the operator until it disappears when the operator is close enough to the element. This can be represented mathematically in Eq. (2),
$$\alpha = \left\{ {\begin{array}{*{20}l} {1,} \hfill & {{\text{if}}\quad d > 1,} \hfill \\ {\frac{d}{0.5} - 1,} \hfill & {{\text{if}}\quad 0.5 \le d \le 1} \hfill \\ {0,} \hfill & {{\text{if}}\quad d < 0.5,} \hfill \\ \end{array} } \right.,$$
(2)
where d is the distance (unit: m). In the particular case of duplicated components, the process ends when the operator uses the specific voice command “placed”. For voice commands, it was proven that in spite of working in an industrial setting, the voice commands were detected accurately due to the incorporation of directional microphones on HoloLens 2, which recognized the operator who used the headset. However, to make the system more efficient, it was determined to use voice commands only in required cases like the one mentioned. Therefore, the system has been designed to minimize interaction mainly because of its efficiency, effectiveness, and usability, and because interaction is not required in many situations. The headsets guide the operator to the area where the pieces need to be assembled by indicating their exact position and orientation. This guidance is done accurately and quickly, making the process of locating the position and orientation more efficient.
In summary, the need for identification codes, bills of materials, and installation drawing are eliminated. The process now consists of scanning the components’ QR code, following the indications of the headset to the installation position, cleaning the area, and installing the part, reducing the total installation time of the insulation material significantly.

4.3 Pre-operational procedures

This section explains the pre-operational procedures, specifically how the system is created and the requirements used to make Holorailway run. Although the aforementioned hardware incorporates useful techniques, it should be integrated with other methods not only to complete the entire process but also to make the system more precise and reliable.

4.3.1 App creation and setup

Holorailway was developed using Unity3D (Version 2019.4), the Microsoft Mixed Reality Toolkit (Version 2.8.1), that is used in QR detection, interaction with the GUI, and the HoloLens 2; in other words, it is responsible for all the methods related to the HoloLens 2. The tracking-based library Vuforia (Version 9.6.3) performed the initial DT alignment with the real railway and the correction of the position and orientation (based on SLAM) during the process. Vuforia is a widely used library in AR application development and can be implemented in most phones, tablets, and eyewear, such as the HoloLens 2. By combining the Vuforia tracking method and the tracking method incorporated in the HoloLens 2 (based on the SLAM technique), the final tracking approach used in this system during the assembly process was obtained. Additionally, the solution was developed using C# as the programming language.
Holorailway addresses the issues of inflexibility, difficulty, and limited scalability in the creation of AR work instructions as mentioned in Sects. 1 and 2 by introducing a workflow that can address these drawbacks. It features an application, that is in charge of dynamically loading the content required to run the system without the need for programming expertise. Specifically, the approach only required two inputs. One is a configuration file (XML format file) in which the component, the location, and the ID of each AR marker are indicated. The other, is the DT (3D model) with the components of the wagon and the information about where the virtual marker is positioned, which is represented in a JT file. First, a virtual marker is a 2D plane appended in the JT file and represents the physical markers used by Vuforia as represented in Fig. 7a. In addition, the virtual marker coincides with the size, position and orientation of the physical marker placed in the train. JT is an openly published data format developed by Siemens Digital Industries Software that corresponds with ISO 14306:2017 that is an industry-focused, high-performance, lightweight, flexible file format for capturing and repurposing 3D product definition data for visualization to enable collaboration and validation throughout the extended enterprise.
Although the SDK in HoloLens 2 is not able to load JT files, Holorailway incorporates a method that automatically converts JT files into an FBX file using Autodesk 3ds Max (Version 2023). Additionally, this method can obtain a reduction in the polygon load if needed. To ensure the proper and optimal functioning of HoloLens 2 and according to Microsoft guidelines, the maximum number of polygons is established up to 150 000. Concretely, the models used to test the approach have polygons that vary between approximately 100 000 and 600 000. In order to make the conversion and reduce the polygon load, a BAT file is created which opens using Autodesk 3ds Max and then executes a script responsible for the conversion step. In this script, the ProOptimizer method is in charge of the polygonal reduction.
The information in the configuration file and the JT file is automatically and dynamically loaded into the application. Thus, it can create new use cases and different railway models quickly and easily without the need for programming knowledge.

4.3.2 Initial alignment

As mentioned previously another essential step is the initial alignment of the DT with the real wagon (see Fig. 5 - initial alignment), which is done automatically. To perform this step, a marker-based system is used. Vuforia library provides many methods, which are classified according to their purpose. It can be divided into tracking environments and tracking objects, the latter of which is used in Holorailway. Specifically, it uses an image marker-based method that uses AR markers. It uses computer vision to identify and carry out the pose estimation of the planar patches in real-time and consequently, the alignment between the DT and the real scene is created. Compared with traditional fiducial marker-based systems such as ArUco [50], image targets allow a certain degree of occlusion, while being robust to light changes in the scene, which are improvements over traditional markers.
Furthermore, Vuforia requires the creation of a database composed of different AR markers such as the one depicted in Fig. 7b. Overall, the DT is projected (obtaining the relative position and orientation of the AR marker in the wagon in relation to the frontal camera of HoloLens 2) when the first AR marker is detected. The position and orientation can be corrected each time a new AR marker is encountered, ensuring that the projection is accurate while navigating the real environment, which could be large on long-distance railways.
To ensure detection in the railway wagon, the chosen AR markers are the same size as an A3 sheet of paper (29.7 cm × 42 cm). Additionally, the AR markers are images with a high amount of texture and planar patches, which help with detection in the railway’s environment, and is characterized by its lack of texture. The markers should be placed within the railway wagon, and the number needed will depend on the carriage’s size. For instance, a wagon with 2 m × 5 m × 2.5 m requires two AR markers (one on each side). It is important to note that installing the different AR markers along the wagon requires some time; however, this is considered negligible within the whole pipeline because it is easy to place them.
Using information provided by the markers and tracking approaches, the initial alignment of the DT on the physical scene occurs accurately, automatically, robustly, and in real-time. However, this alignment, requires the DT to be created in advance, which is the DT represented in the JT file. It is important to note that to avoid errors during the tracking, it is also necessary to use the camera-eye calibration method integrated into the HoloLens 2.

4.4 In-operation procedures

The previous sections explained how the application was created. The following will describe the processes that work in the background while the application is running and the requirements needed. This section describes how the tracking is done and how the QR detection is carried out.
In the tracking step, the system not only uses a marker-based technique using Vuforia but also takes advantage of the tracking method used natively in HoloLens 2, which is a markerless method known as SLAM. By applying both techniques, it overcomes some of the limitations that both kinds of tracking methods are not able to solve independently, such as the accumulative error obtained by markerless methods in dynamic scenes or problems with occlusions when the tracking is done using a marker-based algorithm.
As previously mentioned, the marker-based method is used to perform the initial alignment. Once it is finished, the camera pose could be estimated during the process using the different sensors in HoloLens 2 using SLAM. This technique reconstructs the environment while it estimates the position and orientation of the camera with respect to the environment. This also allows the user to know the position in the projected virtual environment so that the user can naturally move around the DT. Additionally, the use of spatial mapping is essential for AR applications as it provides a more immersive experience, making it possible to place objects on real surfaces, create familiar, real-world behaviors and interactions with the holograms, and facilitate navigation in the environment.
Both tracking techniques are applied to obtain high spatial accuracy; therefore, the alignment of holograms in 3D space should be precise. To evaluate the spatial accuracy generated by HoloLens 2, a 3D reconstruction of the wagon was obtained using the Truncated Signed Distance Function (TSDF) volume method [63] to prove the proper functioning of the SLAM camera pose estimation process, avoiding the accumulated error time known as drift. Specifically, the TSDF volume approach could be used with HoloLens 2 to create the 3D color reconstruction of the physical spaces using depth, RGB frames and head poses. A visual example of the reconstructed model is presented in Fig. 8, which shows the 3D reconstruction of the wagon obtained with the TSDF volume method compared with its DT. Additionally, the DT follows a validation process by the engineering department of the company involved. Specifically, the JT file used in the assembly is directly exported from Siemens NX (12.0.2) modeling software and is validated to ensure that it is a faithful representation of the engineering model. Then, the quality department of the company employs various methods for measuring errors, including visual and manual inspections, as well as specialized software like PolyWorks (2021 IR11.5), to compare tolerances between the digital and the physical models. For instance, in the case of a wagon chassis with dimensions (depth, height, width) 20 000 mm × 3 000 mm × 3 000 mm, the maximum error allowed is equal to 15 mm in depth and 5 mm in height and width.
The second step required during the pipeline is the detection of the QR codes previously appended in the JT file. To detect the QR codes, the method integrated into HoloLens 2 is used. According to the current best practices in Microsoft guidelines, the QR code detectors work well under good lighting conditions and with proper size and distances. However, they do not perform well if the codes are small, poorly visible, or too far from the headset. Specifically, in our case the QR code used is approximately 5 cm × 5 cm and there is a margin around all sides of the code and no printed content. The distance between the QR code and the HMD is the length of the operator’s arm, which could be between 30 cm and 50 cm, and they should view this QR frontally to guarantee the condition of the manufacturer.

5 Results

In analyzing the advantages and disadvantages of the proposed AR solution compared to the traditional approach, we can identify Holorailway’s improvement in efficiency and usability.
To measure the impact on time, error rates, usability, and mental workload, as well as additional human factors that might be related to stress or anxiety, the experiments would need to be carried out in an industrial environment as depicted in Fig. 9. Particularly, the industrial environment involves multiple train wagons in different stages of assembly, industrial materials such as tools, boxes, ladders, industrial machinery, and also industrial noise and workers.
This approach aligns with the emerging paradigms of Industry 5.0 and human-centric manufacturing, where human well-being is at the forefront of technological innovation. The experiments involve using the same tools, materials, light conditions, and industrial noise as in the industrial setting, as well as using an actual railway vehicle in construction.
To perform the validation, a study on the 12 selected operators’ backgrounds was done. The information obtained from this study could be split into two groups depending on the kind of data depicted. Figure 10 represents demographic information on the users, such as age, gender, and education levels. Age ranges are identified, including 18–25, 26–35, and 36–50, which cover a large part of the working population, as shown in Fig. 10b. Figure 10a shows the users’ gender. Finally, Fig. 10c depicts the users’ education levels.
Figure 11 depicts the users’ level of experience with AR and their proficiency in performing the task. For experience carrying out the task, we selected 8 users with almost no experience, considered beginners and 4 who had performed the task on many occasions, treated as experts (see Fig. 11a). Another relevant factor in the sociodemographic survey was the users’ frequency of AR use (see Fig. 11b). Most used AR one or two times per week, while others used it less than once per month. Figure 11c shows how they use AR. Some used it only at work, while others also used it in their free time, and one had no experience using AR technology.
In industry, the measurement of time must be highly accurate. Therefore, the assessment of the times needed to perform tasks, was done automatically. The Holorailway system could estimate the time using the time stamp information, whereas in the traditional method, the time was measured using a chronometer. However, in both cases it is supervised to avoid errors in time estimation if a task is disrupted.
Next, location of three different components must be found for each method, three components for the traditional method and three for the AR approach. Thus, six components are used to perform the full experiment. The previously selected components have similar features in size between 7.5 cm × 30 cm and 20 cm × 60 cm, distance from the box where the materials are initially between 4 m and 15 m, and location on the wagon, and with different shapes (such as L shape and rectangular with and without cutouts or notches, similar to Fig. 3). This is important in order to make a comparison as fair as possible. Once the setup is ready, an operator starts the task following the steps explained in Sect. 4.2, from selecting the component to finding its location on the wagon and installing it. This is repeated six times for each operator, three for the traditional method and three using Holorailway. In each of the six cases, a supervisor controlled the measurement of the task to avoid disruptions, and obtain a precise time measurement.
To analyze the information obtained from the experiments, both pipelines were tested by 12 users with different attributes. Firstly, as previously mentioned, the users completed a sociodemo graphic survey. Two types of results were studied as proposed by Marino et al. [64] including objective results [48] and subjective results, which are particularly crucial in the context of human-centric manufacturing [65]. To validate these results, different aspects were analyzed
(i)
Nasa task load index (TLX) test [66]. This technique focuses on the subjective mental workload and includes a measure for frustration level, which can be related to stress or anxiety. It is commonly used when evaluating the added value of AR in comparison to traditional methods in the industrial sector [29, 31, 67, 68]. To compute the score, the average is taken among six factors: mental demand, physical demand, temporal demand, performance, effort, and frustration level.
 
(ii)
System usability scale (SUS) test [69]. This test focused on the system’s usability. Similar to the TLX test, SUS is usually used to prove the AR’s usability [67, 70]. It takes three factors into account: effectiveness, efficiency and satisfaction level.
 
(iii)
Task completion time. Each task is timed, and the time is recorded. Concretely, the time is computed for each component and followed by the global time per user. In addition, each process is saved to compare the time using the proposed system to the time using the traditional procedure. Task completion time is a common metric for evaluating IAR methods [9, 25, 29].
 
(iv)
Number of errors: In this step, the number of errors made during the assembly procedure are counted. Specifically, it is considered a mistake if the operator component is not located in the correct position and/or orientation. A person needs to supervise this. Similar to task completion times, the number of errors is another commonly used evaluation metric [9, 25, 29].
 
First, a quantitative analysis was done on the task completion times and the number of errors. For the completion times, the procedure was split into three main tasks. In the first stage, the component was identified and located (Task 1), which was approximately 60% of the global time. This step includes the time taken by HoloLens 2 to scan a QR and suggest a position, this time varies between 21 ms and 24 ms. Next, the surface was cleaned (Task 2); the component was positioned (25% of the global time); and finally the component was installed (Task 3), which corresponded to 15% of the global time. It was worth noting that the time required to perform the last two tasks was not reduced by using AR because both were done by hand. However, it was proven that using the hardware in the Holorailway did not increase the time needed to carry out these two tasks.
Two different comparisons using the times are noteworthy. Table 2 shows a comparison between the traditional procedure and the AR-supported one. To study the decrease in time, the time required to locate the component and the global time using both procedures should be analyzed. Specifically, it could be obtained by subtracting the location time from the global time. The cleaning and installation times were not included because they were similar in both cases, and not modified in the AR solution.
Table 2
Results of times in seconds using both pipelines
Procedure
Metric
Task 1/s
Global time/s
Traditional
Mean
65.020
108.37
St. Dev.
16.23
AR
Mean
13.156
56.5
St. Dev
2.56
Using the time information for each user in both cases, a statistical study was performed using Student’s t-test between the traditional approach (M = 65.021, SD = 16.23) and the AR method (M = 13.156, SD = 2.56), t(11) = 10.75, p < 0.001, where the null hypothesis asserts that the mean of both groups, with and without AR, would be equal.
The second noteworthy comparison was made between the users’ different levels of experience. Users were considered beginners if they had less than two years of experience, as described by Masood et al. [17]. Otherwise, they were considered experts. The results of this comparison are depicted in Table 3. It is important to note that the time is extracted only for the first task, which is the one that makes use of AR’s advantages.
Table 3
Results of times for the first task in seconds for different profiles
Profile
Traditional/s
AR/s
Beginner
72.89
13.27
Expert
49.29
12.94
The comparison in the number of errors is similar to those conducted previously. It is important to mention that we consider an error to be when the orientation or the location of the component is not correct, e.g., if the component is upside-down or flipped or with a tolerance of 2 cm. It is important to remark that during the validation process the systems does not produce any error, those errors depicted in Table 4 were attributed to human factor, particularly, to the operators when putting the component. Additionally, the error rate is measured exclusively during the first attempt with an empty train without any guides, such as other already placed insulation components. Having in mind those features, a comparison of error reduction when using the AR solution is depicted in Table 4. Regarding these results, it is important to point out that 13 out of the 17 errors made in the traditional approach were caused by the beginners, indicating that a training period was essential to perform the task with a reduced number of errors.
Table 4
Errors produced during the Task 1 using both procedures
Procedure
Number of errors
Mean number of errors
Traditional
17
1.42
AR
2
0.17
Others important results were obtained from the NASA-TLX and SUS tests. Each user completed these tests to obtain feedback on the work-load and usability of the systems. The NASA-TLX test results and the evaluation of each factor are depicted in Fig. 13. The SUS test results are illustrated in Fig. 14. A student’s t-test was performed for both tests (NASA-TLX: Mtrad = 0.58, SDtrad = 0.098, MAR = 0.39, SDAR = 0.075, t(11) = 5.78 and p < 0.001) and (SUS: Mtrad = 41.875, SDtrad = 18.92, MAR = 84.375, SDAR = 9.71, t(11) = 1.79 and p < 0.001), where the null hypothesis asserts that the mean score of both groups (traditional vs. AR) would be equal.

6 Discussion

As mentioned in Sect. 5, different aspects were analyzed to carry out this in depth analysis comparing both methods and proving the advantages of the AR approach.
In terms of time, two different aspects were analyzed: the mean time for both procedures and the mean time for both pipelines depending on the users’ experience of the users. The analysis of the times in the identification and location task (Task 1) per user is represented in Fig. 12. This chart illustrates the times obtained for the 12 different users performing both methods. As observed, the difference of 51.87 s between the traditional method and the AR solution is calculated based on the mean global time values for each method, which are shown in Table 2. Those results are correlated with Ref. [67], which obtained a reduction of 30% in time on the understanding phase (our Task 1), lower than our performance. Re et al. [68] obtained a reduction in time of 38.27%, which was consistent with the results obtained in this work. As mentioned above, the result obtained after evaluation using Student’s t-test confirms that the null hypothesis is rejected. This outcome proves a large difference between the means of both methods, as shown in Fig. 12.
Table 3 shows a comparison between times when the process is carried out by a beginner with less than two years of experience and an expert with more than two years of experience. This second study proves the hypothesis that beginners can perform the assembly task in a similar amount of time as experts. Analyzing the results of the comparison between the different profiles, it is proven that experience is a key factor in the traditional approach. However, experience is not a significant factor when using the AR solution. Therefore, it is not necessary to train novice operators because AR helps the beginning operator to complete the task similar to experts. As depicted, the difference in time between both profiles is less than 1 s when using the AR method, whereas in the traditional approach, the difference more than 20 s. This is coherent with results presented by Loizeau et al. [67] that obtain an improvement for beginner equal to 25% and in the case of an expert equal to 15%, this last being lower than the beginners’ reduction.
After testing on different models and users, it was estimated that the reduction in the first task was around 79%, which represented 47% of the global time. Therefore, the time is reduced when using the AR approach, similar to the other IAR approaches mentioned previously [14, 35, 67].
Another factor studied was the number of errors. The AR method significantly reduced the number of errors [14] as shown in Table 4. They decreased by 88% from 17 errors per project in the traditional use case to 2 errors per project when using Holorailway. The errors produced using the traditional method were positioning errors. In other words, the operator located the component in an erroneous position, in most cases far from the expected location. A few number of errors where produced by changes in the orientation of the component. However, the two errors produced using the AR system were orientation errors. It is important to note that the component in those cases were similar (almost symmetrical). Thus, an extra effort is needed to correctly orient the component, and it could be concluded that the DT’s visual support outperforms the number of errors produced.
The results of the TLX and SUS test (see Figs. 13 and 14, respectively) show that most of the users concluded that in the traditional method, the complexity depended on the number of components needing placement. Thus, if the number of components is greater, it becomes more complex and vice versa. Furthermore, in the traditional solution, the user has no way to be certain they are hanging the element in the correct position. When using the AR method, where the DT of the component is clearly visible, this issue is solved. Studying the information obtained in the NASA-TLX test (see Fig. 13), it can be determined that the AR approach improves all the factors studied in the test by reducing the users’ mental, physical and temporal demands, as well as frustration and effort levels while improving performance. This conclusion is supported by results obtained after carrying out a student’s t-test. These results are similar to the ones obtained by Re et al. [68] which assert a significant difference between the means when using the different methods (traditional method and AR method). Overall, they obtain a reduction in the mental workload of 27%.
Likewise, the results obtained by the SUS test as shown in Fig. 14 demonstrate that the scores are higher for the Holorailway approach than the traditional method. The Student’s t-test revealed a significant distance between both methods, so the null hypothesis is also rejected. Therefore, it could be stated that the AR solution improves usability compared to the traditional approach, as already concluded in Ref. [67], where the improvement in usability can be observed, specifically in the beginners’ results.
Once the evaluation using the information about the different aspects is done, we concluded that the presented solution is better than the traditional method in many aspects, such as efficiency, time devoted to carrying out the pipeline, the number of errors when locating the precise position of the isolation components, and in usability. It is noteworthy to mention a limitation of this work related to the validation process, specifically in the context of worker proficiency and learning curves. When workers are repeatedly asked to operate similar processes, they tend to become more proficient with time, resulting in shorter completion times and fewer errors. This improvement can bias the evaluation of a new system’s effectiveness, such as the Holorailway system, which is being compared to traditional methods.
To conclude, Holorailway is a novel IAR system that successfully uses AR technologies to outperform the assembly task in the railway sector. Using AR, the efficiency and precision of the system were enhanced and an almost error-free assembly method was obtained. This approach was analyzed in depth using an exhaustive statistical study in environmental production conditions, showing excellent results concerning times, number of errors, and usability compared to the traditional method. We present an AR system that can significantly improve productivity and efficiency in this field, while enhancing the final quality. Holorailway can automatically align virtual information on a wagon in real-time and give visual instructions to workers, making the assembly task more simple and efficient. Additionally, new operators do not need previous knowledge to perform the task, reducing the difference in task performance time between experts and beginners.
As a future work, this system could be adapted to different use cases in the railway sector or other industrial sectors. As mentioned previously, the only requirements are the creation of a configuration file and the DT with AR markers. Thus, it could be applied to the airplane or the automotive industry. Concerning the different use cases in the railway sector, it is planned to be used to assemble other components of the railway, such as wire supports, threaded rivets, and pass partitions. In these applications, aluminum templates are often used to help operators during the assembly process, enhancing precision and minimizing the likelihood of errors but increasing the environmental impact and production costs. However, with the Holorailway solution, these aluminum templates could be removed, significantly reducing environmental impact and production costs, improving the sustainability of the sector.
Regarding the aforementioned limitations of this work, future implementation will implement an exhaustive validation process organizing two groups of workers with similar characteristics and have them operate in different sequences, one start testing the real use case and the other with the AR-based solution.
Finally, following the guideline of ZDM, Holorailway solution could be complemented in order to eliminate human errors in the final product, by adding a system that allows to prevent and correct produced failures.

Acknowledgments

The authors want to thank Víctor Gascó and Alejandro Juan for their technical support during the development phase and also express gratitude to Pablo Escribano for his help in creating the figures in this paper. We would also like to thank the following STADLER staff for their invaluable collaboration. To Juan María Lázaro Miguel-Sin for his initial valuable contribution. To Rodolfo Sierra and Manuel Sospedra, whose deep understanding of STADLER’s industrial processes has made the implementation of AG incredibly effective. And to Fernando Martínez Blat, who from the INDUSTRIAL INNOVATION department has facilitated the collaboration between STADLER and H-TECH. The author P.M. is the beneficiary of a University Teacher Training scholarship granted by the Spanish Ministry of Universities. This work was supported by the European Commission, Project EXPERIENCE H2020-FETPROACT-EIC-07-2020-101017727.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.
Literatur
1.
Zurück zum Zitat Schwab K (2017) The fourth industrial revolution, Hardcover edn, p. 192. Currency, Geneva Schwab K (2017) The fourth industrial revolution, Hardcover edn, p. 192. Currency, Geneva
2.
Zurück zum Zitat Lasi H, Fettke P, Kemper HG et al (2014) Industry 4.0. Bus Inf Syst Eng 6(4):239–242CrossRef Lasi H, Fettke P, Kemper HG et al (2014) Industry 4.0. Bus Inf Syst Eng 6(4):239–242CrossRef
3.
Zurück zum Zitat Grieves M (2014) Digital twin: manufacturing excellence through virtual factory replication. White Pap 1:1–7 Grieves M (2014) Digital twin: manufacturing excellence through virtual factory replication. White Pap 1:1–7
4.
Zurück zum Zitat Qi Q, Tao F (2018) Digital twin and big data towards smart manufacturing and Industry 4.0: 360 degree comparison. IEEE Access 6:3585–3593CrossRef Qi Q, Tao F (2018) Digital twin and big data towards smart manufacturing and Industry 4.0: 360 degree comparison. IEEE Access 6:3585–3593CrossRef
5.
Zurück zum Zitat Carmigniani J, Furht B (2011) Augmented reality: an overview. In: Furht B (eds) Handbook of augmented reality. Springer, New York, pp 3–46CrossRef Carmigniani J, Furht B (2011) Augmented reality: an overview. In: Furht B (eds) Handbook of augmented reality. Springer, New York, pp 3–46CrossRef
6.
Zurück zum Zitat De Pace F, Manuri F, Sanna A (2018) Augmented reality in Industry 4.0. Am J Comput Sci Inf Technol 6(1):1–7 De Pace F, Manuri F, Sanna A (2018) Augmented reality in Industry 4.0. Am J Comput Sci Inf Technol 6(1):1–7
9.
Zurück zum Zitat Langley A, Lawson G, Hermawati S et al (2016) Establishing the usability of a virtual training system for assembly operations within the automotive industry. Hum Factors Ergon Manuf Serv Ind 26(6):667–679CrossRef Langley A, Lawson G, Hermawati S et al (2016) Establishing the usability of a virtual training system for assembly operations within the automotive industry. Hum Factors Ergon Manuf Serv Ind 26(6):667–679CrossRef
10.
Zurück zum Zitat Fraga-Lamas P, Fernández-Caramés TM, Blanco-Novoa O et al (2018) A review on industrial augmented reality systems for the Industry 4.0 shipyard. IEEE Access 6:13358–13375CrossRef Fraga-Lamas P, Fernández-Caramés TM, Blanco-Novoa O et al (2018) A review on industrial augmented reality systems for the Industry 4.0 shipyard. IEEE Access 6:13358–13375CrossRef
11.
Zurück zum Zitat Regenbrecht H, Baratoff G, Wilke W (2005) Augmented reality projects in the automotive and aerospace industries. IEEE Comput Graph Appl 25(6):48–56PubMedCrossRef Regenbrecht H, Baratoff G, Wilke W (2005) Augmented reality projects in the automotive and aerospace industries. IEEE Comput Graph Appl 25(6):48–56PubMedCrossRef
12.
Zurück zum Zitat Popov O, Iatsyshyn A, Sokolov D et al (2021) Application of virtual and augmented reality at nuclear power plants. In: Zaporozhets A, Artemchuk V (eds) Systems, decision and control in energy II, Springer, Cham, pp 243–260CrossRef Popov O, Iatsyshyn A, Sokolov D et al (2021) Application of virtual and augmented reality at nuclear power plants. In: Zaporozhets A, Artemchuk V (eds) Systems, decision and control in energy II, Springer, Cham, pp 243–260CrossRef
15.
Zurück zum Zitat Dalle Mura M, Dini G (2021) Augmented reality in assembly systems: state of the art and future perspectives. In: International precision assembly seminar. Springer, pp 3–22 Dalle Mura M, Dini G (2021) Augmented reality in assembly systems: state of the art and future perspectives. In: International precision assembly seminar. Springer, pp 3–22
16.
Zurück zum Zitat Ashwini KB, Patil PN et al (2020) Tracking methods in augmented reality–explore the usage of marker-based tracking. In: Proceedings of the 2nd international conference on IoT, social, mobile, analytics & cloud in computational vision & bio-engineering (ISMAC-CVB 2020) Ashwini KB, Patil PN et al (2020) Tracking methods in augmented reality–explore the usage of marker-based tracking. In: Proceedings of the 2nd international conference on IoT, social, mobile, analytics & cloud in computational vision & bio-engineering (ISMAC-CVB 2020)
17.
Zurück zum Zitat Masood T, Egger J (2019) Augmented reality in support of Industry 4.0—implementation challenges and success factors. Robot Comput Integr Manuf 58:181–195CrossRef Masood T, Egger J (2019) Augmented reality in support of Industry 4.0—implementation challenges and success factors. Robot Comput Integr Manuf 58:181–195CrossRef
18.
Zurück zum Zitat Georgel PF (2011) Is there a reality in industrial augmented reality? In: 2011 10th IEEE international symposium on mixed and augmented reality. IEEE, pp 201–210 Georgel PF (2011) Is there a reality in industrial augmented reality? In: 2011 10th IEEE international symposium on mixed and augmented reality. IEEE, pp 201–210
20.
Zurück zum Zitat Ungureanu D, Bogo F, Galliani S et al (2020) Hololens 2 research mode as a tool for computer vision research. arXiv preprint arXiv:2008.11239 Ungureanu D, Bogo F, Galliani S et al (2020) Hololens 2 research mode as a tool for computer vision research. arXiv preprint arXiv:​2008.​11239
21.
Zurück zum Zitat Vidal-Balea A, Blanco-Novoa O, Fraga-Lamas P et al (2020) Creating collaborative augmented reality experiences for Industry 4.0 training and assistance applications: performance evaluation in the shipyard of the future. Appl Sci 10(24):9073. https://doi.org/10.3390/app10249073 Vidal-Balea A, Blanco-Novoa O, Fraga-Lamas P et al (2020) Creating collaborative augmented reality experiences for Industry 4.0 training and assistance applications: performance evaluation in the shipyard of the future. Appl Sci 10(24):9073. https://​doi.​org/​10.​3390/​app10249073
22.
Zurück zum Zitat Psarommatis F, May G, Dreyfus PA et al (2020) Zero defect manufacturing: state-of-the-art review, shortcomings and future directions in research. Int J Prod Res 58(1):1–17CrossRef Psarommatis F, May G, Dreyfus PA et al (2020) Zero defect manufacturing: state-of-the-art review, shortcomings and future directions in research. Int J Prod Res 58(1):1–17CrossRef
23.
Zurück zum Zitat Psarommatis F, Sousa J, Mendonça JP et al (2022) Zero-defect manufacturing the approach for higher manufacturing sustain-ability in the era of Industry 4.0: a position paper. Int J Prod Res 60(1):73–91CrossRef Psarommatis F, Sousa J, Mendonça JP et al (2022) Zero-defect manufacturing the approach for higher manufacturing sustain-ability in the era of Industry 4.0: a position paper. Int J Prod Res 60(1):73–91CrossRef
24.
Zurück zum Zitat Wohlgemuth W, Triebfürst G (2000) Arvika: augmented reality for development, production and service. In: Proceedings of DARE 2000 on designing augmented reality environments, pp 151–152 Wohlgemuth W, Triebfürst G (2000) Arvika: augmented reality for development, production and service. In: Proceedings of DARE 2000 on designing augmented reality environments, pp 151–152
25.
Zurück zum Zitat Lima JP, Roberto R, Simões F et al (2017) Markerless tracking system for augmented reality in the automotive Industry. Expert Syst Appl 82:100–114CrossRef Lima JP, Roberto R, Simões F et al (2017) Markerless tracking system for augmented reality in the automotive Industry. Expert Syst Appl 82:100–114CrossRef
27.
Zurück zum Zitat Oh YJ, Park KY, Kim EK (2014) Mobile augmented reality system for design drawing visualization. In: 16th International conference on advanced communication technology. IEEE, pp 1296–1300 Oh YJ, Park KY, Kim EK (2014) Mobile augmented reality system for design drawing visualization. In: 16th International conference on advanced communication technology. IEEE, pp 1296–1300
28.
Zurück zum Zitat Masoni R, Ferrise F, Bordegoni M et al (2017) Supporting remote maintenance in Industry 4.0 through augmented reality. Procedia Manuf 11:1296–1302CrossRef Masoni R, Ferrise F, Bordegoni M et al (2017) Supporting remote maintenance in Industry 4.0 through augmented reality. Procedia Manuf 11:1296–1302CrossRef
29.
Zurück zum Zitat Funk M, Kosch T, Schmidt A (2016) Interactive worker assistance: comparing the effects of in-situ projection, head-mounted displays, tablet, and paper instructions. In: Proceedings of the 2016 ACM international joint conference on pervasive and ubiquitous computing, pp 934–939 Funk M, Kosch T, Schmidt A (2016) Interactive worker assistance: comparing the effects of in-situ projection, head-mounted displays, tablet, and paper instructions. In: Proceedings of the 2016 ACM international joint conference on pervasive and ubiquitous computing, pp 934–939
32.
Zurück zum Zitat Doshi A, Smith RT, Thomas BH et al (2017) Use of projector based augmented reality to improve manual spot-welding precision and accuracy for automotive manufacturing. Int J Adv Manuf Technol 89(5):1279–1293CrossRef Doshi A, Smith RT, Thomas BH et al (2017) Use of projector based augmented reality to improve manual spot-welding precision and accuracy for automotive manufacturing. Int J Adv Manuf Technol 89(5):1279–1293CrossRef
33.
Zurück zum Zitat Mourtzis D, Zogopoulos V, Vlachou E (2018) Augmented reality supported product design towards Industry 4.0: a teaching factory paradigm. Procedia Manuf 23:207–212CrossRef Mourtzis D, Zogopoulos V, Vlachou E (2018) Augmented reality supported product design towards Industry 4.0: a teaching factory paradigm. Procedia Manuf 23:207–212CrossRef
34.
Zurück zum Zitat Alvarez H, Aguinaga I, Borro D (2011) Providing guidance for maintenance operations using automatic markerless augmented reality system. In: 2011 10th IEEE international symposium on mixed and augmented reality. IEEE, pp 181–190 Alvarez H, Aguinaga I, Borro D (2011) Providing guidance for maintenance operations using automatic markerless augmented reality system. In: 2011 10th IEEE international symposium on mixed and augmented reality. IEEE, pp 181–190
35.
Zurück zum Zitat Fiorentino M, Uva AE, Gattullo M et al (2014) Augmented reality on large screen for interactive maintenance instructions. Comput Ind 65(2):270–278CrossRef Fiorentino M, Uva AE, Gattullo M et al (2014) Augmented reality on large screen for interactive maintenance instructions. Comput Ind 65(2):270–278CrossRef
36.
Zurück zum Zitat Mourtzis D, Zogopoulos V, Vlachou E (2017) Augmented reality application to support remote maintenance as a service in the robotics Industry. Procedia CIRP 63:46–51CrossRef Mourtzis D, Zogopoulos V, Vlachou E (2017) Augmented reality application to support remote maintenance as a service in the robotics Industry. Procedia CIRP 63:46–51CrossRef
37.
Zurück zum Zitat Hořejší P (2015) Augmented reality system for virtual training of parts assembly. Procedia Eng 100:699–706CrossRef Hořejší P (2015) Augmented reality system for virtual training of parts assembly. Procedia Eng 100:699–706CrossRef
38.
Zurück zum Zitat Füchter SK, Schlichting MS, Salazar G (2021) Aeronautic pilot training and augmented reality. Acta Imeko 10(3):66–71CrossRef Füchter SK, Schlichting MS, Salazar G (2021) Aeronautic pilot training and augmented reality. Acta Imeko 10(3):66–71CrossRef
39.
Zurück zum Zitat Fründ J, Gausemeier J, Matysczok C et al (2005) Using augmented reality technology to support the automobile development. In: computer supported cooperative work in design I: 8th international conference, CSCWD 2004, Xiamen, 26–28 May, Xiamen, China. Fründ J, Gausemeier J, Matysczok C et al (2005) Using augmented reality technology to support the automobile development. In: computer supported cooperative work in design I: 8th international conference, CSCWD 2004, Xiamen, 26–28 May, Xiamen, China.
40.
Zurück zum Zitat Ceruti A, Marzocca P, Liverani A et al (2019) Maintenance in aeronautics in an Industry 4.0 context: the role of augmented reality and additive manufacturing. J Comput Des Eng 6(4):516–526 Ceruti A, Marzocca P, Liverani A et al (2019) Maintenance in aeronautics in an Industry 4.0 context: the role of augmented reality and additive manufacturing. J Comput Des Eng 6(4):516–526
41.
Zurück zum Zitat Mourtzis D, Zogopoulos V, Xanthi F (2019) Augmented reality application to support the assembly of highly customized products and to adapt to production re-scheduling. Int J Adv Manuf Technol 105(9):3899–3910CrossRef Mourtzis D, Zogopoulos V, Xanthi F (2019) Augmented reality application to support the assembly of highly customized products and to adapt to production re-scheduling. Int J Adv Manuf Technol 105(9):3899–3910CrossRef
44.
Zurück zum Zitat de Oliveira ME, Corrˆea CG (2020) Virtual reality and augmented reality applications in agriculture: a literature review. In: 2020 22nd symposium on virtual and augmented reality (SVR). IEEE, pp 1–9 de Oliveira ME, Corrˆea CG (2020) Virtual reality and augmented reality applications in agriculture: a literature review. In: 2020 22nd symposium on virtual and augmented reality (SVR). IEEE, pp 1–9
45.
Zurück zum Zitat Woll R, Damerau T, Wrasse K et al (2011) Augmented reality in a serious game for manual assembly processes. In: 2011 IEEE international symposium on mixed and augmented reality-arts, media, and humanities. IEEE, pp 37–39 Woll R, Damerau T, Wrasse K et al (2011) Augmented reality in a serious game for manual assembly processes. In: 2011 IEEE international symposium on mixed and augmented reality-arts, media, and humanities. IEEE, pp 37–39
46.
Zurück zum Zitat Barfield W, Caudell T (2001) Boeing’s wire bundle assembly project. In: fundamentals of wearable computers and augmented reality. CRC Press, pp 462–482 Barfield W, Caudell T (2001) Boeing’s wire bundle assembly project. In: fundamentals of wearable computers and augmented reality. CRC Press, pp 462–482
47.
Zurück zum Zitat Liu Y, Li S, Wang J et al (2015) A computer vision-based assistant system for the assembly of narrow cabin products. Int J Adv Manuf Technol 76(1):281–293CrossRef Liu Y, Li S, Wang J et al (2015) A computer vision-based assistant system for the assembly of narrow cabin products. Int J Adv Manuf Technol 76(1):281–293CrossRef
48.
Zurück zum Zitat Lai ZH, Tao W, Leu MC et al (2020) Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J Manuf Syst 55:69–81CrossRef Lai ZH, Tao W, Leu MC et al (2020) Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J Manuf Syst 55:69–81CrossRef
49.
Zurück zum Zitat Richardson T, Gilbert S, Holub J et al (2014) Fusing self-reported and sensor data from mixed-reality training Richardson T, Gilbert S, Holub J et al (2014) Fusing self-reported and sensor data from mixed-reality training
51.
Zurück zum Zitat Evans G, Miller J, Pena MI et al (2017) Evaluating the microsoft hololens through an augmented reality assembly application. In: degraded environments: sensing, processing, and display 2017, vol 10197. SPIE, pp 282–297 Evans G, Miller J, Pena MI et al (2017) Evaluating the microsoft hololens through an augmented reality assembly application. In: degraded environments: sensing, processing, and display 2017, vol 10197. SPIE, pp 282–297
53.
Zurück zum Zitat Vorraber W, Gasser J, Webb H et al (2020) Assessing augmented reality in production: remote-assisted maintenance with hololens. Procedia CIRP 88:139–144CrossRef Vorraber W, Gasser J, Webb H et al (2020) Assessing augmented reality in production: remote-assisted maintenance with hololens. Procedia CIRP 88:139–144CrossRef
54.
Zurück zum Zitat Brunzini A, Mandolini M, Caragiuli M, Germani M et al (2021) Hololens 2 for maxillofacial surgery: a preliminary study. In: international conference on design, simulation, manufacturing: the innovation exchange. Springer, pp 133–140 Brunzini A, Mandolini M, Caragiuli M, Germani M et al (2021) Hololens 2 for maxillofacial surgery: a preliminary study. In: international conference on design, simulation, manufacturing: the innovation exchange. Springer, pp 133–140
55.
Zurück zum Zitat Leu MC, ElMaraghy HA, Nee AY et al (2013) Cad model based virtual assembly simulation, planning and training. CIRP Ann 62(2):799–822CrossRef Leu MC, ElMaraghy HA, Nee AY et al (2013) Cad model based virtual assembly simulation, planning and training. CIRP Ann 62(2):799–822CrossRef
56.
Zurück zum Zitat Webel S, Becker M, Stricker D et al (2007) Identifying differences between cad and physical mock-ups using ar. In: 2007 6th IEEE and ACM international symposium on mixed and augmented reality. IEEE, pp 281–282 Webel S, Becker M, Stricker D et al (2007) Identifying differences between cad and physical mock-ups using ar. In: 2007 6th IEEE and ACM international symposium on mixed and augmented reality. IEEE, pp 281–282
57.
Zurück zum Zitat Azpiazu J, Siltanen S , Multanen P et al (2011) Remote support for maintenance tasks by the use of augmented reality: the ManuVAR project. CARVI 2011: IX congress on virtualreality applications, 11-12 November, Alava, Spain Azpiazu J, Siltanen S , Multanen P et al (2011) Remote support for maintenance tasks by the use of augmented reality: the ManuVAR project. CARVI 2011: IX congress on virtualreality applications, 11-12 November, Alava, Spain
60.
Zurück zum Zitat Seeliger A, Netland T, Feuerriegel S (2022) Aug-mented reality for machine setups: task performance and usability evaluation in a field test. Procedia CIRP 107:570–575CrossRef Seeliger A, Netland T, Feuerriegel S (2022) Aug-mented reality for machine setups: task performance and usability evaluation in a field test. Procedia CIRP 107:570–575CrossRef
61.
Zurück zum Zitat Teruggi S, Fassi F (2022) Hololens 2 spatial map-ping capabilities in vast monumental heritage environments. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XLVI-2/W1-2022, pp. 489–496 Teruggi S, Fassi F (2022) Hololens 2 spatial map-ping capabilities in vast monumental heritage environments. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XLVI-2/W1-2022, pp. 489–496
62.
Zurück zum Zitat Unger M, Heinrich S, Rick M et al (2022) Hologram accuracy evaluation of hololens 2 for thermal imaging in medical applications. Curr Direct Biomed Eng 8(2):193–196CrossRef Unger M, Heinrich S, Rick M et al (2022) Hologram accuracy evaluation of hololens 2 for thermal imaging in medical applications. Curr Direct Biomed Eng 8(2):193–196CrossRef
63.
Zurück zum Zitat Curless B, Levoy M (1996) A volumetric method for building complex models from range images. In: proceedings of the 23rd annual conference on computer graphics and interactive techniques, pp 303–312 Curless B, Levoy M (1996) A volumetric method for building complex models from range images. In: proceedings of the 23rd annual conference on computer graphics and interactive techniques, pp 303–312
65.
Zurück zum Zitat Lu Y, Zheng H, Chand S et al (2022) Outlook on human-centric manufacturing towards Industry 5.0. J Manuf Syst 62:612–627CrossRef Lu Y, Zheng H, Chand S et al (2022) Outlook on human-centric manufacturing towards Industry 5.0. J Manuf Syst 62:612–627CrossRef
66.
Zurück zum Zitat Cao A, Chintamani KK, Pandya AK et al (2009) Nasa tlx: software for assessing subjective mental workload. Behav Res Methods 41(1):113–117PubMedCrossRef Cao A, Chintamani KK, Pandya AK et al (2009) Nasa tlx: software for assessing subjective mental workload. Behav Res Methods 41(1):113–117PubMedCrossRef
67.
Zurück zum Zitat Loizeau Q, Danglade F, Ababsa F et al (2019) Evaluating added value of augmented reality to assist aeronautical maintenance workers—experimentation on on-field use case. In: international conference on virtual reality and augmented reality. Springer, pp 151–169 Loizeau Q, Danglade F, Ababsa F et al (2019) Evaluating added value of augmented reality to assist aeronautical maintenance workers—experimentation on on-field use case. In: international conference on virtual reality and augmented reality. Springer, pp 151–169
68.
Zurück zum Zitat Re GM, Oliver J, Bordegoni M (2016) Impact of monitor-based augmented reality for on-site industrial manual operations. Cognit Technol Work 18(2):379–392CrossRef Re GM, Oliver J, Bordegoni M (2016) Impact of monitor-based augmented reality for on-site industrial manual operations. Cognit Technol Work 18(2):379–392CrossRef
69.
Zurück zum Zitat Jordan PW, Thomas B, McClelland IL et al (1996) Usability evaluation in industry, CRC Press, Boca RatonCrossRef Jordan PW, Thomas B, McClelland IL et al (1996) Usability evaluation in industry, CRC Press, Boca RatonCrossRef
70.
Zurück zum Zitat Fischini A, Ababsa F, Grasser M (2018) Usability of augmented reality in aeronautic maintenance, repair and overhaul. International conference on artificial reality and telexistence and Eurographics symposium on virtual environments, Nov 2018, Limassol, Cyprus. Fischini A, Ababsa F, Grasser M (2018) Usability of augmented reality in aeronautic maintenance, repair and overhaul. International conference on artificial reality and telexistence and Eurographics symposium on virtual environments, Nov 2018, Limassol, Cyprus.
Metadaten
Titel
Holorailway: an augmented reality system to support assembly operations in the railway industry
verfasst von
Clara Garcia
Mario Ortega
Eugenio Ivorra
Manuel Contero
Pau Mora
Mariano L. Alcañiz
Publikationsdatum
05.03.2024
Verlag
Shanghai University
Erschienen in
Advances in Manufacturing
Print ISSN: 2095-3127
Elektronische ISSN: 2195-3597
DOI
https://doi.org/10.1007/s40436-023-00479-5

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen.