Skip to main content
Top
Published in:
Cover of the book

Open Access 2021 | OriginalPaper | Chapter

Augmented Reality in Assembly Systems: State of the Art and Future Perspectives

Authors : M. Dalle Mura, G. Dini

Published in: Smart Technologies for Precision Assembly

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Assembly represents a fundamental step in manufacturing, being a time-consuming and costly process, on which the final quality of the product mostly depends. Augmented Reality (AR) may represent a key tool to assist workers during assembly, thanks to the possibility to provide the user of real-time instructions and information superimposed on the work environment. Many implementations have been developed by industries and academic institutions for both manual and collaborative assembly. Among the most remarkable examples of the last few years are applications in guidance of complex tasks, training of personnel, quality control and inspection. This keynote paper aims to provide a useful survey by reviewing recent applications of AR in assembly systems, describing potential advantages, as well as current limitations and future perspectives.

1 Introduction

Assembly represents a key element in the whole fabrication process. The total cost of a product, the time requested for completing it and its quality strictly depend on the efficiency and accuracy of execution of the different assembly steps.
Very often, these operations can be complex and require fine adjustments in order to obtain an acceptable final result. The sequence can be lengthy, with many parts to be assembled together in a specific order, so as to assure the product to function as it should.
For these reasons, the workers should be skilled and trained to do that within the cycle time imposed by the production rate. In many cases, the sequence can be dependent on the variant to be assembled and can request the consultation of paper manuals, reference tables that could lead to time inefficiencies, distractions and, therefore, safety problems.
Typically, in manual assembly, the tasks are performed by human operators aided by tools or by semi-automated machines. Set up and maintenance operations should be provided and planned in advance for all equipment, so as to allow workers a proper use and avoid problems related to unexpected breakdowns, with consequent inactivity of the production system.
A relevant problem in assembly is also represented by human errors. Obviously, an error should be prevented by using different methods such as training sessions, sensing devices or poka-yoke solutions. These approaches are often expensive and in many cases do not give a complete assurance of avoiding these inconveniences. Human errors in assembly may result in increased production wastes or higher processing time and costs, also worsening the quality level of products because of manufacturing defects. To avoid possible damage to the entire production system, the probability of human errors in assembly should be reduced.
That said, the need of integrating the traditional manual assembly with a tool able to enhance the efficiency and the efficacy of the process becomes evident. The operator must be supported and guided in his/her actions through the accomplishment of assembly operations by non-invasive devices, which allow the user not to deflect attention and distract from the process. The tool should also be connectable to the hardware employed in the system, such as sensors, and able to access and consult software databases, in order to help the operator in executing the tasks as efficiently and effectively as possible.
Augmented Reality (AR) technology could be particularly suitable to address these issues. The possibility of replacing paper manuals with interactive instructions allows the worker to interact with both the real environment and the virtual information, also receiving a feedback on the operations made.

2 Basics on Augmented Reality

Augmented Reality (AR) can be defined as a hardware-software system able to overlay virtual images or objects on the real world visual, in order to give to the observer information that he could not obtain by using only the interaction of his/her physical senses with the workplace.
The main objective of an AR system is therefore to create virtual elements in the optical path linking the user’s eyes with the real scene in front of him/her. To do that, the following steps have to be implemented:
  • real image acquisition, through a video camera that captures the images from the real world;
  • tracking, necessary to identify and record the position and orientation of objects in the environment, so as to properly align the virtual image. Widely employed are vision-based tracking systems, namely fiducial marker identification, which uses artificial markers to identify components in space, and markerless identification, which uses object features;
  • virtual image generation, through a computer that uses 3D CAD models to create digitized contents;
  • composite image generation, obtained by combining the real images captured by the camera with the virtual images generated by the computer, so as to produce the augmented content.

2.1 AR Techniques

The composite image can be generated using 3 different approaches which deeply characterize and influence the performance of a specific application:
  • optical see-through, virtual images are projected in a transparent display positioned between the user’s eyes and the real scene (Fig. 1a);
  • video see-through, real and virtual images are mixed and visualized on a monitor display positioned in front of the user (Fig. 1b);
  • image projection, virtual images are directly projected onto the surfaces of the objects in the real world (Fig. 1c).

2.2 AR Devices

An AR device identifies the hardware tool by which an operator can visualize the real scene in front of him/her enriched by the virtual elements. This device represents a very important element of the whole AR system since it constitutes the mean that gives to the human the perception of an augmented workplace.
In order to fulfill the previous conditions, an AR device should be:
  • realistic: an AR system which operates interactively in real time and in three dimensions should give a “real” interpretation and sensation of the augmented environment;
  • immersive: virtual objects and information are integrated with the real environment, in order to give a 3D perception of the scene and therefore an immersive feeling to the worker to be surrounded by a mixed real-virtual environment;
  • comfortable: wearable devices must ensure portability and be non-invasive, so as not to detract the attention from the process.
Many solutions have been proposed in the last decade by different manufacturers and tested by researchers and industries. Four categories can be identified to be used in a workplace, depending on their position with respect to the user and to the real environment:
  • head mounted devices (HMD): these visual displays are worn on the user’s head. Advantages on using an HMD are that they are hands-free and give an immersive perception to the user, since the display with AR scene is positioned to the eye-level, while a current drawback is that they may be uncomfortable and the user could suffer of eye strain, especially after prolonged use. Among most well-known HMD, there are AR glasses, such as Microsoft HoloLens, Magic Leap One and Google Glasses;
  • hand held devices (HHD): among these, there are smartphones and tablets. The main advantage with respect to an HMD is the possibility to be held only when necessary. However, they are less immersive, not hands-free and this could limit the worker in his/her actions;
  • spatial devices (SD): they consist of displays of different sizes placed statically within the environment, not suited for mobile solutions;
  • projection devices (PD): these solutions have the advantage of giving a very good mobility to the user. However, they need to be calibrated, they request flat and wide surfaces in the work area for projecting images and suffer of user obstructions.
Some of these types of devices are best suited to operate with certain visualization approaches of AR contents, as presented in Table 1. In particular HMDs can effectively operate either in video see-through and optical see-through modes.
Table 1.
AR devices capability to be used in the three composite image generation approaches.
 
Video see-through
Optical see-through
Image projection
Head mounted devices
X
X
 
Hand held devices
X
  
Spatial devices
X
  
Projection devices
  
X
Other manufacturers are proposing innovative and more powerful devices, nowadays under development, such as AR contact lenses (or smart lenses) to be used as accessory to smartphones or as a separate device. Smart contact lens technology aims to project images into the wearer’s eyes, through little display, camera and antenna inside the lens, where sensors are controlled by wearer’s blinking [1]. Other researches are working on the development of new AR hardware, such as head-worn projectors [2] and Virtual Retinal Displays (VRD), which create the augmented images by projecting laser light directly into the worker’s eye.
These last solutions, even if they aim at obtaining a more immersive impact to the user, at the current stage of development are far from a practical use and the absence of risks for the worker has to be proved.

2.3 AR Applications in Production Engineering

In recent years, AR techniques are spreading out in any field, from retail to military, from healthcare to education, gaming, travel, tourism, etc.
Production engineering, in particular, represents an area having a more and more increasing interest in this kind of tools able to be used in many activities, such as:
  • worker support in performing complex operations, using virtual instructions during particular tasks and preventing the user from doing wrong actions that may involve errors, with time-related costs;
  • guided training, accomplished through AR tools for teaching procedures and instructions conceived and addressed to unexperienced personnel;
  • remote support and maintenance, thanks to the possibility of providing the composite image to the worker on the field and, remotely and simultaneously, to the expert maintenance technicians [3];
  • other potential applications, such as real-time factory diagnostics or colleague collaboration and communication.
As said before, assembly represents a typical area of production engineering where this kind of technique may be profitably used, due to the strong need of supports and interactive tools in performing complex and articulated tasks, which can be encountered in assembly procedures.
The aim of this survey is therefore to provide a comprehensive insight on the state of the art of AR applications in assembly systems, as well as some current open issues that need to be overcome.

3 Applications in Manual Assembly Systems

Many applications have been experimented in manual assembly systems, mainly to support operators in performing correctly an assembly sequence. Extensive surveys have been published in 2012 and 2016 [4, 5], which demonstrate that from the beginning of this century, augmented reality is receiving increasing attention by researchers working on assembly problems.
In this wide literature scenario, a first rough subdivision can be made considering that the contributions can be addressed to “real industrial” and to “demonstrative” cases. The former concerns the implementation of an AR systems for supporting a real industrial problem (e.g.: assembly of an engine or a car body) and represents the main topic of this survey paper, the latter represents the experimental demonstration of procedures by using simple parts or toys, used as benchmarks.
These last contributions are often used to test specific hardware or software, such as in [6, 7], where the use of a smartphone, a Microsoft HoloLens and an Epson Moverio BT-200 smart glasses are compared with the use of paper-based instructions. Nevertheless, the contribution demonstrates that, on the one hand, users are able to solve the task of assembly a Lego Duplo set fastest when using paper instructions, but, on the other, made fewer errors with AR support (Fig. 2).
Other “non-industrial” cases often concern the assembly of furniture, which are very good as demonstrators, such as the research described in [8], where Microsoft HoloLens are used. The tests show that good results are achieved for assembly with high complexity.

3.1 Guidance in Assembly Tasks – Optical and Video See-Through Approaches

Guidance in performing assembly tasks undoubtedly represents the typical objective of many AR applications. In most cases, optical or video see-through approaches are used to guide assembly operations.
An example of a system used to substitute paper list in assembly operations is given in [9] and illustrated in Fig. 3. The AR application is developed for a Microsoft HoloLens hardware and the information is presented to the worker via a head-mounted display. The system is able to show a virtual “tunnel” indicating to the worker the bin where the part is placed, demonstrating its time-efficiency both for single kit and batch preparation.
Other applications concern the use of AR devices for guiding the correct execution of assembly operations. In [10], a prototype system is developed for providing different modality of guidance based on the different cognition phases of the user, i.e. perception, attention, memory and execution, in order to give appropriate information during assembly tasks. The prototype adopts an HMD and a 3D bare-hand interface, free from any devices, such as mouse or keyboard. In experiments concerning a motorbike alternator assembly, users proved to execute operations more quickly and accurately in comparison to a digital documentation on an LCD.
The study presented in [11] is focused on an AR video see-through support that associates different visual features to explain assembly operations with different difficulty level: the more the task to perform is complex, the easier to understand is the instruction given by the visual feature (Fig. 4). The results show that the method outperforms paper instructions.
A voice-based interaction technique is integrated with an augmented reality system for car assembly and disassembly in the work proposed in [12]. The speech signal is used to interact with the augmented reality environment, made of the virtual car’s components visualized in the real scene. The voice commands, converted into textual commands, leave both hands free, so as to allow the user to manipulate the various components.
In addition to hearing, AR has been combined with other human senses to overcome some drawbacks that may occur when only using this technology. An example is presented in [13], where a real-time vibrotactile guidance method is developed to support assembly tasks. The system uses a web camera, located on top of the workstation, to recognize components and user hands, and a vibrotactile wristband to give the user a feedback on the action made in the form of vibration.

3.2 Guidance in Assembly Tasks – Image Projection Approaches

Image projection is often used in assembly workplaces for its intrinsic capability of leaving the worker free of moving without carrying AR devices on his/her body. These systems, as the previous ones, eliminate the need for hard copy or monitor-based work instructions, and create a sort of digital “operating canvas” on a work surface.
A projector-based AR tool is in general more simple with respect to a video see-through or an optical see-through system and also more intuitive as regards its use by the operator, throughout every step of a manual process: what they have to do is follow the lights projected on the workbench or on the parts, as illustrated in Fig. 5 [14].
At Dassault Aviation, an image projection AR system named Diota’s [15] is used to guide operators in complex assembly tasks performed inside the panels of fuselage structures. Instructions related to operations to be performed and machine types to be used are displayed on the assembly. Figure 6 demonstrates how the system works, also showing the obstruction generated by the presence of the worker, typical problem in these kinds of systems.
Another approach of image projection is shown in Fig. 7 [16]. The system uses a panel placed near the operator, where instructions are projected to avoid major drawbacks of an HMD, such as poor comfort, need for special versions for users who wear eyeglasses and limited field of view. To step through the assembly sequence and review previous instructions, the worker uses foot pedals; in this way, hands are kept free from the AR device and the worker does not interrupt the assembly process.

3.3 Guidance in Complex Assembly Tasks

Guidance in complex assembly tasks is essential for performing this kind of operations and for avoiding errors or time consuming procedures. Usually, these tasks are accomplished by using reference manuals and forcing the worker to read several pages of paper instructions.
Typical complex assembly tasks can be encountered in aerospace industry. NASA uses HoloLens AR headsets [17] to faster build a new spacecraft and assemble the crew capsule Orion, as shown in Fig. 8. Boeing has developed smart glasses and an AR application for wiring harness operations. The system enables assemblers to easily see where the electrical wiring goes in the aircraft fuselage and this allows a significant reduction of errors and production time [18].
A particular situation of complex assembly is represented by performing operations in areas not directly visible by the user. The work proposed in [19] uses a tracking method to acquire information on the position of target objects in real time, to be combined with the digitized model of parts and human hands in the invisible area, superimposed on the assembly scene (Fig. 9). The method demonstrated an improved efficiency and accuracy compared to the assembly without augmented reality assistance.
Complex assembly tasks can be also encountered in mounting electrical components. A tutoring system for motherboard assembly has been developed in [20]; the prototype uses an HMD to teach novice users the assembling of hardware components on a computer motherboard.

3.4 Order Picking

Order picking is a fundamental process in any production plant, to find and take the correct parts and products from a warehouse, just before assembly tasks. Companies are continuously looking for new methods to improve such a costly process, for which AR can represent an innovative solution.
In an application made in automotive industry, workers can automatically receive all the information they need directly in their field of vision, such as storage locations or part numbers, by using smart glasses (Fig. 10). The camera mounted in this device can be also used as a barcode reader: when a worker selects the correct part, the barcode is shown in green, if not, it is illuminated in red [21].

3.5 Quality Control and Inspection

Quality control and inspection represent important steps in an assembly sequence. In many cases, they could request instruments and a comparison with reference values. For these reasons, AR systems are very suitable for this kind of application and can efficiently support operator in this critical phase.
An example is reported in Fig. 11, where an HHD using a video see-through system is employed to determine the correct position of the rear light of vehicle and its alignment with respect to the surrounding body panels [22].
Also Renault Trucks is using AR for quality control in assembly lines [23]. Quality control operators wear Microsoft HoloLens smart glasses to improve inspection operations during engine assembly, as shown in Fig. 12.
In the aerospace industry, an AR tool for quality control was developed by Airbus Group, for fuselage section inspection during assembly. The tool consists of an HHD that quickly detects errors, in order to optimize inspection time, while reducing the risk of deviations in conformity [24].

3.6 Integration with Sensing Devices

Since the sole use of AR could not provide feedback signals for a closed-loop control on the tasks executed by the operator, in manual assembly lines AR-based systems are strongly requested to be integrated with sensors, able to give to the user appropriate physical values of the process [25]. Indeed, AR can be integrated with sensing devices to create a synergistic system in which the limits of one may be filled by the other. AR can thus be employed to guide the operator in carrying out the correct action and to provide a support to recover any committed errors.
For this purpose, the contribution described in [26] proposes a configuration of a manual assembly workstation integrating the use of a sensing device and augmented reality equipment, in order to guide the actions performed by the worker. Figure 13 shows the scheme of the system obtained by the integration of a force sensor, able to detect a part of assembly errors not relievable by the AR camera.

3.7 Training

Starting from the consideration that an AR system, in principle, can be used both for guiding expertise workers and for training unexperienced users, the difference between these two applications relies in the level of instructions given by the system.
In an AR system used for training, the sequence of instructions and the virtual objects used to guide the operator take into account the fact that he/she does not have any experience on a given assembly procedure. In this case, even very simple assembly tasks are explained in detail, showing also elementary information about the use of a tool or safety principles.
For example, the assembly line shown in Fig. 14, equipped with AR projectors, provides different assembly instructions according to the skill level of workers [27].
Other examples are described in [28], where the training platform has been tested on an actuator assembly, and in [22, 29] where an HMD-based training system is used at the BMW Group Production Academy on an engine assembly (Fig. 15).
Training can be also efficiently performed through virtual assembly. An example of augmented reality system developed for training, which uses a low-invasiveness hand tracking device, is reported in [30]. The proposed methodology enables the user to interact with the virtual objects, acquiring the pose of all the fingers of both hands by a markerless optical triangulation, as reported in Fig. 16.

4 Applications in Collaborative Assembly Systems

The recent introduction of collaborative robots in assembly lines has led to the development of new fields of study, mainly concerning the safety of the work environment shared between human and robot. Also in this regard, AR may be beneficial, particularly in providing virtual information on the real workplace, to enhance situational awareness for the worker, to support spatial dialog among human and robot and to enable even remote collaboration.
The main topics currently treated in the field of AR applications for human-robot collaboration are in particular the development of Human Robot Interface (HRI) and the enhancement of worker’s safety perception.
An application of AR in Human-Robot-Interface is given in [31]. The research focuses on the study of an AR solution for increasing human confidence and trust in collaborating with robots. The work describes the design and development of an interface system that shows virtual animation of the robot movements to the operator in advance on the real workplace, in order to improve human safety.
A typical application concerning the worker’s safety perception when working close to an industrial robot is given in [32]. The main information provided by the system are assembly related instructions, robot workspace and trajectory visualization. The system, as illustrated in Fig. 17, was tested on a real case study derived from the automotive industry and demonstrated to optimize the time for assembly and the worker’s safety perception in an HRC environment. Other examples can be found in [33], where an AR system for collaborative cooperation between a robotic assembly unit and a human worker has been developed by EON Reality, or in [34], where virtual instructions are given to workers to perform simple assembly tasks in cooperation with a robot (Fig. 18).

5 Other Potential Areas of Application

AR techniques could be also useful in other areas concerning assembly process and assembly systems. Among them, the most promising and with the highest potential of development are:
  • assembly layout planning. AR devices could be used for: i) obtaining a graphical overlay of the new equipment onto the existing ones; ii) performing visual check for collisions or for available spaces in transportation systems such as forklifts;
  • workplace ergonomics, such as the detection of the reachability of objects in a manual assembly station using a digital human model superimposed on the real scene, or the graphical overlay of different workspace zones containing all elements within a grasping distance, with no need to bend or walk forward;
  • assembly process evaluation, in particular the evaluation of aspects such as the assembly sequencing, the assembly path visualization, the assembly accessibility with collision detection among parts, the ergonomic use of tools, the co-working, all of them to be made thanks to the integration with sensing devices (e.g.: haptic interfaces).

6 Open Issues and Future Perspectives

Despite the significant progress of AR technology and applications in recent years, some limitations to be solved still occur. The main limits of AR implementation in assembly systems concern the following aspects:
  • hardware and software performance,
  • tracking methods,
  • user’s acceptance,
  • authoring procedure.
They represent limits but, consequently, issues to be enhanced by the research community and therefore future perspectives for AR to more effectively assist assembly in the near future.

6.1 Hardware and Software Performance

Hardware performance is a key-element for obtaining efficient solutions. Nowadays this aspect presents some limits, even if it is increasingly improving. Features such as comfort for the user, wider field of view, comprehension of the environment, and, most importantly, more natural human machine interface through NLP (Natural Language Processing), hand and eye tracking are very important and many manufacturers are working on improving them.
Another open issue is “latency”, which refers to the error of misalignment among virtual objects resulting from the delay between the movement of the user and the moment in which the changed image is displayed, with respect to the new position of the user [11]. To solve this problem, Waegel and Brooks [35] developed a low latency inertial measurement unit (IMU) and proposed new 3D reconstruction algorithms.

6.2 Tracking Methods

Among tracking methods, AR systems for assembly applications mainly use the marker-based approach, thanks to its stable and fast detection of markers in the entire image sequence. Valuable features are accuracy, flexibility, robustness, computation efficiency as well as ease of use. However, significant drawbacks of this tracking method are incorrect registration in the field of view due to marker occlusion and impossibility of using markers on many industrial components for their small size.
Tracking should be enhanced to create a good orientation of virtual contents with respect to the user’s field of view. For example, Microsoft HoloLens 2 has implemented some tracking additions and improvements which are very beneficial to the workers. Gesture recognition, eye and hand tracking working together may provide to workers a more contextual interaction with virtual objects.
Marker-less tracking is considered the natural successor of marker-based approach. Among these techniques is Simultaneous Localization and Mapping (SLaM), which allows to build up a map within an unprepared environment, simultaneously keeping track of the current location, by extracting scene features.
The importance of this issue is also demonstrated by another contribution [36], where a method using the combination of point cloud and visual feature mechanical assembly system is used to create a tracking method. The accuracy is presented through a pump assembly case study and the results show that the tracking trajectory estimated by the method is very close to the real one and better than other methods, obtaining a good alignment of objects.

6.3 User’s Acceptance

As far as the user’s acceptance is concerned, research conducted in real work environment shows that operators could prefer AR over conventional methods of instruction [37]. Users generally give positive feedbacks when experiencing AR, especially for the possibility for non-experts to perform complex tasks and for the reduced mental workload on the user. However, this applies to AR systems implemented with care and attention to details on the interface and the content within presented to the user [37]. Moreover, to feel for the user that a task is worth using the AR support, its complexity must be high enough, as discussed in [38].

6.4 Authoring Procedure

Another important open issue is represented by the complexity of authoring in AR applications: an effective AR system requires a deep analysis of several aspects related to the assembly procedure (assembly sequence, instructions to be given to the operators, tools and fixtures, work environment, etc.) and its implementation in the AR platform and software.
A proposal of standard guidelines has been given in [39], with advantages to obtain time saving, error reduction and accuracy improvement.
Another contribution in this field has been given in [37] with the creation of AR work instructions for assembly using a new form of authoring, based on an expert recorded performing the assembly steps, automatically processed to generate AR work instructions.
An authoring tool has been also developed by BMW Group in order to support and simplify the design of AR training programs [22].

7 Conclusions

This paper presents a comprehensive survey of AR research and development in assembly systems. Many successful implementations made by industries and academic institutions, both for manual and collaborative assembly, are described, as well as current open issues to be overcome.
According to the analysis of the state of the art conducted, it is possible to conclude that augmented reality, after an important and still existing experimental stage, is becoming increasingly utilized in industrial applications and the sector is in constant expansion. The potential contribution of this technology to facilitate faster and correct operations encourages the market to a continuous evolution of hardware and software systems, including aspects related to the portability and comfort of AR devices, for a best users’ acceptance. Furthermore, the latest developments in tracking and authoring procedures aim at more robust and sophisticated systems.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Literature
2.
go back to reference Soomro, S.R., Urey, H.: Augmented reality 3D display using head-mounted projectors and transparent retro-reflective screen. In: Advances in Display Technologies VII, vol. 10126, p. 101260E. International Society for Optics and Photonics (2017) Soomro, S.R., Urey, H.: Augmented reality 3D display using head-mounted projectors and transparent retro-reflective screen. In: Advances in Display Technologies VII, vol. 10126, p. 101260E. International Society for Optics and Photonics (2017)
3.
go back to reference Dini, G., Dalle Mura, M.: Application of augmented reality techniques in through-life engineering services. Procedia Cirp 38, 14–23 (2015)CrossRef Dini, G., Dalle Mura, M.: Application of augmented reality techniques in through-life engineering services. Procedia Cirp 38, 14–23 (2015)CrossRef
4.
go back to reference Nee, A.Y.C., Ong, S.K., Chryssolouris, G., Mourtzis, D.: Augmented reality applications in design and manufacturing. CIRP Ann. Manuf. Technol. 61(2), 657–679 (2012) Nee, A.Y.C., Ong, S.K., Chryssolouris, G., Mourtzis, D.: Augmented reality applications in design and manufacturing. CIRP Ann. Manuf. Technol. 61(2), 657–679 (2012)
5.
go back to reference Wang, X., Ong, S.K., Nee, A.Y.C.: A comprehensive survey of augmented reality assembly research. Adv. Manuf. 4(1), 1–22 (2016)CrossRef Wang, X., Ong, S.K., Nee, A.Y.C.: A comprehensive survey of augmented reality assembly research. Adv. Manuf. 4(1), 1–22 (2016)CrossRef
6.
go back to reference Blattgerste, J., Strenge, B., Renner, P., Pfeiffer, T., Essig, K.: Comparing conventional and augmented reality instructions for manual assembly tasks. In: Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments, pp. 75–82 (2017) Blattgerste, J., Strenge, B., Renner, P., Pfeiffer, T., Essig, K.: Comparing conventional and augmented reality instructions for manual assembly tasks. In: Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments, pp. 75–82 (2017)
7.
go back to reference Blattgerste, J., Renner, P., Strenge, B., Pfeiffer, T.: In-situ instructions exceed side-by-side instructions in augmented reality assisted assembly. In: Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference, pp. 133–140 (2018) Blattgerste, J., Renner, P., Strenge, B., Pfeiffer, T.: In-situ instructions exceed side-by-side instructions in augmented reality assisted assembly. In: Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference, pp. 133–140 (2018)
8.
go back to reference Deshpande, A., Kim, I.: The effects of augmented reality on improving spatial problem solving for object assembly. Adv. Eng. Inform. 38, 760–775 (2018)CrossRef Deshpande, A., Kim, I.: The effects of augmented reality on improving spatial problem solving for object assembly. Adv. Eng. Inform. 38, 760–775 (2018)CrossRef
9.
go back to reference Hanson, R., Falkenström, W., Miettinen, M.: Augmented reality as a means of conveying picking information in kit preparation for mixed-model assembly. Comput. Ind. Eng. 113, 570–575 (2017)CrossRef Hanson, R., Falkenström, W., Miettinen, M.: Augmented reality as a means of conveying picking information in kit preparation for mixed-model assembly. Comput. Ind. Eng. 113, 570–575 (2017)CrossRef
10.
go back to reference Wang, X., Ong, S.K., Nee, A.Y.C.: Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv. Eng. Inform. 30(3), 406–421 (2016)CrossRef Wang, X., Ong, S.K., Nee, A.Y.C.: Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv. Eng. Inform. 30(3), 406–421 (2016)CrossRef
11.
go back to reference Radkowski, R., Herrrema, J., Oliver, J.: Augmented reality-based manual assembly support with visual features for different degrees of difficulty. Int. J. Hum.-Comput. Interact. 31(5), 337–349 (2015)CrossRef Radkowski, R., Herrrema, J., Oliver, J.: Augmented reality-based manual assembly support with visual features for different degrees of difficulty. Int. J. Hum.-Comput. Interact. 31(5), 337–349 (2015)CrossRef
12.
go back to reference Aouam, D., Benbelkacem, S., Zenati, N., Zakaria, S., Meftah, Z.: Voice-based augmented reality interactive system for car’s components assembly. In: 2018 3rd International Conference on Pattern Analysis and Intelligent Systems (PAIS), pp. 1–5 (2018) Aouam, D., Benbelkacem, S., Zenati, N., Zakaria, S., Meftah, Z.: Voice-based augmented reality interactive system for car’s components assembly. In: 2018 3rd International Conference on Pattern Analysis and Intelligent Systems (PAIS), pp. 1–5 (2018)
13.
go back to reference Arbeláez, J.C., Viganò, R., Osorio-Gómez, G.: Haptic Augmented reality (HapticAR) for assembly guidance. Int. J. Interact. Des. Manuf. (IJIDeM) 13, 673–687 (2019) Arbeláez, J.C., Viganò, R., Osorio-Gómez, G.: Haptic Augmented reality (HapticAR) for assembly guidance. Int. J. Interact. Des. Manuf. (IJIDeM) 13, 673–687 (2019)
16.
go back to reference Funk, M., Kosch, T., Kettner, R., Korn, O., Schmidt, A.: An overview of 4 years of combining industrial assembly with augmented reality for industry 4.0. In: Proceedings of the 16th International Conference on Knowledge Technologies and Datadriven Business, p. 4 (2016) Funk, M., Kosch, T., Kettner, R., Korn, O., Schmidt, A.: An overview of 4 years of combining industrial assembly with augmented reality for industry 4.0. In: Proceedings of the 16th International Conference on Knowledge Technologies and Datadriven Business, p. 4 (2016)
19.
go back to reference Wang, Z., Zhang, S., Bai, X.: Augmented reality based product invisible area assembly assistance. In: 2018 3rd International Conference on Control, Automation and Artificial Intelligence (CAAI) (2018) Wang, Z., Zhang, S., Bai, X.: Augmented reality based product invisible area assembly assistance. In: 2018 3rd International Conference on Control, Automation and Artificial Intelligence (CAAI) (2018)
20.
go back to reference Westerfield, G., Mitrovic, A., Billinghurst, M.: Intelligent augmented reality training for motherboard assembly. Int. J. Artif. Intell. Educ. 25(1), 157–172 (2015)CrossRef Westerfield, G., Mitrovic, A., Billinghurst, M.: Intelligent augmented reality training for motherboard assembly. Int. J. Artif. Intell. Educ. 25(1), 157–172 (2015)CrossRef
25.
go back to reference Iliano, S., Chimienti, V., Dini, G.: Training by augmented reality in industrial environments: a case study. In: Proceedings of 4th CIRP Conference on Assembly Technology and Systems, Ann Arbor (2012) Iliano, S., Chimienti, V., Dini, G.: Training by augmented reality in industrial environments: a case study. In: Proceedings of 4th CIRP Conference on Assembly Technology and Systems, Ann Arbor (2012)
26.
go back to reference Dalle Mura, M., Dini, G., Failli, F.: An integrated environment based on augmented reality and sensing device for manual assembly worstations. Procedia Cirp 41, 340–345 (2015)CrossRef Dalle Mura, M., Dini, G., Failli, F.: An integrated environment based on augmented reality and sensing device for manual assembly worstations. Procedia Cirp 41, 340–345 (2015)CrossRef
27.
go back to reference Funk, M., Bächler, A., Bächler, L., Kosch, T., Heidenreich, T., Schmidt, A.: Working with augmented reality: a long-term analysis of in-situ instructions at the assembly workplace. In: Proceedings of the 10th International Conference on Pervasive Technologies Related to Assistive Environments, pp. 222–229 (2017) Funk, M., Bächler, A., Bächler, L., Kosch, T., Heidenreich, T., Schmidt, A.: Working with augmented reality: a long-term analysis of in-situ instructions at the assembly workplace. In: Proceedings of the 10th International Conference on Pervasive Technologies Related to Assistive Environments, pp. 222–229 (2017)
28.
go back to reference Gavish, N., et al.: Evaluating virtual reality and augmented reality training for industrial maintenance and assembly tasks. Interact. Learn. Environ. 23(6), 778–798 (2015)CrossRef Gavish, N., et al.: Evaluating virtual reality and augmented reality training for industrial maintenance and assembly tasks. Interact. Learn. Environ. 23(6), 778–798 (2015)CrossRef
29.
go back to reference Werrlich, S., Nitsche, K., Notni, G.: Demand analysis for an augmented reality based assembly training. In: Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments, pp. 416–422 (2017) Werrlich, S., Nitsche, K., Notni, G.: Demand analysis for an augmented reality based assembly training. In: Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments, pp. 416–422 (2017)
30.
go back to reference Valentini, P.P.: Natural interface for interactive virtual assembly in augmented reality using leap motion controller. Int. J. Interact. Des. Manuf. (IJIDeM) 12(4), 1157–1165 (2018)CrossRef Valentini, P.P.: Natural interface for interactive virtual assembly in augmented reality using leap motion controller. Int. J. Interact. Des. Manuf. (IJIDeM) 12(4), 1157–1165 (2018)CrossRef
31.
go back to reference Palmarini, R., Del Amo, I.F., Bertolino, G., Dini, G., Erkoyuncu, J.A., Roy, R., Farnsworth, M.: Designing an AR interface to improve trust in human-robots collaboration. Procedia CIRP 70, 350–355 (2018)CrossRef Palmarini, R., Del Amo, I.F., Bertolino, G., Dini, G., Erkoyuncu, J.A., Roy, R., Farnsworth, M.: Designing an AR interface to improve trust in human-robots collaboration. Procedia CIRP 70, 350–355 (2018)CrossRef
32.
go back to reference Makris, S., Karagiannis, P., Koukas, S., Matthaiakis, A.S.: Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann. 65(1), 61–64 (2016)CrossRef Makris, S., Karagiannis, P., Koukas, S., Matthaiakis, A.S.: Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann. 65(1), 61–64 (2016)CrossRef
34.
go back to reference Danielsson, O., Syberfeldt, A., Brewster, R., Wang, L.: Assessing instructions in augmented reality for human-robot collaborative assembly by using demonstrators. Procedia CIRP 63, 89–94 (2017)CrossRef Danielsson, O., Syberfeldt, A., Brewster, R., Wang, L.: Assessing instructions in augmented reality for human-robot collaborative assembly by using demonstrators. Procedia CIRP 63, 89–94 (2017)CrossRef
35.
go back to reference Waegel, K., Brooks, F.P.: Filling the gaps: hybrid vision and inertial tracking. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), IEEE (2013) Waegel, K., Brooks, F.P.: Filling the gaps: hybrid vision and inertial tracking. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), IEEE (2013)
36.
go back to reference Wang, Y., Zhang, S., Wan, B., He, W., Bai, X.: Point cloud and visual feature-based tracking method for an augmented reality-aided mechanical assembly system. Int. J. Adv. Manuf. Technol. 99(9–12), 2341–2352 (2018)CrossRef Wang, Y., Zhang, S., Wan, B., He, W., Bai, X.: Point cloud and visual feature-based tracking method for an augmented reality-aided mechanical assembly system. Int. J. Adv. Manuf. Technol. 99(9–12), 2341–2352 (2018)CrossRef
37.
go back to reference Bhattacharya, B., Winer, E.H.: Augmented reality via expert demonstration authoring (AREDA). Comput. Ind. 105, 61–79 (2019)CrossRef Bhattacharya, B., Winer, E.H.: Augmented reality via expert demonstration authoring (AREDA). Comput. Ind. 105, 61–79 (2019)CrossRef
38.
go back to reference Syberfeldt, A., Danielsson, O., Holm, M., Wang, L.: Visual assembling guidance using augmented reality. Procedia Manuf. 1, 98–109 (2015)CrossRef Syberfeldt, A., Danielsson, O., Holm, M., Wang, L.: Visual assembling guidance using augmented reality. Procedia Manuf. 1, 98–109 (2015)CrossRef
39.
go back to reference Chimienti, V., Iliano, S., Dassisti, M., Dini, G., Failli, F.: Guidelines for implementing augmented reality procedures in assisting assembly operations. In: International Precision Assembly Seminar, pp. 174–179. Springer, Heidelberg (2010) Chimienti, V., Iliano, S., Dassisti, M., Dini, G., Failli, F.: Guidelines for implementing augmented reality procedures in assisting assembly operations. In: International Precision Assembly Seminar, pp. 174–179. Springer, Heidelberg (2010)
Metadata
Title
Augmented Reality in Assembly Systems: State of the Art and Future Perspectives
Authors
M. Dalle Mura
G. Dini
Copyright Year
2021
DOI
https://doi.org/10.1007/978-3-030-72632-4_1

Premium Partner