Skip to main content
Erschienen in: The International Journal of Advanced Manufacturing Technology 1-2/2021

10.05.2021 | ORIGINAL ARTICLE

SHARIDEAS: a smart collaborative assembly platform based on augmented reality supporting assembly intention recognition

verfasst von: Zhuo Wang, Yang Wang, Xiaoliang Bai, Xiangyu Huo, Weiping He, Shuo Feng, Jie Zhang, Yueqing Zhang, Jinzhao Zhou

Erschienen in: The International Journal of Advanced Manufacturing Technology | Ausgabe 1-2/2021

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

With the development of augmented reality supporting manual assembly collaboration, it is particularly important to explore the transformation from traditional “human-machine” cooperation mode to smart “human-machine” cooperation mode. Early studies have shown that user cues (i.e., head, gesture, eye) and scene cues (i.e., objects, tools, space) are intuitive and highly expressive for traditional AR collaborative mode. However, how to integrate these cues into an assembly system, reasonably infer an operator’s work intention, and then give an appropriate rendering scheme is one of the problems in collaborative assembly. This paper describes a AR collaborative assembly platform: SHARIDEAS. It uses a generalized grey correlation method to integrate user cues and scene cues. The results of data fusion can provide appropriate and intuitive assembly guidance for local workers. A formal user study is to explore the usability and feasibility of SHAREDEAS in a manual assembly task. The experimental data show that SHAREDEAS is more conducive than traditional one to improve the efficiency of human-machine cooperation. Finally, some conclusions of SHARIDEAS are given and the future research direction has prospected.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Literatur
1.
Zurück zum Zitat Uva AE, Gattullo M, Manghisi VM, Spagnulo D, Cascella GL, Fiorentino M (2018) Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations. Int J Adv Manuf Technol 94:509–521CrossRef Uva AE, Gattullo M, Manghisi VM, Spagnulo D, Cascella GL, Fiorentino M (2018) Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations. Int J Adv Manuf Technol 94:509–521CrossRef
2.
Zurück zum Zitat Mengoni M, Ceccacci S, Generosi A, Leopardi A (2018) Spatial augmented reality: an application for human work in smart manufacturing environment. Procedia Manuf 17:476–483CrossRef Mengoni M, Ceccacci S, Generosi A, Leopardi A (2018) Spatial augmented reality: an application for human work in smart manufacturing environment. Procedia Manuf 17:476–483CrossRef
3.
Zurück zum Zitat Deshpande A, Kim I (2018) The effects of augmented reality on improving spatial problem solving for object assembly. Adv Eng Inform 38:760–775CrossRef Deshpande A, Kim I (2018) The effects of augmented reality on improving spatial problem solving for object assembly. Adv Eng Inform 38:760–775CrossRef
4.
Zurück zum Zitat Westerfield G, Mitrovic A, Billinghurst M (2015) Intelligent augmented reality training for motherboard assembly. Int J Artif Intell Educ 25:157–172CrossRef Westerfield G, Mitrovic A, Billinghurst M (2015) Intelligent augmented reality training for motherboard assembly. Int J Artif Intell Educ 25:157–172CrossRef
5.
Zurück zum Zitat Zhou J, Lee I, Thomas B, Menassa R, Farrant A, Sansome A (2012) In-situ support for automotive manufacturing using spatial augmented reality. Int J Virtual Real 11:33–41CrossRef Zhou J, Lee I, Thomas B, Menassa R, Farrant A, Sansome A (2012) In-situ support for automotive manufacturing using spatial augmented reality. Int J Virtual Real 11:33–41CrossRef
6.
Zurück zum Zitat Kritzler M, Murr M, Michahelles F (2016) Remotebob: support of on-site workers via a telepresence remote expert system. In: Proceedings of the 6th International Conference on the Internet of Things, pp 7–14CrossRef Kritzler M, Murr M, Michahelles F (2016) Remotebob: support of on-site workers via a telepresence remote expert system. In: Proceedings of the 6th International Conference on the Internet of Things, pp 7–14CrossRef
7.
Zurück zum Zitat Wang P, Zhang S, Bai X, Billinghurst M, He W, Sun M, Chen Y, Lv H, Ji H (2019) 2.5 DHANDS: a gesture-based MR remote collaborative platform. Int J Adv Manuf Technol 102:1339–1353CrossRef Wang P, Zhang S, Bai X, Billinghurst M, He W, Sun M, Chen Y, Lv H, Ji H (2019) 2.5 DHANDS: a gesture-based MR remote collaborative platform. Int J Adv Manuf Technol 102:1339–1353CrossRef
8.
Zurück zum Zitat Wang P, Bai X, Billinghurst M, Zhang S, He W, Han D, Wang Y, Min H, Lan W, Han S (2020) Using a head pointer or eye gaze: the effect of gaze on spatial ar remote collaboration for physical tasks. Interact Comput 32:153–169CrossRef Wang P, Bai X, Billinghurst M, Zhang S, He W, Han D, Wang Y, Min H, Lan W, Han S (2020) Using a head pointer or eye gaze: the effect of gaze on spatial ar remote collaboration for physical tasks. Interact Comput 32:153–169CrossRef
9.
Zurück zum Zitat Gurevich P, Lanir J, Cohen B (2015) Design and implementation of teleadvisor: a projection-based augmented reality system for remote collaboration. Comput Support Cooper Work (CSCW) 24:527–562CrossRef Gurevich P, Lanir J, Cohen B (2015) Design and implementation of teleadvisor: a projection-based augmented reality system for remote collaboration. Comput Support Cooper Work (CSCW) 24:527–562CrossRef
10.
Zurück zum Zitat Le Chénéchal M, Duval T, Gouranton V, Royan J, Arnaldi B (2016) Vishnu: virtual immersive support for HelpiNg users an interaction paradigm for collaborative remote guiding in mixed reality. In: 2016 IEEE Third VR International Workshop on Collaborative Virtual Environments (3DCVE), pp 9–12CrossRef Le Chénéchal M, Duval T, Gouranton V, Royan J, Arnaldi B (2016) Vishnu: virtual immersive support for HelpiNg users an interaction paradigm for collaborative remote guiding in mixed reality. In: 2016 IEEE Third VR International Workshop on Collaborative Virtual Environments (3DCVE), pp 9–12CrossRef
11.
Zurück zum Zitat Wang Z, Bai X, Zhang S, He W, Wang P, Zhang X et al (2020) SHARIdeas: a visual representation of intention sharing between designer and executor supporting AR assembly. In: SIGGRAPH Asia 2020 Posters, pp 1–2 Wang Z, Bai X, Zhang S, He W, Wang P, Zhang X et al (2020) SHARIdeas: a visual representation of intention sharing between designer and executor supporting AR assembly. In: SIGGRAPH Asia 2020 Posters, pp 1–2
12.
Zurück zum Zitat Wang P, Zhang S, Billinghurst M, Bai X, He W, Wang S et al (2019) A comprehensive survey of AR/MR-based co-design in manufacturing. Eng Comput:1–24 Wang P, Zhang S, Billinghurst M, Bai X, He W, Wang S et al (2019) A comprehensive survey of AR/MR-based co-design in manufacturing. Eng Comput:1–24
13.
Zurück zum Zitat Xuerui T, Julong D, Xiaojun C (2007) Generalized grey relational grade and grey relational order test, 2007 IEEE International Conference on Systems, Man and Cybernetics, pp 3928–3931 Xuerui T, Julong D, Xiaojun C (2007) Generalized grey relational grade and grey relational order test, 2007 IEEE International Conference on Systems, Man and Cybernetics, pp 3928–3931
14.
Zurück zum Zitat Piumsomboon T, Lee Y, Lee G, Billinghurst M (2017) In: SIGGRAPH Asia 2017 Emerging Technologies (ed) CoVAR: a collaborative virtual and augmented reality system for remote collaboration, pp 1–2 Piumsomboon T, Lee Y, Lee G, Billinghurst M (2017) In: SIGGRAPH Asia 2017 Emerging Technologies (ed) CoVAR: a collaborative virtual and augmented reality system for remote collaboration, pp 1–2
15.
Zurück zum Zitat Piumsomboon T, Day A, Ens B, Lee Y, Lee G, Billinghurst M (2017) Exploring enhancements for remote mixed reality collaboration. In: SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, pp 1–5 Piumsomboon T, Day A, Ens B, Lee Y, Lee G, Billinghurst M (2017) Exploring enhancements for remote mixed reality collaboration. In: SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, pp 1–5
16.
Zurück zum Zitat Wang P, Zhang S, Bai X, Billinghurst M, He W, Wang S et al (2019) Head pointer or eye gaze: which helps more in MR remote collaboration? In: 2019 IEEE conference on virtual reality and 3D user interfaces (VR), pp 1219–1220CrossRef Wang P, Zhang S, Bai X, Billinghurst M, He W, Wang S et al (2019) Head pointer or eye gaze: which helps more in MR remote collaboration? In: 2019 IEEE conference on virtual reality and 3D user interfaces (VR), pp 1219–1220CrossRef
17.
Zurück zum Zitat Špakov O, Istance H, Räihä K-J, Viitanen T, Siirtola H (2019) Eye gaze and head gaze in collaborative games. In: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, pp 1–9 Špakov O, Istance H, Räihä K-J, Viitanen T, Siirtola H (2019) Eye gaze and head gaze in collaborative games. In: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, pp 1–9
18.
Zurück zum Zitat Renner P, Pfeiffer T (2017) Evaluation of attention guiding techniques for augmented reality-based assistance in picking and assembly tasks. In: Intelligent User Interfaces, pp 89–92 Renner P, Pfeiffer T (2017) Evaluation of attention guiding techniques for augmented reality-based assistance in picking and assembly tasks. In: Intelligent User Interfaces, pp 89–92
19.
Zurück zum Zitat Erickson A, Norouzi N, Kim K, Schubert R, Jules J, LaViola JJ et al (2020) Sharing gaze rays for visual target identification tasks in collaborative augmented reality. J Multimodal User Interfaces 14:353–371CrossRef Erickson A, Norouzi N, Kim K, Schubert R, Jules J, LaViola JJ et al (2020) Sharing gaze rays for visual target identification tasks in collaborative augmented reality. J Multimodal User Interfaces 14:353–371CrossRef
20.
Zurück zum Zitat Norouzi N, Erickson A, Kim K, Schubert R, LaViola J, Bruder G et al (2019) Effects of shared gaze parameters on visual target identification task performance in augmented reality. In: Symposium on Spatial User Interaction, pp 1–11 Norouzi N, Erickson A, Kim K, Schubert R, LaViola J, Bruder G et al (2019) Effects of shared gaze parameters on visual target identification task performance in augmented reality. In: Symposium on Spatial User Interaction, pp 1–11
21.
Zurück zum Zitat Masai K, Kunze K, Sugimoto M, Billinghurst M (2016) Empathy glasses. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp 1257–1263CrossRef Masai K, Kunze K, Sugimoto M, Billinghurst M (2016) Empathy glasses. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp 1257–1263CrossRef
22.
Zurück zum Zitat Pan Y, Steed A (2016) Effects of 3D perspective on head gaze estimation with a multiview autostereoscopic display. Int J Human-Comput Stud 86:138–148CrossRef Pan Y, Steed A (2016) Effects of 3D perspective on head gaze estimation with a multiview autostereoscopic display. Int J Human-Comput Stud 86:138–148CrossRef
23.
Zurück zum Zitat Steptoe W, Wolff R, Murgia A, Guimaraes E, Rae J, Sharkey P et al (2008) Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments. In: Proceedings of the 2008 ACM conference on Computer supported cooperative work, pp 197–200CrossRef Steptoe W, Wolff R, Murgia A, Guimaraes E, Rae J, Sharkey P et al (2008) Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments. In: Proceedings of the 2008 ACM conference on Computer supported cooperative work, pp 197–200CrossRef
24.
Zurück zum Zitat Young J, Langlotz T, Cook M, Mills S, Regenbrecht H (2019) Immersive telepresence and remote collaboration using mobile and wearable devices. IEEE Trans Vis Comput Graph 25:1908–1918CrossRef Young J, Langlotz T, Cook M, Mills S, Regenbrecht H (2019) Immersive telepresence and remote collaboration using mobile and wearable devices. IEEE Trans Vis Comput Graph 25:1908–1918CrossRef
25.
Zurück zum Zitat Kim S, Lee G, Huang W, Kim H, Woo W, Billinghurst M (2019) Evaluating the combination of visual communication cues for HMD-based mixed reality remote collaboration. In: Proceedings of the 2019 CHI conference on human factors in computing systems, pp 1–13 Kim S, Lee G, Huang W, Kim H, Woo W, Billinghurst M (2019) Evaluating the combination of visual communication cues for HMD-based mixed reality remote collaboration. In: Proceedings of the 2019 CHI conference on human factors in computing systems, pp 1–13
26.
Zurück zum Zitat Bai H, Sasikumar P, Yang J, Billinghurst M (2020) A user study on mixed reality remote collaboration with eye gaze and hand gesture sharing. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp 1–13 Bai H, Sasikumar P, Yang J, Billinghurst M (2020) A user study on mixed reality remote collaboration with eye gaze and hand gesture sharing. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp 1–13
27.
Zurück zum Zitat Gao L, Bai H, Lindeman R, Billinghurst M (2017) Static local environment capturing and sharing for MR remote collaboration. In: SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, pp 1–6 Gao L, Bai H, Lindeman R, Billinghurst M (2017) Static local environment capturing and sharing for MR remote collaboration. In: SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, pp 1–6
28.
Zurück zum Zitat Gao L, Bai H, He W, Billinghurst M, Lindeman RW (2018) Real-time visual representations for mobile mixed reality remote collaboration. In: SIGGRAPH Asia 2018 Virtual & Augmented Reality, pp 1–2 Gao L, Bai H, He W, Billinghurst M, Lindeman RW (2018) Real-time visual representations for mobile mixed reality remote collaboration. In: SIGGRAPH Asia 2018 Virtual & Augmented Reality, pp 1–2
29.
Zurück zum Zitat Lee GA, Teo T, Kim S, Billinghurst M (2017) Mixed reality collaboration through sharing a live panorama. In: SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, pp 1–4 Lee GA, Teo T, Kim S, Billinghurst M (2017) Mixed reality collaboration through sharing a live panorama. In: SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, pp 1–4
30.
Zurück zum Zitat Piumsomboon T, Lee GA, Irlitti A, Ens B, Thomas BH, Billinghurst M (2019) On the shoulder of the giant: A multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In: Proceedings of the 2019 CHI conference on human factors in computing systems, pp 1–17 Piumsomboon T, Lee GA, Irlitti A, Ens B, Thomas BH, Billinghurst M (2019) On the shoulder of the giant: A multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In: Proceedings of the 2019 CHI conference on human factors in computing systems, pp 1–17
31.
Zurück zum Zitat Wang P, Bai X, Billinghurst M, Zhang S, Wei S, Xu G et al (2020) 3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration. Multimed Tools Appl:1–26 Wang P, Bai X, Billinghurst M, Zhang S, Wei S, Xu G et al (2020) 3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration. Multimed Tools Appl:1–26
32.
Zurück zum Zitat Wang Y, Zhang S, Yang S, He W, Bai X (2018) Mechanical assembly assistance using marker-less augmented reality system. Assem Autom Wang Y, Zhang S, Yang S, He W, Bai X (2018) Mechanical assembly assistance using marker-less augmented reality system. Assem Autom
33.
Zurück zum Zitat Wang Y, Zhang S, Wan B, He W, Bai X (2018) Point cloud and visual feature-based tracking method for an augmented reality-aided mechanical assembly system. Int J Adv Manuf Technol 99:2341–2352CrossRef Wang Y, Zhang S, Wan B, He W, Bai X (2018) Point cloud and visual feature-based tracking method for an augmented reality-aided mechanical assembly system. Int J Adv Manuf Technol 99:2341–2352CrossRef
34.
Zurück zum Zitat Wang Z, Bai X, Zhang S, He W, Zhang X, Zhang L, Wang P, Han D, Yan Y (2020) Information-level AR instruction: a novel assembly guidance information representation assisting user cognition. Int J Adv Manuf Technol 106:603–626CrossRef Wang Z, Bai X, Zhang S, He W, Zhang X, Zhang L, Wang P, Han D, Yan Y (2020) Information-level AR instruction: a novel assembly guidance information representation assisting user cognition. Int J Adv Manuf Technol 106:603–626CrossRef
35.
Zurück zum Zitat Wang Z, Bai X, Zhang S, He W, Zhang X, Yan Y et al (2020) Information-level real-time AR instruction: a novel dynamic assembly guidance information representation assisting human cognition. Int J Adv Manuf Technol:1–19 Wang Z, Bai X, Zhang S, He W, Zhang X, Yan Y et al (2020) Information-level real-time AR instruction: a novel dynamic assembly guidance information representation assisting human cognition. Int J Adv Manuf Technol:1–19
36.
Zurück zum Zitat M. Bose, "Optimal crossover designs," in International Conference on Parallel Processing Workshops, 2009. M. Bose, "Optimal crossover designs," in International Conference on Parallel Processing Workshops, 2009.
37.
Zurück zum Zitat Gavish N, Gutierrez T, Webel S, Rodriguez J, Peveri M, Bockholt U et al (2015) Evaluating virtual reality and augmented reality training for industrial maintenance and assembly tasks. Interact Learn Environ 23:778–798CrossRef Gavish N, Gutierrez T, Webel S, Rodriguez J, Peveri M, Bockholt U et al (2015) Evaluating virtual reality and augmented reality training for industrial maintenance and assembly tasks. Interact Learn Environ 23:778–798CrossRef
38.
Zurück zum Zitat Houpt JW, Blaha LM (2016) Mathematical models of perception and cognition volume I: A Festschrift for James T. Psychology Press, TownsendCrossRef Houpt JW, Blaha LM (2016) Mathematical models of perception and cognition volume I: A Festschrift for James T. Psychology Press, TownsendCrossRef
39.
Zurück zum Zitat Henderson SJ, Feiner SK (2011) Augmented reality in the psychomotor phase of a procedural task. In: 2011 10th IEEE International Symposium on Mixed and Augmented Reality, pp 191–19sCrossRef Henderson SJ, Feiner SK (2011) Augmented reality in the psychomotor phase of a procedural task. In: 2011 10th IEEE International Symposium on Mixed and Augmented Reality, pp 191–19sCrossRef
Metadaten
Titel
SHARIDEAS: a smart collaborative assembly platform based on augmented reality supporting assembly intention recognition
verfasst von
Zhuo Wang
Yang Wang
Xiaoliang Bai
Xiangyu Huo
Weiping He
Shuo Feng
Jie Zhang
Yueqing Zhang
Jinzhao Zhou
Publikationsdatum
10.05.2021
Verlag
Springer London
Erschienen in
The International Journal of Advanced Manufacturing Technology / Ausgabe 1-2/2021
Print ISSN: 0268-3768
Elektronische ISSN: 1433-3015
DOI
https://doi.org/10.1007/s00170-021-07142-y

Weitere Artikel der Ausgabe 1-2/2021

The International Journal of Advanced Manufacturing Technology 1-2/2021 Zur Ausgabe

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen.