Skip to main content
Erschienen in: International Journal of Computer Assisted Radiology and Surgery 2/2017

19.09.2016 | Review Article

Touchless interaction with software in interventional radiology and surgery: a systematic literature review

verfasst von: André Mewes, Bennet Hensen, Frank Wacker, Christian Hansen

Erschienen in: International Journal of Computer Assisted Radiology and Surgery | Ausgabe 2/2017

Einloggen

Aktivieren Sie unsere intelligente Suche um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Purpose

In this article, we systematically examine the current state of research of systems that focus on touchless human–computer interaction in operating rooms and interventional radiology suites. We further discuss the drawbacks of current solutions and underline promising technologies for future development.

Methods

A systematic literature search of scientific papers that deal with touchless control of medical software in the immediate environment of the operation room and interventional radiology suite was performed. This includes methods for touchless gesture interaction, voice control and eye tracking.

Results

Fifty-five research papers were identified and analyzed in detail including 33 journal publications. Most of the identified literature (62 %) deals with the control of medical image viewers. The others present interaction techniques for laparoscopic assistance (13 %), telerobotic assistance and operating room control (9 % each) as well as for robotic operating room assistance and intraoperative registration (3.5 % each). Only 8 systems (14.5 %) were tested in a real clinical environment, and 7 (12.7 %) were not evaluated at all.

Conclusion

In the last 10 years, many advancements have led to robust touchless interaction approaches. However, only a few have been systematically evaluated in real operating room settings. Further research is required to cope with current limitations of touchless software interfaces in clinical environments. The main challenges for future research are the improvement and evaluation of usability and intuitiveness of touchless human–computer interaction and the full integration into productive systems as well as the reduction of necessary interaction steps and further development of hands-free interaction.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Achacon DLM, Carlos DM, Puyaoan MK, Clarin CT, Naval Jr. PC (2009) Realism: real-time hand gesture interface for surgeons and medical experts. In: 9th Philippine computing science congress, Citeseer Achacon DLM, Carlos DM, Puyaoan MK, Clarin CT, Naval Jr. PC (2009) Realism: real-time hand gesture interface for surgeons and medical experts. In: 9th Philippine computing science congress, Citeseer
2.
Zurück zum Zitat Alapetite A (2008) Speech recognition for the anaesthesia record during crisis scenarios. Int J Medi Inform 77(7):448–460CrossRef Alapetite A (2008) Speech recognition for the anaesthesia record during crisis scenarios. Int J Medi Inform 77(7):448–460CrossRef
3.
Zurück zum Zitat Audhkhasi K, Sethy A, Ramabhadran B (2016) Semantic word embedding neural network language models for automatic speech recognition. IN: 2016 IEEE international conference on acoustics. Speech and signal processing (ICASSP), IEEE, pp 5995–5999 Audhkhasi K, Sethy A, Ramabhadran B (2016) Semantic word embedding neural network language models for automatic speech recognition. IN: 2016 IEEE international conference on acoustics. Speech and signal processing (ICASSP), IEEE, pp 5995–5999
4.
Zurück zum Zitat Bane R, Höllerer T (2004) Interactive tools for virtual x-ray vision in mobile augmented reality. In: Third IEEE and ACM international symposium on mixed and augmented reality, 2004. ISMAR 2004, IEEE, pp 231–239 Bane R, Höllerer T (2004) Interactive tools for virtual x-ray vision in mobile augmented reality. In: Third IEEE and ACM international symposium on mixed and augmented reality, 2004. ISMAR 2004, IEEE, pp 231–239
5.
Zurück zum Zitat Bauer S, Seitel A, Hofmann H, Blum T, Wasza J, Balda M, Meinzer HP, Navab N, Hornegger J, Maier-Hein L (2013) Real-time range imaging in health care: a survey. In: Grzegorzek M, Theobalt C, Koch R, Kolb A (eds) Time-of-flight and depth imaging. Sensors, algorithms, and applications. Springer, Berlin, pp 228–254. doi:10.1007/978-3-642-44964-2_11 Bauer S, Seitel A, Hofmann H, Blum T, Wasza J, Balda M, Meinzer HP, Navab N, Hornegger J, Maier-Hein L (2013) Real-time range imaging in health care: a survey. In: Grzegorzek M, Theobalt C, Koch R, Kolb A (eds) Time-of-flight and depth imaging. Sensors, algorithms, and applications. Springer, Berlin, pp 228–254. doi:10.​1007/​978-3-642-44964-2_​11
6.
Zurück zum Zitat Bigdelou A, Schwarz L, Navab N (2012) An adaptive solution for intra-operative gesture-based human-machine interaction. In: Proceedings of the 2012 ACM international conference on intelligent user interfaces, ACM, pp 75–84 Bigdelou A, Schwarz L, Navab N (2012) An adaptive solution for intra-operative gesture-based human-machine interaction. In: Proceedings of the 2012 ACM international conference on intelligent user interfaces, ACM, pp 75–84
7.
Zurück zum Zitat Bizzotto N, Costanzo A, Bizzotto L, Regis D, Sandri A, Magnan B (2014) Leap motion gesture control with osirix in the operating room to control imaging first experiences during live surgery. Surg Innov 21(6):655–656CrossRefPubMed Bizzotto N, Costanzo A, Bizzotto L, Regis D, Sandri A, Magnan B (2014) Leap motion gesture control with osirix in the operating room to control imaging first experiences during live surgery. Surg Innov 21(6):655–656CrossRefPubMed
8.
Zurück zum Zitat Cambria E, White B (2014) Jumping nlp curves: a review of natural language processing research [review article]. IEEE Comput Intell Mag 9(2):48–57CrossRef Cambria E, White B (2014) Jumping nlp curves: a review of natural language processing research [review article]. IEEE Comput Intell Mag 9(2):48–57CrossRef
9.
Zurück zum Zitat Chan W, Jaitly N, Le Q, Vinyals O (2016) Listen, attend and spell: A neural network for large vocabulary conversational speech recognition. 2016 IEEE international conference on acoustics. Speech and signal processing (ICASSP), IEEE, pp 4960–4964 Chan W, Jaitly N, Le Q, Vinyals O (2016) Listen, attend and spell: A neural network for large vocabulary conversational speech recognition. 2016 IEEE international conference on acoustics. Speech and signal processing (ICASSP), IEEE, pp 4960–4964
10.
Zurück zum Zitat Chao C, Tan J, Castillo EM, Zawaideh M, Roberts AC, Kinney TB (2014) Comparative efficacy of new interfaces for intra-procedural imaging review: the microsoft kinect, hillcrest labs loop pointer, and the apple ipad. J Digit Imaging 27(4):463–469CrossRefPubMedPubMedCentral Chao C, Tan J, Castillo EM, Zawaideh M, Roberts AC, Kinney TB (2014) Comparative efficacy of new interfaces for intra-procedural imaging review: the microsoft kinect, hillcrest labs loop pointer, and the apple ipad. J Digit Imaging 27(4):463–469CrossRefPubMedPubMedCentral
11.
Zurück zum Zitat Clancy NT, Mylonas GP, Yang GZ, Elson DS (2011) Gaze-contingent autofocus system for robotic-assisted minimally invasive surgery. In: Engineering in medicine and biology society, EMBC, 2011 annual international conference of the IEEE, IEEE, pp 5396–5399 Clancy NT, Mylonas GP, Yang GZ, Elson DS (2011) Gaze-contingent autofocus system for robotic-assisted minimally invasive surgery. In: Engineering in medicine and biology society, EMBC, 2011 annual international conference of the IEEE, IEEE, pp 5396–5399
12.
Zurück zum Zitat Dahl GE, Yu D, Deng L, Acero A (2012) Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition. IEEE Trans Audio Speech Lang Process 20(1):30–42CrossRef Dahl GE, Yu D, Deng L, Acero A (2012) Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition. IEEE Trans Audio Speech Lang Process 20(1):30–42CrossRef
13.
Zurück zum Zitat Ebert L, Flach P, Thali M, Ross S (2014) Out of touch-a plugin for controlling osirix with gestures using the leap controller. J Forensic Radiol Imaging 2(3):126–128CrossRef Ebert L, Flach P, Thali M, Ross S (2014) Out of touch-a plugin for controlling osirix with gestures using the leap controller. J Forensic Radiol Imaging 2(3):126–128CrossRef
14.
Zurück zum Zitat Ebert LC, Hatch G, Ampanozi G, Thali MJ, Ross S (2012) You can’t touch this touch-free navigation through radiological images. Surg Innov 19(3):301–307CrossRefPubMed Ebert LC, Hatch G, Ampanozi G, Thali MJ, Ross S (2012) You can’t touch this touch-free navigation through radiological images. Surg Innov 19(3):301–307CrossRefPubMed
15.
Zurück zum Zitat El-Shallaly G, Mohammed B, Muhtaseb M, Hamouda A, Nassar A (2005) Voice recognition interfaces (vri) optimize the utilization of theatre staff and time during laparoscopic cholecystectomy. Minim Invasive Ther Allied Technol 14(6):369–371CrossRefPubMed El-Shallaly G, Mohammed B, Muhtaseb M, Hamouda A, Nassar A (2005) Voice recognition interfaces (vri) optimize the utilization of theatre staff and time during laparoscopic cholecystectomy. Minim Invasive Ther Allied Technol 14(6):369–371CrossRefPubMed
16.
Zurück zum Zitat Gallo L (2013) A study on the degrees of freedom in touchless interaction. In: SIGGRAPH Asia 2013 technical briefs, ACM, p 28 Gallo L (2013) A study on the degrees of freedom in touchless interaction. In: SIGGRAPH Asia 2013 technical briefs, ACM, p 28
17.
Zurück zum Zitat Gong RH, Güler Ö, Kürklüoglu M, Lovejoy J, Yaniv Z (2013) Interactive initialization of 2d/3d rigid registration. Med Phys 40(12):121,911 Gong RH, Güler Ö, Kürklüoglu M, Lovejoy J, Yaniv Z (2013) Interactive initialization of 2d/3d rigid registration. Med Phys 40(12):121,911
18.
Zurück zum Zitat Graetzel C, Fong T, Grange S, Baur C (2004) A non-contact mouse for surgeon-computer interaction. Technol Health Care 12(3):245–257 Graetzel C, Fong T, Grange S, Baur C (2004) A non-contact mouse for surgeon-computer interaction. Technol Health Care 12(3):245–257
19.
Zurück zum Zitat Grange S, Fong T, Baur C (2004) M/oris: a medical/operating room interaction system. In: Proceedings of the 6th international conference on multimodal interfaces, ACM, pp 159–166 Grange S, Fong T, Baur C (2004) M/oris: a medical/operating room interaction system. In: Proceedings of the 6th international conference on multimodal interfaces, ACM, pp 159–166
20.
Zurück zum Zitat Hartmann F, Schlaefer A (2013) Feasibility of touch-less control of operating room lights. Int J Comput Assist Radiol Surg 8(2):259–268CrossRefPubMed Hartmann F, Schlaefer A (2013) Feasibility of touch-less control of operating room lights. Int J Comput Assist Radiol Surg 8(2):259–268CrossRefPubMed
21.
Zurück zum Zitat Herniczek SK, Lasso A, Ungi T, Fichtinger G (2014) Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy. Proceedings of SPIE 9036, medical imaging 2014: image-guided procedures, robotic interventions, and modeling, 90362F. doi:10.1117/12.2043564 Herniczek SK, Lasso A, Ungi T, Fichtinger G (2014) Feasibility of a touch-free user interface for ultrasound snapshot-guided nephrostomy. Proceedings of SPIE 9036, medical imaging 2014: image-guided procedures, robotic interventions, and modeling, 90362F. doi:10.​1117/​12.​2043564
22.
Zurück zum Zitat Hettig J, Mewes A, Riabikin O, Skalej M, Preim B, Hansen C (2015) Exploration of 3D medical image data for interventional radiology using myoelectric gesture control. In: Proceedings of Eurographics workshop on visual computing for biology and medicine, The Eurographics Association, pp 177–185 Hettig J, Mewes A, Riabikin O, Skalej M, Preim B, Hansen C (2015) Exploration of 3D medical image data for interventional radiology using myoelectric gesture control. In: Proceedings of Eurographics workshop on visual computing for biology and medicine, The Eurographics Association, pp 177–185
23.
Zurück zum Zitat Hinton G, Deng L, Yu D, Dahl GE, Mohamed Ar, Jaitly N, Senior A, Vanhoucke V, Nguyen P, Sainath TN (2012) Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process Mag 29(6):82–97CrossRef Hinton G, Deng L, Yu D, Dahl GE, Mohamed Ar, Jaitly N, Senior A, Vanhoucke V, Nguyen P, Sainath TN (2012) Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process Mag 29(6):82–97CrossRef
24.
Zurück zum Zitat Hötker AM, Pitton MB, Mildenberger P, Düber C (2013) Speech and motion control for interventional radiology: requirements and feasibility. Int J Comput Assist Radiol Surg 8(6):997–1002CrossRefPubMed Hötker AM, Pitton MB, Mildenberger P, Düber C (2013) Speech and motion control for interventional radiology: requirements and feasibility. Int J Comput Assist Radiol Surg 8(6):997–1002CrossRefPubMed
25.
Zurück zum Zitat Hübler A, Hansen C, Beuing O, Skalej M, Preim B (2014) Workflow analysis for interventional neuroradiology using frequent pattern mining. In: Proceedings of the annual meeting of the German Society of Computer- and Robot-Assisted Surgery, Munich, pp 165–168 Hübler A, Hansen C, Beuing O, Skalej M, Preim B (2014) Workflow analysis for interventional neuroradiology using frequent pattern mining. In: Proceedings of the annual meeting of the German Society of Computer- and Robot-Assisted Surgery, Munich, pp 165–168
26.
Zurück zum Zitat Neumann J, Neumuth T (2015a) Standardized semantic workflow modeling in the surgical domain–proof-of-concept analysis and evaluation for a neurosurgical use-case. IEEE, Boston, pp 6–11 Neumann J, Neumuth T (2015a) Standardized semantic workflow modeling in the surgical domain–proof-of-concept analysis and evaluation for a neurosurgical use-case. IEEE, Boston, pp 6–11
27.
Zurück zum Zitat Neumann J, Neumuth T (2015b) Towards a framework for standardized semantic workflow modeling and management in the surgical domain. Curr Dir Biomed Eng 1(1):172–175 Neumann J, Neumuth T (2015b) Towards a framework for standardized semantic workflow modeling and management in the surgical domain. Curr Dir Biomed Eng 1(1):172–175
28.
Zurück zum Zitat Jacob MG, Wachs JP (2014) Context-based hand gesture recognition for the operating room. Pattern Recognit Lett 36:196–203CrossRef Jacob MG, Wachs JP (2014) Context-based hand gesture recognition for the operating room. Pattern Recognit Lett 36:196–203CrossRef
29.
Zurück zum Zitat Jalaliniya S, Smith J, Sousa M, Büthe L, Pederson T (2013) Touch-less interaction with medical images using hand & foot gestures. In: Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication, ACM, pp 1265–1274 Jalaliniya S, Smith J, Sousa M, Büthe L, Pederson T (2013) Touch-less interaction with medical images using hand & foot gestures. In: Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication, ACM, pp 1265–1274
30.
Zurück zum Zitat Kajastila R, Lokki T (2013) Eyes-free interaction with free-hand gestures and auditory menus. Int J Hum Comput Stud 71(5):627–640CrossRef Kajastila R, Lokki T (2013) Eyes-free interaction with free-hand gestures and auditory menus. Int J Hum Comput Stud 71(5):627–640CrossRef
31.
Zurück zum Zitat Kilgus T, Bux R, Franz A, Johnen W, Heim E, Fangerau M, Müller M, Yen K, Maier-Hein L (2016) Structure sensor for mobile markerless augmented reality. Proceedings of SPIE 9786, medical imaging 2016: image-guided procedures, robotic interventions, and modeling, 97861L. doi:10.1117/12.2216057 Kilgus T, Bux R, Franz A, Johnen W, Heim E, Fangerau M, Müller M, Yen K, Maier-Hein L (2016) Structure sensor for mobile markerless augmented reality. Proceedings of SPIE 9786, medical imaging 2016: image-guided procedures, robotic interventions, and modeling, 97861L. doi:10.​1117/​12.​2216057
32.
Zurück zum Zitat Kipshagen T, Graw M, Tronnier V, Bonsanto M, Hofmann U (2009) Touch-and marker-free interaction with medical software. World congress on medical physics and biomedical engineering, September 7–12, 2009. Springer, Munich, pp 75–78 Kipshagen T, Graw M, Tronnier V, Bonsanto M, Hofmann U (2009) Touch-and marker-free interaction with medical software. World congress on medical physics and biomedical engineering, September 7–12, 2009. Springer, Munich, pp 75–78
33.
Zurück zum Zitat Kirmizibayrak C, Radeva N, Wakid M, Philbeck J, Sibert J, Hahn J (2011) Evaluation of gesture based interfaces for medical volume visualization tasks. In: Proceedings of the 10th international conference on Virtual reality continuum and its applications in industry, ACM, pp 69–74 Kirmizibayrak C, Radeva N, Wakid M, Philbeck J, Sibert J, Hahn J (2011) Evaluation of gesture based interfaces for medical volume visualization tasks. In: Proceedings of the 10th international conference on Virtual reality continuum and its applications in industry, ACM, pp 69–74
34.
Zurück zum Zitat Kocev B, Ritter F, Linsen L (2014) Projector-based surgeon-computer interaction on deformable surfaces. Int J Comput Assist Radiol Surg 9(2):301–312CrossRefPubMed Kocev B, Ritter F, Linsen L (2014) Projector-based surgeon-computer interaction on deformable surfaces. Int J Comput Assist Radiol Surg 9(2):301–312CrossRefPubMed
35.
Zurück zum Zitat Li YT, Jacob M, Akingba G, Wachs JP (2013) A cyber-physical management system for delivering and monitoring surgical instruments in the or. Surg Innov 20(4):377–384CrossRefPubMed Li YT, Jacob M, Akingba G, Wachs JP (2013) A cyber-physical management system for delivering and monitoring surgical instruments in the or. Surg Innov 20(4):377–384CrossRefPubMed
36.
Zurück zum Zitat Manning CD, Surdeanu M, Bauer J, Finkel JR, Bethard S, McClosky D (2014) The stanford coreNLP natural language processing toolkit. In: Proceedings of 52nd annual meeting of the association for computational linguistics: system demonstrations, pp 55–60 Manning CD, Surdeanu M, Bauer J, Finkel JR, Bethard S, McClosky D (2014) The stanford coreNLP natural language processing toolkit. In: Proceedings of 52nd annual meeting of the association for computational linguistics: system demonstrations, pp 55–60
37.
Zurück zum Zitat Mauser S, Burgert O (2014) Touch-free, gesture-based control of medical devices and software based on the leap motion controller. Stud Health Technol Inform 196:265–270PubMed Mauser S, Burgert O (2014) Touch-free, gesture-based control of medical devices and software based on the leap motion controller. Stud Health Technol Inform 196:265–270PubMed
38.
Zurück zum Zitat Meng M, Fallavollita P, Habert S, Weidert S, Navab N (2016) Device-and system-independent personal touchless user interface for operating rooms. Int J Comput Assist Radiol Surg 11(6):1–9 Meng M, Fallavollita P, Habert S, Weidert S, Navab N (2016) Device-and system-independent personal touchless user interface for operating rooms. Int J Comput Assist Radiol Surg 11(6):1–9
39.
Zurück zum Zitat Mentis HM, O’Hara K, Gonzalez G, Sellen A, Corish R, Criminisi A, Trivedi R, Theodore P (2015) Voice or gesture in the operating room. In: Proceedings of the 33rd annual ACM conference extended abstracts on human factors in computing systems, ACM, pp 773–780 Mentis HM, O’Hara K, Gonzalez G, Sellen A, Corish R, Criminisi A, Trivedi R, Theodore P (2015) Voice or gesture in the operating room. In: Proceedings of the 33rd annual ACM conference extended abstracts on human factors in computing systems, ACM, pp 773–780
40.
Zurück zum Zitat Merolla PA, Arthur JV, Alvarez-Icaza R, Cassidy AS, Sawada J, Akopyan F, Jackson BL, Imam N, Guo C, Nakamura Y, Brezzo B, Vo I, Esser SK, Rathinakumar A, Taba B, Amir A, Flickner MD, Risk WP, Monohar R, Modha DS (2014) A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345(6197):668–673CrossRefPubMed Merolla PA, Arthur JV, Alvarez-Icaza R, Cassidy AS, Sawada J, Akopyan F, Jackson BL, Imam N, Guo C, Nakamura Y, Brezzo B, Vo I, Esser SK, Rathinakumar A, Taba B, Amir A, Flickner MD, Risk WP, Monohar R, Modha DS (2014) A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345(6197):668–673CrossRefPubMed
41.
Zurück zum Zitat Mewes A, Saalfeld P, Riabikin O, Skalej M, Hansen C (2015) A gesture-controlled projection display for ct-guided interventions. Int J Comput Assist Radiol Surg 11(1):1–8CrossRef Mewes A, Saalfeld P, Riabikin O, Skalej M, Hansen C (2015) A gesture-controlled projection display for ct-guided interventions. Int J Comput Assist Radiol Surg 11(1):1–8CrossRef
42.
Zurück zum Zitat Moher D, Liberati A, Tetzlaff J, Altman DG (2009) Preferred reporting items for systematic reviews and meta-analyses: the prisma statement. Ann Intern Med 151(4):264–269CrossRefPubMed Moher D, Liberati A, Tetzlaff J, Altman DG (2009) Preferred reporting items for systematic reviews and meta-analyses: the prisma statement. Ann Intern Med 151(4):264–269CrossRefPubMed
43.
Zurück zum Zitat Molchanov P, Gupta S, Kim K, Kautz J (2015) Hand gesture recognition with 3d convolutional neural networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp 1–7 Molchanov P, Gupta S, Kim K, Kautz J (2015) Hand gesture recognition with 3d convolutional neural networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp 1–7
44.
Zurück zum Zitat Müller M, Rassweiler MC, Klein J, Seitel A, Gondan M, Baumhauer M, Teber D, Rassweiler JJ, Meinzer HP, Maier-Hein L (2013) Mobile augmented reality for computer-assisted percutaneous nephrolithotomy. Int J Comput Assist Radiol Surg 8(4):663–675CrossRefPubMed Müller M, Rassweiler MC, Klein J, Seitel A, Gondan M, Baumhauer M, Teber D, Rassweiler JJ, Meinzer HP, Maier-Hein L (2013) Mobile augmented reality for computer-assisted percutaneous nephrolithotomy. Int J Comput Assist Radiol Surg 8(4):663–675CrossRefPubMed
45.
Zurück zum Zitat Mylonas GP, Kwok KW, Darzi A, Yang GZ (2008) Gaze-contingent motor channelling and haptic constraints for minimally invasive robotic surgery. In: Medical image computing and computer-assisted intervention—MICCAI 2008, Springer, pp 676–683 Mylonas GP, Kwok KW, Darzi A, Yang GZ (2008) Gaze-contingent motor channelling and haptic constraints for minimally invasive robotic surgery. In: Medical image computing and computer-assisted intervention—MICCAI 2008, Springer, pp 676–683
46.
Zurück zum Zitat Nathan CAO, Chakradeo V, Malhotra K, D’Agostino H, Patwardhan R (2006) The voice-controlled robotic assist scope holder aesop for the endoscopic approach to the sella. Skull Base 16(3):123CrossRefPubMedPubMedCentral Nathan CAO, Chakradeo V, Malhotra K, D’Agostino H, Patwardhan R (2006) The voice-controlled robotic assist scope holder aesop for the endoscopic approach to the sella. Skull Base 16(3):123CrossRefPubMedPubMedCentral
47.
Zurück zum Zitat Neverova N, Wolf C, Taylor GW, Nebout F (2014) Multi-scale deep learning for gesture detection and localization. In: Computer vision-ECCV 2014 workshops, Springer, pp 474–490 Neverova N, Wolf C, Taylor GW, Nebout F (2014) Multi-scale deep learning for gesture detection and localization. In: Computer vision-ECCV 2014 workshops, Springer, pp 474–490
48.
Zurück zum Zitat Nishida N, Nakayama H (2015) Multimodal gesture recognition using multi-stream recurrent neural network. In: Pacific-rim symposium on image and video technology, Springer, pp 682–694 Nishida N, Nakayama H (2015) Multimodal gesture recognition using multi-stream recurrent neural network. In: Pacific-rim symposium on image and video technology, Springer, pp 682–694
49.
Zurück zum Zitat Nishikawa A, Hosoi T, Koara K, Negoro D, Hikita A, Asano S, Kakutani H, Miyazaki F, Sekimoto M, Yasui M, Miyake Y, Takiguchi S, Monden M (2003) Face mouse: a novel human-machine interface for controlling the position of a laparoscope. IEEE Trans Robot Automa 19(5):825–841CrossRef Nishikawa A, Hosoi T, Koara K, Negoro D, Hikita A, Asano S, Kakutani H, Miyazaki F, Sekimoto M, Yasui M, Miyake Y, Takiguchi S, Monden M (2003) Face mouse: a novel human-machine interface for controlling the position of a laparoscope. IEEE Trans Robot Automa 19(5):825–841CrossRef
50.
Zurück zum Zitat Nouei MT, Kamyad AV, Soroush AR, Ghazalbash S (2015) A comprehensive operating room information system using the kinect sensors and rfid. J Clin Monit Comput 29(2):251–261CrossRefPubMed Nouei MT, Kamyad AV, Soroush AR, Ghazalbash S (2015) A comprehensive operating room information system using the kinect sensors and rfid. J Clin Monit Comput 29(2):251–261CrossRefPubMed
51.
Zurück zum Zitat O’Hara K, Gonzalez G, Sellen A, Penney G, Varnavas A, Mentis H, Criminisi A, Corish R, Rouncefield M, Dastur N (2014) Touchless interaction in surgery. Commun ACM 57(1):70–77CrossRef O’Hara K, Gonzalez G, Sellen A, Penney G, Varnavas A, Mentis H, Criminisi A, Corish R, Rouncefield M, Dastur N (2014) Touchless interaction in surgery. Commun ACM 57(1):70–77CrossRef
52.
Zurück zum Zitat Opromolla A, Volpi V, Ingrosso A, Fabri S, Rapuano C, Passalacqua D, Medaglia CM (2015) A usability study of a gesture recognition system applied during the surgical procedures. In: Marcus A (ed) Design, user experience, and usability: interactive experience design. Springer, pp 682–692. doi:10.1007/978-3-319-20889-3_63 Opromolla A, Volpi V, Ingrosso A, Fabri S, Rapuano C, Passalacqua D, Medaglia CM (2015) A usability study of a gesture recognition system applied during the surgical procedures. In: Marcus A (ed) Design, user experience, and usability: interactive experience design. Springer, pp 682–692. doi:10.​1007/​978-3-319-20889-3_​63
53.
Zurück zum Zitat Park BJ, Jang T, Choi JW, Kim N (2016) Gesture-controlled interface for contactless control of various computer programs with a hooking-based keyboard and mouse-mapping technique in the operating room. Comput Math Methods Med 2016. doi:10.1155/2016/5170379 Park BJ, Jang T, Choi JW, Kim N (2016) Gesture-controlled interface for contactless control of various computer programs with a hooking-based keyboard and mouse-mapping technique in the operating room. Comput Math Methods Med 2016. doi:10.​1155/​2016/​5170379
54.
Zurück zum Zitat Park Y, Kim J, Lee K (2015) Effects of auditory feedback on menu selection in hand-gesture interfaces. IEEE MultiMed 22(1):32–40CrossRef Park Y, Kim J, Lee K (2015) Effects of auditory feedback on menu selection in hand-gesture interfaces. IEEE MultiMed 22(1):32–40CrossRef
55.
Zurück zum Zitat Pauchot J, Di Tommaso L, Lounis A, Benassarou M, Mathieu P, Bernot D, Aubry S (2015) Leap motion gesture control with carestream software in the operating room to control imaging installation guide and discussion. Surg Innov 22:615–620CrossRefPubMed Pauchot J, Di Tommaso L, Lounis A, Benassarou M, Mathieu P, Bernot D, Aubry S (2015) Leap motion gesture control with carestream software in the operating room to control imaging installation guide and discussion. Surg Innov 22:615–620CrossRefPubMed
56.
Zurück zum Zitat Perrakis A, Hohenberger W, Horbach T (2013) Integrated operation systems and voice recognition in minimally invasive surgery: comparison of two systems. Surg Endosc 27(2):575–579CrossRefPubMed Perrakis A, Hohenberger W, Horbach T (2013) Integrated operation systems and voice recognition in minimally invasive surgery: comparison of two systems. Surg Endosc 27(2):575–579CrossRefPubMed
57.
Zurück zum Zitat Reilink R, De Bruin G, Franken M, Mariani M, Misra S, Stramigioli S (2010) Endoscopic camera control by head movements for thoracic surgery. In: 2010 3rd IEEE RAS and EMBS international conference on Biomedical robotics and biomechatronics (BioRob), IEEE, pp 510–515 Reilink R, De Bruin G, Franken M, Mariani M, Misra S, Stramigioli S (2010) Endoscopic camera control by head movements for thoracic surgery. In: 2010 3rd IEEE RAS and EMBS international conference on Biomedical robotics and biomechatronics (BioRob), IEEE, pp 510–515
58.
Zurück zum Zitat Riduwan M, Basori AH, Mohamed F (2013) Finger-based gestural interaction for exploration of 3d heart visualization. Procedia Soc Behav Sci 97:684–690CrossRef Riduwan M, Basori AH, Mohamed F (2013) Finger-based gestural interaction for exploration of 3d heart visualization. Procedia Soc Behav Sci 97:684–690CrossRef
59.
Zurück zum Zitat Rosa GM, Elizondo ML (2014) Use of a gesture user interface as a touchless image navigation system in dental surgery: case series report. Imaging Sci Dent 44(2):155–160CrossRefPubMedPubMedCentral Rosa GM, Elizondo ML (2014) Use of a gesture user interface as a touchless image navigation system in dental surgery: case series report. Imaging Sci Dent 44(2):155–160CrossRefPubMedPubMedCentral
60.
Zurück zum Zitat Ruppert GCS, Reis LO, Amorim PHJ, de Moraes TF, da Silva JVL (2012) Touchless gesture user interface for interactive image visualization in urological surgery. World J Urol 30(5):687–691CrossRefPubMed Ruppert GCS, Reis LO, Amorim PHJ, de Moraes TF, da Silva JVL (2012) Touchless gesture user interface for interactive image visualization in urological surgery. World J Urol 30(5):687–691CrossRefPubMed
61.
Zurück zum Zitat Saalfeld P, Mewes A, Luz M, Preim B, Hansen C (2015) Comparative evaluation of gesture and touch input for medical software. In: Proceedings of Mensch und computer 2015 Saalfeld P, Mewes A, Luz M, Preim B, Hansen C (2015) Comparative evaluation of gesture and touch input for medical software. In: Proceedings of Mensch und computer 2015
62.
Zurück zum Zitat Sainath TN, Vinyals O, Senior A, Sak H (2015) Convolutional, long short-term memory, fully connected deep neural networks. In: 2015 IEEE international conference on acoustics, speech and signal processing (ICASSP), IEEE, pp 4580–4584 Sainath TN, Vinyals O, Senior A, Sak H (2015) Convolutional, long short-term memory, fully connected deep neural networks. In: 2015 IEEE international conference on acoustics, speech and signal processing (ICASSP), IEEE, pp 4580–4584
63.
Zurück zum Zitat Salama IA, Schwaitzberg SD (2005) Utility of a voice-activated system in minimally invasive surgery. J Laparoendosc Adv Surg Tech 15(5):443–446CrossRef Salama IA, Schwaitzberg SD (2005) Utility of a voice-activated system in minimally invasive surgery. J Laparoendosc Adv Surg Tech 15(5):443–446CrossRef
64.
Zurück zum Zitat Schwarz LA, Bigdelou A, Navab N (2011) Learning gestures for customizable human-computer interaction in the operating room. In: Medical image computing and computer-assisted intervention—MICCAI 2011, Springer, pp 129–136 Schwarz LA, Bigdelou A, Navab N (2011) Learning gestures for customizable human-computer interaction in the operating room. In: Medical image computing and computer-assisted intervention—MICCAI 2011, Springer, pp 129–136
65.
Zurück zum Zitat Silva ÉS, Rodrigues MAF (2014) Design and evaluation of a gesture-controlled system for interactive manipulation of medical images and 3d models. SBC J Interact Syst 5(3):53–65 Silva ÉS, Rodrigues MAF (2014) Design and evaluation of a gesture-controlled system for interactive manipulation of medical images and 3d models. SBC J Interact Syst 5(3):53–65
66.
Zurück zum Zitat Soutschek S, Penne J, Hornegger J, Kornhuber J (2008) 3-d gesture-based scene navigation in medical imaging applications using time-of-flight cameras. In: IEEE computer society conference on computer vision and pattern recognition workshops, 2008. CVPRW’08, IEEE, pp 1–6 Soutschek S, Penne J, Hornegger J, Kornhuber J (2008) 3-d gesture-based scene navigation in medical imaging applications using time-of-flight cameras. In: IEEE computer society conference on computer vision and pattern recognition workshops, 2008. CVPRW’08, IEEE, pp 1–6
67.
Zurück zum Zitat Stoyanov D, Mylonas GP, Yang GZ (2008) Gaze-contingent 3d control for focused energy ablation in robotic assisted surgery. In: Medical image computing and computer-assisted intervention–MICCAI 2008, Springer, pp 347–355 Stoyanov D, Mylonas GP, Yang GZ (2008) Gaze-contingent 3d control for focused energy ablation in robotic assisted surgery. In: Medical image computing and computer-assisted intervention–MICCAI 2008, Springer, pp 347–355
68.
Zurück zum Zitat Strickland M, Tremaine J, Brigley G, Law C (2013) Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Can J Surg 56(3):E1CrossRefPubMedPubMedCentral Strickland M, Tremaine J, Brigley G, Law C (2013) Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Can J Surg 56(3):E1CrossRefPubMedPubMedCentral
69.
Zurück zum Zitat Suelze B, Agten R, Bertrand PB, Vandenryt T, Thoelen R, Vandervoort P, Grieten L (2013) Waving at the heart: Implementation of a kinect-based real-time interactive control system for viewing cineangiogram loops during cardiac catheterization procedures. In: Computing in cardiology conference (CinC), 2013, IEEE, pp 229–232 Suelze B, Agten R, Bertrand PB, Vandenryt T, Thoelen R, Vandervoort P, Grieten L (2013) Waving at the heart: Implementation of a kinect-based real-time interactive control system for viewing cineangiogram loops during cardiac catheterization procedures. In: Computing in cardiology conference (CinC), 2013, IEEE, pp 229–232
70.
Zurück zum Zitat Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB (2013) Informatics in radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics 33(2):E61–E70CrossRefPubMed Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB (2013) Informatics in radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics 33(2):E61–E70CrossRefPubMed
71.
Zurück zum Zitat Visentini-Scarzanella M, Mylonas GP, Stoyanov D, Yang GZ (2009) i-brush: A gaze-contingent virtual paintbrush for dense 3d reconstruction in robotic assisted surgery. In: Medical image computing and computer-assisted intervention—MICCAI 2009, Springer, pp 353–360 Visentini-Scarzanella M, Mylonas GP, Stoyanov D, Yang GZ (2009) i-brush: A gaze-contingent virtual paintbrush for dense 3d reconstruction in robotic assisted surgery. In: Medical image computing and computer-assisted intervention—MICCAI 2009, Springer, pp 353–360
72.
Zurück zum Zitat Wachs JP, Stern HI, Edan Y, Gillam M, Handler J, Feied C, Smith M (2008) A gesture-based tool for sterile browsing of radiology images. J Am Med Inform Assoc 15(3):321–323CrossRefPubMedPubMedCentral Wachs JP, Stern HI, Edan Y, Gillam M, Handler J, Feied C, Smith M (2008) A gesture-based tool for sterile browsing of radiology images. J Am Med Inform Assoc 15(3):321–323CrossRefPubMedPubMedCentral
73.
Zurück zum Zitat Wachs JP, Vujjeni K, Matson ET, Adams S (2010) a window on tissue-using facial orientation to control endoscopic views of tissue depth. In: 2010 annual international conference of the IEEE engineering in medicine and biology society (EMBC), IEEE, pp 935–938 Wachs JP, Vujjeni K, Matson ET, Adams S (2010) a window on tissue-using facial orientation to control endoscopic views of tissue depth. In: 2010 annual international conference of the IEEE engineering in medicine and biology society (EMBC), IEEE, pp 935–938
74.
Zurück zum Zitat Walker BN, Lindsay J, Nance A, Nakano Y, Palladino DK, Dingler T, Jeon M (2013) Spearcons (speech-based earcons) improve navigation performance in advanced auditory menus. Hum Factors J Hum Factors Ergon Soc 55(1):157–182CrossRef Walker BN, Lindsay J, Nance A, Nakano Y, Palladino DK, Dingler T, Jeon M (2013) Spearcons (speech-based earcons) improve navigation performance in advanced auditory menus. Hum Factors J Hum Factors Ergon Soc 55(1):157–182CrossRef
75.
Zurück zum Zitat Wen R, Tay WL, Nguyen BP, Chng CB, Chui CK (2014) Hand gesture guided robot-assisted surgery based on a direct augmented reality interface. Comput Methods Programs Biomed 116(2):68–80CrossRefPubMed Wen R, Tay WL, Nguyen BP, Chng CB, Chui CK (2014) Hand gesture guided robot-assisted surgery based on a direct augmented reality interface. Comput Methods Programs Biomed 116(2):68–80CrossRefPubMed
76.
Zurück zum Zitat Wigdor D, Wixon D (2011) Brave NUI world: designing natural user interfaces for touch and gesture. Elsevier, Amsterdam Wigdor D, Wixon D (2011) Brave NUI world: designing natural user interfaces for touch and gesture. Elsevier, Amsterdam
77.
Zurück zum Zitat Wipfli R, Dubois-Ferrière V, Budry S, Hoffmeyer P, Lovis C (2016) Gesture-controlled image management for operating room: a randomized crossover study to compare interaction using gestures, mouse, and third person relaying. PloS One 11(4):e0153,596 Wipfli R, Dubois-Ferrière V, Budry S, Hoffmeyer P, Lovis C (2016) Gesture-controlled image management for operating room: a randomized crossover study to compare interaction using gestures, mouse, and third person relaying. PloS One 11(4):e0153,596
78.
Zurück zum Zitat Yoshida S, Ito M, Tatokoro M, Yokoyama M, Ishioka J, Matsuoka Y, Numao N, Saito K, Fujii Y, Kihara K (2015) Multitask imaging monitor for surgical navigation: combination of touchless interface and head-mounted display. Urol Int. doi:10.1159/000381104 Yoshida S, Ito M, Tatokoro M, Yokoyama M, Ishioka J, Matsuoka Y, Numao N, Saito K, Fujii Y, Kihara K (2015) Multitask imaging monitor for surgical navigation: combination of touchless interface and head-mounted display. Urol Int. doi:10.​1159/​000381104
79.
Zurück zum Zitat Yusoff YA, Basori AH, Mohamed F (2013) Interactive hand and arm gesture control for 2d medical image and 3d volumetric medical visualization. Procedia Soc Behav Sci 97:723–729CrossRef Yusoff YA, Basori AH, Mohamed F (2013) Interactive hand and arm gesture control for 2d medical image and 3d volumetric medical visualization. Procedia Soc Behav Sci 97:723–729CrossRef
Metadaten
Titel
Touchless interaction with software in interventional radiology and surgery: a systematic literature review
verfasst von
André Mewes
Bennet Hensen
Frank Wacker
Christian Hansen
Publikationsdatum
19.09.2016
Verlag
Springer International Publishing
Erschienen in
International Journal of Computer Assisted Radiology and Surgery / Ausgabe 2/2017
Print ISSN: 1861-6410
Elektronische ISSN: 1861-6429
DOI
https://doi.org/10.1007/s11548-016-1480-6

Weitere Artikel der Ausgabe 2/2017

International Journal of Computer Assisted Radiology and Surgery 2/2017 Zur Ausgabe