Skip to main content
Erschienen in: International Journal of Computer Assisted Radiology and Surgery 6/2016

01.06.2016 | Original Article

Device- and system-independent personal touchless user interface for operating rooms

One personal UI to control all displays in an operating room

verfasst von: Meng MA, Pascal Fallavollita, Séverine Habert, Simon Weidert, Nassir Navab

Erschienen in: International Journal of Computer Assisted Radiology and Surgery | Ausgabe 6/2016

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Introduction

In the modern day operating room, the surgeon performs surgeries with the support of different medical systems that showcase patient information, physiological data, and medical images. It is generally accepted that numerous interactions must be performed by the surgical team to control the corresponding medical system to retrieve the desired information. Joysticks and physical keys are still present in the operating room due to the disadvantages of mouses, and surgeons often communicate instructions to the surgical team when requiring information from a specific medical system. In this paper, a novel user interface is developed that allows the surgeon to personally perform touchless interaction with the various medical systems, switch effortlessly among them, all of this without modifying the systems’ software and hardware.

Methods

To achieve this, a wearable RGB-D sensor is mounted on the surgeon’s head for inside-out tracking of his/her finger with any of the medical systems’ displays. Android devices with a special application are connected to the computers on which the medical systems are running, simulating a normal USB mouse and keyboard. When the surgeon performs interaction using pointing gestures, the desired cursor position in the targeted medical system display, and gestures, are transformed into general events and then sent to the corresponding Android device. Finally, the application running on the Android devices generates the corresponding mouse or keyboard events according to the targeted medical system.

Results and conclusion

To simulate an operating room setting, our unique user interface was tested by seven medical participants who performed several interactions with the visualization of CT, MRI, and fluoroscopy images at varying distances from them. Results from the system usability scale and NASA-TLX workload index indicated a strong acceptance of our proposed user interface.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Literatur
4.
Zurück zum Zitat Grätzel C, Fong T, Grange S, Baur C (2004) A non-contact mouse for surgeon-computer interaction. Technol Health Care 12(3):245–257PubMed Grätzel C, Fong T, Grange S, Baur C (2004) A non-contact mouse for surgeon-computer interaction. Technol Health Care 12(3):245–257PubMed
6.
Zurück zum Zitat Ionescu A (2006) A Mouse in the O.R. Ambidextrous Mag 4:30–32 Ionescu A (2006) A Mouse in the O.R. Ambidextrous Mag 4:30–32
7.
Zurück zum Zitat Jalaliniya S, Pederson T (2015) Designing wearable personal assistants for surgeons: an egocentric approach. Pervasive Comput IEEE 14(3):22–31CrossRef Jalaliniya S, Pederson T (2015) Designing wearable personal assistants for surgeons: an egocentric approach. Pervasive Comput IEEE 14(3):22–31CrossRef
8.
Zurück zum Zitat Jalaliniya S, Smith J, Sousa M, Büthe L, Pederson T (2013) Touch-less interaction with medical images using hand & foot gestures. In: Proceedings of the 2013 ACM conference pervasive ubiquitous Comput Adjun Publ - UbiComp ’13 Adjun., ACM Press, New York, New York, USA, pp 1265–1274. doi:10.1145/2494091.2497332 Jalaliniya S, Smith J, Sousa M, Büthe L, Pederson T (2013) Touch-less interaction with medical images using hand & foot gestures. In: Proceedings of the 2013 ACM conference pervasive ubiquitous Comput Adjun Publ - UbiComp ’13 Adjun., ACM Press, New York, New York, USA, pp 1265–1274. doi:10.​1145/​2494091.​2497332
9.
Zurück zum Zitat Johnson R, O’Hara K, Sellen A, Cousins C, Criminisi A (2011) Exploring the potential for touchless interaction in image-guided interventional radiology. In: Proceedings of the 2011 annuual conference Hum factors Comput Syst - CHI ’11, pp 3323–3332. doi:10.1145/1978942.1979436 Johnson R, O’Hara K, Sellen A, Cousins C, Criminisi A (2011) Exploring the potential for touchless interaction in image-guided interventional radiology. In: Proceedings of the 2011 annuual conference Hum factors Comput Syst - CHI ’11, pp 3323–3332. doi:10.​1145/​1978942.​1979436
10.
Zurück zum Zitat Ma M, Merckx K, Fallavollita P, Navab N (2015) [POSTER] Natural user interface for ambient objects. In: 2015 IEEE International Symposium Mix Augment Real, Fukuoka, Japan, pp 76–79. doi:10.1109/ISMAR.2015.25 Ma M, Merckx K, Fallavollita P, Navab N (2015) [POSTER] Natural user interface for ambient objects. In: 2015 IEEE International Symposium Mix Augment Real, Fukuoka, Japan, pp 76–79. doi:10.​1109/​ISMAR.​2015.​25
11.
Zurück zum Zitat Moraes TF, Amorim PHJ, Azevedo FS, Silva JVL (2012) InVesalius-An open-source imaging application. Comput Vis Med Image Process 19:405–408 Moraes TF, Amorim PHJ, Azevedo FS, Silva JVL (2012) InVesalius-An open-source imaging application. Comput Vis Med Image Process 19:405–408
13.
Zurück zum Zitat O’Hara K, Dastur N, Carrell T, Gonzalez G, Sellen A, Penney G, Varnavas A, Mentis H, Criminisi A, Corish R, Rouncefield M (2014) Touchless interaction in surgery. Commun ACM 57(1):70–77. doi:10.1145/2541883.2541899 CrossRef O’Hara K, Dastur N, Carrell T, Gonzalez G, Sellen A, Penney G, Varnavas A, Mentis H, Criminisi A, Corish R, Rouncefield M (2014) Touchless interaction in surgery. Commun ACM 57(1):70–77. doi:10.​1145/​2541883.​2541899 CrossRef
14.
Zurück zum Zitat Pederson T, Janlert LE, Surie D (2010) Towards a model for egocentric interaction with physical and virtual objects. In: Proceedings of the 6th Nord conference human–computer interact. Extending Boundaries—Nord. ’10, ACM Press, New York, New York, USA, p 755. doi:10.1145/1868914.1869022 Pederson T, Janlert LE, Surie D (2010) Towards a model for egocentric interaction with physical and virtual objects. In: Proceedings of the 6th Nord conference human–computer interact. Extending Boundaries—Nord. ’10, ACM Press, New York, New York, USA, p 755. doi:10.​1145/​1868914.​1869022
16.
Zurück zum Zitat Schwarz LA, Bigdelou A, Navab N (2011) Learning gestures for customizable human-computer interaction in the operating room. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 6891 LNCS (PART 1), 129–136. doi:10.1007/978-3-642-23623-5_17 Schwarz LA, Bigdelou A, Navab N (2011) Learning gestures for customizable human-computer interaction in the operating room. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 6891 LNCS (PART 1), 129–136. doi:10.​1007/​978-3-642-23623-5_​17
17.
Zurück zum Zitat Strickland M, Tremaine J, Brigley G, Law C (2013) Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Can J Surg 56(3):1–6. doi:10.1503/cjs.035311 CrossRef Strickland M, Tremaine J, Brigley G, Law C (2013) Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Can J Surg 56(3):1–6. doi:10.​1503/​cjs.​035311 CrossRef
18.
Zurück zum Zitat Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB (2013) Informatics in radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics 33(2):E61–70. doi:10.1148/rg.332125101 CrossRefPubMed Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB (2013) Informatics in radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics 33(2):E61–70. doi:10.​1148/​rg.​332125101 CrossRefPubMed
19.
Zurück zum Zitat Tangcharoen T, Bell A, Hegde S, Hussain T, Beerbaum P, Schaeffter T, Razavi R, Botnar RM, Greil GF (2011) Detection of coronary artery anomalies in infants and young children with congenital heart disease by using MR imaging. doi:10.1148/radiol.10100828 Tangcharoen T, Bell A, Hegde S, Hussain T, Beerbaum P, Schaeffter T, Razavi R, Botnar RM, Greil GF (2011) Detection of coronary artery anomalies in infants and young children with congenital heart disease by using MR imaging. doi:10.​1148/​radiol.​10100828
20.
Zurück zum Zitat Wachs J, Stern H, Edan Y (2008) Real-time hand gesture interface for browsing medical images. J Intell 2(1):15–25 Wachs J, Stern H, Edan Y (2008) Real-time hand gesture interface for browsing medical images. J Intell 2(1):15–25
Metadaten
Titel
Device- and system-independent personal touchless user interface for operating rooms
One personal UI to control all displays in an operating room
verfasst von
Meng MA
Pascal Fallavollita
Séverine Habert
Simon Weidert
Nassir Navab
Publikationsdatum
01.06.2016
Verlag
Springer Berlin Heidelberg
Erschienen in
International Journal of Computer Assisted Radiology and Surgery / Ausgabe 6/2016
Print ISSN: 1861-6410
Elektronische ISSN: 1861-6429
DOI
https://doi.org/10.1007/s11548-016-1375-6

Weitere Artikel der Ausgabe 6/2016

International Journal of Computer Assisted Radiology and Surgery 6/2016 Zur Ausgabe

Premium Partner