ABSTRACT
Communicating spatial information by pointing is ubiquitous in human interactions. With the growing use of head-mounted cameras for collaborative purposes, it is important to assess how accurately viewers of the resulting egocentric videos can interpret pointing acts. We conducted an experiment to compare the accuracy of interpreting four different pointing techniques: hand pointing, head pointing, gaze pointing and hand+gaze pointing. Our results suggest that superimposing the gaze information on the egocentric video can enable viewers to determine pointing targets more accurately and more confidently. Hand pointing performed best when the pointing target was straight ahead and head pointing was the least preferred in terms of ease of interpretation. Our results can inform the design of collaborative applications that make use of the egocentric view.
- Deepak Akkil, Poika Isokoski, Jari Kangas, Jussi Rantala, and Roope Raisamo. 2014. TraQuMe: a tool for measuring the gaze tracking quality. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14), 327--330. http://dx.doi.org/10.1145/2578153.2578192 Google ScholarDigital Library
- Deepak Akkil and Poika Isokoski. 2016. Gaze Augmentation in Egocentric Video Improves Awareness of Intention. In Proceedings of ACM Conference on Human Factors in Computing Systems (CHI '16). http://dx.doi.org/10.1145/2858036.2858127 Google ScholarDigital Library
- Deepak Akkil, Jobin Mathew James, Poika Isokoski, and Jari Kangas. GazeTorch : Enabling Gaze Awareness in Collaborative Physical Tasks. In Proceedings of ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16). http://dx.doi.org/10.1145/2851581.2892459 Google ScholarDigital Library
- Ignacio Avellino, Cédric Fleury, and Michel Beaudouin-Lafon. 2015. Accuracy of Deictic Gestures to Support Telepresence on Wall-sized Displays. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15), 2393--2396. http://dx.doi.org/10.1145/2702123.2702448 Google ScholarDigital Library
- Matthias Baldauf, Peter Fröhlich, and Siegfried Hutter. 2010. KIBITZER: a wearable system for eye-gaze-based mobile urban exploration. In Proceedings of the 1st Augmented Human International Conference (AH '10). Article 9, 5 pages. http://dx.doi.org/10.1145/1785455.1785464 Google ScholarDigital Library
- B Biguer, C Prablanc, and M Jeannerod. 1984. The contribution of coordinated eye and head movements in hand pointing accuracy. Experimental brain research. 55, 3: 462--469. http://doi.org/10.1007/BF00235277Google Scholar
- Richard A. Bolt. 1980. "Put-that-there": Voice and gesture at the graphics interface. In Proceedings of the 7th annual conference on Computer graphics and interactive techniques (SIGGRAPH '80). 262--270. http://dx.doi.org/10.1145/800250.807503 Google ScholarDigital Library
- David P. Carey. 2001. Vision research: Losing sight of eye dominance. Current Biology. 1.20. 828--830. http://doi.org/10.1016/S0960-9822(01)00496-1Google Scholar
- Andrea Colaço, Ahmed Kirmani, Hye Soo Yang, NanWei Gong, Chris Schmandt, and Vivek K. Goyal. 2013. Mime: compact, low power 3D gesture sensing for interaction with head mounted displays. In Proceedings of ACM symposium on User interface software and technology (UIST '13). 227--236. http://dx.doi.org/10.1145/2501988.2502042 Google ScholarDigital Library
- Mikael Drugge, Marcus Nilsson, Roland Parviainen, and Peter Parnes. 2004. Experiences of using wearable computers for ambient telepresence and remote interaction. In Proceedings of the 2004 ACM SIGMM workshop on Effective telepresence (ETP '04). 2--11. http://dx.doi.org/10.1145/1026776.1026780 Google ScholarDigital Library
- Pat Dugard. 2014. Randomization tests: A new gold standard? Journal of Contextual Behavioral Science 3, 1: 65--68. http://doi.org/10.1016/j.jcbs.2013.10.001Google ScholarCross Ref
- Susan R. Fussell, Leslie D. Setlock, and Robert E. Kraut. 2003. Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '03). 513--520. http://dx.doi.org/10.1145/642611.642701 Google ScholarDigital Library
- Zenzi M. Griffin and Kathryn Bock. 2000. What the eyes say about speaking. Psychological science. 11, 4. 274--279. http://dx.doi.org/10.1111/1467-9280.00255.Google Scholar
- Denise Henriques and John D. Crawford. 2002. Role of eye, head, and shoulder geometry in the planning of accurate arm movements. Journal of neurophysiology 87, 4: 1677--1685.http://doi.org/10.1152/jn.00509.2001Google ScholarCross Ref
- Sture Holm. 1979. A simple sequentially rejective multiple test procedure. Scandinavian journal of statistics, 1: 65--70.Google Scholar
- Steven Johnson, Madeleine Gibson, and Bilge Mutlu. 2015. Handheld or Handsfree?: Remote Collaboration via Lightweight Head-Mounted Displays and Handheld Devices. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW '15). 1825--1836. http://dx.doi.org/10.1145/2675133.2675176 Google ScholarDigital Library
- Brennan Jones, Anna Witcraft, Scott Bateman, Carman Neustaedter, and Anthony Tang. 2015. Mechanics of Camera Work in Mobile Video Collaboration. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). 957--966. http://dx.doi.org/10.1145/2702123.2702345 Google ScholarDigital Library
- Kasahara, Shunichi, and Jun Rekimoto.2014. JackIn: integrating first-person view with out-of-body vision generation for human-human augmentation. In Proceedings of the 5th Augmented Human International Conference (AH' 14), p. 46. http://dx.doi.org/10.1145/2582051.2582097 Google ScholarDigital Library
- Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). 1151--1160. http://dx.doi.org/10.1145/2638728.2641695 Google ScholarDigital Library
- Aarlenne Z. Khan and John D. Crawford. 2001. Ocular dominance reverses as a function of horizontal gaze angle. Vision Research 41, 14: 1743--1748. http://doi.org/10.1016/S0042-6989(01)00079-7Google ScholarCross Ref
- Aarlenne Z. Khan and John D. Crawford. 2003. Coordinating one hand with two eyes: Optimizing for field of view in a pointing task. Vision Research 43, 4: 409--417. http://doi.org/10.1016/S0042-6989(02)00569-2Google ScholarCross Ref
- Barry Kollee, Sven Kratz, and Anthony Dunnigan. 2014. Exploring gestural interaction in smart spaces using head mounted devices with ego-centric sensing. In Proceedings of the 2nd ACM symposium on Spatial user interaction (SUI '14). 40--49. http://dx.doi.org/10.1145/2659766.2659781 Google ScholarDigital Library
- Rob Kooper and Blair MacIntyre. 2003. Browsing the Real-World Wide Web: Maintaining Awareness of Virtual Information in an AR Information Space. International Journal of Human-Computer Interaction 16, 3: 425--446. http://doi.org/10.1207/S15327590IJHC1603_3Google ScholarCross Ref
- Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications (ETRA '02). 15--22. http://dx.doi.org/10.1145/507072.507076 Google ScholarDigital Library
- Bernhard Maurer, Sandra Trösterer, Magdalena Gärtner, Martin Wuchse, Axel Baumgartner, Alexander Meschtscherjakov, David Wilfinger, and Manfred Tscheligi. 2014. Shared Gaze in the Car: Towards a Better Driver-Passenger Collaboration. In Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14). 1--6. http://dx.doi.org/10.1145/2667239.2667274 Google ScholarDigital Library
- Sven Mayer, Katrin Wolf, Stefan Schneegass, and Niels Henze. 2015. Modeling Distant Pointing for Compensating Systematic Displacements. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). 4165--4168.http://dx.doi.org/10.1145/2702123.2702332 Google ScholarDigital Library
- W R Miles. 1930. Ocular dominance in human adults. Journal of General Psychology 3: 412--430. http://dx.doi.org/10.1080/00221309.1930.9918218Google ScholarCross Ref
- Romy Müller, Jens R. Helmert, Sebastian Pannasch, and Boris M. Velichkovsky. 2013. Gaze transfer in remote cooperation: Is it always helpful to see what your partner is attending to? The Quarterly Journal of Experimental Psychology 66, 7: 1302--1316. http://doi.org/10.1080/17470218.2012.737813Google ScholarCross Ref
- Shohei Nagai, Shunichi Kasahara, and Jun Rekimoto. 2015. LiveSphere: Sharing the Surrounding Visual Environment for Immersive Experience in Remote Collaboration. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction (TEI '15). 113--116. http://dx.doi.org/10.1145/2677199.2680549 Google ScholarDigital Library
- Mark B Neider, Xin Chen, Christopher A. Dickinson, Susan E. Brennan, and Gregory J. Zelinsky. 2010. Coordinating spatial referencing using shared gaze. Psychonomic bulletin & review 17, 5: 718--24. http://doi.org/10.3758/PBR.17.5.718Google Scholar
- Jason Procyk, Carman Neustaedter, Carolyn Pang, Anthony Tang, and Tejinder K. Judge. 2014. Exploring video streaming in public settings: shared geocaching over distance using mobile video chat. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). 2163-2172. http://dx.doi.org/10.1145/2556288.2557198 Google ScholarDigital Library
- Sonja Rümelin, Chadly Marouane, and Andreas Butz. 2013. Free-hand pointing for identification and interaction with distant objects. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '13). 40--47. http://dx.doi.org/10.1145/2516540.2516556 Google ScholarDigital Library
- Rainer Stiefelhagen and Jie Zhu. 2002. Head orientation and gaze direction in meetings. In CHI '02 Extended Abstracts on Human Factors in Computing Systems (CHI EA '02). 858--859. http://dx.doi.org/10.1145/506443.506634 Google ScholarDigital Library
- Bruce Thomas, Ben Close, John Donoghue, John Squires, Phillip De Bondi, and Wayne Piekarski. 2002. First person indoor/outdoor augmented reality application: ARQuake. Personal and Ubiquitous Computing, 75--86. http://doi.org/10.1007/s007790200007 Google ScholarDigital Library
- Ying-Chao Tung, Chun-Yen Hsu, Han-Yu Wang, Silvia Chyou, Jhe-Wei Lin, Pei-Jung Wu, Andries Valstar, and Mike Y. Chen. 2015. User-Defined Game Input for Smart Glasses in Public Space. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). 3327--3336. http://dx.doi.org/10.1145/2702123.2702214 Google ScholarDigital Library
- Takuya Uemura, Yasuko Arai, and Chiga Shimazaki. 1980. Eye-head coordination during lateral gaze in normal subjects. Octa Otolaryngology 90, 1-6: 191--198. http://doi.org/10.3109/00016488009131715Google ScholarCross Ref
- Nelson Wong and Carl Gutwin. 2010. Where are you pointing?: the accuracy of deictic pointing in CVEs. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). 1029--1038. http://dx.doi.org/10.1145/1753326.1753480 Google ScholarDigital Library
- Xianjun Sam Zheng, Cedric Foucault, Patrik Matos da Silva, Siddharth Dasari, Tao Yang, and Stuart Goose. 2015. Eye-Wearable Technology for Machine Maintenance: Effects of Display Position and Hands-free Operation. In Proceedings of the Human Factors in Computing Systems (CHI '15). 2125--2134. http://dx.doi.org/10.1145/2702123.2702305 Google ScholarDigital Library
Index Terms
- Accuracy of interpreting pointing gestures in egocentric view
Recommendations
Manual and gaze input cascaded (MAGIC) pointing
CHI '99: Proceedings of the SIGCHI conference on Human Factors in Computing SystemsThis work explores a new direction in utilizing eye gaze for computer input. Gaze tracking has long been considered as an alternative or potentially superior pointing method for computer input. We believe that many fundamental limitations exist with ...
Look and lean: accurate head-assisted eye pointing
ETRA '14: Proceedings of the Symposium on Eye Tracking Research and ApplicationsCompared to the mouse, eye pointing is inaccurate. As a consequence, small objects are difficult to point by gaze alone. We suggest using a combination of eye pointing and subtle head movements to achieve accurate hands-free pointing in a conventional ...
Biomechanics of Front and Back-of-Tablet Pointing with Grasping Hands
Considering the kinematic model of the hand allows for deeper understanding of target selection on the front and on the back of tablets. The authors found that the position where the thumb and fingers are naturally hovering when the device is held ...
Comments