ABSTRACT
Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes. We evaluated the system in a user study and found that AmbiGaze enables robust gaze-only interaction with many devices, from multiple positions in the environment, in a spontaneous and comfortable manner.
Supplemental Material
- Michael Beigl. 1999. Point & Click-Interaction in Smart Environments. In Handheld and Ubiquitous Computing (Lecture Notes in Computer Science), Hans. Gellersen (Ed.), Vol. 1707. Springer Berlin Heidelberg, 311--313. DOI:http://dx.doi.org/10.1007/3-540-48157-5_31 Google ScholarDigital Library
- Richard A. Bolt. 1981. Gaze-orchestrated Dynamic Windows. SIGGRAPH Comput. Graph. 15, 3 (Aug. 1981), 109--119. DOI: http://dx.doi.org/10.1145/965161.806796 Google ScholarDigital Library
- Matthias Budde, Matthias Berning, Christopher Baumgärtner, Florian Kinn, Timo Kopf, Sven Ochs, Frederik Reiche, Till Riedel, and Michael Beigl. 2013. Point & Control -- Interaction in Smart Environments: You Only Click Twice. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication (UbiComp '13 Adjunct). ACM, New York, NY, USA, 303--306. DOI: http://dx.doi.org/10.1145/2494091.2494184 Google ScholarDigital Library
- Marcus Carter, Eduardo Velloso, John Downs, Abigail Sellen, Kenton O'Hara, and Frank Vetere. 2016. PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 13. DOI: http://dx.doi.org/10.1145/2858036.2858284 Google ScholarDigital Library
- Katie Derthick, James Scott, Nicolas Villar, and Christian Winkler. 2013. Exploring Smartphone-based Web User Interfaces for Appliances. In Proceedings of the 15th International Conference on Human-computer Interaction with Mobile Devices and Services (MobileHCI '13). ACM, New York, NY, USA, 227--236. DOI:http://dx.doi.org/10.1145/2493190.2493239 Google ScholarDigital Library
- Connor Dickie, Jamie Hart, Roel Vertegaal, and Alex Eiser. 2006. LookPoint: An Evaluation of Eye Input for Hands-free Switching of Input Devices Between Multiple Computers. In Proceedings of the 18th Australia Conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments (OZCHI '06). ACM, New York, NY, USA, 119--126. DOI: http://dx.doi.org/10.1145/1228175.1228198 Google ScholarDigital Library
- Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015a. Orbits: Enabling Gaze Interaction in Smart Watches Using Moving Targets. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers (UbiComp/ISWC'15 Adjunct). ACM, New York, NY, USA, 419--422. DOI: http://dx.doi.org/10.1145/2800835.2800942 Google ScholarDigital Library
- Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015b. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. In Proc. of the 28th ACM Symposium on User Interface Software and Technology (UIST 2015) (2015-11-01). DOI:http://dx.doi.org/10.1145/2807442.2807499 Google ScholarDigital Library
- Jean-Daniel Fekete, Niklas Elmqvist, and Yves Guiard. 2009. Motion-pointing: Target Selection Using Elliptical Motions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 289--298. DOI: http://dx.doi.org/10.1145/1518701.1518748 Google ScholarDigital Library
- David Fleer and Christian Leichsenring. 2012. MISO: A Context-sensitive Multimodal Interface for Smart Objects Based on Hand Gestures and Finger Snaps. In Adjunct Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST Adjunct Proceedings '12). ACM, New York, NY, USA, 93--94. DOI:http://dx.doi.org/10.1145/2380296.2380338 Google ScholarDigital Library
- Jeremy Hales, David Rozado, and Diako Mardanbegi. 2013. Interacting with Objects in the Environment by Gaze and Hand Gestures. In 3rd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI2013).Google Scholar
- Jan Hess, Guy Küstermann, and Volkmar Pipek. 2008. Premote: A User Customizable Remote Control. In CHI '08 Extended Abstracts on Human Factors in Computing Systems (CHI EA '08). ACM, New York, NY, USA, 3279--3284. DOI: http://dx.doi.org/10.1145/1358628.1358844 Google ScholarDigital Library
- Yoshio Ishiguro, Adiyan Mujibiya, Takashi Miyaki, and Jun Rekimoto. 2010. Aided Eyes: Eye Activity Sensing for Daily Life. In Proceedings of the 1st Augmented Human International Conference (AH '10). ACM, New York, NY, USA, Article 25, 7 pages. DOI: http://dx.doi.org/10.1145/1785455.1785480 Google ScholarDigital Library
- Tiiu Koskela and Kaisa Väänänen-Vainio-Mattila. 2004. Evolution towards smart home environments: empirical evaluation of three user interfaces. Personal and Ubiquitous Computing 8, 3--4 (2004), 234--240. DOI: http://dx.doi.org/10.1007/s00779-004-0283-x Google ScholarDigital Library
- Logitech. 2010. Logitech Study Shows Multiple Remote Controls Hindering Entertainment Experiences Around the Globe. Press Release. (Nov 2010). http: //www.logitech.com/en-us/press/press-releases/7748Google Scholar
- Robert Neßelrath, Chensheng Lu, ChristianH. Schulz, Jochen Frey, and Jan Alexandersson. 2011. A Gesture Based System for Context Sensitive Interaction with Smart Homes. In Ambient Assisted Living, Reiner Wichert and Birgid Eberhardt (Eds.). Springer Berlin Heidelberg, 209--219. DOI: http://dx.doi.org/10.1007/978-3-642-18167-2_15Google Scholar
- Ken Pfeuffer, Mélodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13). ACM, New York, NY, USA, 261--270. DOI: http://dx.doi.org/10.1145/2501988.2501998 Google ScholarDigital Library
- Christof Roduner, Marc Langheinrich, Christian Floerkemeier, and Beat Schwarzentrub. 2007. Operating Appliances with Mobile Phones'Strengths and Limits of a Universal Interaction Device. In Pervasive Computing (Lecture Notes in Computer Science), Anthony LaMarca, Marc Langheinrich, and Khai N. Truong (Eds.), Vol. 4480. Springer Berlin Heidelberg, 198--215. DOI: http://dx.doi.org/10.1007/978-3-540-72037-9_12 Google ScholarDigital Library
- Dominik Schmidt, David Molyneaux, and Xiang Cao. 2012. PICOntrol: Using a Handheld Projector for Direct Control of Physical Devices Through Visible Light. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST '12). ACM, New York, NY, USA, 379--388. DOI: http://dx.doi.org/10.1145/2380116.2380166 Google ScholarDigital Library
- Jeffrey S. Shell, Roel Vertegaal, Daniel Cheng, Alexander W. Skaburskis, Changuk Sohn, A. James Stewart, Omar Aoudeh, and Connor Dickie. 2004. ECSGlasses and EyePliances: Using Attention to Open Sociable Windows of Interaction. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (ETRA '04). ACM, New York, NY, USA, 93--100. DOI:http://dx.doi.org/10.1145/968363.968384 Google ScholarDigital Library
- Jeffrey S. Shell, Roel Vertegaal, and Alexander W. Skaburskis. 2003. EyePliances: Attention-seeking Devices That Respond to Visual Attention. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). ACM, New York, NY, USA, 770--771. DOI:http://dx.doi.org/10.1145/765891.765981 Google ScholarDigital Library
- Roel Vertegaal, Aadil Mamuji, Changuk Sohn, and Daniel Cheng. 2005. Media Eyepliances: Using Eye Tracking for Remote Control Focus Selection of Appliances. In CHI '05 Extended Abstracts on Human Factors in Computing Systems (CHI EA '05). ACM, New York, NY, USA, 1861--1864. DOI: http://dx.doi.org/10.1145/1056808.1057041 Google ScholarDigital Library
- Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448. DOI:http://dx.doi.org/10.1145/2493432.2493477 Google ScholarDigital Library
- Mark Weiser and JohnSeely Brown. 1997. The Coming Age of Calm Technology. In Beyond Calculation. Springer New York, 75--85. DOI: http://dx.doi.org/10.1007/978-1-4612-0685-9_6Google Scholar
- Andrew Wilson and Steven Shafer. 2003. XWand: UI for Intelligent Spaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '03). ACM, New York, NY, USA, 545--552. DOI: http://dx.doi.org/10.1145/642611.642706 Google ScholarDigital Library
- Gottfried Zimmermann, Gregg Vanderheiden, and Al Gilman. 2002. Prototype Implementations for a Universal Remote Console Specification. In CHI '02 Extended Abstracts on Human Factors in Computing Systems (CHI EA '02). ACM, New York, NY, USA, 510--511. DOI: http://dx.doi.org/10.1145/506443.506454 Google ScholarDigital Library
Index Terms
- AmbiGaze: Direct Control of Ambient Devices by Gaze
Recommendations
SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality
UIST '17: Proceedings of the 30th Annual ACM Symposium on User Interface Software and TechnologySmoothMoves is an interaction technique for augmented reality (AR) based on smooth pursuits head movements. It works by computing correlations between the movements of on-screen targets and the user's head while tracking those targets. The paper ...
Smooth-i: smart re-calibration using smooth pursuit eye movements
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsEye gaze for interaction is dependent on calibration. However, gaze calibration can deteriorate over time affecting the usability of the system. We propose to use motion matching of smooth pursuit eye movements and known motion on the display to ...
The COSE ontology: bringing the semantic web to smart environments
ICOST'11: Proceedings of the 9th international conference on Toward useful services for elderly and people with disabilities: smart homes and health telematicsThe number of smart appliances and devices in the home and office has grown dramatically in recent years. Unfortunately, these devices rarely interact with each other or the environment. In order to move from environments filled with smart devices to ...
Comments