skip to main content
research-article

An assessment of eye-gaze potential within immersive virtual environments

Published:12 December 2007Publication History
Skip Abstract Section

Abstract

In collaborative situations, eye gaze is a critical element of behavior which supports and fulfills many activities and roles. In current computer-supported collaboration systems, eye gaze is poorly supported. Even in a state-of-the-art video conferencing system such as the access grid, although one can see the face of the user, much of the communicative power of eye gaze is lost. This article gives an overview of some preliminary work that looks towards integrating eye gaze into an immersive collaborative virtual environment and assessing the impact that this would have on interaction between the users of such a system. Three experiments were conducted to assess the efficacy of eye gaze within immersive virtual environments. In each experiment, subjects observed on a large screen the eye-gaze behavior of an avatar. The eye-gaze behavior of that avatar had previously been recorded from a user with the use of a head-mounted eye tracker. The first experiment was conducted to assess the difference between users' abilities to judge what objects an avatar is looking at with only head gaze being viewed and also with eye- and head-gaze data being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects, correctly identifying what a person is looking at in an immersive virtual environment. The second experiment examined whether a monocular or binocular eye-tracker would be required. This was examined by testing subjects' ability to identify where an avatar was looking from their eye direction alone, or by eye direction combined with convergence. This experiment showed that convergence had a significant impact on the subjects' ability to identify where the avatar was looking. The final experiment looked at the effects of stereo and mono-viewing of the scene, with the subjects being asked to identify where the avatar was looking. This experiment showed that there was no difference in the subjects' ability to detect where the avatar was gazing. This is followed by a description of how the eye-tracking system has been integrated into an immersive collaborative virtual environment and some preliminary results from the use of such a system.

References

  1. Argyle, M. 1975. Bodily Communication. Methuen, London.Google ScholarGoogle Scholar
  2. Argyle, M. 1988. Bodily Communication, 2nd ed. Methuen, London.Google ScholarGoogle Scholar
  3. Argyle, M. and Graham, J. 1977. The central Europe experiment---Looking at persons and looking at things. J. Environ. Psychol. Nonverbal Behav. 1, 1, 6--16.Google ScholarGoogle ScholarCross RefCross Ref
  4. Benford, S. 1995. User embodiment in collaborative virtual environments. In Proceedings of the Conference on Computer-Human Interaction. ACM Press, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Bente, G. and Kraemer, N. C. 2002. Virtual gestures: Analyzing social presence effects of computermediated and computer-generated nonverbal behavior. In Proceedings of the 5th Annual International Workshop PRESENCE. MIT Press, Cambridge, 233--244.Google ScholarGoogle Scholar
  6. Bruce, V. and Yound, A. 1998. In the Eye of the Beholder: The Science of Face Perception. Oxford University Press, Oxford.Google ScholarGoogle Scholar
  7. Cruz-Neira, C., Sandin, D. J., and DeFanti, T. A. 1993. Surround-Screen projection-based virtual reality: The design and implementation of the CAVE. In Proceedings of the 20th Annual Conference on Computer Graphics. ACM Press, New York, 135--142. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Dickerson, P., Rae, J., Stribling, P., Dautenhahn, K., and Werry, I. 2005. Autistic children's co-ordination of gaze and talk: Re-Examining the ‘Asocial’ Autist. In Applying Conversation Analysis, K. Richards and P. Seedhouse, Eds. Palgrave Macmillan, Basingstoke, 19--37.Google ScholarGoogle Scholar
  9. Garau, M., Slater, M., Bee, S., and Sasse, M. A. 2001. The impact of eye gaze on communication using humanoid avatars. In Proceedings of the SIG-CHI Conference on Human Factors in Computing Systems. ACM Press, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Garau, M., Slater, M., Vinayagamoorhty, V., Brogni, A., Steed, A., and Sasse, M. A. 2003. The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment. In Proceedings of the SIG-CHI Conference on Human Factors in Computing Systems. ACM Press, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Goodwin, C. 1980. Restarts, pauses and the achievement of mutual gaze at turn-beginning. Sociol. Inquiry 50, 272--302.Google ScholarGoogle ScholarCross RefCross Ref
  12. Goodwin, C. 1981. Conversational Organisation: Interaction Between Speakers and Hearers. Academic Press, New York.Google ScholarGoogle Scholar
  13. Goodwin, C. 1986. Gestures as a resource for the organization of mutual orientation. Semiotica 62, 29--49.Google ScholarGoogle ScholarCross RefCross Ref
  14. Goodwin, C. 2000. Practices of seeing: Visual analysis: An ethnomethodological approach. In Handbook of Visual Analysis, T. van Leeuwen and C. Jewitt, Eds. Sage, London, 157--182.Google ScholarGoogle Scholar
  15. Heath, C. 1984. Talk and recipiency: Sequential organization in speech and body movement. In Structures of Social Interaction: Studies in Conversation Analysis, J. Atkinson and J. Heritage, Eds. Cambridge University Press, New york, 247--265.Google ScholarGoogle Scholar
  16. Heldal, I. 2005. Successes and failures in copresent situations. Presence: Teleoper. Virtual Environ. 14, 5, 563--579. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Hindmarsh, J. 2000. Object-Focused interaction in collaborative virtual environments. ACM Trans. Comput.-Hum. Interact. 7, 4, 477--509. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Kendon, A. 1967. Some functions of gaze-direction in social interaction. Acta Psychologica 26, 22--63.Google ScholarGoogle ScholarCross RefCross Ref
  19. Lee, S. H., Badler, J. B., and Badler, N. I. 2002. Eyes alive. In Proceedings of the 29th Annual Conference on Computer Graphics and Interactive Techniques. ACM Press, New York, 637--644. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Lerner, G. 1996. On the place of linguistic resources in the organization of talk in interaction: ‘Second person’ reference in multi-party conversation. Pragmatics 6, 281--294.Google ScholarGoogle ScholarCross RefCross Ref
  21. Lerner, G. 2003. Selecting next speaker: The context-sensitive operation of a context free organization. Lang. Soc. 32, 2, 177--201.Google ScholarGoogle ScholarCross RefCross Ref
  22. Markus, G. and Wurmlin, S. 2003. Blue-C: A spatially immersive display and 3D video portal for telepresence. In Proceedings of the ACM SIGGRAPH International Conference on Computer Graphics and Interactive Tenchniques. ACM Press, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Murray, N., Goulermas, J. Y., and Fernando, T. 2003. Visual tracking for a virtual environment. In Proceedings of Human Computer-Interaction International Conference, vol. 1. Lawrence Erlbaum Associates, NJ, 1198--1202.Google ScholarGoogle Scholar
  24. Murray, N. and Roberts, D. 2006. Comparison of head gaze and head and eye gaze within an immersive environment. In the 10th IEEE International Symposium on Distributed Simulation and Real Time Applications. IEEE Computer Society, Los Alamitos, CA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Rae, J. P. 2001. Organizing participation in interaction: Doing participation framework. Res. Lang. Soc. Inter. 34, 2, 253--278.Google ScholarGoogle ScholarCross RefCross Ref
  26. Raskar, R., Welch, G., Cutts, M., Lake, A., Stesin, L., and Fuchs, H. 1998. The office of the future: A unified approach to image-based modeling and spatially immersive displays. In Proceedings of the ACM SIGGRAPH International Conference on Computer Graphics and Interactive Tenchniques. ACM Press, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Roberts, D. J., Wolff, R., Otto, O., Steed, A., Kranzlmueller, D., and Anthes, C. 2004. Supporting social human communication between distributed walk-in displays. In Proceedings of the ACM International Symposium on Virtual Reality, Systems and Technology. ACM Press, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Schroeder, R., Steed, A., Axelsson, A. S., Heldal, I., Abelin, A., Widestrom, J., Nilsson, A., and Slater, M. 2001. Collaborating in networked immersive spaces: As good as being there together? Comput. Graphics 25, 5, 781--788.Google ScholarGoogle ScholarCross RefCross Ref
  29. Steed, A., Spante, M., Schroeder, R., Heldal, I., and Axelsson, A. S. 2003. Strangers and friends in caves: An exploratory study of collaboration in networked IPT systems for extended periods of time. In Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics. ACM Press, New York, 51--54. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Tanger, R., Kauff, P., and Schreer, O. 2004. Immersive meeting point (im.point)---An approach towards immersive media portals. In Proceedings of the Pacific-Rim Conference on Multimedia. Springer, Berlin. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Taylor, R. M., Hudson, T. C., Seeger, A., Weber, H., Juliano, J., and Helser, A. T. 2001. VRPN: A device-independent, network-transparent VR peripheral system. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Press, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Vinayagamoorhty, V., Garau, M., Steed, A., and Slater, M. 2004. An eye gaze model for dyadic interaction in an immersive virtual environment: Practice and experience. Comput. Graphics Forum 23, 1, 1--11.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. An assessment of eye-gaze potential within immersive virtual environments

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Transactions on Multimedia Computing, Communications, and Applications
      ACM Transactions on Multimedia Computing, Communications, and Applications  Volume 3, Issue 4
      December 2007
      147 pages
      ISSN:1551-6857
      EISSN:1551-6865
      DOI:10.1145/1314303
      Issue’s Table of Contents

      Copyright © 2007 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 12 December 2007
      • Received: 1 August 2007
      • Accepted: 1 August 2007
      Published in tomm Volume 3, Issue 4

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader