Skip to main content

2013 | Buch

Human Walking in Virtual Environments

Perception, Technology, and Applications

herausgegeben von: Frank Steinicke, Yon Visell, Jennifer Campos, Anatole Lécuyer

Verlag: Springer New York

insite
SUCHEN

Über dieses Buch

This book presents a survey of past and recent developments on human walking in virtual environments with an emphasis on human self-motion perception, the multisensory nature of experiences of walking, conceptual design approaches, current technologies, and applications. The use of Virtual Reality and movement simulation systems is becoming increasingly popular and more accessible to a wide variety of research fields and applications. While, in the past, simulation technologies have focused on developing realistic, interactive visual environments, it is becoming increasingly obvious that our everyday interactions are highly multisensory. Therefore, investigators are beginning to understand the critical importance of developing and validating locomotor interfaces that can allow for realistic, natural behaviours. The book aims to present an overview of what is currently understood about human perception and performance when moving in virtual environments and to situate it relative to the broader scientific and engineering literature on human locomotion and locomotion interfaces. The contents include scientific background and recent empirical findings related to biomechanics, self-motion perception, and physical interactions. The book also discusses conceptual approaches to multimodal sensing, display systems, and interaction for walking in real and virtual environments. Finally, it will present current and emerging applications in areas such as gait and posture rehabilitation, gaming, sports, and architectural design.

Inhaltsverzeichnis

Frontmatter

Perception

Chapter 1. Sensory Contributions to Spatial Knowledge of Real and Virtual Environments
Abstract
Most sensory systems are able to inform people about the spatial structure of their environment, their place in that environment, and their movement through it. We discuss these various sources of sensory information by dividing them into three general categories: external (vision, audition, somatosensory), internal (vestibular, kinesthetic) and efferent (efference copy, attention). Research on the roles of these sensory systems in the creation of environmental knowledge has shown, with few exceptions, that information from a single sensory modality is often sufficient for acquiring at least rudimentary knowledge of one’s immediate environment and one’s movement through it. After briefly discussing the ways in which sources of sensory information commonly covary in everyday life, we examine the types and quality of sensory information available from contemporary virtual environments, including desktop, CAVE, and HMD-based systems. Because none of these computer mediated systems is yet able to present a perfectly full and veridical sensory experience to its user, it is important for researchers and VE developers to understand the circumstances, tasks, and goals for which different sensory information sources are most critical. We review research on these topics, as well as research on how the omission, limitation, or distortion of different information sources may affect the perception and behavior of users. Finally, we discuss situations in which various types of virtual environment systems may be more or less useful.
David Waller, Eric Hodgson
Chapter 2. Perceptual and Cognitive Factors for Self-Motion Simulation in Virtual Environments: How Can Self-Motion Illusions (“Vection”) Be Utilized?
Abstract
How can we convincingly simulate observer locomotion through virtual environments without having to allow for full physical observer movement? That is, how can we best utilize multi-modal stimulation to provide the compelling illusion of moving through simulated worlds while reducing the overall simulation effort? This chapter provides a review on the contribution and interaction of visual, auditory, vibrational, and biomechanical cues (e.g., walking) for self-motion perception and simulation in VR. We propose an integrative framework and discuss potential synergistic effects of perceptual and cognitive influences on self-motion perception in VEs. Based on this perspective, we envision a lean-and-elegant approach that utilizes multi-modal self-motion illusions and perceptual-cognitive factors in a synergistic manner to improve perceptual and behavioral effectiveness and reduce the demand for physical (loco-)motion interfaces to a more affordable level.
Bernhard E. Riecke, Jörg Schulte-Pelkum
Chapter 3. Biomechanics of Walking in Real World: Naturalness we Wish to Reach in Virtual Reality
Abstract
In most virtual reality (VR) simulations the virtual world is larger than the real walking workspace. The workspace is often bounded by the tracking area or the display devices. Hence, many researchers have proposed technical solutions to make people walk through large virtual spaces using various types of metaphors and multisensory feedback. To achieve this goal it is necessary to understand how people walk in real life. This chapter reports biomechanical data describing human walking including kinematics, dynamics and energetics knowledge for straight line and nonlinear walking. Reference and normative values are provided for most of these variables, which could help developers and researchers improve the naturalness of walking in large virtual environments, or to propose evaluation metrics. For each section of this chapter, we will provide some potential applications in VR. On the one hand, this type of knowledge could be used to design more natural interaction devices such as omnidirectional treadmills, walk-in-place methods, or other facilities. A specific section is dedicated to comparisons between treadmill and ground walking as it is one of the most popular approaches in VR. On the other hand, this knowledge could also be useful to improve the quality of multisensory feedback when walking, such as adding sounds, vibrations, or more natural camera control.
Franck Multon, Anne-Hélène Olivier
Chapter 4. Affordance Perception and the Visual Control of Locomotion
Abstract
When people navigate through complex, dynamic environments, they select actions and guide locomotion in ways that take into account not only the environment but also their body dimensions and locomotor capabilities. For example, when stepping off a curb, a pedestrian may need to decide whether to go now ahead of an approaching vehicle or wait until it passes. Similarly, a child playing a game of tag may need to decide whether to go to the left or right around a stationary obstacle to intercept another player. In such situations, the possible actions (i.e., affordances) are partly determined by the person’s body dimensions and locomotor capabilities. From an ecological perspective, the ability to take these factors into account begins with the perception of affordances. The aim of this chapter is to review recent theoretical developments and empirical research on affordance perception and its role in the visual control of locomotion, including basic locomotor tasks such as avoiding stationary and moving obstacles, walking to targets, and selecting routes through complex scenes. The focus will be on studies conducted in virtual environments, which have created new and exciting opportunities to investigate how people perceive affordances, guide locomotion, and adapt to changes in body dimensions and locomotor capabilities.
Brett R. Fajen
Chapter 5. The Effect of Translational and Rotational Body-Based Information on Navigation
Abstract
Physical locomotion provides internal (body-based) sensory information about the translational and rotational components of movement. This chapter starts by summarizing the characteristics of model-, small- and large-scale VE applications, and attributes of ecological validity that are important for the application of navigation research. The type of navigation participants performed, the scale and spatial extent of the environment, and the richness of the visual scene are used to provide a framework for a review of research into the effect of body-based information on navigation. The review resolves contradictions between previous studies’ findings, identifies types of navigation interface that are suited to different applications, and highlights areas in which further research is needed. Applications that take place in small-scale environments, where maneuvering is the most demanding aspect of navigation, will benefit from full-walking interfaces. However, collision detection may not be needed because users avoid obstacles even when they are below eye-level. Applications that involve large-scale spaces (e.g., buildings or cities) just need to provide the translational component of body-based information, because it is only in unusual scenarios that the rotational component of body-based information produces any significant benefit. This opens up the opportunity of combining linear treadmill and walking-in-place interfaces with projection displays that provide a wide field of view.
Roy A. Ruddle
Chapter 6. Enabling Unconstrained Omnidirectional Walking Through Virtual Environments: An Overview of the CyberWalk Project
Abstract
The CyberWalk treadmill is the first truly omnidirectional treadmill of its size that allows for near natural walking through arbitrarily large Virtual Environments. The platform represents advances in treadmill and virtual reality technology and engineering, but it is also a major step towards having a single setup that allows the study of human locomotion and its many facets. This chapter focuses on the human behavioral research that was conducted to understand human locomotion from the perspective of specifying design criteria for the CyberWalk. The first part of this chapter describes research on the biomechanics of human walking, in particular, the nature of natural unconstrained walking and the effects of treadmill walking on characteristics of gait. The second part of this chapter describes the multisensory nature of walking, with a focus on the integration of vestibular and proprioceptive information during walking. The third part of this chapter describes research on large-scale human navigation and identifies possible causes for the human tendency to veer from a straight path, and even walk in circles when no external references are made available. The chapter concludes with a summary description of the features of the CyberWalk platform that were informed by this collection of research findings and briefly highlights the current and future scientific potential for this platform.
Ilja Frissen, Jennifer L. Campos, Manish Sreenivasa, Marc O. Ernst

Technologies

Frontmatter
Chapter 7. Displays and Interaction for Virtual Travel
Abstract
Virtual travel can be accomplished in many ways. In this chapter we review displays and interaction devices that can be utilized for virtual travel techniques. The types of display range from desktop to fully immersive and the types of interaction devices range from hand-held devices through to motion tracking systems. We give examples of different classes of device that are commonly used, as well as some more novel devices. We then give a general overview of travel tasks and explain how they can be realized through interaction devices.
Anthony Steed, Doug A. Bowman
Chapter 8. Sensing Human Walking: Algorithms and Techniques for Extracting and Modeling Locomotion
Abstract
This chapter reports the most popular methods used to evaluate the main properties of human walking. We will mainly focus on: global parameters (such as step length, frequency, gait asymmetry and regularity), kinematic parameters (such as joint angles depending on time), dynamic values (such as the ground reaction force and the joint torques) and muscle activity (such as muscle tension). A large set of sensors have been introduced in order to analyze human walking in biomechanics and other connected domains such as robotics, human motion sciences, computer animation... Among all these sensors, we will focus on: mono-point sensors (such as accelerometers), multi-point sensors (such as flock of sensors, opto-electronic systems and video analysis), and dynamic sensors (such as force plates or electromyographic sensors). For the most popular systems, we will describe the most popular methods and algorithms used to compute the parameters described above. All along the chapter we will explain how these algorithms could provide original methods for helping people to design natural navigation in VR.
Franck Multon
Chapter 9. Locomotion Interfaces
Abstract
A locomotion interface is a device that creates an artificial sensation of physical walking. It should ideally be equipped with three functions: (1) The creation of a sense of walking while the true position of its user is preserved, (2) Allowing the walker to change bearing direction, (3) The simulation of uneven walking surfaces. This chapter categorizes and describes four different methods for the design and implementation of such interfaces: Sliding shoes, Treadmills, Foot-pads, and Robotic tiles. It discusses related technical issues and potential applications.
Hiroo Iwata
Chapter 10. Implementing Walking in Virtual Environments
Abstract
In the previous chapter, locomotion devices have been described, which prevent displacements in the real world while a user is walking. In this chapter we explain different strategies, which allow users to actually move through the real-world, while these physical displacements are mapped to motions of the camera in the virtual environment (VE) in order to support unlimited omnidirectional walking. Transferring a user’s head movements from a physical workspace to a virtual scene is an essential component of any immersive VE. This chapter describes the pipeline of transformations from tracked real-world coordinates to coordinates of the VE. The chapter starts with an overview of different approaches for virtual walking, and gives an introduction to tracking volumes, coordinate systems and transformations required to set up a workspace for implementing virtual walking. The chapter continues with the traditional isometric mapping found in most immersive VEs, with special emphasis on combining walking in a restricted interaction volume via reference coordinates with virtual traveling metaphors (e.g., flying). Advanced mappings are then introduced with user-centric coordinates, which provide a basis to guide users on different paths in the physical workspace than what they experience in the virtual world.
Gerd Bruder, Frank Steinicke
Chapter 11. Stepping-Driven Locomotion Interfaces
Abstract
Walking-in-place and real-walking locomotion interfaces for virtual environment systems are interfaces that are driven by the user’s actual stepping motions and do not include treadmills or other mechanical devices. While both walking-in-place and real-walking interfaces compute the user’s speed and direction and convert those values into viewpoint movement between frames, they differ in how they enable the user to move to any distant location in very large virtual scenes. Walking-in-place constrains the user’s actual movement to a small area and translates stepping-in-place motions into viewpoint movement. Real-walking applies one of several techniques to transform the virtual scene so that the user’s physical path stays within the available laboratory space. This chapter discusses implementations of these two types of interfaces with particular regard to how walking-in-place interfaces generate smooth motion and how real-walking interfaces modify the user’s view of the scene so deviations from her real motion are less detectable.
Mary C. Whitton, Tabitha C. Peck
Chapter 12. Multimodal Rendering of Walking Over Virtual Grounds
Abstract
The addition of multimodal feedback during navigation in a virtual environment is fundamental when aiming at fully immersive and realistic simulations. Several visual, acoustic, haptic or vibrotactile perceptual cues can be generated when walking over a ground surface. Such sensory feedback can provide crucial and varied information regarding either the ground material itself, the properties of the ground surface such as slope or elasticity, the surrounding environment, the specificities of the foot-floor interaction such as gait phase or forces, or even users’ emotions. This chapter addresses the multimodal rendering of walking over virtual ground surfaces, incorporating haptic, acoustic and graphic rendering to enable truly multimodal walking experiences.
Maud Marchal, Gabriel Cirio, Yon Visell, Federico Fontana, Stefania Serafin, Jeremy Cooperstock, Anatole Lécuyer

Part III

Frontmatter
Chapter 13. Displacements in Virtual Reality for Sports Performance Analysis
Abstract
In real situations, analyzing the contribution of different parameters on sports performance is a difficult task. In a duel for example, an athlete needs to anticipate his opponent’s actions to win. To evaluate the relationship between perception and action in such a duel, the parameters used to anticipate the opponent’s action must then be determined. Only a fully standardized and controllable environment such as virtual reality can allow this analysis. Nevertheless, movement is inherent in sports and only a system providing a complete freedom of movements of the immersed subject (including displacements) would allow the study of the link between visual information uptake and action, that is related to performance. Two case studies are described to illustrate such use of virtual reality to better understand sports performance. Finally, we discuss how the introduction of new displacement devices can extend the range of applications in sports.
Richard Kulpa, Benoit Bideau, Sébastien Brault
Chapter 14. Redirected Walking in Mixed Reality Training Applications
Abstract
To create effective immersive training experiences, it is important to provide intuitive interfaces that allow users to move around and interact with virtual content in a manner that replicates real world experiences. However, natural locomotion remains an implementation challenge because the dimensions of the physical tracking space restrict the size of the virtual environment that users can walk through. To relax these limitations, redirected walking techniques may be employed to enable walking through immersive virtual environments that are substantially larger than the physical tracking area. In this chapter, we present practical design considerations for employing redirected walking in immersive training applications and recent research evaluating the impact on spatial orientation. Additionally, we also describe an alternative implementation of redirection that is more appropriate for mixed reality environments. Finally, we discuss challenges and future directions for research in redirected walking with the goal of transitioning these techniques into practical training simulators.
Evan A. Suma, David M. Krum, Mark Bolas
Chapter 15. VR-Based Assessment and Rehabilitation of Functional Mobility
Abstract
The advent of virtual reality (VR) as a tool for real-world training dates back to the mid-twentieth century and the early years of driving and flight simulators. These simulation environments, while far below the quality of today’s visual displays, proved to be advantageous to the learner due to the safe training environments the simulations provided. More recently, these training environments have proven beneficial in the transfer of user-learned skills from the simulated environment to the real world [5, 31, 48, 51, 57]. Of course the VR technology of today has come a long way. Contemporary displays boast high-resolution, wide-angle fields of view and increased portability. This has led to the evolution of new VR research and training applications in many different arenas, several of which are covered in other chapters of this book. This is true of clinical assessment and rehabilitation as well, as the field has recognized the potential advantages of incorporating VR technologies into patient training for almost 20 years [7, 10, 18, 45, 78].
Adam W. Kiefer, Christopher K. Rhea, William H. Warren
Chapter 16. Full Body Locomotion with Video Game Motion Controllers
Abstract
Sensing technologies of increasing fidelity are dropping in costs to the point that full body sensing hardware is commonplace in people’s homes. This presents an opportunity for users to interact and move through environments with their body. Not just walking or running, but jumping, dodging, looking, dancing and exploring. Three current generation videogame devices, the Nintendo Wii Remote, Playstation Move and Microsoft Kinect, are discussed in terms of their sensors and data, in order to explore two questions. First, how do you deal with the data from the devices including error, uncertainty and volume? Second, how do you use the devices to create an interface that allows the user to interact as they wish? While these devices will change in time, understanding the sensing methods and approach to interface design will act as a basis for further improvements to full body locomotion.
Brian Williamson, Chadwick Wingrave, Joseph J. LaViola Jr.
Chapter 17. Interacting with Augmented Floor Surfaces
Abstract
This chapter reviews techniques and technologies for interaction via the feet with touch-sensitive floor surfaces that are augmented with multimodal (visual, auditory, and/or haptic) feedback. We discuss aspects of human-computer interaction with such interfaces, including potential applications in virtual and augmented reality for floor based user interfaces and immersive walking simulations. Several realizations of augmented floor surfaces are discussed, and we review one case example that has been extensively investigated by the authors, along with evaluations that have been reported in prior literature. Potential applications in the domains of human-computer interaction and virtual reality are also reviewed.
Yon Visell, Severin Smith, Jeremy R. Cooperstock
Backmatter
Metadaten
Titel
Human Walking in Virtual Environments
herausgegeben von
Frank Steinicke
Yon Visell
Jennifer Campos
Anatole Lécuyer
Copyright-Jahr
2013
Verlag
Springer New York
Electronic ISBN
978-1-4419-8432-6
Print ISBN
978-1-4419-8431-9
DOI
https://doi.org/10.1007/978-1-4419-8432-6