Skip to main content

2016 | Buch

Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments

insite
SUCHEN

Über dieses Buch

This book provides a concise study of eye gaze tracking as a direct controller of electronic displays and interfaces inside cars and other vehicles. The author explores the prospect of controlling a vehicle’s internal system via the drivers’ eye gaze and for the vehicles to analyse and respond to a drivers' change in cognitive load too.

New algorithms tackling micro-saccadic eye movements and the inaccuracy in eye gaze tracking for controlling on-screen pointers are presented and explored. Multimodal fusion algorithms involving eye gaze and finger tracking systems are presented and validated and important results have been obtained on gaze controlled interfaces and visual responses whilst encountering oncoming road hazards. A set of user trials to validate the algorithms involving driving simulators are also presented by the author.

Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments would of great importance to researchers and designers alike, within the fields of automotive design and engineering, human-computer interaction (HCI) and intelligent interfaces.

Inhaltsverzeichnis

Frontmatter
Chapter 1. Introduction
Abstract
Modern infotainment systems in automobiles facilitate driving at the cost of adding secondary tasks in addition to the primary task of driving. These secondary tasks have considerable chance to distract a driver from his primary driving task, thereby reducing safety or increasing cognitive workload. Thus easing out human-machine interaction (HMI) between drivers and infotainment systems can potentially raise safety and help to leverage the true potential of those systems.
Pradipta Biswas
Chapter 2. Preliminary Studies on Input Modalities
Abstract
This chapter reports a set of user studies on a pointing and selection task similar to ISO 9241 pointing task. Users undertook pointing and selection trials using eye gaze and finger trackers for different target size and distances. We conducted trials in desktop and automotive environments, and results from these trials were later used to develop target prediction and multimodal fusion algorithms.
Pradipta Biswas
Chapter 3. Intelligent Multimodal Systems
Abstract
The previous chapter compared gaze controlled and finger tracking based pointing modalities and we found users found it difficult to home on target using the gaze controlled interface and even with finger tracking system in automotive environment. In this chapter, we have proposed an algorithm that can activate a target even before the pointer reaches on top of it. For a gaze controlled interface, the target will be activated as soon as the saccade launches near the target reducing the fixation duration required to activate a target. In the latter half of the chapter we discussed different fusion strategies to combine eye gaze and finger tracking systems together. The previous chapter already introduced multimodal eye gaze tracking system by combining eye gaze tracking with joystick and LeapMotion controller, this chapter takes forward the concept with more sophisticated fusion models.
Pradipta Biswas
Chapter 4. User Studies on Driving Simulator
Abstract
In this chapter, we have investigated the use of eye gaze tracking for operating a dashboard in an automotive environment. In our previous studies, the primary task was pointing and selection in a graphical user interface. In the following two studies, we investigated pointing and selection tasks in a dual-task set-up where the primary task was operating a driving simulator. The first study involved two different secondary tasks to compare four different pointing modalities (touchscreen and the fusion strategies discussed in previous chapters). The second study investigated effect of eye gaze tracking on driving performance in two different road conditions. In the following sections, we have described the studies in details.
Pradipta Biswas
Chapter 5. Preliminary Study on Cognitive Load Detection
Abstract
Detection of drivers’ cognitive load is a well-investigated subject in automotive technology. Recent advancement of information technology added a plethora of infotainment systems inside a car to help the driver. However, proper and also improper uses of those systems often distract the driver from driving and lead to driving accidents. Besides the infotainment system, the change of mental states of drivers due to road or traffic condition and duration of driving can also increase chances of accident.
Pradipta Biswas
Chapter 6. User Studies on Saccadic Intrusion
Abstract
The previous chapter confirmed that saccadic intrusion may be a viable alternative to detect higher cognitive load. This chapter presents a set of user studies where participants were subjected to tasks demanding variable amount of cognitive load. We measured and compared parameters involving saccadic intrusion in different task conditions.
Pradipta Biswas
Chapter 7. Concluding Remarks
Abstract
The previous chapters demonstrate that it is possible to control on-board infotainment systems in a car using gaze-controlled interface and an off-the-shelf eye gaze tracker can also be used to detect sudden change in drivers’ cognitive load. So how far are we from installing gaze-controlled interface in a real car? The following paragraphs point out limitations of existing technology and also points to future research directions.
Pradipta Biswas
Metadaten
Titel
Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments
verfasst von
Pradipta Biswas
Copyright-Jahr
2016
Electronic ISBN
978-3-319-40709-8
Print ISBN
978-3-319-40708-1
DOI
https://doi.org/10.1007/978-3-319-40709-8

Neuer Inhalt