Elsevier

Clinical Neurophysiology

Volume 125, Issue 11, November 2014, Pages 2297-2304
Clinical Neurophysiology

Comparing tactile and visual gaze-independent brain–computer interfaces in patients with amyotrophic lateral sclerosis and healthy users

https://doi.org/10.1016/j.clinph.2014.03.005Get rights and content

Highlights

  • ALS patients with mild to moderate disabilities can control a visual gaze-independent BCI spelling system.

  • The visual Hex-o-Spell outperforms the tactile speller in both healthy participants and ALS patients.

  • Subjective assessment shows that attending to visual stimuli is easier than attending to tactile stimuli, even if the stimuli are in the peripheral visual field.

Abstract

Objective

Brain–computer interfaces (BCI) tested in patients often are gaze-dependent, while these intended users could possibly lose the ability to focus their gaze. Therefore, a visual and a tactile gaze-independent spelling system were investigated.

Methods

Five patients with amyotrophic lateral sclerosis (ALS) tested a visual Hex-o-Spell and a tactile speller. Six healthy participants were also included, mainly to evaluate the tactile stimulators.

Results

A significant attentional modulation was seen in the P300 for the Hex-o-Spell and in the N2 for the tactile speller. Average on-line classification performance for selecting a step in the speller was above chance level (17%) for both spellers. However, average performance was higher for the Hex-o-Spell (88% and 85% for healthy participants and patients, respectively) than for the tactile speller (56% and 53%, respectively). Likewise, bitrates were higher for the Hex-o-Spell compared with the tactile speller, and in the subjective usability a preference for the Hex-o-Spell was found.

Conclusions

The Hex-o-Spell outperformed the tactile speller in classification performance, bit rate and subjective usability.

Significance

This is the first study showing the possible use of tactile and visual gaze-independent BCI spelling systems by ALS patients with mild to moderate disabilities.

Introduction

The most investigated brain–computer interface (BCI) for communication is the visual speller which uses flashing rows and columns (Farwell and Donchin, 1988). Participants pay attention to a symbol that they want to select. Every symbol is characterized by its own unique sequence of flashes. The flash of an attended symbol yields a different electroencephalographic (EEG) response compared with flashes of other symbols. By aggregating information over a sequence of flashes, the BCI can detect the desired symbol. To get highest performance, participants need to direct their gaze to the desired symbol: a covert version of this speller, in which users are not allowed to look at the symbol, but must keep their gaze fixated on the center of the display, has much lower performance (Brunner et al., 2010, Treder and Blankertz, 2010). This means that patients whose vision or gaze control is impaired, will have significantly reduced performance, or cannot use this system at all. Patients with amyotrophic lateral sclerosis (ALS), a progressive neurodegenerative disease affecting the motor neurons, have a high risk of becoming locked-in. Furthermore, many of these patients may develop oculomotor control deficits. Therefore, they can probably not use systems that depend on eye gaze. Recently, Riccio et al. (2012) systematically reviewed the available systems that could be used independent of eye gaze in the auditory, tactile and visual domain. Only two of the 34 included articles tested their system with end-users (ALS patients). Both were auditory BCIs (Sellers and Donchin, 2006, Kübler et al., 2009). To the best of our knowledge, no studies have been conducted with visual or tactile eye-gaze independent BCIs in ALS patients.

Several tactile BCI systems have been tested successfully in healthy subjects. A first BCI based on tactile stimulation focussed on steady-state somatosensory potentials (Müller-Putz et al., 2006). Later, transient ERP responses to tactile stimulation of the trunk (Brouwer and van Erp, 2010, Thurlings et al., 2011, Thurlings et al., 2012) and fingers have been used as well (Severens et al., 2013). Recently, in healthy subjects we showed that a tactile speller performed similar to a gaze-independent visual speller (Van der Waal et al., 2012). In the tactile speller, stimulus events are short mechanical taps against the fingertips. Initially, each finger corresponds to a number of letters. After the selection of a subset of letters, these letters are distributed over the fingers. Then a single letter can be selected. Thus, spelling a letter is a two-step procedure. The gaze-independent visual speller that was used in the comparison was the Hex-o-Spell. This speller was first introduced by Treder and Blankertz (2010) and was later evaluated in an online setting (Treder et al., 2011). In this Hex-o-Spell, letters are divided over six circles on the screen. Covert attention is used to select the desired circles. Stimulus events are an intensification of a circle and the including symbols. Again the selection of a letter occurs in two steps: first a circle of letters is selected; second the remaining letters are distributed over the circles and an individual letter can be selected. Both the tactile speller and Hex-o-Spell are promising for patients that cannot control eye gaze.

The number of studies in which end-users evaluate the BCI is limited. Although some studies that do include end-users report similar performance compared with able-bodied subjects, (Sellers and Donchin, 2006, Nijboer et al., 2008b, Pires et al., 2012), others report lower performance in end-users (Piccione et al., 2006, Kübler et al., 2009, Ortner et al., 2011). Evaluations with end-users are necessary to assess not only the performance of BCI systems, but also the usability in these end-users.

In the present paper, the performance of the tactile speller and Hex-o-Spell were compared. Participants included a group of healthy subjects and a group of ALS patients. Because this is the first study on gaze-independent BCIs using tactile and visual stimuli that includes patients we selected patients with mild to moderate disabilities. The healthy group was included mainly for testing newly developed tactile stimulators. Furthermore, for both systems the subjective usability in terms of need for support, training and complexity was assessed and compared.

Section snippets

Participants

Eleven volunteers participated in this study: six healthy subjects (mean age 20 year (SD 0.4), 4 female) and five individuals with ALS. Inclusion criteria were a diagnosis of typical ALS and a duration of the illness of less than 3 years. Exclusion criteria were other neurological disorders and inability to understand and carry out the test instructions. All participants gave written informed consent before the start of the experiment. The experiment was approved by the ethical committee of the

ERP responses

In the ERP difference waves several responses could be seen. First, in the tactile speller a negative component around 200 ms (N2) was found, which was larger for targets than for non-targets (see Fig. 3a). This N2 was most clear in left temporal parietal electrodes. The difference between target and non-target in this component was significant for only the tactile speller in the patient group. In the Hex-o-Spell, no N2 was found, and the difference between target and non-target is also not

Discussion

The aim of this paper was to assess the online performance and usability of two eye-gaze independent BCI spelling systems. Overall the online performance of the Hex-o-Spell was higher than the online performance of the tactile speller. This pattern was found in both the binary classification and the six-class decoding and also translated to higher usability scores for the Hex-o-Spell than for the tactile speller. For the tactile speller, the N2 was stronger for targets compared with non-targets

Acknowledgments

The authors gratefully acknowledge the support of the BrainGain Smart Mix Programme of the Netherlands Ministry of Economic Affairs and the Netherlands Ministry of Education, Culture and Science. In addition, we would like to thank Jos Wittebrood and Pascal de Water from the electronic research group for their technical support, and Philip van den Broek for software development and support.

References (30)

  • D.H. Brainard

    The psychophysics toolbox

    Spat Vis

    (1997)
  • A.M. Brouwer et al.

    A tactile P300 brain–computer interface

    Front Neurosci

    (2010)
  • P. Brunner et al.

    Does the “P300” speller depend on eye gaze?

    J Neural Eng

    (2010)
  • J. Farquhar et al.

    Interactions between pre-processing and classification methods for event-related-potential classification: best-practice guidelines for brain–computer interfacing

    Neuroinformatics

    (2012)
  • J.E. Huggins et al.

    What would brain–computer interface users want? Opinions and priorities of potential users with amyotrophic lateral sclerosis

    Amyotroph Lateral Scler

    (2011)
  • Cited by (30)

    • Brain-computer interfaces for communication

      2020, Handbook of Clinical Neurology
      Citation Excerpt :

      Because oculomotor function is often impaired in people with late-stage ALS (Hayashi and Oppenheimer, 2003; Donaghy et al., 2011; Murguialday et al., 2011; Sharma et al., 2011), there is a growing interest in gaze-independent visual, auditory, and tactile BCIs (for a review, see Riccio et al., 2012). Studies investigating these approaches in people with ALS or LIS are scarce, but so far indicate limited usability (Kaufmann et al., 2013; Severens et al., 2014). Current and ongoing efforts to directly decode (covert or imagined) speech from neural signals using subdural or intracortical electrodes may eventually provide a viable alternative, since this approach, if successful, would completely remove any visual gaze limits.

    • Noninvasive Brain–Computer Interfaces

      2018, Neuromodulation: Comprehensive Textbook of Principles, Technologies, and Therapies, Second Edition: Volume 1-3
    • Operation of a P300-based brain-computer interface by patients with spinocerebellar ataxia

      2017, Clinical Neurophysiology Practice
      Citation Excerpt :

      A Hex-O speller, which needed covertly directed attention to a target, is a gaze-independent visual BCI (Treder and Blankertz, 2010; Marjolein van der et al., 2012). Severens et al. showed that patients with moderate ALS could achieve high performance in a copy-spelling task with the Hex-O speller (Severens et al., 2014). In a P300 auditory BCI speller, instead of visual stimuli, a series of different auditory stimuli (target and nontarget) is presented and the participant is asked to concentrate on a target stimulus.

    • Classification of tactile event-related potential elicited by Braille display for brain–computer interface

      2017, Biocybernetics and Biomedical Engineering
      Citation Excerpt :

      A multi-signature BCI was also developed using both transient and steady-state SEP [17]. Severens et al. developed a stimulation device that can be attached to the fingers and compared tactile BCI with visual BCI [18]. We use a piezo-actuator (KGS Corp.) as a tactile stimulator [19].

    • Wheelchair control by elderly participants in a virtual environment with a brain-computer interface (BCI) and tactile stimulation

      2016, Biological Psychology
      Citation Excerpt :

      An important aspect of our design was intuitiveness and ease of use. Previously, the most common tactor placements were around the trunk (Brouwer & van Erp, 2010; Thurlings et al., 2012) or at the fingertips (Severens, Farquhar, Duysens, & Desain, 2013; Severens, Van der Waal, Farquhar, & Desain, 2014; Van der Waal et al., 2012). We placed tactors at respective body positions (see methods) to allow for intuitive relation between stimulation and movement direction, and easy discrimination between targets due to spacious distribution.

    View all citing articles on Scopus
    View full text