Skip to main content
Log in

A novel human–machine interface based on recognition of multi-channel facial bioelectric signals

  • Scientific Paper
  • Published:
Australasian Physical & Engineering Sciences in Medicine Aims and scope Submit manuscript

Abstract

This paper presents a novel human–machine interface for disabled people to interact with assistive systems for a better quality of life. It is based on multi-channel forehead bioelectric signals acquired by placing three pairs of electrodes (physical channels) on the Frontalis and Temporalis facial muscles. The acquired signals are passed through a parallel filter bank to explore three different sub-bands related to facial electromyogram, electrooculogram and electroencephalogram. The root mean square features of the bioelectric signals analyzed within non-overlapping 256 ms windows were extracted. The subtractive fuzzy c-means clustering method (SFCM) was applied to segment the feature space and generate initial fuzzy based Takagi–Sugeno rules. Then, an adaptive neuro-fuzzy inference system is exploited to tune up the premises and consequence parameters of the extracted SFCMs rules. The average classifier discriminating ratio for eight different facial gestures (smiling, frowning, pulling up left/right lips corner, eye movement to left/right/up/down) is between 93.04% and 96.99% according to different combinations and fusions of logical features. Experimental results show that the proposed interface has a high degree of accuracy and robustness for discrimination of 8 fundamental facial gestures. Some potential and further capabilities of our approach in human–machine interfaces are also discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Huang CN, Chen CH, Chung HY (2004) The review of applications and measurements in facial electromyography. J Med Biol Eng 25(1):15–20

    Google Scholar 

  2. Ang L, Belen E, Bernardo R, Boongaling E, Briones G, Coronel J (2004) Facial expression recognition through pattern analysis of facial movements utilizing electromyogram sensors. In: IEEE TENCON2004, Chiang Mai, Thailand, pp 600–603

  3. Firoozabadi M, Oskoei MA, Hu H (2008) A human–computer interface based on forehead multichannel biosignals to control a virtual wheelchair. In: ICBME08, Tehran, Iran

  4. Mohammad Rezazadeh I, Wang X, Wang R, Firoozabadi M (2009) Toward affective handsfree human–machine interface approach in virtual environments-based equipment operation. In: Training 9th international conference on construction applications on virtual reality (CONVR2009), Sydney, Australia

  5. Mohammad Rezazadeh I, Firoozabadi M, Hu H, Hashemi Golpayegani MR (2010) Determining the surface electrodes locations to capture facial bioelectric signals. Iran J Med Phys 7:65–79

    Google Scholar 

  6. Mohammad Rezazadeh I, Wang X, Firoozabadi M, Hashemi Golpayegani MR (2011) Using affective human–machine interface to interface to increase the operation performance in virtual construction crane training system: a novel approach. Autom Construct J 20:289–298

    Article  Google Scholar 

  7. Mahlke S, Minge M (2006) Emotions and EMG measures of facial muscles in interactive contexts. In: Conference in human factor in Eng., Montreal, Canada

  8. Rosalind P (1997) Affective computing. MIT Press, Cambridge

    Google Scholar 

  9. Vanhala T, Surakka V (2007) facial activation control (FACE). In: Proceedings of ACII2007. Lecture notes in computer science, vol 4738, pp 278–289

  10. Ferreira A, Silva RL, Celeste WC, Bastos Filho TF, Filho MS (2007) Human–machine interface based on muscular and brain signals applied to a robotic wheelchair. In: 16th Argentine Bioeng. Cong. J. Physics, Conference Series 90, Argentina

  11. Niemenlehto PH, Juhola M, Surakka V (2006) Detection of electromyographic signal from facial muscles with neural networks. ACM Trans Appl Percept 3(1):48–61

    Article  Google Scholar 

  12. Surraka V (1998) Contagion and modulation of human emotions. University of Tampere, Tampere

    Google Scholar 

  13. Tsui C, Jia P, Gan JQ, Hu H, Yuan K (2007) EMG-based hands-free wheelchair control with EOG attention shift detection. In: IEEE international conference on robotics and biomimetics ROBIO2007, Sanya, China, pp 1266–1271

  14. Hori J, Sakano K, Saitoh Y (2004) Development of communication supporting device controlled by eye movements voluntary eye blink. In: Proc. the 26th annual international conference of IEEE EMBS, San Francisco, CA, USA

  15. Barreto AB, Scargle SD, Adjouadi M (2000) A practical EMG-based human–computer interface for users with motor disabilities. J Rehabil Res Dev 37(1):53–64

    PubMed  CAS  Google Scholar 

  16. Brainfinger. http://www.brainfinger.com. Accessed on 19 March 2011

  17. Chin ZY, Ang KK, Guan C (2008) Multiclass voluntary facial expression classification based on filter bank common spatial pattern. In: 30th annual international conference of IEEE EMBS, Vancouver, BC, Canada

  18. Kim KH, Yoo JK, Kim HK, Son W, Lee SY (2006) A practical biosignal-based human interface applicable to the assistive systems for people with motor impairment. IEICE Trans Inform Syst E89-D(10):2644–2652

    Article  Google Scholar 

  19. Biopac. http://www.biopac.com, Accessed 19 March 2011

  20. Oskoei M, Hu H (2007) Myoelectric control systems—a survey. J Biomed Signal Process Control 2(4):275–294

    Article  Google Scholar 

  21. Ajiboye AB, Weir R (2005) A heuristic fuzzy logic approach to EMG pattern recognition for multi-functional prosthesis control. IEEE Trans Neural Syst Rehabil Eng 13(3):280–291

    Article  PubMed  Google Scholar 

  22. Fukuda O, Tsuji T, Kaneko M, Otsuka A (2003) A human-assisting manipulator teleoperated by EMG signals and arm motions. IEEE Trans Robotics Autom 19(2):210–222

    Article  Google Scholar 

  23. Momen K, Krishnan S, Chau T (2007) Real-time classification of forearm electromyographic signals corresponding to user-selected intentional movements for multifunction control. IEEE Trans Neural Syst Rehabil Eng 15(4):535–542

    Article  PubMed  Google Scholar 

  24. Moertini V (2002) Introduction to five data clustering algorithm. Integral 7(2):87–96

    Google Scholar 

  25. Priyona A, Ridwan M, Alias A, Atiq R, Rahmat OK, Hassan A, Ali A (2003) Generation of fuzzy rules with subtractive clustering. Universiti Teknologi Malaysia. Jurnal Teknologi 43(D):143–153

    Google Scholar 

  26. Palot K, Yosunkaya S, Gunes S (2008) Comparison of different classifier algorithms on the automated detection of obstructive sleep apnea syndrome. J Med Syst 32:243–250

    Article  Google Scholar 

  27. Khezri M, Jahed M (2003) Real-time intelligent pattern recognition algorithm for surface EMG signals. Biomed Eng Online 6:45

    Google Scholar 

  28. Paradiso G, Cunic D, Gunraj C, Chen R (2005) Representation of facial muscles in human motor cortex. J Physiol 567:323–336

    Article  PubMed  CAS  Google Scholar 

  29. Ahsan MD, Ibrahimy M, Khalifa O (2009) EMG signal classification for human computer interaction: a review. Eur J Sci Res 33(3):480–501

    Google Scholar 

Download references

Acknowledgments

We would like to thank Dr. Christian Jones from USC-Australia for sharing his expertise in the area of affective computing and the help from eager volunteer participants is also appreciated.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Iman Mohammad Rezazadeh.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mohammad Rezazadeh, I., Firoozabadi, S.M., Hu, H. et al. A novel human–machine interface based on recognition of multi-channel facial bioelectric signals. Australas Phys Eng Sci Med 34, 497–513 (2011). https://doi.org/10.1007/s13246-011-0113-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13246-011-0113-1

Keywords

Navigation