Abstract
This paper presents a novel human–machine interface for disabled people to interact with assistive systems for a better quality of life. It is based on multi-channel forehead bioelectric signals acquired by placing three pairs of electrodes (physical channels) on the Frontalis and Temporalis facial muscles. The acquired signals are passed through a parallel filter bank to explore three different sub-bands related to facial electromyogram, electrooculogram and electroencephalogram. The root mean square features of the bioelectric signals analyzed within non-overlapping 256 ms windows were extracted. The subtractive fuzzy c-means clustering method (SFCM) was applied to segment the feature space and generate initial fuzzy based Takagi–Sugeno rules. Then, an adaptive neuro-fuzzy inference system is exploited to tune up the premises and consequence parameters of the extracted SFCMs rules. The average classifier discriminating ratio for eight different facial gestures (smiling, frowning, pulling up left/right lips corner, eye movement to left/right/up/down) is between 93.04% and 96.99% according to different combinations and fusions of logical features. Experimental results show that the proposed interface has a high degree of accuracy and robustness for discrimination of 8 fundamental facial gestures. Some potential and further capabilities of our approach in human–machine interfaces are also discussed.
Similar content being viewed by others
References
Huang CN, Chen CH, Chung HY (2004) The review of applications and measurements in facial electromyography. J Med Biol Eng 25(1):15–20
Ang L, Belen E, Bernardo R, Boongaling E, Briones G, Coronel J (2004) Facial expression recognition through pattern analysis of facial movements utilizing electromyogram sensors. In: IEEE TENCON2004, Chiang Mai, Thailand, pp 600–603
Firoozabadi M, Oskoei MA, Hu H (2008) A human–computer interface based on forehead multichannel biosignals to control a virtual wheelchair. In: ICBME08, Tehran, Iran
Mohammad Rezazadeh I, Wang X, Wang R, Firoozabadi M (2009) Toward affective handsfree human–machine interface approach in virtual environments-based equipment operation. In: Training 9th international conference on construction applications on virtual reality (CONVR2009), Sydney, Australia
Mohammad Rezazadeh I, Firoozabadi M, Hu H, Hashemi Golpayegani MR (2010) Determining the surface electrodes locations to capture facial bioelectric signals. Iran J Med Phys 7:65–79
Mohammad Rezazadeh I, Wang X, Firoozabadi M, Hashemi Golpayegani MR (2011) Using affective human–machine interface to interface to increase the operation performance in virtual construction crane training system: a novel approach. Autom Construct J 20:289–298
Mahlke S, Minge M (2006) Emotions and EMG measures of facial muscles in interactive contexts. In: Conference in human factor in Eng., Montreal, Canada
Rosalind P (1997) Affective computing. MIT Press, Cambridge
Vanhala T, Surakka V (2007) facial activation control (FACE). In: Proceedings of ACII2007. Lecture notes in computer science, vol 4738, pp 278–289
Ferreira A, Silva RL, Celeste WC, Bastos Filho TF, Filho MS (2007) Human–machine interface based on muscular and brain signals applied to a robotic wheelchair. In: 16th Argentine Bioeng. Cong. J. Physics, Conference Series 90, Argentina
Niemenlehto PH, Juhola M, Surakka V (2006) Detection of electromyographic signal from facial muscles with neural networks. ACM Trans Appl Percept 3(1):48–61
Surraka V (1998) Contagion and modulation of human emotions. University of Tampere, Tampere
Tsui C, Jia P, Gan JQ, Hu H, Yuan K (2007) EMG-based hands-free wheelchair control with EOG attention shift detection. In: IEEE international conference on robotics and biomimetics ROBIO2007, Sanya, China, pp 1266–1271
Hori J, Sakano K, Saitoh Y (2004) Development of communication supporting device controlled by eye movements voluntary eye blink. In: Proc. the 26th annual international conference of IEEE EMBS, San Francisco, CA, USA
Barreto AB, Scargle SD, Adjouadi M (2000) A practical EMG-based human–computer interface for users with motor disabilities. J Rehabil Res Dev 37(1):53–64
Brainfinger. http://www.brainfinger.com. Accessed on 19 March 2011
Chin ZY, Ang KK, Guan C (2008) Multiclass voluntary facial expression classification based on filter bank common spatial pattern. In: 30th annual international conference of IEEE EMBS, Vancouver, BC, Canada
Kim KH, Yoo JK, Kim HK, Son W, Lee SY (2006) A practical biosignal-based human interface applicable to the assistive systems for people with motor impairment. IEICE Trans Inform Syst E89-D(10):2644–2652
Biopac. http://www.biopac.com, Accessed 19 March 2011
Oskoei M, Hu H (2007) Myoelectric control systems—a survey. J Biomed Signal Process Control 2(4):275–294
Ajiboye AB, Weir R (2005) A heuristic fuzzy logic approach to EMG pattern recognition for multi-functional prosthesis control. IEEE Trans Neural Syst Rehabil Eng 13(3):280–291
Fukuda O, Tsuji T, Kaneko M, Otsuka A (2003) A human-assisting manipulator teleoperated by EMG signals and arm motions. IEEE Trans Robotics Autom 19(2):210–222
Momen K, Krishnan S, Chau T (2007) Real-time classification of forearm electromyographic signals corresponding to user-selected intentional movements for multifunction control. IEEE Trans Neural Syst Rehabil Eng 15(4):535–542
Moertini V (2002) Introduction to five data clustering algorithm. Integral 7(2):87–96
Priyona A, Ridwan M, Alias A, Atiq R, Rahmat OK, Hassan A, Ali A (2003) Generation of fuzzy rules with subtractive clustering. Universiti Teknologi Malaysia. Jurnal Teknologi 43(D):143–153
Palot K, Yosunkaya S, Gunes S (2008) Comparison of different classifier algorithms on the automated detection of obstructive sleep apnea syndrome. J Med Syst 32:243–250
Khezri M, Jahed M (2003) Real-time intelligent pattern recognition algorithm for surface EMG signals. Biomed Eng Online 6:45
Paradiso G, Cunic D, Gunraj C, Chen R (2005) Representation of facial muscles in human motor cortex. J Physiol 567:323–336
Ahsan MD, Ibrahimy M, Khalifa O (2009) EMG signal classification for human computer interaction: a review. Eur J Sci Res 33(3):480–501
Acknowledgments
We would like to thank Dr. Christian Jones from USC-Australia for sharing his expertise in the area of affective computing and the help from eager volunteer participants is also appreciated.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Mohammad Rezazadeh, I., Firoozabadi, S.M., Hu, H. et al. A novel human–machine interface based on recognition of multi-channel facial bioelectric signals. Australas Phys Eng Sci Med 34, 497–513 (2011). https://doi.org/10.1007/s13246-011-0113-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13246-011-0113-1