Skip to main content
Erschienen in: Cluster Computing 2/2023

03.11.2022

An AI-empowered affect recognition model for healthcare and emotional well-being using physiological signals

verfasst von: Zijian Zhou, Muhammad Adeel Asghar, Daniyal Nazir, Kamran Siddique, Mohammad Shorfuzzaman, Raja Majid Mehmood

Erschienen in: Cluster Computing | Ausgabe 2/2023

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Affective Computing is one of the central studies for achieving advanced human-computer interaction and is a popular research direction in the field of artificial intelligence for smart healthcare frameworks. In recent years, the use of electroencephalograms (EEGs) to analyze human emotional states has become a hot spot in the field of emotion recognition. However, the EEG is a non-stationary, non-linear signal that is sensitive to interference from other physiological signals and external factors. Traditional emotion recognition methods have limitations in complex algorithm structures and low recognition precision. In this article, based on an in-depth analysis of EEG signals, we have studied emotion recognition methods in the following respects. First, in this study, the DEAP dataset and the excitement model were used, and the original signal was filtered with others. The frequency band was selected using a butter filter and then the data was processed in the same range using min–max normalization. Besides, in this study, we performed hybrid experiments on sash windows and overlays to obtain an optimal combination for the calculation of features. We also apply the Discrete Wave Transform (DWT) to extract those functions from the preprocessed EEG data. Finally, a pre-trained k-Nearest Neighbor (kNN) machine learning model was used in the recognition and classification process and different combinations of DWT and kNN parameters were tested and fitted. After 10-fold cross-validation, the precision reached 86.4%. Compared to state-of-the-art research, this method has higher recognition accuracy than conventional recognition methods, while maintaining a simple structure and high speed of operation.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Albraikan, A., Tobin, D.P., El Saddik, A.: Toward user-independent emotion recognition using physiological signals. IEEE Sens. J. 19(19), 8402–8412 (2019)CrossRef Albraikan, A., Tobin, D.P., El Saddik, A.: Toward user-independent emotion recognition using physiological signals. IEEE Sens. J. 19(19), 8402–8412 (2019)CrossRef
2.
Zurück zum Zitat Hossain, M.S., Muhammad, G., Alamri, A.: Smart healthcare monitoring: a voice pathology detection paradigm for smart cities. Multimed. Syst. 25(5), 565–575 (2019)CrossRef Hossain, M.S., Muhammad, G., Alamri, A.: Smart healthcare monitoring: a voice pathology detection paradigm for smart cities. Multimed. Syst. 25(5), 565–575 (2019)CrossRef
3.
Zurück zum Zitat Hossain, M.S., Muhammad, G.: Deep learning based pathology detection for smart connected healthcare. IEEE Netw. 34(6), 120–125 (2020)MathSciNetCrossRef Hossain, M.S., Muhammad, G.: Deep learning based pathology detection for smart connected healthcare. IEEE Netw. 34(6), 120–125 (2020)MathSciNetCrossRef
4.
Zurück zum Zitat Tian, Fangzheng, Gao, Yongbin, Fang, Zhijun, Fang, Yuming, Jia, Gu., Fugita, Hamido, Hwang, Jenq-Neng.: Depth estimation using a self-supervised network based on cross-layer feature fusion and the quadtree constraint. IEEE Trans. Circuits Syst. Video Technol. (TCSVT) 32(4), 1751–1766 (2022)CrossRef Tian, Fangzheng, Gao, Yongbin, Fang, Zhijun, Fang, Yuming, Jia, Gu., Fugita, Hamido, Hwang, Jenq-Neng.: Depth estimation using a self-supervised network based on cross-layer feature fusion and the quadtree constraint. IEEE Trans. Circuits Syst. Video Technol. (TCSVT) 32(4), 1751–1766 (2022)CrossRef
5.
Zurück zum Zitat Seo, Yeong-Seok., Huh, Jun-Ho.: Automatic emotion-based music classification for supporting intelligent IoT applications. Electronics 8(2), 164 (2019)CrossRef Seo, Yeong-Seok., Huh, Jun-Ho.: Automatic emotion-based music classification for supporting intelligent IoT applications. Electronics 8(2), 164 (2019)CrossRef
6.
Zurück zum Zitat Maria, E., Matthias, L., Sten, H.: Emotion recognition from physiological signal analysis: a review. Electron. Notes Theor. Comput. Scie. 343, 35–55 (2019)CrossRef Maria, E., Matthias, L., Sten, H.: Emotion recognition from physiological signal analysis: a review. Electron. Notes Theor. Comput. Scie. 343, 35–55 (2019)CrossRef
7.
Zurück zum Zitat Dzedzickis, A., Kaklauska, C., Bucinskas, C.: Human emotion recognition: review of sensors and method. Sensors 20(592), 162–186 (2020) Dzedzickis, A., Kaklauska, C., Bucinskas, C.: Human emotion recognition: review of sensors and method. Sensors 20(592), 162–186 (2020)
8.
Zurück zum Zitat Nakisa, B., Rastgoo, M.N., Tjondronegoro, D.: Evolutionary computation algorithms for feature selection of EEG-based emotion recognition using mobile sensors. Expert Syst. Appl. 93, 143–155 (2018)CrossRef Nakisa, B., Rastgoo, M.N., Tjondronegoro, D.: Evolutionary computation algorithms for feature selection of EEG-based emotion recognition using mobile sensors. Expert Syst. Appl. 93, 143–155 (2018)CrossRef
9.
Zurück zum Zitat Gupta, V., Chopda, M.D., Pachori, R.B.: Cross-subject emotion recognition using flexible analytic wavelet transform from EEG signals. IEEE Sens. J. 19(6), 2266–2274 (2019)CrossRef Gupta, V., Chopda, M.D., Pachori, R.B.: Cross-subject emotion recognition using flexible analytic wavelet transform from EEG signals. IEEE Sens. J. 19(6), 2266–2274 (2019)CrossRef
11.
Zurück zum Zitat Hossain, M.S., Muhammad, G.: Emotion recognition using deep learning approach from audio-visual emotional big data. Inf. Fusion 49, 69–78 (2019)CrossRef Hossain, M.S., Muhammad, G.: Emotion recognition using deep learning approach from audio-visual emotional big data. Inf. Fusion 49, 69–78 (2019)CrossRef
12.
Zurück zum Zitat Muhammad, G., Hossain, M.S., Kumar, N.: EEG-based pathology detection for home health monitoring. IEEE J. Sel. Areas Commun. 39(2), 603–610 (2020)CrossRef Muhammad, G., Hossain, M.S., Kumar, N.: EEG-based pathology detection for home health monitoring. IEEE J. Sel. Areas Commun. 39(2), 603–610 (2020)CrossRef
13.
Zurück zum Zitat Dhall, A., Goecke, R., Ghosh, S.: From individual to group-level emotion recognition: Emotiw 5.0. In: Proceedings of the 19th ACM international conference on multimodal interaction, pp. 524–528. (2017) Dhall, A., Goecke, R., Ghosh, S.: From individual to group-level emotion recognition: Emotiw 5.0. In: Proceedings of the 19th ACM international conference on multimodal interaction, pp. 524–528. (2017)
14.
Zurück zum Zitat Fan, Y., Lu, X., Li, D., Liu, Y.: Video-based emotion recognition using CNN-RNN and C3D hybrid networks. In: Proceedings of the 18th ACM international conference on multimodal interaction, pp. 445–450. (2016) Fan, Y., Lu, X., Li, D., Liu, Y.: Video-based emotion recognition using CNN-RNN and C3D hybrid networks. In: Proceedings of the 18th ACM international conference on multimodal interaction, pp. 445–450. (2016)
15.
Zurück zum Zitat Fayek, H., Lech, M., Cavedon, L.: Evaluating deep learning architectures for speech emotion recognition. Neural Netw. 92, 60–68 (2017)CrossRef Fayek, H., Lech, M., Cavedon, L.: Evaluating deep learning architectures for speech emotion recognition. Neural Netw. 92, 60–68 (2017)CrossRef
16.
Zurück zum Zitat Kahou, S., Bouthillier, X., Lamblin, P.: Emonets: multimodal deep learning approaches for emotion recognition in video. J. Multimodal User Interfaces 10(2), 99–111 (2016)CrossRef Kahou, S., Bouthillier, X., Lamblin, P.: Emonets: multimodal deep learning approaches for emotion recognition in video. J. Multimodal User Interfaces 10(2), 99–111 (2016)CrossRef
17.
Zurück zum Zitat Mirsamadi, S., Barsoum, E., Zhang, C.: Automatic speech emotion recognition using recurrent neural networks with local attention. In: 2017 IEEE international conference on acoustics, speech and signal processing (ICASSP). pp. 2227–2231. IEEE (2017) Mirsamadi, S., Barsoum, E., Zhang, C.: Automatic speech emotion recognition using recurrent neural networks with local attention. In: 2017 IEEE international conference on acoustics, speech and signal processing (ICASSP). pp. 2227–2231. IEEE (2017)
18.
Zurück zum Zitat Poria, S., Chaturvedi, I., Cambria, E., Hussain, H.: A convolutional MKL based multimodal emotion recognition and sentiment analysis. In: 2016 IEEE 16th international conference on data mining (ICDM). pp 439–448. IEEE (2016) Poria, S., Chaturvedi, I., Cambria, E., Hussain, H.: A convolutional MKL based multimodal emotion recognition and sentiment analysis. In: 2016 IEEE 16th international conference on data mining (ICDM). pp 439–448. IEEE (2016)
19.
Zurück zum Zitat Trigeorgis, G., Ringeval, F., Brueckner, B.: End-to-end speech emotion recognition using a deep convolutional recurrent network. In: 2016 IEEE international conference on acoustics, speech and signal processing, pp. 5200–5204. (2016) Trigeorgis, G., Ringeval, F., Brueckner, B.: End-to-end speech emotion recognition using a deep convolutional recurrent network. In: 2016 IEEE international conference on acoustics, speech and signal processing, pp. 5200–5204. (2016)
20.
Zurück zum Zitat Hossain, M.S., Muhammad, G.: Emotion-aware connected healthcare big data towards 5G. IEEE Internet Things J. 5(4), 2399–2406 (2018)CrossRef Hossain, M.S., Muhammad, G.: Emotion-aware connected healthcare big data towards 5G. IEEE Internet Things J. 5(4), 2399–2406 (2018)CrossRef
21.
Zurück zum Zitat Geethanjali, B., Adalarasu, K., Jagannath, M., Guhan Seshadri, N.P.: Music-induced brain functional connectivity using EEG sensors: a study on Indian music. IEEE Sens. J. 19(4), 1499–1507 (2018)CrossRef Geethanjali, B., Adalarasu, K., Jagannath, M., Guhan Seshadri, N.P.: Music-induced brain functional connectivity using EEG sensors: a study on Indian music. IEEE Sens. J. 19(4), 1499–1507 (2018)CrossRef
22.
Zurück zum Zitat Greco, A., Valenza, G., Citi, L., Scilingo, E.P.: Arousal and valence recognition of affective sounds based on electrodermal activity. IEEE Sens. J. 17(3), 716–725 (2016)CrossRef Greco, A., Valenza, G., Citi, L., Scilingo, E.P.: Arousal and valence recognition of affective sounds based on electrodermal activity. IEEE Sens. J. 17(3), 716–725 (2016)CrossRef
23.
Zurück zum Zitat Tzirakis, P., Trigeorgis, G., Nicolaou, M.A., Schuller, B.W., Zafeiriou, S.: End-to-end multimodal emotion recognition using deep neural networks. IEEE J. Sel. Top. Signal Process. 11(8), 1301–1309 (2017)CrossRef Tzirakis, P., Trigeorgis, G., Nicolaou, M.A., Schuller, B.W., Zafeiriou, S.: End-to-end multimodal emotion recognition using deep neural networks. IEEE J. Sel. Top. Signal Process. 11(8), 1301–1309 (2017)CrossRef
24.
Zurück zum Zitat Koelstra, S., et al.: DEAP: a database for emotion analysis using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)CrossRef Koelstra, S., et al.: DEAP: a database for emotion analysis using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)CrossRef
25.
Zurück zum Zitat Wrigh, R., Riedel, R., Sechres, L.: Sex differences in emotion recognition ability: the mediating role of trait emotional awareness. Motiv. Emot. 42(1), 149–160 (2016)CrossRef Wrigh, R., Riedel, R., Sechres, L.: Sex differences in emotion recognition ability: the mediating role of trait emotional awareness. Motiv. Emot. 42(1), 149–160 (2016)CrossRef
26.
Zurück zum Zitat Liu, Z., Xie, Q., Li, S., et al.: Electroencephalogram emotion recognition based on empirical mode decomposition and optimal feature selection. IEEE Trans. Cognit. Dev. Syst. 3, 1–4 (2016) Liu, Z., Xie, Q., Li, S., et al.: Electroencephalogram emotion recognition based on empirical mode decomposition and optimal feature selection. IEEE Trans. Cognit. Dev. Syst. 3, 1–4 (2016)
27.
Zurück zum Zitat Cao, G., Ma, Y., Meng, X. et al.: Emotion recognition based on CNN. In: 2019 Chinese Control Conference (CCC), vol. 2, pp. 8627–8630. (2019) Cao, G., Ma, Y., Meng, X. et al.: Emotion recognition based on CNN. In: 2019 Chinese Control Conference (CCC), vol. 2, pp. 8627–8630. (2019)
28.
Zurück zum Zitat Han, B., Lee, S.: Feature selection and comparison for the emotion recognition according to music listening. In: 2017 international conference on robotics and automation sciences, vol. 7, pp. 172–176. (2017) Han, B., Lee, S.: Feature selection and comparison for the emotion recognition according to music listening. In: 2017 international conference on robotics and automation sciences, vol. 7, pp. 172–176. (2017)
29.
Zurück zum Zitat Shahnaz, C., Shoaib, M.: Emotion recognition based on wavelet analysis of empirical mode decomposed EEG signals responsive to music videos. In: 2016 IEEE region 10 conference, vol. 11, pp. 424–427. (2016) Shahnaz, C., Shoaib, M.: Emotion recognition based on wavelet analysis of empirical mode decomposed EEG signals responsive to music videos. In: 2016 IEEE region 10 conference, vol. 11, pp. 424–427. (2016)
30.
Zurück zum Zitat Li, X., Song, D., Zhang, P. et al.: Emotion recognition from multi-channel EEG data through convolutional recurrent neural network. In: 2016 IEEE international conference on bioinformatics and biomedicine, vol. 7, pp. 352–359. (2016) Li, X., Song, D., Zhang, P. et al.: Emotion recognition from multi-channel EEG data through convolutional recurrent neural network. In: 2016 IEEE international conference on bioinformatics and biomedicine, vol. 7, pp. 352–359. (2016)
31.
Zurück zum Zitat Islam, R., Ahmad, M.: Wavelet analysis based classification of emotion from EEG signal. In: 2019 international conference on electrical, computer and communication engineering (ECCE), pp. 7–9. (2019) Islam, R., Ahmad, M.: Wavelet analysis based classification of emotion from EEG signal. In: 2019 international conference on electrical, computer and communication engineering (ECCE), pp. 7–9. (2019)
32.
Zurück zum Zitat Shao, J., Zhu, J., Wei, Y. et al.: Emotion recognition by edge-weighted hypergraph neural network. In: 2019 IEEE international conference on image processing, vol. 12, pp. 425–431. (2019) Shao, J., Zhu, J., Wei, Y. et al.: Emotion recognition by edge-weighted hypergraph neural network. In: 2019 IEEE international conference on image processing, vol. 12, pp. 425–431. (2019)
Metadaten
Titel
An AI-empowered affect recognition model for healthcare and emotional well-being using physiological signals
verfasst von
Zijian Zhou
Muhammad Adeel Asghar
Daniyal Nazir
Kamran Siddique
Mohammad Shorfuzzaman
Raja Majid Mehmood
Publikationsdatum
03.11.2022
Verlag
Springer US
Erschienen in
Cluster Computing / Ausgabe 2/2023
Print ISSN: 1386-7857
Elektronische ISSN: 1573-7543
DOI
https://doi.org/10.1007/s10586-022-03705-0

Weitere Artikel der Ausgabe 2/2023

Cluster Computing 2/2023 Zur Ausgabe

Premium Partner