Skip to main content
Erschienen in: International Journal of Machine Learning and Cybernetics 4/2024

11.09.2023 | Original Article

CoDF-Net: coordinated-representation decision fusion network for emotion recognition with EEG and eye movement signals

verfasst von: Xinrong Gong, Yihan Dong, Tong Zhang

Erschienen in: International Journal of Machine Learning and Cybernetics | Ausgabe 4/2024

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Physiological signals, such as EEG and eye movements, have emerged as promising research topics in emotion recognition due to their inherent advantages of objectivity, high recognition accuracy, and cost-effectiveness. However, most existing methods for fusing EEG and eye movement signals use concatenation or weighted summation, which may lead to information loss and limited ability to resist noise. To tackle this issue, in this paper, we propose a Coordinated-representation Decision Fusion Network (CoDF-Net) to efficiently fuse the representation of EEG and eye movement signals. Specifically, CoDF-Net first learns personalized information by maximizing the correlation between modalities. Next, the Decision-level Fusion Broad Learning System (DF-BLS) is developed to construct multiple sub-systems to obtain the final emotional states via the effective decision-making mechanism. To evaluate the performance of the proposed method, subject-dependent and subject-independent experiments are designed on two public datasets. Extensive experiments demonstrate that the proposed method has superior emotion recognition performance over traditional approaches and current state-of-the-art methods. The CoDF-Net achieves 94.09 and 91.62% in the subject-dependent setting and 87.04 and 83.87% in the subject-independent setting on the SEED-CHN and SEED-GER datasets, respectively. Moreover, it is found that the proposed method exhibits a more significant ability to resist noise by adding Gaussian noise with different standard deviations.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Weitere Produktempfehlungen anzeigen
Literatur
1.
Zurück zum Zitat McRae K (2016) Cognitive emotion regulation: a review of theory and scientific findings. Curr Opin Behav Sci 10:119–124 McRae K (2016) Cognitive emotion regulation: a review of theory and scientific findings. Curr Opin Behav Sci 10:119–124
2.
Zurück zum Zitat Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor JG (2001) Emotion recognition in human–computer interaction. IEEE Signal Process Mag 18(1):32–80ADS Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor JG (2001) Emotion recognition in human–computer interaction. IEEE Signal Process Mag 18(1):32–80ADS
3.
Zurück zum Zitat Khan G, Samyan S, Khan MUG, Shahid M, Wahla SQ (2020) A survey on analysis of human faces and facial expressions datasets. Int J Mach Learn Cybern 11:553–571 Khan G, Samyan S, Khan MUG, Shahid M, Wahla SQ (2020) A survey on analysis of human faces and facial expressions datasets. Int J Mach Learn Cybern 11:553–571
4.
Zurück zum Zitat Jin X, Sun W, Jin Z (2020) A discriminative deep association learning for facial expression recognition. Int J Mach Learn Cybern 11:779–793 Jin X, Sun W, Jin Z (2020) A discriminative deep association learning for facial expression recognition. Int J Mach Learn Cybern 11:779–793
5.
Zurück zum Zitat Zhang T, Gong X, Chen CLP (2022) BMT-Net: broad multitask transformer network for sentiment analysis. IEEE Trans Cybern 52(7):6232–6243PubMed Zhang T, Gong X, Chen CLP (2022) BMT-Net: broad multitask transformer network for sentiment analysis. IEEE Trans Cybern 52(7):6232–6243PubMed
6.
Zurück zum Zitat Yan R, Yu Y, Qiu D (2022) Emotion-enhanced classification based on fuzzy reasoning. Int J Mach Learn Cybern 13(3):839–850 Yan R, Yu Y, Qiu D (2022) Emotion-enhanced classification based on fuzzy reasoning. Int J Mach Learn Cybern 13(3):839–850
7.
Zurück zum Zitat Huang Y, Wen H, Qing L, Jin R, Xiao L (2021) Emotion recognition based on body and context fusion in the wild. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 3609–3617 Huang Y, Wen H, Qing L, Jin R, Xiao L (2021) Emotion recognition based on body and context fusion in the wild. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 3609–3617
8.
Zurück zum Zitat Zhou C, Zhi R (2022) Learning deep representation for action unit detection with auxiliary facial attributes. Int J Mach Learn Cybern 13:407–419 Zhou C, Zhi R (2022) Learning deep representation for action unit detection with auxiliary facial attributes. Int J Mach Learn Cybern 13:407–419
9.
Zurück zum Zitat Scherer KR, Bänziger T (2010) On the use of actor portrayals in research on emotional expression. In: Blueprint for affective computing: a sourcebook. Oxford University Press, New York, NY, pp 166–176 Scherer KR, Bänziger T (2010) On the use of actor portrayals in research on emotional expression. In: Blueprint for affective computing: a sourcebook. Oxford University Press, New York, NY, pp 166–176
10.
Zurück zum Zitat Koelstra S, Muhl C, Soleymani M, Lee J-S, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2011) DEAP: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31 Koelstra S, Muhl C, Soleymani M, Lee J-S, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2011) DEAP: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31
11.
Zurück zum Zitat Gao Y, Wang X, Potter T, Zhang J, Zhang Y (2020) Single-trial EEG emotion recognition using granger causality/transfer entropy analysis. J Neurosci Methods 346:108904PubMed Gao Y, Wang X, Potter T, Zhang J, Zhang Y (2020) Single-trial EEG emotion recognition using granger causality/transfer entropy analysis. J Neurosci Methods 346:108904PubMed
12.
Zurück zum Zitat Liu S, Tong J, Meng J, Yang J, Zhao X, He F, Qi H, Ming D (2018) Study on an effective cross-stimulus emotion recognition model using EEGs based on feature selection and support vector machine. Int J Mach Learn Cybern 9:721–726 Liu S, Tong J, Meng J, Yang J, Zhao X, He F, Qi H, Ming D (2018) Study on an effective cross-stimulus emotion recognition model using EEGs based on feature selection and support vector machine. Int J Mach Learn Cybern 9:721–726
13.
Zurück zum Zitat Skaramagkas V, Giannakakis G, Ktistakis E, Manousos D, Karatzanis I, Tachos NS, Tripoliti E, Marias K, Fotiadis DI, Tsiknakis M (2021) Review of eye tracking metrics involved in emotional and cognitive processes. IEEE Rev Biomed Eng 16:260–277 Skaramagkas V, Giannakakis G, Ktistakis E, Manousos D, Karatzanis I, Tachos NS, Tripoliti E, Marias K, Fotiadis DI, Tsiknakis M (2021) Review of eye tracking metrics involved in emotional and cognitive processes. IEEE Rev Biomed Eng 16:260–277
14.
Zurück zum Zitat Lu Y, Zheng W-L, Li B, Lu B-L (2015) Combining eye movements and EEG to enhance emotion recognition. In: IJCAI, vol 15. Buenos Aires, pp 1170–1176 Lu Y, Zheng W-L, Li B, Lu B-L (2015) Combining eye movements and EEG to enhance emotion recognition. In: IJCAI, vol 15. Buenos Aires, pp 1170–1176
15.
Zurück zum Zitat Zhang X, Pan J, Shen J, Din Z, Li J, Lu D, Wu M, Hu B (2020) Fusing of electroencephalogram and eye movement with group sparse canonical correlation analysis for anxiety detection. IEEE Trans Affect Comput 13(2):958–971 Zhang X, Pan J, Shen J, Din Z, Li J, Lu D, Wu M, Hu B (2020) Fusing of electroencephalogram and eye movement with group sparse canonical correlation analysis for anxiety detection. IEEE Trans Affect Comput 13(2):958–971
17.
Zurück zum Zitat Zheng W-L, Liu W, Lu Y, Lu B-L, Cichocki A (2018) Emotionmeter: a multimodal framework for recognizing human emotions. IEEE Trans Cybern 49(3):1110–1122PubMed Zheng W-L, Liu W, Lu Y, Lu B-L, Cichocki A (2018) Emotionmeter: a multimodal framework for recognizing human emotions. IEEE Trans Cybern 49(3):1110–1122PubMed
19.
Zurück zum Zitat Lawhern VJ, Solon AJ, Waytowich NR, Gordon SM, Hung CP, Lance BJ (2018) EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces. J Neural Eng 15(5):056013PubMed Lawhern VJ, Solon AJ, Waytowich NR, Gordon SM, Hung CP, Lance BJ (2018) EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces. J Neural Eng 15(5):056013PubMed
20.
Zurück zum Zitat Gao Q, Yang Y, Kang Q, Tian Z, Song Y (2022) EEG-based emotion recognition with feature fusion networks. Int J Mach Learn Cybern 13(2):421–429 Gao Q, Yang Y, Kang Q, Tian Z, Song Y (2022) EEG-based emotion recognition with feature fusion networks. Int J Mach Learn Cybern 13(2):421–429
21.
Zurück zum Zitat Song T, Zheng W, Song P, Cui Z (2020) EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans Affect Comput 11(3):532–541 Song T, Zheng W, Song P, Cui Z (2020) EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans Affect Comput 11(3):532–541
22.
Zurück zum Zitat Zhang T, Wang X, Xu X, Chen CLP (2022) GCB-Net: graph convolutional broad network and its application in emotion recognition. IEEE Trans Affect Comput 13(1):379–388 Zhang T, Wang X, Xu X, Chen CLP (2022) GCB-Net: graph convolutional broad network and its application in emotion recognition. IEEE Trans Affect Comput 13(1):379–388
26.
Zurück zum Zitat Li J, Wu X, Zhang Y, Yang H, Wu X (2022) DRS-Net: a spatial-temporal affective computing model based on multichannel EEG data. Biomed Signal Process Control 76:103660 Li J, Wu X, Zhang Y, Yang H, Wu X (2022) DRS-Net: a spatial-temporal affective computing model based on multichannel EEG data. Biomed Signal Process Control 76:103660
27.
Zurück zum Zitat Wang Z, Wang Y, Hu C, Yin Z, Song Y (2022) Transformers for EEG-based emotion recognition: a hierarchical spatial information learning model. IEEE Sens J 22(5):4359–4368ADS Wang Z, Wang Y, Hu C, Yin Z, Song Y (2022) Transformers for EEG-based emotion recognition: a hierarchical spatial information learning model. IEEE Sens J 22(5):4359–4368ADS
28.
Zurück zum Zitat Sun M, Cui W, Yu S, Han H, Hu B, Li Y (2022) A dual-branch dynamic graph convolution based adaptive transformer feature fusion network for EEG emotion recognition. IEEE Trans Affect Comput 13(4):2218–2228 Sun M, Cui W, Yu S, Han H, Hu B, Li Y (2022) A dual-branch dynamic graph convolution based adaptive transformer feature fusion network for EEG emotion recognition. IEEE Trans Affect Comput 13(4):2218–2228
29.
Zurück zum Zitat Chen CP, Liu Z (2017) Broad learning system: an effective and efficient incremental learning system without the need for deep architecture. IEEE Trans Neural Netw Learn Syst 29(1):10–24MathSciNetPubMed Chen CP, Liu Z (2017) Broad learning system: an effective and efficient incremental learning system without the need for deep architecture. IEEE Trans Neural Netw Learn Syst 29(1):10–24MathSciNetPubMed
30.
Zurück zum Zitat Gong X, Zhang T, Chen CLP, Liu Z (2022) Research review for broad learning system: algorithms, theory, and applications. IEEE Trans Cybern 52(9):8922–8950PubMed Gong X, Zhang T, Chen CLP, Liu Z (2022) Research review for broad learning system: algorithms, theory, and applications. IEEE Trans Cybern 52(9):8922–8950PubMed
31.
Zurück zum Zitat Jia X, Zhang T, Philip Chen CL, Liu Z, Chen L, Wen G, Hu B (2020) Multi-channel EEG based emotion recognition using temporal convolutional network and broad learning system. In: 2020 IEEE international conference on systems, man, and cybernetics (SMC), pp 2452–2457 Jia X, Zhang T, Philip Chen CL, Liu Z, Chen L, Wen G, Hu B (2020) Multi-channel EEG based emotion recognition using temporal convolutional network and broad learning system. In: 2020 IEEE international conference on systems, man, and cybernetics (SMC), pp 2452–2457
32.
Zurück zum Zitat Yang Y, Gao Z, Li Y, Cai Q, Marwan N, Kurths J (2021) A complex network-based broad learning system for detecting driver fatigue from EEG signals. IEEE Trans Syst Man Cybern Syst 51(9):5800–5808 Yang Y, Gao Z, Li Y, Cai Q, Marwan N, Kurths J (2021) A complex network-based broad learning system for detecting driver fatigue from EEG signals. IEEE Trans Syst Man Cybern Syst 51(9):5800–5808
33.
Zurück zum Zitat Oliva M, Anikin A (2018) Pupil dilation reflects the time course of emotion recognition in human vocalizations. Sci Rep 8(1):4871ADSPubMedPubMedCentral Oliva M, Anikin A (2018) Pupil dilation reflects the time course of emotion recognition in human vocalizations. Sci Rep 8(1):4871ADSPubMedPubMedCentral
34.
Zurück zum Zitat Bradley MM, Miccoli L, Escrig MA, Lang PJ (2008) The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45(4):602–607PubMedPubMedCentral Bradley MM, Miccoli L, Escrig MA, Lang PJ (2008) The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45(4):602–607PubMedPubMedCentral
35.
Zurück zum Zitat Geangu E, Hauf P, Bhardwaj R, Bentz W (2011) Infant pupil diameter changes in response to others’ positive and negative emotions. PLoS ONE 6(11):27132ADS Geangu E, Hauf P, Bhardwaj R, Bentz W (2011) Infant pupil diameter changes in response to others’ positive and negative emotions. PLoS ONE 6(11):27132ADS
36.
Zurück zum Zitat Aracena C, Basterrech S, Snáel V, Velásquez J (2015) Neural networks for emotion recognition based on eye tracking data. In: 2015 IEEE international conference on systems, man, and cybernetics. IEEE, pp 2632–2637 Aracena C, Basterrech S, Snáel V, Velásquez J (2015) Neural networks for emotion recognition based on eye tracking data. In: 2015 IEEE international conference on systems, man, and cybernetics. IEEE, pp 2632–2637
37.
Zurück zum Zitat Lanatà A, Armato A, Valenza G, Scilingo EP (2011) Eye tracking and pupil size variation as response to affective stimuli: a preliminary study. In: 2011 5th international conference on pervasive computing technologies for healthcare (PervasiveHealth) and workshops. IEEE, pp 78–84 Lanatà A, Armato A, Valenza G, Scilingo EP (2011) Eye tracking and pupil size variation as response to affective stimuli: a preliminary study. In: 2011 5th international conference on pervasive computing technologies for healthcare (PervasiveHealth) and workshops. IEEE, pp 78–84
38.
Zurück zum Zitat Chen X, Mao J, Liu Y, Zhang M, Ma S (2022) Investigating human reading behavior during sentiment judgment. Int J Mach Learn Cybern 13(8):2283–2296 Chen X, Mao J, Liu Y, Zhang M, Ma S (2022) Investigating human reading behavior during sentiment judgment. Int J Mach Learn Cybern 13(8):2283–2296
39.
Zurück zum Zitat Alhargan A, Cooke N, Binjammaz T (2017) Affect recognition in an interactive gaming environment using eye tracking. In: 2017 seventh international conference on affective computing and intelligent interaction (ACII). IEEE, pp 285–291 Alhargan A, Cooke N, Binjammaz T (2017) Affect recognition in an interactive gaming environment using eye tracking. In: 2017 seventh international conference on affective computing and intelligent interaction (ACII). IEEE, pp 285–291
40.
Zurück zum Zitat Melo CM, Paiva A, Gratch J (2014) Emotion in games. In: Handbook of digital games. Hoboken, NJ, USA : Wiley, pp 573–592 Melo CM, Paiva A, Gratch J (2014) Emotion in games. In: Handbook of digital games. Hoboken, NJ, USA : Wiley, pp 573–592
41.
Zurück zum Zitat Baltrušaitis T, Ahuja C, Morency L-P (2018) Multimodal machine learning: a survey and taxonomy. IEEE Trans Pattern Anal Mach Intell 41(2):423–443PubMed Baltrušaitis T, Ahuja C, Morency L-P (2018) Multimodal machine learning: a survey and taxonomy. IEEE Trans Pattern Anal Mach Intell 41(2):423–443PubMed
42.
Zurück zum Zitat Zeng Z, Pantic M, Roisman GI, Huang TS (2007) A survey of affect recognition methods: audio, visual and spontaneous expressions. In: Proceedings of the 9th international conference on multimodal interfaces, pp 126–133 Zeng Z, Pantic M, Roisman GI, Huang TS (2007) A survey of affect recognition methods: audio, visual and spontaneous expressions. In: Proceedings of the 9th international conference on multimodal interfaces, pp 126–133
43.
Zurück zum Zitat D’mello SK, Kory J (2015) A review and meta-analysis of multimodal affect detection systems. ACM Comput Surv (CSUR) 47(3):1–36 D’mello SK, Kory J (2015) A review and meta-analysis of multimodal affect detection systems. ACM Comput Surv (CSUR) 47(3):1–36
44.
Zurück zum Zitat Guo K, Chai R, Candra H, Guo Y, Song R, Nguyen H, Su S (2019) A hybrid fuzzy cognitive map/support vector machine approach for EEG-based emotion classification using compressed sensing. Int J Fuzzy Syst 21:263–273 Guo K, Chai R, Candra H, Guo Y, Song R, Nguyen H, Su S (2019) A hybrid fuzzy cognitive map/support vector machine approach for EEG-based emotion classification using compressed sensing. Int J Fuzzy Syst 21:263–273
45.
Zurück zum Zitat Nemati S, Rohani R, Basiri ME, Abdar M, Yen NY, Makarenkov V (2019) A hybrid latent space data fusion method for multimodal emotion recognition. IEEE Access 7:172948–172964 Nemati S, Rohani R, Basiri ME, Abdar M, Yen NY, Makarenkov V (2019) A hybrid latent space data fusion method for multimodal emotion recognition. IEEE Access 7:172948–172964
46.
Zurück zum Zitat Ngiam J, Khosla A, Kim M, Nam J, Lee H, Ng AY (2011) Multimodal deep learning. In: Proceedings of the 28th international conference on machine learning (ICML-11), pp 689–696 Ngiam J, Khosla A, Kim M, Nam J, Lee H, Ng AY (2011) Multimodal deep learning. In: Proceedings of the 28th international conference on machine learning (ICML-11), pp 689–696
47.
Zurück zum Zitat Andrew G, Arora R, Bilmes J, Livescu K (2013) Deep canonical correlation analysis. In: International conference on machine learning. PMLR, pp 1247–1255 Andrew G, Arora R, Bilmes J, Livescu K (2013) Deep canonical correlation analysis. In: International conference on machine learning. PMLR, pp 1247–1255
48.
Zurück zum Zitat Vielzeuf V, Lechervy A, Pateux S, Jurie F (2018) CentralNet: a multilayer approach for multimodal fusion. In: Proceedings of the European conference on computer vision (ECCV) workshops Vielzeuf V, Lechervy A, Pateux S, Jurie F (2018) CentralNet: a multilayer approach for multimodal fusion. In: Proceedings of the European conference on computer vision (ECCV) workshops
49.
Zurück zum Zitat Zheng W-L, Zhu J-Y, Lu B-L (2017) Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans Affect Comput 10(3):417–429 Zheng W-L, Zhu J-Y, Lu B-L (2017) Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans Affect Comput 10(3):417–429
50.
Zurück zum Zitat Breiman L (1996) Bagging predictors. Mach Learn 24:123–140 Breiman L (1996) Bagging predictors. Mach Learn 24:123–140
51.
Zurück zum Zitat Zheng W-L, Lu B-L (2015) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 7(3):162–175 Zheng W-L, Lu B-L (2015) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 7(3):162–175
52.
Zurück zum Zitat Duan R-N, Zhu J-Y, Lu B-L (2013) Differential entropy feature for EEG-based emotion classification. In: 2013 6th international IEEE/EMBS conference on neural engineering (NER). IEEE, pp 81–84 Duan R-N, Zhu J-Y, Lu B-L (2013) Differential entropy feature for EEG-based emotion classification. In: 2013 6th international IEEE/EMBS conference on neural engineering (NER). IEEE, pp 81–84
53.
Zurück zum Zitat Liu W, Zheng W-L, Li Z, Wu S-Y, Gan L, Lu B-L (2022) Identifying similarities and differences in emotion recognition with EEG and eye movements among Chinese, German, and French people. J Neural Eng 19(2):026012 Liu W, Zheng W-L, Li Z, Wu S-Y, Gan L, Lu B-L (2022) Identifying similarities and differences in emotion recognition with EEG and eye movements among Chinese, German, and French people. J Neural Eng 19(2):026012
54.
Zurück zum Zitat Li J, Hua H, Xu Z, Shu L, Xu X, Kuang F, Wu S (2022) Cross-subject EEG emotion recognition combined with connectivity features and meta-transfer learning. Comput Biol Med 145:105519PubMed Li J, Hua H, Xu Z, Shu L, Xu X, Kuang F, Wu S (2022) Cross-subject EEG emotion recognition combined with connectivity features and meta-transfer learning. Comput Biol Med 145:105519PubMed
55.
Zurück zum Zitat Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNet Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNet
56.
Zurück zum Zitat Hwang HC, Kim SM, Han DH (2021) Different facial recognition patterns in schizophrenia and bipolar disorder assessed using a computerized emotional perception test and FMRI. J Affect Disord 279:83–88PubMed Hwang HC, Kim SM, Han DH (2021) Different facial recognition patterns in schizophrenia and bipolar disorder assessed using a computerized emotional perception test and FMRI. J Affect Disord 279:83–88PubMed
57.
Zurück zum Zitat Pan SJ, Tsang IW, Kwok JT, Yang Q (2011) Domain adaptation via transfer component analysis. IEEE Trans Neural Netw 22(2):199–210PubMed Pan SJ, Tsang IW, Kwok JT, Yang Q (2011) Domain adaptation via transfer component analysis. IEEE Trans Neural Netw 22(2):199–210PubMed
58.
Zurück zum Zitat Zheng W-L, Lu B-L (2016) Personalizing EEG-based affective models with transfer learning. In: Proceedings of the twenty-fifth international joint conference on artificial intelligence, pp 2732–2738 Zheng W-L, Lu B-L (2016) Personalizing EEG-based affective models with transfer learning. In: Proceedings of the twenty-fifth international joint conference on artificial intelligence, pp 2732–2738
59.
Zurück zum Zitat Laukka P, Elfenbein HA (2021) Cross-cultural emotion recognition and in-group advantage in vocal expression: a meta-analysis. Emot Rev 13(1):3–11 Laukka P, Elfenbein HA (2021) Cross-cultural emotion recognition and in-group advantage in vocal expression: a meta-analysis. Emot Rev 13(1):3–11
60.
Zurück zum Zitat Palva S, Palva JM (2007) New vistas for \(\alpha\)-frequency band oscillations. Trends Neurosci 30(4):150–158PubMed Palva S, Palva JM (2007) New vistas for \(\alpha\)-frequency band oscillations. Trends Neurosci 30(4):150–158PubMed
62.
Zurück zum Zitat Pan SJ, Yang Q (2010) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359 Pan SJ, Yang Q (2010) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359
Metadaten
Titel
CoDF-Net: coordinated-representation decision fusion network for emotion recognition with EEG and eye movement signals
verfasst von
Xinrong Gong
Yihan Dong
Tong Zhang
Publikationsdatum
11.09.2023
Verlag
Springer Berlin Heidelberg
Erschienen in
International Journal of Machine Learning and Cybernetics / Ausgabe 4/2024
Print ISSN: 1868-8071
Elektronische ISSN: 1868-808X
DOI
https://doi.org/10.1007/s13042-023-01964-w

Weitere Artikel der Ausgabe 4/2024

International Journal of Machine Learning and Cybernetics 4/2024 Zur Ausgabe

Neuer Inhalt