Skip to main content
Erschienen in: Artificial Intelligence Review 1/2023

13.04.2022

State of the art: a review of sentiment analysis based on sequential transfer learning

verfasst von: Jireh Yi-Le Chan, Khean Thye Bea, Steven Mun Hong Leow, Seuk Wai Phoong, Wai Khuen Cheng

Erschienen in: Artificial Intelligence Review | Ausgabe 1/2023

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Recently, sequential transfer learning emerged as a modern technique for applying the “pretrain then fine-tune” paradigm to leverage existing knowledge to improve the performance of various downstream NLP tasks, with no exception of sentiment analysis. Previous pieces of literature mostly focus on reviewing the application of various deep learning models to sentiment analysis. However, supervised deep learning methods are known to be data hungry, but insufficient training data in practice may cause the application to be impractical. To this end, sequential transfer learning provided a solution to alleviate the training bottleneck issues of data scarcity and facilitate sentiment analysis application. This study aims to discuss the background of sequential transfer learning, review the evolution of pretrained models, extend the literature with the application of sequential transfer learning to different sentiment analysis tasks (aspect-based sentiment analysis, multimodal sentiment analysis, sarcasm detection, cross-domain sentiment classification, multilingual sentiment analysis, emotion detection) and suggest future research directions on model compression, effective knowledge adaptation techniques, neutrality detection and ambivalence handling tasks.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
Zurück zum Zitat Abdu SA, Yousef AH, Salem A (2021) Multimodal video sentiment analysis using deep learning approaches, a survey. Inf Fusion 76:204–226CrossRef Abdu SA, Yousef AH, Salem A (2021) Multimodal video sentiment analysis using deep learning approaches, a survey. Inf Fusion 76:204–226CrossRef
Zurück zum Zitat Acheampong FA, Nunoo-Mensah H, Chen W (2021) Transformer models for text-based emotion detection: a review of bert-based approaches. Artif Intell Rev 54:1–41CrossRef Acheampong FA, Nunoo-Mensah H, Chen W (2021) Transformer models for text-based emotion detection: a review of bert-based approaches. Artif Intell Rev 54:1–41CrossRef
Zurück zum Zitat Adoma AF, Henry NM, Chen W (2020) Comparative analyses of bert, roberta, distilbert, and xlnet for text-based emotion recognition. In: 2020 17th international computer conference on wavelet active media technology and information processing (ICCWAMTIP). IEEE, pp 117–121 Adoma AF, Henry NM, Chen W (2020) Comparative analyses of bert, roberta, distilbert, and xlnet for text-based emotion recognition. In: 2020 17th international computer conference on wavelet active media technology and information processing (ICCWAMTIP). IEEE, pp 117–121
Zurück zum Zitat Agüero-Torales MM, Salas JIA, López-Herrera AG (2021) Deep learning and multilingual sentiment analysis on social media data: an overview. Appl Soft Comput 107:107373CrossRef Agüero-Torales MM, Salas JIA, López-Herrera AG (2021) Deep learning and multilingual sentiment analysis on social media data: an overview. Appl Soft Comput 107:107373CrossRef
Zurück zum Zitat Al-Moslmi T, Omar N, Abdullah S, Albared M (2017) Approaches to cross-domain sentiment analysis: a systematic literature review. IEEE Access 5:16173–16192CrossRef Al-Moslmi T, Omar N, Abdullah S, Albared M (2017) Approaches to cross-domain sentiment analysis: a systematic literature review. IEEE Access 5:16173–16192CrossRef
Zurück zum Zitat Alberti C, Ling J, Collins M, Reitter D (2019) Fusion of detected objects in text for visual question answering. arXiv:1908.05054 Alberti C, Ling J, Collins M, Reitter D (2019) Fusion of detected objects in text for visual question answering. arXiv:​1908.​05054
Zurück zum Zitat Avvaru A, Vobilisetty S, Mamidi,R (2020) Detecting sarcasm in conversation context using transformer-based models. In: Proceedings of the second workshop on figurative language processing, pp 98–103 Avvaru A, Vobilisetty S, Mamidi,R (2020) Detecting sarcasm in conversation context using transformer-based models. In: Proceedings of the second workshop on figurative language processing, pp 98–103
Zurück zum Zitat Babanejad N, Davoudi H, An A, Papagelis M (2020) Affective and contextual embedding for sarcasm detection. In: Proceedings of the 28th international conference on computational linguistics, pp 225–243 Babanejad N, Davoudi H, An A, Papagelis M (2020) Affective and contextual embedding for sarcasm detection. In: Proceedings of the 28th international conference on computational linguistics, pp 225–243
Zurück zum Zitat Bengio Y, Ducharme R, Vincent P (2000) A neural probabilistic language model. Adv Neural Inf Process Syst. 13 Bengio Y, Ducharme R, Vincent P (2000) A neural probabilistic language model. Adv Neural Inf Process Syst. 13
Zurück zum Zitat Birjali M, Kasri M, Beni-Hssane A (2021) A comprehensive survey on sentiment analysis: approaches, challenges and trends. Knowl-Based Syst 226:107134CrossRef Birjali M, Kasri M, Beni-Hssane A (2021) A comprehensive survey on sentiment analysis: approaches, challenges and trends. Knowl-Based Syst 226:107134CrossRef
Zurück zum Zitat Blitzer J, McDonald R, Pereira F (2006) Domain adaptation with structural correspondence learning. In: Proceedings of the 2006 conference on empirical methods in natural language processing, pp 120–128 Blitzer J, McDonald R, Pereira F (2006) Domain adaptation with structural correspondence learning. In: Proceedings of the 2006 conference on empirical methods in natural language processing, pp 120–128
Zurück zum Zitat Bojanowski P, Grave E, Joulin A, Mikolov T (2017) Enriching word vectors with subword information. Trans Assoc Comput Linguist 5:135–146CrossRef Bojanowski P, Grave E, Joulin A, Mikolov T (2017) Enriching word vectors with subword information. Trans Assoc Comput Linguist 5:135–146CrossRef
Zurück zum Zitat Cai Y, Cai H, Wan X (2019) Multi-modal sarcasm detection in twitter with hierarchical fusion model. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 2506–2515 Cai Y, Cai H, Wan X (2019) Multi-modal sarcasm detection in twitter with hierarchical fusion model. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 2506–2515
Zurück zum Zitat Cambria E, Das D, Bandyopadhyay S, Feraco A (2017) Affective computing and sentiment analysis. A practical guide to sentiment analysis. Springer, New York, pp 1–10CrossRef Cambria E, Das D, Bandyopadhyay S, Feraco A (2017) Affective computing and sentiment analysis. A practical guide to sentiment analysis. Springer, New York, pp 1–10CrossRef
Zurück zum Zitat Cambria E, Li Y, Xing FZ, Poria S, Kwok K (2020) Senticnet 6: ensemble application of symbolic and subsymbolic ai for sentiment analysis. In: Proceedings of the 29th ACM international conference on information & knowledge management, pp 105–114 Cambria E, Li Y, Xing FZ, Poria S, Kwok K (2020) Senticnet 6: ensemble application of symbolic and subsymbolic ai for sentiment analysis. In: Proceedings of the 29th ACM international conference on information & knowledge management, pp 105–114
Zurück zum Zitat Cao Z, Zhou Y, Yang A, Peng S (2021) Deep transfer learning mechanism for fine-grained cross-domain sentiment classification. Connect Sci 33(4):911–928CrossRef Cao Z, Zhou Y, Yang A, Peng S (2021) Deep transfer learning mechanism for fine-grained cross-domain sentiment classification. Connect Sci 33(4):911–928CrossRef
Zurück zum Zitat Castro S, Hazarika D, Pérez-Rosas V, Zimmermann R, Mihalcea R, Poria S (2019) Towards multimodal sarcasm detection (an _obviously_ perfect paper). arXiv:1906.01815 Castro S, Hazarika D, Pérez-Rosas V, Zimmermann R, Mihalcea R, Poria S (2019) Towards multimodal sarcasm detection (an _obviously_ perfect paper). arXiv:​1906.​01815
Zurück zum Zitat Chandrasekaran G, Nguyen TN, Hemanth DJ (2021) Multimodal sentimental analysis for social media applications: a comprehensive review. Wiley Interdiscip Rev 11(5):e1415 Chandrasekaran G, Nguyen TN, Hemanth DJ (2021) Multimodal sentimental analysis for social media applications: a comprehensive review. Wiley Interdiscip Rev 11(5):e1415
Zurück zum Zitat Chaturvedi I, Cambria E, Welsch RE, Herrera F (2018) Distinguishing between facts and opinions for sentiment analysis: survey and challenges. Inf Fusion 44:65–77CrossRef Chaturvedi I, Cambria E, Welsch RE, Herrera F (2018) Distinguishing between facts and opinions for sentiment analysis: survey and challenges. Inf Fusion 44:65–77CrossRef
Zurück zum Zitat Chen F, Huang Y (2019) Knowledge-enhanced neural networks for sentiment analysis of Chinese reviews. Neurocomputing 368:51–58CrossRef Chen F, Huang Y (2019) Knowledge-enhanced neural networks for sentiment analysis of Chinese reviews. Neurocomputing 368:51–58CrossRef
Zurück zum Zitat Chen X, Sun Y, Athiwaratkun B, Cardie C, Weinberger K (2018) Adversarial deep averaging networks for cross-lingual sentiment classification. Trans Assoc Comput Linguist 6:557–570 Chen X, Sun Y, Athiwaratkun B, Cardie C, Weinberger K (2018) Adversarial deep averaging networks for cross-lingual sentiment classification. Trans Assoc Comput Linguist 6:557–570
Zurück zum Zitat Collobert R, Weston J, Bottou L, Karlen M, Kavukcuoglu K, Kuksa P (2011) Natural language processing (almost) from scratch. J Mach Learn Res 12:2493–2537MATH Collobert R, Weston J, Bottou L, Karlen M, Kavukcuoglu K, Kuksa P (2011) Natural language processing (almost) from scratch. J Mach Learn Res 12:2493–2537MATH
Zurück zum Zitat Conneau A, Kruszewski G, Lample G, Barrault L, Baroni M (2018) What you can cram into a single vector: probing sentence embeddings for linguistic properties. arXiv:1805.01070 Conneau A, Kruszewski G, Lample G, Barrault L, Baroni M (2018) What you can cram into a single vector: probing sentence embeddings for linguistic properties. arXiv:​1805.​01070
Zurück zum Zitat Conneau A, Lample G (2019) Cross-lingual language model pretraining. In: Advances in neural information processing systems , 32 Conneau A, Lample G (2019) Cross-lingual language model pretraining. In: Advances in neural information processing systems , 32
Zurück zum Zitat Davidov D, Tsur O, Rappoport A (2010) Semi-supervised recognition of sarcasm in twitter and amazon. In: Proceedings of the fourteenth conference on computational natural language learning, pp 107–116 Davidov D, Tsur O, Rappoport A (2010) Semi-supervised recognition of sarcasm in twitter and amazon. In: Proceedings of the fourteenth conference on computational natural language learning, pp 107–116
Zurück zum Zitat Devlin J, Chang MW, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805 Devlin J, Chang MW, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding. arXiv:​1810.​04805
Zurück zum Zitat Do HH, Prasad P, Maag A, Alsadoon A (2019) Deep learning for aspect-based sentiment analysis: a comparative review. Expert Syst Appl 118:272–299CrossRef Do HH, Prasad P, Maag A, Alsadoon A (2019) Deep learning for aspect-based sentiment analysis: a comparative review. Expert Syst Appl 118:272–299CrossRef
Zurück zum Zitat Dong XL, de Melo G (2019) A robust self-learning framework for cross-lingual text classification. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp 6306–6310 Dong XL, de Melo G (2019) A robust self-learning framework for cross-lingual text classification. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp 6306–6310
Zurück zum Zitat Du C, Sun H, Wang J, Qi Q, Liao J (2020a) Adversarial and domain-aware bert for cross-domain sentiment analysis. In: Proceedings of the 58th annual meeting of the Association for Computational Linguistics, pp 4019–4028 Du C, Sun H, Wang J, Qi Q, Liao J (2020a) Adversarial and domain-aware bert for cross-domain sentiment analysis. In: Proceedings of the 58th annual meeting of the Association for Computational Linguistics, pp 4019–4028
Zurück zum Zitat Du Y, He M, Wang L, Zhang H (2020) Wasserstein based transfer network for cross-domain sentiment classification. Knowl-Based Syst 204:106162CrossRef Du Y, He M, Wang L, Zhang H (2020) Wasserstein based transfer network for cross-domain sentiment classification. Knowl-Based Syst 204:106162CrossRef
Zurück zum Zitat Duh K, Fujino A, Nagata M (2011) Is machine translation ripe for cross-lingual sentiment classification? In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp 429–433 Duh K, Fujino A, Nagata M (2011) Is machine translation ripe for cross-lingual sentiment classification? In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp 429–433
Zurück zum Zitat Feng Y, Wan X (2019) Learning bilingual sentiment-specific word embeddings without cross-lingual supervision. In: Proceedings of the 2019 conference of the north american chapter of the association for computational linguistics: human language technologies, vol 1 (Long and Short Papers), pp 420–429 Feng Y, Wan X (2019) Learning bilingual sentiment-specific word embeddings without cross-lingual supervision. In: Proceedings of the 2019 conference of the north american chapter of the association for computational linguistics: human language technologies, vol 1 (Long and Short Papers), pp 420–429
Zurück zum Zitat Fu Y, Liu Y (2021) Cross-domain sentiment classification based on key pivot and non-pivot extraction. Knowl-Based Syst 228:107280CrossRef Fu Y, Liu Y (2021) Cross-domain sentiment classification based on key pivot and non-pivot extraction. Knowl-Based Syst 228:107280CrossRef
Zurück zum Zitat Ganin Y, Ustinova E, Ajakan H, Germain P, Larochelle H, Laviolette F, Marchand M, Lempitsky V (2016) Domain-adversarial training of neural networks. J Mach Learn Res 17(1):2030–2096MathSciNetMATH Ganin Y, Ustinova E, Ajakan H, Germain P, Larochelle H, Laviolette F, Marchand M, Lempitsky V (2016) Domain-adversarial training of neural networks. J Mach Learn Res 17(1):2030–2096MathSciNetMATH
Zurück zum Zitat Ghosal D, Majumder N, Poria S, Chhaya N, Gelbukh A (2019) Dialoguegcn: A graph convolutional neural network for emotion recognition in conversation. arXiv:1908.11540 Ghosal D, Majumder N, Poria S, Chhaya N, Gelbukh A (2019) Dialoguegcn: A graph convolutional neural network for emotion recognition in conversation. arXiv:​1908.​11540
Zurück zum Zitat Gregory H, Li S, Mohammadi P, Tarn N, Draelos R, Rudin C (2020) A transformer approach to contextual sarcasm detection in twitter. In: Proceedings of the second workshop on figurative language processing, pp 270–275 Gregory H, Li S, Mohammadi P, Tarn N, Draelos R, Rudin C (2020) A transformer approach to contextual sarcasm detection in twitter. In: Proceedings of the second workshop on figurative language processing, pp 270–275
Zurück zum Zitat Habimana O, Li Y, Li R, Gu X, Yu G (2020) Sentiment analysis using deep learning approaches: an overview. Sci China Inf Sci 63(1):1–36CrossRef Habimana O, Li Y, Li R, Gu X, Yu G (2020) Sentiment analysis using deep learning approaches: an overview. Sci China Inf Sci 63(1):1–36CrossRef
Zurück zum Zitat Hazarika D, Zimmermann R, Poria S (2020) Misa: modality-invariant and-specific representations for multimodal sentiment analysis. In: Proceedings of the 28th ACM international conference on multimedia, pp 1122–1131 Hazarika D, Zimmermann R, Poria S (2020) Misa: modality-invariant and-specific representations for multimodal sentiment analysis. In: Proceedings of the 28th ACM international conference on multimedia, pp 1122–1131
Zurück zum Zitat He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778 He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778
Zurück zum Zitat Huang C, Trabelsi A, Qin X, Farruque N, Mou L, Zaiane OR (2021) Seq2emo: a sequence to multi-label emotion classification model. In: Proceedings of the 2021 conference of the North American chapter of the association for computational linguistics: human language technologies, pp 4717–4724 Huang C, Trabelsi A, Qin X, Farruque N, Mou L, Zaiane OR (2021) Seq2emo: a sequence to multi-label emotion classification model. In: Proceedings of the 2021 conference of the North American chapter of the association for computational linguistics: human language technologies, pp 4717–4724
Zurück zum Zitat Huang YH, Lee SR, Ma MY, Chen YH, Yu YW, Chen YS (2019) Emotionx-idea: emotion bert–an affectional model for conversation. arXiv:1908.06264 Huang YH, Lee SR, Ma MY, Chen YH, Yu YW, Chen YS (2019) Emotionx-idea: emotion bert–an affectional model for conversation. arXiv:​1908.​06264
Zurück zum Zitat Jaiswal N (2020). Neural sarcasm detection using conversation context. In: Proceedings of the second workshop on figurative language processing, pp 77–82 Jaiswal N (2020). Neural sarcasm detection using conversation context. In: Proceedings of the second workshop on figurative language processing, pp 77–82
Zurück zum Zitat Javdan S, Minaei-Bidgoli B, et al (2020) Applying transformers and aspect-based sentiment analysis approaches on sarcasm detection. In: Proceedings of the second workshop on figurative language processing, pp 67–71 Javdan S, Minaei-Bidgoli B, et al (2020) Applying transformers and aspect-based sentiment analysis approaches on sarcasm detection. In: Proceedings of the second workshop on figurative language processing, pp 67–71
Zurück zum Zitat Jawahar G, Sagot B, Seddah D (2019) What does bert learn about the structure of language? In: ACL 2019-57th annual meeting of the association for computational linguistics Jawahar G, Sagot B, Seddah D (2019) What does bert learn about the structure of language? In: ACL 2019-57th annual meeting of the association for computational linguistics
Zurück zum Zitat Jorgensen J, Miller GA, Sperber D (1984) Test of the mention theory of irony. J Exp Psychol Gen 113(1):112CrossRef Jorgensen J, Miller GA, Sperber D (1984) Test of the mention theory of irony. J Exp Psychol Gen 113(1):112CrossRef
Zurück zum Zitat Joshi A, Sharma V, Bhattacharyya P (2015) Harnessing context incongruity for sarcasm detection. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing (vol 2: Short Papers), pp 757–762 Joshi A, Sharma V, Bhattacharyya P (2015) Harnessing context incongruity for sarcasm detection. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing (vol 2: Short Papers), pp 757–762
Zurück zum Zitat Karimi A, Rossi L, Prati A (2021) Adversarial training for aspect-based sentiment analysis with bert. In: 2020 25th international conference on pattern recognition (ICPR). IEEE, pp 8797–8803 Karimi A, Rossi L, Prati A (2021) Adversarial training for aspect-based sentiment analysis with bert. In: 2020 25th international conference on pattern recognition (ICPR). IEEE, pp 8797–8803
Zurück zum Zitat Kumar A, Anand V (2020) Transformers on sarcasm detection with context. In: Proceedings of the second workshop on figurative language processing, pp 88–92 Kumar A, Anand V (2020) Transformers on sarcasm detection with context. In: Proceedings of the second workshop on figurative language processing, pp 88–92
Zurück zum Zitat Lan Z, Chen M, Goodman S, Gimpel K, Sharma P, Soricut R (2019) Albert: a lite bert for self-supervised learning of language representations. arXiv:1909.11942 Lan Z, Chen M, Goodman S, Gimpel K, Sharma P, Soricut R (2019) Albert: a lite bert for self-supervised learning of language representations. arXiv:​1909.​11942
Zurück zum Zitat Lee J, Yoon W, Kim S, Kim D, Kim S, So CH, Kang J (2020) Biobert: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics 36(4):1234–1240 Lee J, Yoon W, Kim S, Kim D, Kim S, So CH, Kang J (2020) Biobert: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics 36(4):1234–1240
Zurück zum Zitat Li LH, Yatskar M, Yin D, Hsieh CJ, Chang KW (2019) Visualbert: a simple and performant baseline for vision and language. arXiv:1908.03557 Li LH, Yatskar M, Yin D, Hsieh CJ, Chang KW (2019) Visualbert: a simple and performant baseline for vision and language. arXiv:​1908.​03557
Zurück zum Zitat Li Q, Shah S (2017) Learning stock market sentiment lexicon and sentiment-oriented word vector from stocktwits. In: Proceedings of the 21st conference on computational natural language learning (CoNLL 2017), pp 301–310 Li Q, Shah S (2017) Learning stock market sentiment lexicon and sentiment-oriented word vector from stocktwits. In: Proceedings of the 21st conference on computational natural language learning (CoNLL 2017), pp 301–310
Zurück zum Zitat Li T, Chen X, Zhang S, Dong Z, Keutzer K (2021) Cross-domain sentiment classification with contrastive learning and mutual information maximization. In: ICASSP 2021-2021 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 8203–8207 Li T, Chen X, Zhang S, Dong Z, Keutzer K (2021) Cross-domain sentiment classification with contrastive learning and mutual information maximization. In: ICASSP 2021-2021 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 8203–8207
Zurück zum Zitat Li W, Shao W, Ji S, Cambria E (2022) Bieru: bidirectional emotional recurrent unit for conversational sentiment analysis. Neurocomputing 467:73–82CrossRef Li W, Shao W, Ji S, Cambria E (2022) Bieru: bidirectional emotional recurrent unit for conversational sentiment analysis. Neurocomputing 467:73–82CrossRef
Zurück zum Zitat Li Z, Zhang Y, Wei Y, Wu Y, Yang Q (2017) End-to-end adversarial memory network for cross-domain sentiment classification. In: IJCAI, pp 2237–2243 Li Z, Zhang Y, Wei Y, Wu Y, Yang Q (2017) End-to-end adversarial memory network for cross-domain sentiment classification. In: IJCAI, pp 2237–2243
Zurück zum Zitat Liang B, Luo W, Li X, Gui L, Yang M, Yu X, Xu R (2021) Enhancing aspect-based sentiment analysis with supervised contrastive learning. In: Proceedings of the 30th ACM international conference on information & knowledge management, pp 3242–3247 Liang B, Luo W, Li X, Gui L, Yang M, Yu X, Xu R (2021) Enhancing aspect-based sentiment analysis with supervised contrastive learning. In: Proceedings of the 30th ACM international conference on information & knowledge management, pp 3242–3247
Zurück zum Zitat Liang B, Su H, Gui L, Cambria E, Xu R (2022) Aspect-based sentiment analysis via affective knowledge enhanced graph convolutional networks. Knowl-Based Syst 235:107643CrossRef Liang B, Su H, Gui L, Cambria E, Xu R (2022) Aspect-based sentiment analysis via affective knowledge enhanced graph convolutional networks. Knowl-Based Syst 235:107643CrossRef
Zurück zum Zitat Liao W, Zeng B, Yin X, Wei P (2021) An improved aspect-category sentiment analysis model for text sentiment analysis based on roberta. Appl Intell 51(6):3522–3533CrossRef Liao W, Zeng B, Yin X, Wei P (2021) An improved aspect-category sentiment analysis model for text sentiment analysis based on roberta. Appl Intell 51(6):3522–3533CrossRef
Zurück zum Zitat Liu B (2020) Sentiment analysis: mining opinions, sentiments, and emotions. Cambridge University Press, CambridgeCrossRef Liu B (2020) Sentiment analysis: mining opinions, sentiments, and emotions. Cambridge University Press, CambridgeCrossRef
Zurück zum Zitat Liu R, Shi Y, Ji C, Jia M (2019) A survey of sentiment analysis based on transfer learning. IEEE Access 7:85401–85412CrossRef Liu R, Shi Y, Ji C, Jia M (2019) A survey of sentiment analysis based on transfer learning. IEEE Access 7:85401–85412CrossRef
Zurück zum Zitat Liu Y, Ott M, Goyal N, Du J, Joshi M, Chen D, Levy O, Lewis M, Zettlemoyer L, Stoyanov V (2019b) Roberta: a robustly optimized bert pretraining approach. arXiv:1907.11692 Liu Y, Ott M, Goyal N, Du J, Joshi M, Chen D, Levy O, Lewis M, Zettlemoyer L, Stoyanov V (2019b) Roberta: a robustly optimized bert pretraining approach. arXiv:​1907.​11692
Zurück zum Zitat Liu Z, Shen Y. Lakshminarasimhan VB, Liang PP, Zadeh A, Morency LP (2018) Efficient low-rank multimodal fusion with modality-specific factors. arXiv:1806.00064 Liu Z, Shen Y. Lakshminarasimhan VB, Liang PP, Zadeh A, Morency LP (2018) Efficient low-rank multimodal fusion with modality-specific factors. arXiv:​1806.​00064
Zurück zum Zitat Lu J, Batra D, Parikh D, Lee S (2019) Vilbert: pretraining task-agnostic visiolinguistic representations for vision-and-language tasks. In: Advances in neural information processing systems, 32 Lu J, Batra D, Parikh D, Lee S (2019) Vilbert: pretraining task-agnostic visiolinguistic representations for vision-and-language tasks. In: Advances in neural information processing systems, 32
Zurück zum Zitat Ma Y, Peng H, Cambria E (2018) Targeted aspect-based sentiment analysis via embedding commonsense knowledge into an attentive lSTM. In: Proceedings of the AAAI conference on artificial intelligence, vol 32 Ma Y, Peng H, Cambria E (2018) Targeted aspect-based sentiment analysis via embedding commonsense knowledge into an attentive lSTM. In: Proceedings of the AAAI conference on artificial intelligence, vol 32
Zurück zum Zitat Majumder N, Poria S, Hazarika D, Mihalcea R, Gelbukh A, Cambria E (2019) Dialoguernn: an attentive rnn for emotion detection in conversations. Proc AAAI Conf Artif Intell 33:6818–6825 Majumder N, Poria S, Hazarika D, Mihalcea R, Gelbukh A, Cambria E (2019) Dialoguernn: an attentive rnn for emotion detection in conversations. Proc AAAI Conf Artif Intell 33:6818–6825
Zurück zum Zitat Maynard DG, Greenwood MA (2014) Who cares about sarcastic tweets? Investigating the impact of sarcasm on sentiment analysis. In: Proceedings of Lrec 2014. ELRA Maynard DG, Greenwood MA (2014) Who cares about sarcastic tweets? Investigating the impact of sarcasm on sentiment analysis. In: Proceedings of Lrec 2014. ELRA
Zurück zum Zitat McCann B, Bradbury J, Xiong C, Socher R (2017) Learned in translation: Contextualized word vectors. In: Advances in neural information processing systems, 30 McCann B, Bradbury J, Xiong C, Socher R (2017) Learned in translation: Contextualized word vectors. In: Advances in neural information processing systems, 30
Zurück zum Zitat Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J (2013b) Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems, 26 Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J (2013b) Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems, 26
Zurück zum Zitat Munikar M, Shakya S, Shrestha A (2019) Fine-grained sentiment classification using bert. In: 2019 artificial intelligence for transforming business and society (AITB), vol 1. IEEE, pp 1–5 Munikar M, Shakya S, Shrestha A (2019) Fine-grained sentiment classification using bert. In: 2019 artificial intelligence for transforming business and society (AITB), vol 1. IEEE, pp 1–5
Zurück zum Zitat Nazir A, Rao Y, Wu L, Sun L (2020) Issues and challenges of aspect-based sentiment analysis: a comprehensive survey. In: IEEE transactions on affective computing Nazir A, Rao Y, Wu L, Sun L (2020) Issues and challenges of aspect-based sentiment analysis: a comprehensive survey. In: IEEE transactions on affective computing
Zurück zum Zitat Pan H, Lin Z, Fu P, Qi Y, Wang W (2020) Modeling intra and inter-modality incongruity for multi-modal sarcasm detection. In: Proceedings of the 2020 conference on empirical methods in natural language processing: findings, pp 1383–1392 Pan H, Lin Z, Fu P, Qi Y, Wang W (2020) Modeling intra and inter-modality incongruity for multi-modal sarcasm detection. In: Proceedings of the 2020 conference on empirical methods in natural language processing: findings, pp 1383–1392
Zurück zum Zitat Pan SJ, Ni X, Sun JT, Yang Q, Chen Z (2010) Cross-domain sentiment classification via spectral feature alignment. In: Proceedings of the 19th international conference on World wide web, pp 751–760 Pan SJ, Ni X, Sun JT, Yang Q, Chen Z (2010) Cross-domain sentiment classification via spectral feature alignment. In: Proceedings of the 19th international conference on World wide web, pp 751–760
Zurück zum Zitat Pan SJ, Yang Q (2009) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359CrossRef Pan SJ, Yang Q (2009) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359CrossRef
Zurück zum Zitat Pennington J, Socher R, Manning CD (2014) Glove: global vectors for word representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 1532–1543 Pennington J, Socher R, Manning CD (2014) Glove: global vectors for word representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 1532–1543
Zurück zum Zitat Phan HT, Nguyen NT, Hwang D (2022) Convolutional attention neural network over graph structures for improving the performance of aspect-level sentiment analysis. In: Information Sciences Phan HT, Nguyen NT, Hwang D (2022) Convolutional attention neural network over graph structures for improving the performance of aspect-level sentiment analysis. In: Information Sciences
Zurück zum Zitat Poria S, Chaturvedi I, Cambria E, Bisio F (2016) Sentic lda: improving on lda with semantic similarity for aspect-based sentiment analysis. In: 2016 international joint conference on neural networks (IJCNN), pp 4465–4473. IEEE Poria S, Chaturvedi I, Cambria E, Bisio F (2016) Sentic lda: improving on lda with semantic similarity for aspect-based sentiment analysis. In: 2016 international joint conference on neural networks (IJCNN), pp 4465–4473. IEEE
Zurück zum Zitat Qiu X, Sun T, Xu Y, Shao Y, Dai N, Huang X (2020) Pre-trained models for natural language processing: a survey. Sci China Technol Sci 63(10):1872–1897CrossRef Qiu X, Sun T, Xu Y, Shao Y, Dai N, Huang X (2020) Pre-trained models for natural language processing: a survey. Sci China Technol Sci 63(10):1872–1897CrossRef
Zurück zum Zitat Radford A, Narasimhan K, Salimans T, Sutskever I (2018) Improving language understanding by generative pre-training Radford A, Narasimhan K, Salimans T, Sutskever I (2018) Improving language understanding by generative pre-training
Zurück zum Zitat Radford A, Wu J, Child R, Luan D, Amodei D, Sutskever I et al (2019) Language models are unsupervised multitask learners. Open AI Blog 1(8):9 Radford A, Wu J, Child R, Luan D, Amodei D, Sutskever I et al (2019) Language models are unsupervised multitask learners. Open AI Blog 1(8):9
Zurück zum Zitat Rahman W, Hasan MK, Lee S, Zadeh A, Mao C, Morency LP, Hoque E (2020) Integrating multimodal information in large pretrained transformers. In: Proceedings of the conference. Association for Computational Linguistics. Meeting, vol 2020, p 2359. NIH Public Access Rahman W, Hasan MK, Lee S, Zadeh A, Mao C, Morency LP, Hoque E (2020) Integrating multimodal information in large pretrained transformers. In: Proceedings of the conference. Association for Computational Linguistics. Meeting, vol 2020, p 2359. NIH Public Access
Zurück zum Zitat Rietzler A, Stabinger S, Opitz P, Engl S (2019) Adapt or get left behind: Domain adaptation through bert language model finetuning for aspect-target sentiment classification. arXiv:1908.11860 Rietzler A, Stabinger S, Opitz P, Engl S (2019) Adapt or get left behind: Domain adaptation through bert language model finetuning for aspect-target sentiment classification. arXiv:​1908.​11860
Zurück zum Zitat Rosenberg EL, Ekman P (2020) What the face reveals: basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, Oxford Rosenberg EL, Ekman P (2020) What the face reveals: basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, Oxford
Zurück zum Zitat Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536MATHCrossRef Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536MATHCrossRef
Zurück zum Zitat Sanh V, Debut L, Chaumond J, Wolf T (2019) Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. arXiv:1910.01108 Sanh V, Debut L, Chaumond J, Wolf T (2019) Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. arXiv:​1910.​01108
Zurück zum Zitat Sarzynska-Wawer J, Wawer A, Pawlak A, Szymanowska J, Stefaniak I, Jarkiewicz M, Okruszek L (2021) Detecting formal thought disorder by deep contextualized word representations. Psychiatry Res 304:114135CrossRef Sarzynska-Wawer J, Wawer A, Pawlak A, Szymanowska J, Stefaniak I, Jarkiewicz M, Okruszek L (2021) Detecting formal thought disorder by deep contextualized word representations. Psychiatry Res 304:114135CrossRef
Zurück zum Zitat Schütze H (1992) Word space. In: Advances in neural information processing systems, 5 Schütze H (1992) Word space. In: Advances in neural information processing systems, 5
Zurück zum Zitat Sharma R, Bhattacharyya P, Dandapat S, Bhatt HS (2018) Identifying transferable information across domains for cross-domain sentiment classification. In: Proceedings of the 56th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 968–978 Sharma R, Bhattacharyya P, Dandapat S, Bhatt HS (2018) Identifying transferable information across domains for cross-domain sentiment classification. In: Proceedings of the 56th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 968–978
Zurück zum Zitat Shi T, Li L, Wang P, Reddy CK (2020) A simple and effective self-supervised contrastive learning framework for aspect detection. arXiv:2009.09107 Shi T, Li L, Wang P, Reddy CK (2020) A simple and effective self-supervised contrastive learning framework for aspect detection. arXiv:​2009.​09107
Zurück zum Zitat Sun C, Huang L, Qiu X (2019) Utilizing bert for aspect-based sentiment analysis via constructing auxiliary sentence. arXiv:1903.09588 Sun C, Huang L, Qiu X (2019) Utilizing bert for aspect-based sentiment analysis via constructing auxiliary sentence. arXiv:​1903.​09588
Zurück zum Zitat Sun Z, Sarma P, Sethares W, Liang Y (2020) Learning relationships between text, audio, and video via deep canonical correlation for multimodal language analysis. Proc AAAI Conf Artif Intell 34:8992–8999 Sun Z, Sarma P, Sethares W, Liang Y (2020) Learning relationships between text, audio, and video via deep canonical correlation for multimodal language analysis. Proc AAAI Conf Artif Intell 34:8992–8999
Zurück zum Zitat Tang D, Wei F, Yang N, Zhou M, Liu T, Qin B (2014) Learning sentiment-specific word embedding for twitter sentiment classification. In: Proceedings of the 52nd annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 1555–1565 Tang D, Wei F, Yang N, Zhou M, Liu T, Qin B (2014) Learning sentiment-specific word embedding for twitter sentiment classification. In: Proceedings of the 52nd annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 1555–1565
Zurück zum Zitat Tang H, Mi Y, Xue F, Cao Y (2021) Graph domain adversarial transfer network for cross-domain sentiment classification. IEEE Access 9:33051–33060CrossRef Tang H, Mi Y, Xue F, Cao Y (2021) Graph domain adversarial transfer network for cross-domain sentiment classification. IEEE Access 9:33051–33060CrossRef
Zurück zum Zitat Tian H, Gao C, Xiao X, Liu H, He B, Wu H, Wang H, Wu F (2020) Skep: sentiment knowledge enhanced pre-training for sentiment analysis. arXiv:2005.05635 Tian H, Gao C, Xiao X, Liu H, He B, Wu H, Wang H, Wu F (2020) Skep: sentiment knowledge enhanced pre-training for sentiment analysis. arXiv:​2005.​05635
Zurück zum Zitat Tian Y, Chen G, Song Y (2021). Enhancing aspect-level sentiment analysis with word dependencies. In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pp 3726–3739 Tian Y, Chen G, Song Y (2021). Enhancing aspect-level sentiment analysis with word dependencies. In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pp 3726–3739
Zurück zum Zitat Tsai YHH, Bai S, Liang PP, Kolter JZ, Morency LP, Salakhutdinov R (2019) Multimodal transformer for unaligned multimodal language sequences. In Proceedings of the conference. In: Association for computational linguistics. Meeting, vol 2019, p 6558. NIH Public Access Tsai YHH, Bai S, Liang PP, Kolter JZ, Morency LP, Salakhutdinov R (2019) Multimodal transformer for unaligned multimodal language sequences. In Proceedings of the conference. In: Association for computational linguistics. Meeting, vol 2019, p 6558. NIH Public Access
Zurück zum Zitat Tzeng E, Hoffman J, Saenko K, Darrell T (2017) Adversarial discriminative domain adaptation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 7167–7176 Tzeng E, Hoffman J, Saenko K, Darrell T (2017) Adversarial discriminative domain adaptation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 7167–7176
Zurück zum Zitat Valdivia A, Luzíón MV, Herrera F (2017) Neutrality in the sentiment analysis problem based on fuzzy majority. In: 2017 IEEE international conference on fuzzy systems (FUZZ-IEEE), pp 1–6. IEEE Valdivia A, Luzíón MV, Herrera F (2017) Neutrality in the sentiment analysis problem based on fuzzy majority. In: 2017 IEEE international conference on fuzzy systems (FUZZ-IEEE), pp 1–6. IEEE
Zurück zum Zitat Valdivia A, Luzón MV, Cambria E, Herrera F (2018) Consensus vote models for detecting and filtering neutrality in sentiment analysis. Inf Fusion 44:126–135CrossRef Valdivia A, Luzón MV, Cambria E, Herrera F (2018) Consensus vote models for detecting and filtering neutrality in sentiment analysis. Inf Fusion 44:126–135CrossRef
Zurück zum Zitat Vlad GA, Tanase MA, Onose C, Cercel DC (2019) Sentence-level propaganda detection in news articles with transfer learning and bert-bilstm-capsule model. In: Proceedings of the second workshop on natural language processing for internet freedom: Censorship, Disinformation, and Propaganda, pp 148–154 Vlad GA, Tanase MA, Onose C, Cercel DC (2019) Sentence-level propaganda detection in news articles with transfer learning and bert-bilstm-capsule model. In: Proceedings of the second workshop on natural language processing for internet freedom: Censorship, Disinformation, and Propaganda, pp 148–154
Zurück zum Zitat Wang K, Shen W, Yang Y, Quan X, Wang R (2020a) Relational graph attention network for aspect-based sentiment analysis. arXiv:2004.12362 Wang K, Shen W, Yang Y, Quan X, Wang R (2020a) Relational graph attention network for aspect-based sentiment analysis. arXiv:​2004.​12362
Zurück zum Zitat Wang X, Sun X, Yang T, Wang H (2020b) Building a bridge: A method for image-text sarcasm detection without pretraining on image-text data. In: Proceedings of the first international workshop on natural language processing beyond text, pp 19–29 Wang X, Sun X, Yang T, Wang H (2020b) Building a bridge: A method for image-text sarcasm detection without pretraining on image-text data. In: Proceedings of the first international workshop on natural language processing beyond text, pp 19–29
Zurück zum Zitat Wang Y, Shen Y, Liu Z, Liang PP, Zadeh A, Morency L-P (2019) Words can shift: dynamically adjusting word representations using nonverbal behaviors. Proc AAAI Conf Artif Intell 33:7216–7223 Wang Y, Shen Y, Liu Z, Liang PP, Zadeh A, Morency L-P (2019) Words can shift: dynamically adjusting word representations using nonverbal behaviors. Proc AAAI Conf Artif Intell 33:7216–7223
Zurück zum Zitat Wang Z, Ho S-B, Cambria E (2020) Multi-level fine-scaled sentiment sensing with ambivalence handling. Int J Uncertain Fuzz Knowl-Based Syst 28(04):683–697CrossRef Wang Z, Ho S-B, Cambria E (2020) Multi-level fine-scaled sentiment sensing with ambivalence handling. Int J Uncertain Fuzz Knowl-Based Syst 28(04):683–697CrossRef
Zurück zum Zitat Wu H, Zhang Z, Shi S, Wu Q, Song H (2022) Phrase dependency relational graph attention network for aspect-based sentiment analysis. Knowl-Based Syst 236:107736CrossRef Wu H, Zhang Z, Shi S, Wu Q, Song H (2022) Phrase dependency relational graph attention network for aspect-based sentiment analysis. Knowl-Based Syst 236:107736CrossRef
Zurück zum Zitat Xiao L, Xue Y, Wang H, Hu X, Gu D, Zhu Y (2022) Exploring fine-grained syntactic information for aspect-based sentiment classification with dual graph neural networks. Neurocomputing 471:48–59CrossRef Xiao L, Xue Y, Wang H, Hu X, Gu D, Zhu Y (2022) Exploring fine-grained syntactic information for aspect-based sentiment classification with dual graph neural networks. Neurocomputing 471:48–59CrossRef
Zurück zum Zitat Xing FZ, Cambria E, Welsch RE (2018) Natural language based financial forecasting: a survey. Artif Intell Rev 50(1):49–73CrossRef Xing FZ, Cambria E, Welsch RE (2018) Natural language based financial forecasting: a survey. Artif Intell Rev 50(1):49–73CrossRef
Zurück zum Zitat Xing FZ, Pallucchini F, Cambria E (2019) Cognitive-inspired domain adaptation of sentiment lexicons. Inf Process Manag 56(3):554–564CrossRef Xing FZ, Pallucchini F, Cambria E (2019) Cognitive-inspired domain adaptation of sentiment lexicons. Inf Process Manag 56(3):554–564CrossRef
Zurück zum Zitat Xu H, Liu B, Shu L, Yu PS (2019) Bert post-training for review reading comprehension and aspect-based sentiment analysis. arXiv:1904.02232 Xu H, Liu B, Shu L, Yu PS (2019) Bert post-training for review reading comprehension and aspect-based sentiment analysis. arXiv:​1904.​02232
Zurück zum Zitat Yadav A, Vishwakarma DK (2020) Sentiment analysis using deep learning architectures: a review. Artif Intell Rev 53(6):4335–4385CrossRef Yadav A, Vishwakarma DK (2020) Sentiment analysis using deep learning architectures: a review. Artif Intell Rev 53(6):4335–4385CrossRef
Zurück zum Zitat Yang K, Xu H, Gao K (2020a) Cm-bert: cross-modal bert for text-audio sentiment analysis. In: Proceedings of the 28th ACM international conference on multimedia, pp 521–528 Yang K, Xu H, Gao K (2020a) Cm-bert: cross-modal bert for text-audio sentiment analysis. In: Proceedings of the 28th ACM international conference on multimedia, pp 521–528
Zurück zum Zitat Yang M, Jiang Q, Shen Y, Wu Q, Zhao Z, Zhou W (2019) Hierarchical human-like strategy for aspect-level sentiment classification with sentiment linguistic knowledge and reinforcement learning. Neural Netw 117:240–248CrossRef Yang M, Jiang Q, Shen Y, Wu Q, Zhao Z, Zhou W (2019) Hierarchical human-like strategy for aspect-level sentiment classification with sentiment linguistic knowledge and reinforcement learning. Neural Netw 117:240–248CrossRef
Zurück zum Zitat Yang Z, Dai Z, Yang Y, Carbonell J, Salakhutdinov RR, Le QV (2019c) Xlnet: generalized autoregressive pretraining for language understanding. In: Advances in neural information processing systems, 32 Yang Z, Dai Z, Yang Y, Carbonell J, Salakhutdinov RR, Le QV (2019c) Xlnet: generalized autoregressive pretraining for language understanding. In: Advances in neural information processing systems, 32
Zurück zum Zitat Yin D, Meng T, Chang KW (2020) Sentibert: a transferable transformer-based architecture for compositional sentiment semantics. arXiv:2005.04114 Yin D, Meng T, Chang KW (2020) Sentibert: a transferable transformer-based architecture for compositional sentiment semantics. arXiv:​2005.​04114
Zurück zum Zitat Yu W, Xu H, Meng F, Zhu Y, Ma Y, Wu J, Zou J, Yang K (2020) Ch-sims: a Chinese multimodal sentiment analysis dataset with fine-grained annotation of modality. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp 3718–3727 Yu W, Xu H, Meng F, Zhu Y, Ma Y, Wu J, Zou J, Yang K (2020) Ch-sims: a Chinese multimodal sentiment analysis dataset with fine-grained annotation of modality. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp 3718–3727
Zurück zum Zitat Yu W, Xu H, Yuan Z, Wu J (2021) Learning modality-specific representations with self-supervised multi-task learning for multimodal sentiment analysis. arXiv:2102.04830 Yu W, Xu H, Yuan Z, Wu J (2021) Learning modality-specific representations with self-supervised multi-task learning for multimodal sentiment analysis. arXiv:​2102.​04830
Zurück zum Zitat Yuan J, Wu Y, Lu X, Zhao Y, Qin B, Liu T (2020) Recent advances in deep learning based sentiment analysis. Sci China Technol Sci 63(10):1947–1970CrossRef Yuan J, Wu Y, Lu X, Zhao Y, Qin B, Liu T (2020) Recent advances in deep learning based sentiment analysis. Sci China Technol Sci 63(10):1947–1970CrossRef
Zurück zum Zitat Yuan J, Zhao Y, Qin B, Liu T (2021) Learning to share by masking the non-shared for multi-domain sentiment classification. arXiv:2104.08480 Yuan J, Zhao Y, Qin B, Liu T (2021) Learning to share by masking the non-shared for multi-domain sentiment classification. arXiv:​2104.​08480
Zurück zum Zitat Yue L, Chen W, Li X, Zuo W, Yin M (2019) A survey of sentiment analysis in social media. Knowl Inf Syst 60(2):617–663CrossRef Yue L, Chen W, Li X, Zuo W, Yin M (2019) A survey of sentiment analysis in social media. Knowl Inf Syst 60(2):617–663CrossRef
Zurück zum Zitat Zadeh A, Liang PP, Mazumder N, Poria S, Cambria E, Morency LP (2018) Memory fusion network for multi-view sequential learning. In: Proceedings of the AAAI conference on artificial intelligence, vol 32 Zadeh A, Liang PP, Mazumder N, Poria S, Cambria E, Morency LP (2018) Memory fusion network for multi-view sequential learning. In: Proceedings of the AAAI conference on artificial intelligence, vol 32
Zurück zum Zitat Zadeh A, Pu P (2018) Multimodal language analysis in the wild: Cmu-mosei dataset and interpretable dynamic fusion graph. In: Proceedings of the 56th annual meeting of the association for computational linguistics (Long Papers) Zadeh A, Pu P (2018) Multimodal language analysis in the wild: Cmu-mosei dataset and interpretable dynamic fusion graph. In: Proceedings of the 56th annual meeting of the association for computational linguistics (Long Papers)
Zurück zum Zitat Zhang C, Li Q, Song D (2019a) Aspect-based sentiment classification with aspect-specific graph convolutional networks. arXiv:1909.03477 Zhang C, Li Q, Song D (2019a) Aspect-based sentiment classification with aspect-specific graph convolutional networks. arXiv:​1909.​03477
Zurück zum Zitat Zhang L, Wang S, Liu B (2018) Deep learning for sentiment analysis: a survey. Wiley Interdiscip Rev 8(4):e1253 Zhang L, Wang S, Liu B (2018) Deep learning for sentiment analysis: a survey. Wiley Interdiscip Rev 8(4):e1253
Zurück zum Zitat Zhang Y, Miao D, Wang J (2019b) Hierarchical attention generative adversarial networks for cross-domain sentiment classification. arXiv:1903.11334 Zhang Y, Miao D, Wang J (2019b) Hierarchical attention generative adversarial networks for cross-domain sentiment classification. arXiv:​1903.​11334
Zurück zum Zitat Zhao A, Yu Y (2021) Knowledge-enabled bert for aspect-based sentiment analysis. In: Knowledge-Based Systems, p 107220 Zhao A, Yu Y (2021) Knowledge-enabled bert for aspect-based sentiment analysis. In: Knowledge-Based Systems, p 107220
Zurück zum Zitat Zhao C, Wang S, Li D, Liu X, Yang X, Liu J (2021) Cross-domain sentiment classification via parameter transferring and attention sharing mechanism. Inf Sci 578:281–296MathSciNetCrossRef Zhao C, Wang S, Li D, Liu X, Yang X, Liu J (2021) Cross-domain sentiment classification via parameter transferring and attention sharing mechanism. Inf Sci 578:281–296MathSciNetCrossRef
Zurück zum Zitat Zhao P, Hou L, Wu O (2020) Modeling sentiment dependencies with graph convolutional networks for aspect-level sentiment classification. Knowl-Based Syst 193:105443CrossRef Zhao P, Hou L, Wu O (2020) Modeling sentiment dependencies with graph convolutional networks for aspect-level sentiment classification. Knowl-Based Syst 193:105443CrossRef
Zurück zum Zitat Zhou H, Chen L, Shi F, Huang D (2015) Learning bilingual sentiment word embeddings for cross-language sentiment classification. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing (vol 1: Long Papers), pp 430–440 Zhou H, Chen L, Shi F, Huang D (2015) Learning bilingual sentiment word embeddings for cross-language sentiment classification. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing (vol 1: Long Papers), pp 430–440
Zurück zum Zitat Zhou J, Tian J, Wang R, Wu Y, Xiao W, He L (2020) Sentix: a sentiment-aware pre-trained model for cross-domain sentiment analysis. In Proceedings of the 28th international conference on computational linguistics, pp 568–579 Zhou J, Tian J, Wang R, Wu Y, Xiao W, He L (2020) Sentix: a sentiment-aware pre-trained model for cross-domain sentiment analysis. In Proceedings of the 28th international conference on computational linguistics, pp 568–579
Zurück zum Zitat Zhou X, Wan X, Xiao J (2016) Attention-based lstm network for cross-lingual sentiment classification. In: Proceedings of the 2016 conference on empirical methods in natural language processing, pp 247–256 Zhou X, Wan X, Xiao J (2016) Attention-based lstm network for cross-lingual sentiment classification. In: Proceedings of the 2016 conference on empirical methods in natural language processing, pp 247–256
Zurück zum Zitat Zhuang F, Qi Z, Duan K, Xi D, Zhu Y, Zhu H, Xiong H, He Q (2020) A comprehensive survey on transfer learning. Proc IEEE 109(1):43–76CrossRef Zhuang F, Qi Z, Duan K, Xi D, Zhu Y, Zhu H, Xiong H, He Q (2020) A comprehensive survey on transfer learning. Proc IEEE 109(1):43–76CrossRef
Metadaten
Titel
State of the art: a review of sentiment analysis based on sequential transfer learning
verfasst von
Jireh Yi-Le Chan
Khean Thye Bea
Steven Mun Hong Leow
Seuk Wai Phoong
Wai Khuen Cheng
Publikationsdatum
13.04.2022
Verlag
Springer Netherlands
Erschienen in
Artificial Intelligence Review / Ausgabe 1/2023
Print ISSN: 0269-2821
Elektronische ISSN: 1573-7462
DOI
https://doi.org/10.1007/s10462-022-10183-8

Weitere Artikel der Ausgabe 1/2023

Artificial Intelligence Review 1/2023 Zur Ausgabe

Premium Partner