Skip to main content
Top
Published in: Cognitive Computation 6/2023

16-06-2023

Learning Sentiment-Enhanced Word Representations by Fusing External Hybrid Sentiment Knowledge

Authors: You Li, Zhizhou Lin, Yuming Lin, Jinhui Yin, Liang Chang

Published in: Cognitive Computation | Issue 6/2023

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Word representation learning is a fundamental technique in cognitive computation that plays a crucial role in enabling machines to understand and process human language. By representing words as vectors in a high-dimensional space, computers can perform complex natural language processing tasks such as sentiment analysis. However, most word representation learning models are trained in open-domain corpora, which results in suboptimal performance in domain-specific tasks. To address this problem, we propose a unified learning framework that leverages external hybrid sentiment knowledge to enhance the sentiment information of word distributed representations. Specifically, we automatically acquire domain- and target-dependent sentiment knowledge from multiple sources. To mitigate knowledge noise, we introduce knowledge expectation and knowledge context weights to filter the acquired knowledge items. Finally, we integrate the filtered sentiment knowledge into the word distributed representations via a learning framework to enrich their semantic information. Extensive experiments are conducted to verify the effectiveness of enhancing sentiment information in word representations for different sentiment analysis tasks. The experimental results show that the proposed models significantly outperform state-of-the-art baselines. Our work demonstrates the advantages of sentiment-enhanced word representations in sentiment analysis tasks and provides insights into the acquisition and fusion of sentiment knowledge from different domains for generating word representations with richer semantics.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
Target-opinion word pair extraction is a sentiment analysis research field, and many methods have been proposed. How to extract them is beyond the scope of this paper. We only use two heuristic rules to extract target-opinion word pairs in this paper.
 
2
This sentence was annotated by the CoreNLP, which is available at https://​corenlp.​run/​.
 
Literature
1.
go back to reference Agrawal A, An A, Papagelis M. Learning emotion-enriched word representations. In: Proceedings of the 27th international conference on computational linguistics; 2018. pp. 950–961. Agrawal A, An A, Papagelis M. Learning emotion-enriched word representations. In: Proceedings of the 27th international conference on computational linguistics; 2018. pp. 950–961.
5.
go back to reference Cambria E, Li Y, Xing FZ, Poria S, Kwok K. Senticnet 6: Ensemble application of symbolic and subsymbolic ai for sentiment analysis. In: Proceedings of the 29th ACM international conference on information & knowledge management; 2020. pp. 105–114. Cambria E, Li Y, Xing FZ, Poria S, Kwok K. Senticnet 6: Ensemble application of symbolic and subsymbolic ai for sentiment analysis. In: Proceedings of the 29th ACM international conference on information & knowledge management; 2020. pp. 105–114.
6.
go back to reference Hussain A, Cambria E, Poria S, Hawalah A, Herrera F. Information fusion for affective computing and sentiment analysis. Inf Fusion. 2021;71. Hussain A, Cambria E, Poria S, Hawalah A, Herrera F. Information fusion for affective computing and sentiment analysis. Inf Fusion. 2021;71.
13.
14.
go back to reference Devlin J, Chang MW, Lee K, Toutanova K. Bert: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019. pp. 4171–4186. https://doi.org/10.18653/v1/N19-1423. Devlin J, Chang MW, Lee K, Toutanova K. Bert: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019. pp. 4171–4186. https://​doi.​org/​10.​18653/​v1/​N19-1423.
15.
go back to reference Wang A, Singh A, Michael J, Hill F, Levy O, Bowman S. Glue: A multi-task benchmark and analysis platform for natural language understanding. In: Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP. 2018. pp. 353–355. https://doi.org/10.18653/v1/W18-5446. Wang A, Singh A, Michael J, Hill F, Levy O, Bowman S. Glue: A multi-task benchmark and analysis platform for natural language understanding. In: Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP. 2018. pp. 353–355. https://​doi.​org/​10.​18653/​v1/​W18-5446.
17.
go back to reference Zhang Z, Han X, Liu Z, Jiang X, Sun M, Liu Q. Ernie: Enhanced language representation with informative entities. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019. pp. 1441–1451. https://doi.org/10.18653/v1/P19-1139. Zhang Z, Han X, Liu Z, Jiang X, Sun M, Liu Q. Ernie: Enhanced language representation with informative entities. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019. pp. 1441–1451. https://​doi.​org/​10.​18653/​v1/​P19-1139.
18.
go back to reference Cui Y, Yang Z, Liu T. Pert: Pre-training bert with permuted language model. arXiv preprint arXiv:2203.06906. 2022. 10.48550/arXiv.2203.06906. Cui Y, Yang Z, Liu T. Pert: Pre-training bert with permuted language model. arXiv preprint arXiv:​2203.​06906. 2022. 10.48550/arXiv.2203.06906.
19.
go back to reference Tian H, Gao C, Xiao X, Liu H, He B, Wu H, Wang H, Wu F. Skep: Sentiment knowledge enhanced pre-training for sentiment analysis. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020. pp. 4067–4076. Tian H, Gao C, Xiao X, Liu H, He B, Wu H, Wang H, Wu F. Skep: Sentiment knowledge enhanced pre-training for sentiment analysis. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020. pp. 4067–4076.
22.
go back to reference Hofmann V, Schütze H, Pierrehumbert J. An embarrassingly simple method to mitigate undesirable properties of pretrained language model tokenizers. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). 2022. pp. 385–393. https://doi.org/10.18653/v1/2022.acl-short.43. Hofmann V, Schütze H, Pierrehumbert J. An embarrassingly simple method to mitigate undesirable properties of pretrained language model tokenizers. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). 2022. pp. 385–393. https://​doi.​org/​10.​18653/​v1/​2022.​acl-short.​43.
24.
go back to reference Peters ME, Neumann M, Logan R, Schwartz R, Joshi V, Singh S, Smith NA. Knowledge enhanced contextual word representations. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019. pp. 43–54. https://doi.org/10.18653/v1/D19-1005. Peters ME, Neumann M, Logan R, Schwartz R, Joshi V, Singh S, Smith NA. Knowledge enhanced contextual word representations. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019. pp. 43–54. https://​doi.​org/​10.​18653/​v1/​D19-1005.
27.
go back to reference Li W, Zhu L, Mao R, Cambria E. Skier: A symbolic knowledge integrated model for conversational emotion recognition. In: Proceedings of the AAAI Conference on Artificial Intelligence. 2023. Li W, Zhu L, Mao R, Cambria E. Skier: A symbolic knowledge integrated model for conversational emotion recognition. In: Proceedings of the AAAI Conference on Artificial Intelligence. 2023.
28.
go back to reference Zhao Q, Ma S, Ren S. Kesa: A knowledge enhanced approach to sentiment analysis. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing. 2022. pp. 766–776. Zhao Q, Ma S, Ren S. Kesa: A knowledge enhanced approach to sentiment analysis. In: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing. 2022. pp. 766–776.
30.
go back to reference Cambria E, Mao R, Han S, Liu Q. Sentic parser: A graph-based approach to concept extraction for sentiment analysis. In: Proceedings of the 2022 International Conference on Data Mining Workshops, Orlando, FL, USA; 2022. vol. 30. Cambria E, Mao R, Han S, Liu Q. Sentic parser: A graph-based approach to concept extraction for sentiment analysis. In: Proceedings of the 2022 International Conference on Data Mining Workshops, Orlando, FL, USA; 2022. vol. 30.
31.
go back to reference Kim T, Yoo KM, Lee SG. Self-guided contrastive learning for bert sentence representations. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 2021. pp. 2528–2540. Kim T, Yoo KM, Lee SG. Self-guided contrastive learning for bert sentence representations. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 2021. pp. 2528–2540.
32.
go back to reference Cambria E, Liu Q, Decherchi S, Xing F, Kwok K. SenticNet 7: A commonsense-based neurosymbolic AI framework for explainable sentiment analysis. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference; 2022. pp. 3829–3839. European Language Resources Association, Marseille, France. https://aclanthology.org/2022.lrec-1.408. Cambria E, Liu Q, Decherchi S, Xing F, Kwok K. SenticNet 7: A commonsense-based neurosymbolic AI framework for explainable sentiment analysis. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference; 2022. pp. 3829–3839. European Language Resources Association, Marseille, France. https://​aclanthology.​org/​2022.​lrec-1.​408.
33.
go back to reference Baccianella S, Esuli A, Sebastiani F. Sentiwordnet 3.0: an enhanced lexical resource for sentiment analysis and opinion mining. In: Lrec, 2010; 2010. pp. 2200–2204. Baccianella S, Esuli A, Sebastiani F. Sentiwordnet 3.0: an enhanced lexical resource for sentiment analysis and opinion mining. In: Lrec, 2010; 2010. pp. 2200–2204.
34.
go back to reference Pang B, Lee L. Seeing stars: Exploiting class relationships for sentiment categorization with respect to rating scales. In: Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics (ACL’05). 2005. pp. 115–124. https://doi.org/10.3115/1219840.1219855. Pang B, Lee L. Seeing stars: Exploiting class relationships for sentiment categorization with respect to rating scales. In: Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics (ACL’05). 2005. pp. 115–124. https://​doi.​org/​10.​3115/​1219840.​1219855.
36.
go back to reference Socher R, Perelygin A, Wu J, Chuang J, Manning CD, Ng AY, Potts C. Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 conference on empirical methods in natural language processing. 2013. pp. 1631–1642. Socher R, Perelygin A, Wu J, Chuang J, Manning CD, Ng AY, Potts C. Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 conference on empirical methods in natural language processing. 2013. pp. 1631–1642.
37.
go back to reference Liu P, Qiu X, Huang XJ. Adversarial multi-task learning for text classification. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017. pp. 1–10. Liu P, Qiu X, Huang XJ. Adversarial multi-task learning for text classification. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017. pp. 1–10.
38.
go back to reference Alm C. Affect in text and speech. Ph.D. thesis, University of Illinois at Urbana-Champaign; 2008. Alm C. Affect in text and speech. Ph.D. thesis, University of Illinois at Urbana-Champaign; 2008.
40.
go back to reference Xu H, Liu B, Shu L, Philip SY. Bert post-training for review reading comprehension and aspect-based sentiment analysis. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019. pp. 2324–2335. https://doi.org/10.18653/v1/N19-1242. Xu H, Liu B, Shu L, Philip SY. Bert post-training for review reading comprehension and aspect-based sentiment analysis. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019. pp. 2324–2335. https://​doi.​org/​10.​18653/​v1/​N19-1242.
41.
go back to reference Xu P, Madotto A, Wu CS, Park JH, Fung P. Emo2vec: Learning generalized emotion representation by multi-task training. In: Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis. 2018. pp. 292–298. https://doi.org/10.18653/v1/W18-6243. Xu P, Madotto A, Wu CS, Park JH, Fung P. Emo2vec: Learning generalized emotion representation by multi-task training. In: Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis. 2018. pp. 292–298. https://​doi.​org/​10.​18653/​v1/​W18-6243.
42.
go back to reference Zhang N, Deng S, Cheng X, Chen X, Zhang Y, Zhang W, Chen H, Center HI. Drop redundant, shrink irrelevant: Selective knowledge injection for language pretraining. In: Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI 2021. 2021. pp. 4007–4014. https://doi.org/10.24963/ijcai.2021/552. Zhang N, Deng S, Cheng X, Chen X, Zhang Y, Zhang W, Chen H, Center HI. Drop redundant, shrink irrelevant: Selective knowledge injection for language pretraining. In: Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI 2021. 2021. pp. 4007–4014. https://​doi.​org/​10.​24963/​ijcai.​2021/​552.
Metadata
Title
Learning Sentiment-Enhanced Word Representations by Fusing External Hybrid Sentiment Knowledge
Authors
You Li
Zhizhou Lin
Yuming Lin
Jinhui Yin
Liang Chang
Publication date
16-06-2023
Publisher
Springer US
Published in
Cognitive Computation / Issue 6/2023
Print ISSN: 1866-9956
Electronic ISSN: 1866-9964
DOI
https://doi.org/10.1007/s12559-023-10164-1

Other articles of this Issue 6/2023

Cognitive Computation 6/2023 Go to the issue

Premium Partner