Skip to main content
Top
Published in: The Journal of Supercomputing 17/2023

05-06-2023

Topic sentiment analysis based on deep neural network using document embedding technique

Authors: Azam Seilsepour, Reza Ravanmehr, Ramin Nassiri

Published in: The Journal of Supercomputing | Issue 17/2023

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Sentiment Analysis (SA) is a domain- or topic-dependent task since polarity terms convey different sentiments in various domains. Hence, machine learning models trained on a specific domain cannot be employed in other domains, and existing domain-independent lexicons cannot correctly recognize the polarity of domain-specific polarity terms. Conventional approaches of Topic Sentiment Analysis perform Topic Modeling (TM) and SA sequentially, utilizing the previously trained models on irrelevant datasets for classifying sentiments that cannot provide acceptable accuracy. However, some researchers perform TM and SA simultaneously using topic-sentiment joint models, which require a list of seeds and their sentiments from widely used domain-independent lexicons. As a result, these methods cannot find the polarity of domain-specific terms correctly. This paper proposes a novel supervised hybrid TSA approach, called Embedding Topic Sentiment Analysis using Deep Neural Networks (ETSANet), that extracts the semantic relationships between the hidden topics and the training dataset using Semantically Topic-Related Documents Finder (STRDF). STRDF discovers those training documents in the same context as the topic based on the semantic relationships between the Semantic Topic Vector, a newly introduced concept that encompasses the semantic aspects of a topic, and the training dataset. Then, a hybrid CNN–GRU model is trained by these semantically topic-related documents. Moreover, a hybrid metaheuristic method utilizing Grey Wolf Optimization and Whale Optimization Algorithm is employed to fine-tune the hyperparameters of the CNN–GRU network. The evaluation results demonstrate that ETSANet increases the accuracy of the state-of-the-art methods by 1.92%.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literature
2.
go back to reference Seilsepour A, Ravanmehr R, Sima HR (2019) 2016 Olympic games on Twitter: sentiment analysis of sports fans tweets using big data framework. J Adv Comput Eng Technol 5(3):143–160 Seilsepour A, Ravanmehr R, Sima HR (2019) 2016 Olympic games on Twitter: sentiment analysis of sports fans tweets using big data framework. J Adv Comput Eng Technol 5(3):143–160
4.
go back to reference Fu X, Sun X, Wu H, Cui L, Huang JZ (2018) Weakly supervised topic sentiment joint model with word embeddings. Knowl-Based Syst 147:43–54CrossRef Fu X, Sun X, Wu H, Cui L, Huang JZ (2018) Weakly supervised topic sentiment joint model with word embeddings. Knowl-Based Syst 147:43–54CrossRef
6.
go back to reference Jelodar H et al (2019) Latent Dirichlet allocation (LDA) and topic modeling: models, applications, a survey. Multimed Tools Appl 78(11):15169–15211CrossRef Jelodar H et al (2019) Latent Dirichlet allocation (LDA) and topic modeling: models, applications, a survey. Multimed Tools Appl 78(11):15169–15211CrossRef
7.
14.
go back to reference Yang T, Gao C, Zang J, Lo D, Lyu MR (2021) TOUR: dynamic topic and sentiment analysis of user reviews for assisting app release. arXiv2103.15774 [cs] (Online) Yang T, Gao C, Zang J, Lo D, Lyu MR (2021) TOUR: dynamic topic and sentiment analysis of user reviews for assisting app release. arXiv2103.15774 [cs] (Online)
18.
go back to reference Qiao F, Williams J (2022) Topic modelling and sentiment analysis of global warming tweets: evidence from big data analysis. J Organ End User Comput 34(3):1–18CrossRef Qiao F, Williams J (2022) Topic modelling and sentiment analysis of global warming tweets: evidence from big data analysis. J Organ End User Comput 34(3):1–18CrossRef
20.
go back to reference Blei DM, Ng AY, Jordan MI (2003) Latent dirichlet allocation. J Mach Learn Res 3(Jan):993–1022MATH Blei DM, Ng AY, Jordan MI (2003) Latent dirichlet allocation. J Mach Learn Res 3(Jan):993–1022MATH
21.
go back to reference Lin C, He Y (2009) Joint sentiment/topic model for sentiment analysis. In: Proceedings of the 18th ACM Conference on Information and Knowledge Management, pp 375–384 Lin C, He Y (2009) Joint sentiment/topic model for sentiment analysis. In: Proceedings of the 18th ACM Conference on Information and Knowledge Management, pp 375–384
22.
go back to reference Osmani A, Mohasefi JB, Gharehchopogh FS (2020) Enriched Latent Dirichlet allocation for sentiment analysis. Expert Syst e12527 Osmani A, Mohasefi JB, Gharehchopogh FS (2020) Enriched Latent Dirichlet allocation for sentiment analysis. Expert Syst e12527
24.
go back to reference Fu X, Wu H, Cui L (2016) Topic sentiment joint model with word embeddings. In: DMNLP@ PKDD/ECML, pp 41–48 Fu X, Wu H, Cui L (2016) Topic sentiment joint model with word embeddings. In: DMNLP@ PKDD/ECML, pp 41–48
26.
go back to reference Huang F, Zhang S, Zhang J, Yu G (2017) Multimodal learning for topic sentiment analysis in microblogging. Neurocomputing 253:144–153CrossRef Huang F, Zhang S, Zhang J, Yu G (2017) Multimodal learning for topic sentiment analysis in microblogging. Neurocomputing 253:144–153CrossRef
27.
go back to reference Chen H, Cao G, Chen J, Ding J (2019) A practical framework for evaluating the quality of knowledge graph. In: Knowledge Graph and Semantic Computing: Knowledge Computing and Language Understanding, pp 111–122 Chen H, Cao G, Chen J, Ding J (2019) A practical framework for evaluating the quality of knowledge graph. In: Knowledge Graph and Semantic Computing: Knowledge Computing and Language Understanding, pp 111–122
28.
go back to reference Liu P, Gulla JA, Zhang L (2018) A joint model for analyzing topic and sentiment dynamics from large-scale online news. World Wide Web 21(4):1117–1139CrossRef Liu P, Gulla JA, Zhang L (2018) A joint model for analyzing topic and sentiment dynamics from large-scale online news. World Wide Web 21(4):1117–1139CrossRef
31.
go back to reference Pergola G, Gui L, He Y (2019) TDAM: a topic-dependent attention model for sentiment analysis. Inf Process Manag 56(6):102084CrossRef Pergola G, Gui L, He Y (2019) TDAM: a topic-dependent attention model for sentiment analysis. Inf Process Manag 56(6):102084CrossRef
36.
go back to reference Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. In: 1st Int. Conf. Learn. Represent. ICLR 2013—Work. Track Proc. (Online) Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. In: 1st Int. Conf. Learn. Represent. ICLR 2013—Work. Track Proc. (Online)
38.
go back to reference Le Q, Mikolov T (2014) Distributed representations of sentences and documents. In: 31st International Conference on Machine Learning, ICML 2014, vol 4. PMLR, pp 2931–2939 (Online) Le Q, Mikolov T (2014) Distributed representations of sentences and documents. In: 31st International Conference on Machine Learning, ICML 2014, vol 4. PMLR, pp 2931–2939 (Online)
41.
go back to reference Wang X, Jiang W, Luo Z (2016) Combination of convolutional and recurrent neural network for sentiment analysis of short texts. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp 2428–2437 (Online) Wang X, Jiang W, Luo Z (2016) Combination of convolutional and recurrent neural network for sentiment analysis of short texts. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp 2428–2437 (Online)
47.
go back to reference Alizadeh M, Mousavi SE, Beheshti MTH, Ostadi A (2021) Combination of feature selection and hybrid classifier as to network intrusion detection system adopting FA, GWO, and BAT optimizers. In: 2021 7th International Conference on Signal Processing and Intelligent Systems (ICSPIS), pp 1–7. https://doi.org/10.1109/ICSPIS54653.2021.9729365 Alizadeh M, Mousavi SE, Beheshti MTH, Ostadi A (2021) Combination of feature selection and hybrid classifier as to network intrusion detection system adopting FA, GWO, and BAT optimizers. In: 2021 7th International Conference on Signal Processing and Intelligent Systems (ICSPIS), pp 1–7. https://​doi.​org/​10.​1109/​ICSPIS54653.​2021.​9729365
49.
go back to reference Seilsepour A, Alizadeh M, Ravanmehr R, Beheshti MTH, Nassiri R (2022) Self-supervised sentiment classification based on semantic similarity measures and contextual embedding using metaheuristic optimizer. In: 2022 8th Iranian Conference on Signal Processing and Intelligent Systems (ICSPIS), pp 1–7. https://doi.org/10.1109/ICSPIS56952.2022.10043914 Seilsepour A, Alizadeh M, Ravanmehr R, Beheshti MTH, Nassiri R (2022) Self-supervised sentiment classification based on semantic similarity measures and contextual embedding using metaheuristic optimizer. In: 2022 8th Iranian Conference on Signal Processing and Intelligent Systems (ICSPIS), pp 1–7. https://​doi.​org/​10.​1109/​ICSPIS56952.​2022.​10043914
51.
go back to reference Pang B, Lee L (2004) A sentimental education: sentiment analysis using subjectivity summarization based on minimum cuts. arXiv:cs/0409058. (Online) Pang B, Lee L (2004) A sentimental education: sentiment analysis using subjectivity summarization based on minimum cuts. arXiv:​cs/​0409058. (Online)
52.
go back to reference E R, Pham PT, Huang D, Ng AY, Potts C (2007) Learning word vectors for sentiment analysis. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, vol 1, pp 142–150 (Online) E R, Pham PT, Huang D, Ng AY, Potts C (2007) Learning word vectors for sentiment analysis. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, vol 1, pp 142–150 (Online)
53.
go back to reference Go A, Bhayani R, Huang L (2009) Twitter sentiment classification using distant supervision. In: Processing, pp 1–6 (Online) Go A, Bhayani R, Huang L (2009) Twitter sentiment classification using distant supervision. In: Processing, pp 1–6 (Online)
58.
go back to reference Reimers N, Gurevych I (2019) Sentence-BERT: sentence embeddings using siamese BERT-networks. In: EMNLP-IJCNLP 2019–2019 Conference on Empirical Methods in Natural Language Processing. 9th Int. Jt. Conf. Nat. Lang. Process. Proc. Conf., pp 3982–3992. https://doi.org/10.18653/v1/d19-1410 Reimers N, Gurevych I (2019) Sentence-BERT: sentence embeddings using siamese BERT-networks. In: EMNLP-IJCNLP 2019–2019 Conference on Empirical Methods in Natural Language Processing. 9th Int. Jt. Conf. Nat. Lang. Process. Proc. Conf., pp 3982–3992. https://​doi.​org/​10.​18653/​v1/​d19-1410
59.
go back to reference Lan Z, Chen M, Goodman S, Gimpel K, Sharma P, Soricut R (2019) ALBERT: a lite BERT for self-supervised learning of language representations. ArXiv, vol abs/1909.1 (Online) Lan Z, Chen M, Goodman S, Gimpel K, Sharma P, Soricut R (2019) ALBERT: a lite BERT for self-supervised learning of language representations. ArXiv, vol abs/1909.1 (Online)
60.
go back to reference Liu Y et al. (2019) RoBERTa: a robustly optimized BERT pretraining approach (Online) Liu Y et al. (2019) RoBERTa: a robustly optimized BERT pretraining approach (Online)
61.
go back to reference Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A, Kaiser L, Polosukhin I (2017) Attention is all you need. In: Advances in Neural Information Processing Systems, vol 30 (Online) Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A, Kaiser L, Polosukhin I (2017) Attention is all you need. In: Advances in Neural Information Processing Systems, vol 30 (Online)
Metadata
Title
Topic sentiment analysis based on deep neural network using document embedding technique
Authors
Azam Seilsepour
Reza Ravanmehr
Ramin Nassiri
Publication date
05-06-2023
Publisher
Springer US
Published in
The Journal of Supercomputing / Issue 17/2023
Print ISSN: 0920-8542
Electronic ISSN: 1573-0484
DOI
https://doi.org/10.1007/s11227-023-05423-9

Other articles of this Issue 17/2023

The Journal of Supercomputing 17/2023 Go to the issue

Premium Partner