Skip to main content
Top

2018 | OriginalPaper | Chapter

A Sentence-Level Sparse Gamma Topic Model for Sentiment Analysis

Authors : Tao Chen, Jeffrey Parsons

Published in: Advances in Artificial Intelligence

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Online consumer reviews have become an essential source of information for understanding markets and customer preferences. This research introduces a novel topic model to identify product attributes and sentiments toward them at the sentence level. The model uses a recursive definition of topic distribution in a sentence to avoid the problem of over-parametrization in topic models. The introduction of the inference network enables the utilization of rich features in the content to drive the identification of sentiments, in contrast with other multi-aspect sentiment analysis models that rely on single words. The sentence topic model has a superior performance in producing coherent topics, and the sentence topic-sentiment model outperforms the existing model on the task of predicting product attribute rating.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
2.
go back to reference Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Mach. Learn. Res. 3, 993–1022 (2003)MATH Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Mach. Learn. Res. 3, 993–1022 (2003)MATH
3.
go back to reference Chen, Y., Xie, J.: Online consumer review: word-of-mouth as a new element of marketing communication mix. Manage. Sci. 54(3), 477–491 (2008)CrossRef Chen, Y., Xie, J.: Online consumer review: word-of-mouth as a new element of marketing communication mix. Manage. Sci. 54(3), 477–491 (2008)CrossRef
4.
go back to reference Chevalier, J.A., Mayzlin, D.: The effect of word of mouth on sales: online book reviews. J. Mark. Res. 43(3), 345–354 (2006)CrossRef Chevalier, J.A., Mayzlin, D.: The effect of word of mouth on sales: online book reviews. J. Mark. Res. 43(3), 345–354 (2006)CrossRef
5.
go back to reference Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRef Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRef
6.
go back to reference Hoffman, M.D., Blei, D.M., Wang, C., Paisley, J.: Stochastic variational inference. J. Mach. Learn. Res. 14(1), 1303–1347 (2013)MathSciNetMATH Hoffman, M.D., Blei, D.M., Wang, C., Paisley, J.: Stochastic variational inference. J. Mach. Learn. Res. 14(1), 1303–1347 (2013)MathSciNetMATH
7.
go back to reference Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. In: International Conference on Machine Learning, pp. 448–456 (2015) Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. In: International Conference on Machine Learning, pp. 448–456 (2015)
8.
go back to reference Lau, J.H., Newman, D., Baldwin, T.: Machine reading tea leaves: automatically evaluating topic coherence and topic model quality. In: Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics, pp. 530–539 (2014) Lau, J.H., Newman, D., Baldwin, T.: Machine reading tea leaves: automatically evaluating topic coherence and topic model quality. In: Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics, pp. 530–539 (2014)
9.
go back to reference Liu, B.: Sentiment Analysis and Opinion Mining. Synthesis Lectures on Human Language Technologies, vol. 5(1), pp. 1–167. Morgan & Claypool Publishers, San Rafael (2012) Liu, B.: Sentiment Analysis and Opinion Mining. Synthesis Lectures on Human Language Technologies, vol. 5(1), pp. 1–167. Morgan & Claypool Publishers, San Rafael (2012)
10.
go back to reference Ranganath, R., Tang, L., Charlin, L., Blei, D.: Deep exponential families. In: Artificial Intelligence and Statistics, pp. 762–771 (2015) Ranganath, R., Tang, L., Charlin, L., Blei, D.: Deep exponential families. In: Artificial Intelligence and Statistics, pp. 762–771 (2015)
11.
go back to reference Rezende, D.J., Mohamed, S., Wierstra, D.: Stochastic backpropagation and approximate inference in deep generative models (2014). arXiv preprint arXiv:1401.4082 Rezende, D.J., Mohamed, S., Wierstra, D.: Stochastic backpropagation and approximate inference in deep generative models (2014). arXiv preprint arXiv:​1401.​4082
13.
go back to reference Srivastava, N., Hinton, G.E., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetMATH Srivastava, N., Hinton, G.E., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetMATH
14.
go back to reference Tang, J., Meng, Z., Nguyen, X., Mei, Q., Zhang, M.: Understanding the limiting factors of topic modeling via posterior contraction analysis. In: International Conference on Machine Learning, pp. 190–198 (2014) Tang, J., Meng, Z., Nguyen, X., Mei, Q., Zhang, M.: Understanding the limiting factors of topic modeling via posterior contraction analysis. In: International Conference on Machine Learning, pp. 190–198 (2014)
15.
go back to reference Titov, I., McDonald, R.T.: A joint model of text and aspect ratings for sentiment summarization. In: Proceedings of the Conference 46th Annual Meeting of the Association for Computational Linguistics, vol. 8, pp. 308–316. Citeseer (2008) Titov, I., McDonald, R.T.: A joint model of text and aspect ratings for sentiment summarization. In: Proceedings of the Conference 46th Annual Meeting of the Association for Computational Linguistics, vol. 8, pp. 308–316. Citeseer (2008)
16.
go back to reference Wang, H., Ester, M.: A sentiment-aligned topic model for product aspect rating prediction. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (2014) Wang, H., Ester, M.: A sentiment-aligned topic model for product aspect rating prediction. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (2014)
17.
go back to reference Wang, H., Lu, Y., Zhai, C.: Latent aspect rating analysis without aspect keyword supervision. In: Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 618–626 (2011) Wang, H., Lu, Y., Zhai, C.: Latent aspect rating analysis without aspect keyword supervision. In: Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 618–626 (2011)
18.
Metadata
Title
A Sentence-Level Sparse Gamma Topic Model for Sentiment Analysis
Authors
Tao Chen
Jeffrey Parsons
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-319-89656-4_33

Premium Partner