Skip to main content
Top

2016 | OriginalPaper | Chapter

Chinese Sentiment Analysis Using Bidirectional LSTM with Word Embedding

Authors : Zheng Xiao, PiJun Liang

Published in: Cloud Computing and Security

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Long Short-Term Memory network have been successfully applied to sequence modeling task and obtained great achievements. However, Chinese text contains richer syntactic and semantic information and has strong intrinsic dependency between words and phrases. In this paper, we propose Bidirectional Long Short-Term Memory (BLSTM) with word embedding for Chinese sentiment analysis. BLSTM can learn past and future information and capture stronger dependency relationship. Word embedding mainly extract words’ feature from raw characters input and carry important syntactic and semantic information. Experimental results show that our model achieves 91.46 % accuracy for sentiment analysis task.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Liu, B.: Sentiment analysis and subjectivity. In: Handbook of Natural Language Processing, pp. 627–666 (2010) Liu, B.: Sentiment analysis and subjectivity. In: Handbook of Natural Language Processing, pp. 627–666 (2010)
2.
go back to reference Balahur, A., Steinberger, R., Kabadjov, M.: Sentiment analysis in the news. Infrared Phys. Technol. 65, 94–102 (2014)CrossRef Balahur, A., Steinberger, R., Kabadjov, M.: Sentiment analysis in the news. Infrared Phys. Technol. 65, 94–102 (2014)CrossRef
3.
go back to reference Long, J., Yu. M., Zhou, M., et al.: Target-dependent twitter sentiment classification. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (2011) Long, J., Yu. M., Zhou, M., et al.: Target-dependent twitter sentiment classification. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (2011)
4.
go back to reference Pak, A., Paroubek, P.: Twitter as a corpus for sentiment analysis and opinion mining. In: Proceedings of International Conference on Language Resources and Evaluation (LREc), vol. 10 (2010) Pak, A., Paroubek, P.: Twitter as a corpus for sentiment analysis and opinion mining. In: Proceedings of International Conference on Language Resources and Evaluation (LREc), vol. 10 (2010)
5.
go back to reference Wen, X., Shao, L., Xue, Y., Fang, W.: A rapid learning algorithm for vehicle classification. Inf. Sci. 295(1), 395–406 (2015)CrossRef Wen, X., Shao, L., Xue, Y., Fang, W.: A rapid learning algorithm for vehicle classification. Inf. Sci. 295(1), 395–406 (2015)CrossRef
6.
go back to reference Chen, B., Shu, H., Coatrieux, G., Chen, G., Sun, X., Coatrieux, J.-L.: Color image analysis by quaternion-type moments. J. Math. Imag. Vis. 51(1), 124–144 (2015)MathSciNetCrossRefMATH Chen, B., Shu, H., Coatrieux, G., Chen, G., Sun, X., Coatrieux, J.-L.: Color image analysis by quaternion-type moments. J. Math. Imag. Vis. 51(1), 124–144 (2015)MathSciNetCrossRefMATH
7.
go back to reference Bin, G., Sheng, V.S., Wang, Z., Ho, D., Osman, S., Li, S.: Incremental learning for - support vector regression. Neural Netw. 67, 140–150 (2015)CrossRef Bin, G., Sheng, V.S., Wang, Z., Ho, D., Osman, S., Li, S.: Incremental learning for - support vector regression. Neural Netw. 67, 140–150 (2015)CrossRef
8.
go back to reference Cui, A., Zhang, H., Liu, Y., Zhang, M., Ma, S.: Lexicon-based sentiment analysis on topical chinese microblog messages. In: Li, J., Qi, G., Zhao, D., Nejdl, W., Zheng, H.-T. (eds.) Semantic Web and Web Science, pp. 333–344. Springer, New York (2013)CrossRef Cui, A., Zhang, H., Liu, Y., Zhang, M., Ma, S.: Lexicon-based sentiment analysis on topical chinese microblog messages. In: Li, J., Qi, G., Zhao, D., Nejdl, W., Zheng, H.-T. (eds.) Semantic Web and Web Science, pp. 333–344. Springer, New York (2013)CrossRef
9.
go back to reference Yuan, B., Liu, Y., Li, H.: Sentiment classification in Chinese microblogs: lexicon-based and learning-based approaches. In: International Proceedings of Economics Development and Research, vol. 68, p. 1 (2013) Yuan, B., Liu, Y., Li, H.: Sentiment classification in Chinese microblogs: lexicon-based and learning-based approaches. In: International Proceedings of Economics Development and Research, vol. 68, p. 1 (2013)
10.
go back to reference Wang, D., Li, F.: Sentiment analysis of Chinese microblogs based on layered features. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds.) ICONIP 2014. LNCS, vol. 8835, pp. 361–368. Springer, Heidelberg (2014). doi:10.1007/978-3-319-12640-1_44 Wang, D., Li, F.: Sentiment analysis of Chinese microblogs based on layered features. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds.) ICONIP 2014. LNCS, vol. 8835, pp. 361–368. Springer, Heidelberg (2014). doi:10.​1007/​978-3-319-12640-1_​44
11.
go back to reference Lee, H.Y., Renganathan, H.: Chinese sentiment analysis using maximum entropy. In: Sentiment Analysis Where AI Meets Psychology (SAAIP), vol. 89 (2011) Lee, H.Y., Renganathan, H.: Chinese sentiment analysis using maximum entropy. In: Sentiment Analysis Where AI Meets Psychology (SAAIP), vol. 89 (2011)
12.
go back to reference Liu, L., Luo, D., Liu, M., Zhong, J., Wei, Y., Sun, L.: A self-adaptive hidden Markov model for emotion classification in Chinese microblogs. Math. Probl. Eng. 2015, 1–8 (2015). doi:10.1155/2015/987189. Article ID 987189 Liu, L., Luo, D., Liu, M., Zhong, J., Wei, Y., Sun, L.: A self-adaptive hidden Markov model for emotion classification in Chinese microblogs. Math. Probl. Eng. 2015, 1–8 (2015). doi:10.​1155/​2015/​987189. Article ID 987189
13.
go back to reference Li, J., Hovy, E.H.: Sentiment analysis on the people’s daily. In: EMNLP, pp. 467–476 (2014) Li, J., Hovy, E.H.: Sentiment analysis on the people’s daily. In: EMNLP, pp. 467–476 (2014)
14.
go back to reference Bengio, Y., LeCun, Y., Hinton, G.: Deep learning. Nature 521, 436–444 (2015)CrossRef Bengio, Y., LeCun, Y., Hinton, G.: Deep learning. Nature 521, 436–444 (2015)CrossRef
15.
go back to reference Bengio, S., Vinyals, O., Jaitly, N., et al.: Scheduled sampling for sequence prediction with recurrent neural networks. In: Advances in Neural Information Processing Systems, pp. 1171–1179 (2015) Bengio, S., Vinyals, O., Jaitly, N., et al.: Scheduled sampling for sequence prediction with recurrent neural networks. In: Advances in Neural Information Processing Systems, pp. 1171–1179 (2015)
16.
go back to reference Wang, D., Nyberg, E.: A long short-term memory model for answer sentence selection in question answering. In: ACL, July 2015 Wang, D., Nyberg, E.: A long short-term memory model for answer sentence selection in question answering. In: ACL, July 2015
17.
go back to reference Socher, R., Perelygin, A., Wu, J.Y., et al.: Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1631–1642 (2013) Socher, R., Perelygin, A., Wu, J.Y., et al.: Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1631–1642 (2013)
18.
go back to reference Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. arXiv preprint arXiv:1503.00075 (2015) Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. arXiv preprint arXiv:​1503.​00075 (2015)
19.
go back to reference Graves, A.: Supervised sequence labelling. In: Graves, A. (ed.) Supervised Sequence Labelling with Recurrent Neural Networks. SCI, vol. 385, pp. 5–13. Springer, Heidelberg (2012)CrossRef Graves, A.: Supervised sequence labelling. In: Graves, A. (ed.) Supervised Sequence Labelling with Recurrent Neural Networks. SCI, vol. 385, pp. 5–13. Springer, Heidelberg (2012)CrossRef
20.
go back to reference Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRef Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRef
21.
go back to reference Graves, A., Jaitly, N., Mohamed, A.-R.: Hybrid speech recognition with dee bidirectional LSTM. In: IEEE Workshop on Automatic Speech Recognition and Under-Standing (ASRU), pp. 273–278 (2013) Graves, A., Jaitly, N., Mohamed, A.-R.: Hybrid speech recognition with dee bidirectional LSTM. In: IEEE Workshop on Automatic Speech Recognition and Under-Standing (ASRU), pp. 273–278 (2013)
22.
go back to reference Hochreiter, S.: The vanishing gradient problem during learning recurrent neural nets, problem solutions. Int. J. Uncertainty Fuzziness Knowl. Based Syst. 6(02), 107–116 (1998)CrossRefMATH Hochreiter, S.: The vanishing gradient problem during learning recurrent neural nets, problem solutions. Int. J. Uncertainty Fuzziness Knowl. Based Syst. 6(02), 107–116 (1998)CrossRefMATH
23.
go back to reference Bengio, Y., Simard, P., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw. 5(2), 157–166 (1994)CrossRef Bengio, Y., Simard, P., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw. 5(2), 157–166 (1994)CrossRef
24.
go back to reference Gers, F.A., Schraudolph, N.N., Schmidhuber, J.: Learning precise timing with LSTM recurrent networks. J. Mach. Learn. Res. 3, 115–143 (2003)MathSciNetMATH Gers, F.A., Schraudolph, N.N., Schmidhuber, J.: Learning precise timing with LSTM recurrent networks. J. Mach. Learn. Res. 3, 115–143 (2003)MathSciNetMATH
26.
go back to reference Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Proceedings of NIPS, pp. 3111–3119 (2013) Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Proceedings of NIPS, pp. 3111–3119 (2013)
27.
go back to reference Mikolov, T., Yih, W., Zweig, G.: Linguistic regularities in continuous space word representations. In: Proceedings of HLT-NAACL, pp. 746–751 (2013) Mikolov, T., Yih, W., Zweig, G.: Linguistic regularities in continuous space word representations. In: Proceedings of HLT-NAACL, pp. 746–751 (2013)
28.
go back to reference Liu, X., Duh, K., Matsumoto, Y., et al.: Learning character representations for Chinese word segmentation. In: NIPS Workshop on Modern Machine Learning and Natural Language Processing (2014) Liu, X., Duh, K., Matsumoto, Y., et al.: Learning character representations for Chinese word segmentation. In: NIPS Workshop on Modern Machine Learning and Natural Language Processing (2014)
29.
go back to reference Williams, R.J., Zipser, D.: Gradient-based learning algorithms for recurrent networks, their computational complexity. In: Back-Propagation: Theory, Architectures and Applications, pp. 433–486 (1995) Williams, R.J., Zipser, D.: Gradient-based learning algorithms for recurrent networks, their computational complexity. In: Back-Propagation: Theory, Architectures and Applications, pp. 433–486 (1995)
30.
go back to reference Duchi, J., Hazan, E., Singer, Y.: Adaptive subgradient methods for online learning, stochastic optimization. J. Mach. Learn. Res. 12, 2121–2159 (2011)MathSciNetMATH Duchi, J., Hazan, E., Singer, Y.: Adaptive subgradient methods for online learning, stochastic optimization. J. Mach. Learn. Res. 12, 2121–2159 (2011)MathSciNetMATH
31.
go back to reference Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.R.: Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580 (2012) Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.R.: Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:​1207.​0580 (2012)
Metadata
Title
Chinese Sentiment Analysis Using Bidirectional LSTM with Word Embedding
Authors
Zheng Xiao
PiJun Liang
Copyright Year
2016
DOI
https://doi.org/10.1007/978-3-319-48674-1_53

Premium Partner