Skip to main content
Top

2018 | OriginalPaper | Chapter

Build Chinese Language Model with Recurrent Neural Network

Authors : Li Lin, Jin Liu, Zhenkai Gu, Zelun Zhang, Haoliang Ren

Published in: Advances in Computer Science and Ubiquitous Computing

Publisher: Springer Singapore

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

In recent years, the introduction of Deep Learning based machine learning methods have greatly enhanced the performance of Natural Language Processing (NLP). However, most Deep Learning based NLP studies in the literature are aimed at the Latin family languages. There is seldom research which takes Chinese language model as the objective. In this paper, we use Deep Learning method to build language model for Chinese. In our model, the Fully-connected Neural Network which is a popular structure used in NLP is replaced by the Recurrent Neural Network to build a better language model. In the experiments, we compare and summarize the differences between the results obtained by using the original Deep Learning method and our model. And the results prove the effectiveness of our proposed model.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Harris, Z.S.: Distributional structure. World 10(2–3), 146–162 (1981). Springer, Netherlands Harris, Z.S.: Distributional structure. World 10(2–3), 146–162 (1981). Springer, Netherlands
2.
go back to reference Landauer, T.K., Foltz, P.W., Laham, D.: An introduction to latent semantic analysis. Discourse Process. 25(2–3), 259–284 (1998)CrossRef Landauer, T.K., Foltz, P.W., Laham, D.: An introduction to latent semantic analysis. Discourse Process. 25(2–3), 259–284 (1998)CrossRef
3.
go back to reference Brown, P.F., Desouza, P.F., Mercer, R.L., et al.: Class-based n-gram models of natural language. Comput. Linguist. 18(4), 467–479 (1992) Brown, P.F., Desouza, P.F., Mercer, R.L., et al.: Class-based n-gram models of natural language. Comput. Linguist. 18(4), 467–479 (1992)
4.
go back to reference Wu, Z., Giles, C.L.: Sense-aware semantic analysis: a multi-prototype word representation model using Wikipedia. In: Twenty-Ninth AAAI Conference on Artificial Intelligence, pp. 2188–2194 (2015) Wu, Z., Giles, C.L.: Sense-aware semantic analysis: a multi-prototype word representation model using Wikipedia. In: Twenty-Ninth AAAI Conference on Artificial Intelligence, pp. 2188–2194 (2015)
5.
go back to reference Xu, W., Rudnicky, A.: Can artificial neural networks learn language models? In: Sixth International Conference on Spoken Language Processing (2000) Xu, W., Rudnicky, A.: Can artificial neural networks learn language models? In: Sixth International Conference on Spoken Language Processing (2000)
6.
go back to reference Nalisnick, E., Mitra, B., Craswell, N., et al.: Improving document ranking with dual word embedding. In: International Conference Companion on World Wide Web, pp. 83–84 (2016) Nalisnick, E., Mitra, B., Craswell, N., et al.: Improving document ranking with dual word embedding. In: International Conference Companion on World Wide Web, pp. 83–84 (2016)
7.
go back to reference Lai, S., Liu, K., He, S., et al.: How to generate a good word embedding. IEEE Intell. Syst. 31(6), 5–14 (2016)CrossRef Lai, S., Liu, K., He, S., et al.: How to generate a good word embedding. IEEE Intell. Syst. 31(6), 5–14 (2016)CrossRef
8.
go back to reference Pei, J., Zhang, C., Huang, D., et al.: Combining word embedding and semantic lexicon for Chinese word similarity computation. In: International Conference on Computer Processing of Oriental Languages, pp. 766–777 (2016) Pei, J., Zhang, C., Huang, D., et al.: Combining word embedding and semantic lexicon for Chinese word similarity computation. In: International Conference on Computer Processing of Oriental Languages, pp. 766–777 (2016)
9.
go back to reference Bengio, Y., Ducharme, R., Vincent, P.: A neural probabilistic language model. In: Advances in Neural Information Processing Systems, pp. 932–938 (2001) Bengio, Y., Ducharme, R., Vincent, P.: A neural probabilistic language model. In: Advances in Neural Information Processing Systems, pp. 932–938 (2001)
10.
go back to reference Turian, J., Ratinov, L., Bengio, Y.: Word representations: a simple and general method for semi-supervised learning. In: Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics (ACL), pp. 384–394 (2010) Turian, J., Ratinov, L., Bengio, Y.: Word representations: a simple and general method for semi-supervised learning. In: Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics (ACL), pp. 384–394 (2010)
11.
go back to reference Mikolov, T., Karafiát, M., Burget, L., Cernocký, J., Khudanpur, S.: Recurrent neural network based language model. In: Conference of the International Speech Communication Association, INTERSPEECH 2010, Makuhari, Chiba, Japan, September, DBLP, pp. 1045–1048 (2010) Mikolov, T., Karafiát, M., Burget, L., Cernocký, J., Khudanpur, S.: Recurrent neural network based language model. In: Conference of the International Speech Communication Association, INTERSPEECH 2010, Makuhari, Chiba, Japan, September, DBLP, pp. 1045–1048 (2010)
12.
go back to reference Mikolov, T., Chen, K., Corrado, G., et al.: Efficient estimation of word representations in vector space. In: International Conference on Learning Representations Workshop Track (2013) Mikolov, T., Chen, K., Corrado, G., et al.: Efficient estimation of word representations in vector space. In: International Conference on Learning Representations Workshop Track (2013)
13.
go back to reference Lai, W.: Research on semantic vector representation of words and documents based on neural networks. Institute of automation, Chinese Academy of Sciences (2016) Lai, W.: Research on semantic vector representation of words and documents based on neural networks. Institute of automation, Chinese Academy of Sciences (2016)
Metadata
Title
Build Chinese Language Model with Recurrent Neural Network
Authors
Li Lin
Jin Liu
Zhenkai Gu
Zelun Zhang
Haoliang Ren
Copyright Year
2018
Publisher
Springer Singapore
DOI
https://doi.org/10.1007/978-981-10-7605-3_146