Skip to main content
Top

2021 | OriginalPaper | Chapter

Current Political News Translation Model Based on Attention Mechanism

Authors : Xixi Luo, Jiaqi Yan, Xinyu Chen, Yingjiang Wu, Ke Wu, Meili Lu, Liang Cai

Published in: Learning Technologies and Systems

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

China’s foreign exchanges are becoming increasingly frequent, among which current political news, as the recent or ongoing relevant reports of facts in the national political life, plays a great role in the field of information transmission. However, there are a large number of proper nouns as well as long and complex sentences in current political news, so the news translation through traditional machine translation tends to have low accuracy and poor usability. Based on this situation, this paper proposes a translation model of Chinese current political news based on Attention. It uses the classic Long Short Term Memory (LSTM) model and introduces the Attention Mechanism to improve the traditional Encoder-Decoder framework. Through the training of parallel corpora, constraints are established for the proper nouns of current political news, thereby improving the overall translation accuracy. The experiment shows that the translation model used in this paper has higher accuracy than neural machine translation (NMT).

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Sutskever, I. Vinyals, O., Le, V.Q.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems, pp. 3104–3112 (2014) Sutskever, I. Vinyals, O., Le, V.Q.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems, pp. 3104–3112 (2014)
2.
go back to reference Cho, K., Van Merrienboer, B., Gulcehre, C., et al.: Learning phrase representations using rnn encoder-decoder for statistical machine translation. Comput. Sci. (2014). arXiv preprint arXiv:1406.1078 Cho, K., Van Merrienboer, B., Gulcehre, C., et al.: Learning phrase representations using rnn encoder-decoder for statistical machine translation. Comput. Sci. (2014). arXiv preprint arXiv:​1406.​1078
4.
go back to reference Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences. arXiv preprint arXiv:1404.2188 (2014) Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences. arXiv preprint arXiv:​1404.​2188 (2014)
5.
go back to reference Sutskever, I., Martens, J., Hinton, E.G.: Generating text with recurrent neural networks. In: Proceedings of International Conference on Machine Learning. Bellevue, Washington: DBLP, pp. 1017–1024 (2016) Sutskever, I., Martens, J., Hinton, E.G.: Generating text with recurrent neural networks. In: Proceedings of International Conference on Machine Learning. Bellevue, Washington: DBLP, pp. 1017–1024 (2016)
6.
go back to reference Bawden, R., Sennrich, R., Birch, A., et al.: Evaluating discourse phenomena in neural machine translation. In: Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1304–1313 (2018) Bawden, R., Sennrich, R., Birch, A., et al.: Evaluating discourse phenomena in neural machine translation. In: Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1304–1313 (2018)
7.
go back to reference Zhang, J., Luan, H., Sun, M., et al.: Improving the transformer translation model with document-level context, pp. 533–542 (2018) Zhang, J., Luan, H., Sun, M., et al.: Improving the transformer translation model with document-level context, pp. 533–542 (2018)
8.
go back to reference Sabour, S., Frosst, N., Hinton, G.E.: Dynamic routing between capsules, pp. 3856–3866 (2017) Sabour, S., Frosst, N., Hinton, G.E.: Dynamic routing between capsules, pp. 3856–3866 (2017)
9.
go back to reference Kuang, S., Xiong, D., Luo, W., et al.: Modeling coherence for neural machine translation with dynamic and topic caches, pp. 596–606 (2017) Kuang, S., Xiong, D., Luo, W., et al.: Modeling coherence for neural machine translation with dynamic and topic caches, pp. 596–606 (2017)
10.
go back to reference Bahdanau, D. Kyunghyun, C., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014) Bahdanau, D. Kyunghyun, C., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:​1409.​0473 (2014)
11.
go back to reference Zhou, C., Bai, J., Song, J., et al.: An attention-based user behavior modeling framework for recommendation. arXiv preprint arXiv:1711.06632 (2017) Zhou, C., Bai, J., Song, J., et al.: An attention-based user behavior modeling framework for recommendation. arXiv preprint arXiv:​1711.​06632 (2017)
12.
go back to reference Shi, Y., Wang, Y., Wu, S.Q.: Machine translation system based on self-attention model. Comput. Mod. (07), 9–14 (2019) Shi, Y., Wang, Y., Wu, S.Q.: Machine translation system based on self-attention model. Comput. Mod. (07), 9–14 (2019)
13.
go back to reference Mao, X. Yan, W., Ma, W.J. Yin, H.M.: Machine translation of English place-names for attention mechanisms. Sci. Surv. Mapp. 44(06), 296–300+316 (2019) Mao, X. Yan, W., Ma, W.J. Yin, H.M.: Machine translation of English place-names for attention mechanisms. Sci. Surv. Mapp. 44(06), 296–300+316 (2019)
14.
go back to reference Cai, L., Hu, Z.W.: Construction scheme of network language evolution mechanism based on semantic knowledge base. Audio-Vis. Teach. Foreign Lang. (03), 8 (2018) Cai, L., Hu, Z.W.: Construction scheme of network language evolution mechanism based on semantic knowledge base. Audio-Vis. Teach. Foreign Lang. (03), 8 (2018)
15.
go back to reference Lan, W.F., Xu, W., Wang, D.Z., Pan, P.C.: Chinese news text classification based on LSTM-ATTENTION. J. Cent. South Univ. Natl. (Nat. Sci. Edn.) 37(03), 129–133 (2016) Lan, W.F., Xu, W., Wang, D.Z., Pan, P.C.: Chinese news text classification based on LSTM-ATTENTION. J. Cent. South Univ. Natl. (Nat. Sci. Edn.) 37(03), 129–133 (2016)
16.
go back to reference Cui, Y., Liu, T., Chen, Z., et al.: Consensus attention-based neural networks for chinese reading comprehension. arXiv preprint arXiv:1607.02250 (2016) Cui, Y., Liu, T., Chen, Z., et al.: Consensus attention-based neural networks for chinese reading comprehension. arXiv preprint arXiv:​1607.​02250 (2016)
17.
go back to reference Li, Q., Zhang, C., Woodland, P.C.: Integrating source-channel and attention-based sequence-to-sequence models for speech recognition. arXiv preprint arXiv:1909.06614 (2019) Li, Q., Zhang, C., Woodland, P.C.: Integrating source-channel and attention-based sequence-to-sequence models for speech recognition. arXiv preprint arXiv:​1909.​06614 (2019)
18.
go back to reference Qiu, Y., Ma, Y., Jin, Y., et al.: Chinese dialects identification using attention-based deep neural networks (2017) Qiu, Y., Ma, Y., Jin, Y., et al.: Chinese dialects identification using attention-based deep neural networks (2017)
Metadata
Title
Current Political News Translation Model Based on Attention Mechanism
Authors
Xixi Luo
Jiaqi Yan
Xinyu Chen
Yingjiang Wu
Ke Wu
Meili Lu
Liang Cai
Copyright Year
2021
DOI
https://doi.org/10.1007/978-3-030-66906-5_9

Premium Partner