Skip to main content

2019 | OriginalPaper | Buchkapitel

Analysis of Back-Translation Methods for Low-Resource Neural Machine Translation

verfasst von : Nuo Xu, Yinqiao Li, Chen Xu, Yanyang Li, Bei Li, Tong Xiao, Jingbo Zhu

Erschienen in: Natural Language Processing and Chinese Computing

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Back translation refers to the method of using machine translation to automatically translate target language monolingual data into source language data, which is a commonly used data augmentation method in machine translation tasks. Previous researchers’ works on back translation only focus on rich resource languages, while ignoring the low resource language with different quality. In this paper, we compare various monolingual selection methods, different model performance, pseudo-data and parallel corpus ratios, and different data generation methods for the validity of pseudo-data in machine translation tasks. Experiments on Lithuanian and Gujarati, two low-resource languages have shown that increasing the distribution of low-frequency words and increasing data diversity are more effective for models with sufficient training, while the results of insufficient models are opposite. In this paper, different back-translation strategies are used for different languages, and compared with common back-translation methods in WMT news tasks of two languages, and the effectiveness of the strategies is verified by experiments. At the same time, we find that combined back-translation strategies are more effective than simply increasing the amount of pseudo-data.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Bahdanau, D., Cho, K., Bengio, Y., et al.: Neural machine translation by jointly learning to align and translate. In: International Conference on Learning Representations (2015) Bahdanau, D., Cho, K., Bengio, Y., et al.: Neural machine translation by jointly learning to align and translate. In: International Conference on Learning Representations (2015)
2.
Zurück zum Zitat Luong, T., Pham, H., Manning, C.D., et al.: Effective approaches to attention-based neural machine translation. In: Empirical Methods in Natural Language Processing, pp. 1412–1421 (2015) Luong, T., Pham, H., Manning, C.D., et al.: Effective approaches to attention-based neural machine translation. In: Empirical Methods in Natural Language Processing, pp. 1412–1421 (2015)
3.
Zurück zum Zitat Sennrich, R., Haddow, B., Birch, A., et al.: Improving neural machine translation models with monolingual data. In: Meeting of the Association for Computational Linguistics, pp. 86–96 (2016) Sennrich, R., Haddow, B., Birch, A., et al.: Improving neural machine translation models with monolingual data. In: Meeting of the Association for Computational Linguistics, pp. 86–96 (2016)
4.
Zurück zum Zitat Fadaee, M., Monz, C.: Back-translation sampling by targeting difficult words in neural machine translation. In: Empirical Methods in Natural Language Processing, pp. 436–446 (2018) Fadaee, M., Monz, C.: Back-translation sampling by targeting difficult words in neural machine translation. In: Empirical Methods in Natural Language Processing, pp. 436–446 (2018)
5.
Zurück zum Zitat Axelrod, A., Vyas, Y., Martindale, M., Carpuat, M., Hopkins, J.: Classbased n-gram language difference models for data selection. In: IWSLT (International Workshop on Spoken Language Translation), pp. 180–187 (2015) Axelrod, A., Vyas, Y., Martindale, M., Carpuat, M., Hopkins, J.: Classbased n-gram language difference models for data selection. In: IWSLT (International Workshop on Spoken Language Translation), pp. 180–187 (2015)
6.
Zurück zum Zitat Edunov, S., Ott, M., Auli, M., et al.: Understanding back-translation at scale. In: Empirical Methods in Natural Language Processing, pp. 489–500 (2018) Edunov, S., Ott, M., Auli, M., et al.: Understanding back-translation at scale. In: Empirical Methods in Natural Language Processing, pp. 489–500 (2018)
7.
Zurück zum Zitat Moore, R.C., Lewis, W.D.: Intelligent selection of language model training data. In: Meeting of the Association for Computational Linguistics, pp. 220–224 (2010) Moore, R.C., Lewis, W.D.: Intelligent selection of language model training data. In: Meeting of the Association for Computational Linguistics, pp. 220–224 (2010)
8.
Zurück zum Zitat Post, M.: A Call for Clarity in Reporting BLEU Scores. arXiv: Computation and Language, pp. 186–191 (2018) Post, M.: A Call for Clarity in Reporting BLEU Scores. arXiv: Computation and Language, pp. 186–191 (2018)
9.
Zurück zum Zitat Sennrich, R., Haddow, B., Birch, A., et al.: Neural machine translation of rare words with subword units. In: Meeting of the Association for Computational Linguistics, pp. 1715–1725 (2016) Sennrich, R., Haddow, B., Birch, A., et al.: Neural machine translation of rare words with subword units. In: Meeting of the Association for Computational Linguistics, pp. 1715–1725 (2016)
10.
Zurück zum Zitat Vaswani, A., Shazeer, N., Parmar, N., et al.: Attention is all you need. In: Neural Information Processing Systems, pp. 5998–6008 (2017) Vaswani, A., Shazeer, N., Parmar, N., et al.: Attention is all you need. In: Neural Information Processing Systems, pp. 5998–6008 (2017)
11.
Zurück zum Zitat Brown, P.F., Pietra, V.J., Pietra, S.D., et al.: The mathematics of statistical machine translation: parameter estimation. Computat. Linguist. 19(2), 263–311 (1993) Brown, P.F., Pietra, V.J., Pietra, S.D., et al.: The mathematics of statistical machine translation: parameter estimation. Computat. Linguist. 19(2), 263–311 (1993)
Metadaten
Titel
Analysis of Back-Translation Methods for Low-Resource Neural Machine Translation
verfasst von
Nuo Xu
Yinqiao Li
Chen Xu
Yanyang Li
Bei Li
Tong Xiao
Jingbo Zhu
Copyright-Jahr
2019
DOI
https://doi.org/10.1007/978-3-030-32236-6_42