ABSTRACT
Table-to-text generation involves using natural language to describe a table which has formal structure and valuable information. Open-domain table-to-text especially refers to table-to-text generation for open domain. This paper introduces a theme model based on seq2seq for open-domain table-to-text generation. To deal with the problem of out-of-vocabulary and make the most of the internal correlation within table and the relevance between table and text, this study adopts an improved encoder-decoder approach and a method associating table and text. In addition, this paper improves the beam search method for the inference of the model. The model is experimented on WIKITABLETEXT, and improves the current state-of-the-art BLEU-4 score from 38.23 to 38.71.
- Liang, P., Jordan, M. I., and Klein, D. 2009. Learning semantic correspondences with less supervision. In ACL, 91--99. Google ScholarDigital Library
- Lebret, R., Grangier, D., and Auli, M. 2016. Neural text generation from structured data with application to the biography domain. In EMNLP, 1203--1213.Google Scholar
- Bao, J., Tang, D., Duan, N., Yan, Z., Lv, Y., Zhou, M., and Zhao, T. 2018. Table-to-text: describing table region with natural language. In AAAI.Google Scholar
- Liu, T., Wang, K., Sha, L., Chang, B., Sui, Z. 2018.Table-to-text generation by structure-aware seq2seq learning. In AAAI.Google Scholar
- Barzilay, R., and Lapata, M. 2005. Collective content selection for concept-to-text generation. In EMNLP, 331--338. Google ScholarDigital Library
- Van Deemter, K., Theune, M., and Krahmer, E. 2005. Real versus template-based natural language generation: A false opposition? Computational Linguistics 31(1), 15--24. Google ScholarDigital Library
- Belz, A. 2008. Automatic generation of weather forecast texts using comprehensive probabilistic generation-space models. Natural Language Engineering 14(4), 431--455. Google ScholarDigital Library
- Angeli, G., Liang, P., and Klein, D. 2010. A simple domain independent probabilistic approach to generation. In Proceedings of the Conference on Empirical Methods in Natural Language Processing, 502--512. Google ScholarDigital Library
- Mei, H., Bansal, M., and Walter, M. R. 2016. What to talk about and how? selective generation using lstms with coarse-to-fine alignment. In Proceedings of NAACL.Google Scholar
- Graves, A., Mohamed, A. R., and Hinton, G. 2013. Speech recognition with deep recurrent neural networks. In IEEE international conference on Acoustics, speech and signal processing, 6645--6649.Google Scholar
Index Terms
- Open-Domain Table-to-Text Generation based on Seq2seq
Recommendations
Two-Level Model for Table-to-Text Generation
SSPS '19: Proceedings of the 2019 International Symposium on Signal Processing SystemsTable-to-text generation involves using natural language to describe a table which has formal structure and valuable information. This paper introduces a two-level encoder-decoder neural model for table-to-text generation. To make the most of the ...
Table-to-Text Generation via Row-Aware Hierarchical Encoder
Chinese Computational LinguisticsAbstractIn this paper, we present a neural model to map structured table into document-scale descriptive texts. Most existing neural network based approaches encode a table record-by-record and generate long summaries by attentional encoder-decoder model, ...
A Hierarchical Attention Based Seq2Seq Model for Chinese Lyrics Generation
PRICAI 2019: Trends in Artificial IntelligenceAbstractIn this paper, we comprehensively study on context-aware generation of Chinese song lyrics. Conventional text generative models generate a sequence or sentence word by word, failing to consider the contextual relationship between sentences. Taking ...
Comments