Skip to main content
Top

2018 | OriginalPaper | Chapter

A Hierarchical Neural Extractive Summarizer for Academic Papers

Authors : Kazutaka Kinugawa, Yoshimasa Tsuruoka

Published in: New Frontiers in Artificial Intelligence

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Recent neural network-based models have proven successful in summarization tasks. However, previous studies mostly focus on comparatively short texts and it is still challenging for neural models to summarize long documents such as academic papers. Because of their large size, summarization for academic papers has two obstacles: it is hard for a recurrent neural network (RNN) to squash all the information on the source document into a latent vector, and it is simply difficult to pinpoint a few correct sentences among a large number of sentences. In this paper, we present an extractive summarizer for academic papers. The idea is converting a paper into a tree structure composed of nodes corresponding to sections, paragraphs, and sentences. First, we build a hierarchical encoder-decoder model based on the tree. This design eases the load on the RNNs and enables us to effectively obtain vectors that represent paragraphs and sections. Second, we propose a tree structure-based scoring method to steer our model toward correct sentences, which also helps the model to avoid selecting irrelevant sentences. We collect academic papers available from PubMed Central, and build the training data suited for supervised machine learning-based extractive summarization. Our experimental results show that the proposed model outperforms several baselines and reduces high-impact errors.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
When we calculated ROUGE scores, positive sentences in group A were removed.
 
6
The ROUGE evaluation option is, -m -n 2 -w 1.2.
 
Literature
1.
go back to reference Rush, A.M., Chopra, S., Weston, J.: A neural attention model for sentence summarization. In: Proceedings of EMNLP, pp. 17–21 (2015) Rush, A.M., Chopra, S., Weston, J.: A neural attention model for sentence summarization. In: Proceedings of EMNLP, pp. 17–21 (2015)
2.
go back to reference Zhou, Q., Yang, N., Wei, F., Zhou, M.: Selective encoding for abstractive sentence summarization. In: Proceedings of ACL, pp. 1095–1104 (2017) Zhou, Q., Yang, N., Wei, F., Zhou, M.: Selective encoding for abstractive sentence summarization. In: Proceedings of ACL, pp. 1095–1104 (2017)
3.
go back to reference See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: Proceedings of ACL, pp. 1073–1083 (2017) See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: Proceedings of ACL, pp. 1073–1083 (2017)
4.
go back to reference Nallapati, R., Zhai, F., Zhou, B.: SummaRuNNer: an interpretable recurrent neural network model for extractive summarization. In: Proceedings of AAAI, pp. 3075–3081 (2017) Nallapati, R., Zhai, F., Zhou, B.: SummaRuNNer: an interpretable recurrent neural network model for extractive summarization. In: Proceedings of AAAI, pp. 3075–3081 (2017)
5.
go back to reference Isonuma, M., Fujino, T., Mori, J., Matsuo, Y., Sakata, I.: Extractive summarization using multi-task learning with document classification. In: Proceedings of EMNLP, pp. 2101–2110 (2017) Isonuma, M., Fujino, T., Mori, J., Matsuo, Y., Sakata, I.: Extractive summarization using multi-task learning with document classification. In: Proceedings of EMNLP, pp. 2101–2110 (2017)
6.
go back to reference Cheng, J., Lapata, M.: Neural summarization by extracting sentences and words. In: Proceedings of ACL, pp. 484–494 (2016) Cheng, J., Lapata, M.: Neural summarization by extracting sentences and words. In: Proceedings of ACL, pp. 484–494 (2016)
7.
go back to reference Contractor, D., Guo, Y., Korhonen, A.: Using argumentative zones for extractive summarization of scientific articles. In: Proceedings of COLING, pp. 663–678 (2012) Contractor, D., Guo, Y., Korhonen, A.: Using argumentative zones for extractive summarization of scientific articles. In: Proceedings of COLING, pp. 663–678 (2012)
8.
go back to reference Collins, E., Augenstein, I., Riedel, S.: A supervised approach to extractive summarisation of scientific papers. In: Proceedings of CoNLL, pp. 195–205 (2017) Collins, E., Augenstein, I., Riedel, S.: A supervised approach to extractive summarisation of scientific papers. In: Proceedings of CoNLL, pp. 195–205 (2017)
9.
go back to reference Li, J., Luong, M.T., Jurafsky, D.: A hierarchical neural autoencoder for paragraphs and documents. In: Proceedings of ACL, pp. 1106–1115 (2015) Li, J., Luong, M.T., Jurafsky, D.: A hierarchical neural autoencoder for paragraphs and documents. In: Proceedings of ACL, pp. 1106–1115 (2015)
10.
go back to reference Lin, C.Y.: Rouge: a package for automatic evaluation of summaries. In: Proceedings of the ACL 2004 Workshop on Text Summarization Branches Out, pp. 74–81 (2004) Lin, C.Y.: Rouge: a package for automatic evaluation of summaries. In: Proceedings of the ACL 2004 Workshop on Text Summarization Branches Out, pp. 74–81 (2004)
11.
go back to reference Kim, Y.: Convolutional neural networks for sentence classification. In: Proceedings of EMNLP, pp. 1746–1751 (2014) Kim, Y.: Convolutional neural networks for sentence classification. In: Proceedings of EMNLP, pp. 1746–1751 (2014)
12.
go back to reference Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRef Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRef
13.
go back to reference Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Proceedings of NAACL, pp. 1480–1489 (2016) Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Proceedings of NAACL, pp. 1480–1489 (2016)
14.
go back to reference Nakasuka, K., Tsuruoka, Y.: Auto summarization for academic papers based on discourse structure. In: Proceedings of ANLP 2015, pp. 569–572 (2015). (in Japanese) Nakasuka, K., Tsuruoka, Y.: Auto summarization for academic papers based on discourse structure. In: Proceedings of ANLP 2015, pp. 569–572 (2015). (in Japanese)
15.
go back to reference Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in NIPS, pp. 3111–3119 (2013) Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in NIPS, pp. 3111–3119 (2013)
16.
go back to reference Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: Proceedings of ICLR (2015) Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: Proceedings of ICLR (2015)
17.
go back to reference Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetMATH Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetMATH
18.
go back to reference Chen, Q., Zhu, X., Ling, Z., Wei, S., Jiang, H.: Distraction-based neural networks for modeling documents. In: Proceedings of IJCAI-2016, pp. 2754–2760 (2016) Chen, Q., Zhu, X., Ling, Z., Wei, S., Jiang, H.: Distraction-based neural networks for modeling documents. In: Proceedings of IJCAI-2016, pp. 2754–2760 (2016)
19.
go back to reference Narayan, S., Papasarantopoulos, N., Lapata, M., Cohen, S.B.: Neural extractive summarization with side information. arXiv:1704.04530 (2017) Narayan, S., Papasarantopoulos, N., Lapata, M., Cohen, S.B.: Neural extractive summarization with side information. arXiv:​1704.​04530 (2017)
20.
go back to reference Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of ACL, pp. 1556–1566 (2015) Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of ACL, pp. 1556–1566 (2015)
21.
go back to reference Gur, I., Hewlett, D., Lacoste, A., Jones, L.: Accurate supervised and semi-supervised machine reading for long documents. In: Proceedings of EMNLP, pp. 2011–2020 (2017) Gur, I., Hewlett, D., Lacoste, A., Jones, L.: Accurate supervised and semi-supervised machine reading for long documents. In: Proceedings of EMNLP, pp. 2011–2020 (2017)
Metadata
Title
A Hierarchical Neural Extractive Summarizer for Academic Papers
Authors
Kazutaka Kinugawa
Yoshimasa Tsuruoka
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-319-93794-6_25

Premium Partner