Skip to main content
Erschienen in: Neural Processing Letters 6/2022

10.05.2022

Simplified-Boosting Ensemble Convolutional Network for Text Classification

verfasst von: Fang Zeng, Niannian Chen, Dan Yang, Zhigang Meng

Erschienen in: Neural Processing Letters | Ausgabe 6/2022

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Graph convolutional network (GCN) has a strong ability to extract the global feature but neglects the order of the words, thus leading to its weak effect on short text classification. In contrast, convolutional neural network (CNN) can capture the local contextual information within a sentence. There are few methods that can effectively classify both long text and short text. Therefore, we propose an ensemble convolutional network by combining GCN and CNN. In our method, GCN catches the global information and CNN extracts local features. Besides, we propose a simplified boosting algorithm, which makes CNN learn the samples misclassified by GCN again to improve classification performance and reduce the training time of the network. The results on four benchmark datasets show that our framework achieves better performance than other state-of-the-art methods with less memory.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Battaglia PW, Hamrick JB, Bapst V, Sanchez-Gonzalez A, Zambaldi V, Malinowski M, Tacchetti A, Raposo D, Santoro A, Faulkner R, et al (2018) Relational inductive biases, deep learning, and graph networks. arXiv:1806.01261 Battaglia PW, Hamrick JB, Bapst V, Sanchez-Gonzalez A, Zambaldi V, Malinowski M, Tacchetti A, Raposo D, Santoro A, Faulkner R, et al (2018) Relational inductive biases, deep learning, and graph networks. arXiv:​1806.​01261
2.
Zurück zum Zitat Cheng W, Greaves C, Warren M (2006) From n-gram to skipgram to concgram. Int J Corpus Linguist 11(4):411–433CrossRef Cheng W, Greaves C, Warren M (2006) From n-gram to skipgram to concgram. Int J Corpus Linguist 11(4):411–433CrossRef
3.
Zurück zum Zitat Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetMATH Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetMATH
4.
Zurück zum Zitat Devlin J, Chang MW, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding. arXiv:181004805 Devlin J, Chang MW, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding. arXiv:​181004805
5.
Zurück zum Zitat Elakkiya E, Selvakumar S, Velusamy RL (2020) Textspamdetector: textual content based deep learning framework for social spam detection using conjoint attention mechanism. J Ambient Intell Human Comput 12:1–16 Elakkiya E, Selvakumar S, Velusamy RL (2020) Textspamdetector: textual content based deep learning framework for social spam detection using conjoint attention mechanism. J Ambient Intell Human Comput 12:1–16
6.
Zurück zum Zitat Gao H, Chen Y, Ji S (2019) Learning graph pooling and hybrid convolutional operations for text representations. In: The world wide web conference, pp 2743–2749 Gao H, Chen Y, Ji S (2019) Learning graph pooling and hybrid convolutional operations for text representations. In: The world wide web conference, pp 2743–2749
7.
8.
9.
Zurück zum Zitat James F (2005) The elements of statistical learning: data mining, inference and prediction. Math Intell 27(2):83–85 James F (2005) The elements of statistical learning: data mining, inference and prediction. Math Intell 27(2):83–85
10.
12.
13.
Zurück zum Zitat Kuznetsov V, Mochalov V, Mochalova A (2015) Ontological-semantic text analysis and the question answering system using data from ontology. In: 2016 18th International conference on advanced communication technology (ICACT). IEEE, pp 651–658 Kuznetsov V, Mochalov V, Mochalova A (2015) Ontological-semantic text analysis and the question answering system using data from ontology. In: 2016 18th International conference on advanced communication technology (ICACT). IEEE, pp 651–658
14.
Zurück zum Zitat Li F, Zhang M, Fu G, Qian T, Ji D (2016) A bi-lstm-rnn model for relation classification using low-cost sequence features. arXiv:160807720 Li F, Zhang M, Fu G, Qian T, Ji D (2016) A bi-lstm-rnn model for relation classification using low-cost sequence features. arXiv:​160807720
15.
Zurück zum Zitat Li Q, Peng H, Li J, Xia C, Yang R, Sun L, Yu PS, He L (2020) A survey on text classification: from shallow to deep learning. arXiv:200800364 Li Q, Peng H, Li J, Xia C, Yang R, Sun L, Yu PS, He L (2020) A survey on text classification: from shallow to deep learning. arXiv:​200800364
16.
Zurück zum Zitat Liang J, Deng Y, Zeng D (2020) A deep neural network combined cnn and gcn for remote sensing scene classification. IEEE J Sel Top Appl Earth Observ Remote Sens 13:4325–4338CrossRef Liang J, Deng Y, Zeng D (2020) A deep neural network combined cnn and gcn for remote sensing scene classification. IEEE J Sel Top Appl Earth Observ Remote Sens 13:4325–4338CrossRef
17.
Zurück zum Zitat Lin Y, Meng Y, Sun X, Han Q, Kuang K, Li J, Wu F (2021) Bertgcn: transductive text classification by combining gcn and bert. arXiv:210505727 Lin Y, Meng Y, Sun X, Han Q, Kuang K, Li J, Wu F (2021) Bertgcn: transductive text classification by combining gcn and bert. arXiv:​210505727
18.
Zurück zum Zitat Liu G, Li B, Hu W, Yang J (2013) Horror text recognition based on generalized expectation criteria. In: international conference on intelligent science and big data engineering. Springer, pp 136–142 Liu G, Li B, Hu W, Yang J (2013) Horror text recognition based on generalized expectation criteria. In: international conference on intelligent science and big data engineering. Springer, pp 136–142
19.
Zurück zum Zitat Liu P, Qiu X, Huang X (2016) Recurrent neural network for text classification with multi-task learning. arXiv:160505101 Liu P, Qiu X, Huang X (2016) Recurrent neural network for text classification with multi-task learning. arXiv:​160505101
20.
Zurück zum Zitat Lu Z, Du P, Nie JY (2020) Vgcn-bert: augmenting bert with graph embedding for text classification. In: European conference on information retrieval. Springer, pp 369–382 Lu Z, Du P, Nie JY (2020) Vgcn-bert: augmenting bert with graph embedding for text classification. In: European conference on information retrieval. Springer, pp 369–382
21.
Zurück zum Zitat Novotnỳ V, Ayetiran EF, Štefánik M, Sojka P (2020) Text classification with word embedding regularization and soft similarity measure. arXiv:200305019 Novotnỳ V, Ayetiran EF, Štefánik M, Sojka P (2020) Text classification with word embedding regularization and soft similarity measure. arXiv:​200305019
22.
Zurück zum Zitat Peng H, Li J, He Y, Liu Y, Bao M, Wang L, Song Y, Yang Q (2018) Large-scale hierarchical text classification with recursively regularized deep graph-cnn. In: Proceedings of the 2018 world wide web conference, pp 1063–1072 Peng H, Li J, He Y, Liu Y, Bao M, Wang L, Song Y, Yang Q (2018) Large-scale hierarchical text classification with recursively regularized deep graph-cnn. In: Proceedings of the 2018 world wide web conference, pp 1063–1072
23.
Zurück zum Zitat Peng H, Li J, Wang S, Wang L, Gong Q, Yang R, Li B, Yu P, He L (2019) Hierarchical taxonomy-aware and attentional graph capsule rcnns for large-scale multi-label text classification. IEEE Trans Knowl Data Eng 33(6):2505–2519 Peng H, Li J, Wang S, Wang L, Gong Q, Yang R, Li B, Yu P, He L (2019) Hierarchical taxonomy-aware and attentional graph capsule rcnns for large-scale multi-label text classification. IEEE Trans Knowl Data Eng 33(6):2505–2519
24.
Zurück zum Zitat Shao K, Zhang Z, He S, Bo X (2020) Dtigccn: prediction of drug-target interactions based on gcn and cnn. In: 2020 IEEE 32nd international conference on tools with artificial intelligence (ICTAI). IEEE, pp 337–342 Shao K, Zhang Z, He S, Bo X (2020) Dtigccn: prediction of drug-target interactions based on gcn and cnn. In: 2020 IEEE 32nd international conference on tools with artificial intelligence (ICTAI). IEEE, pp 337–342
25.
Zurück zum Zitat Socher R, Perelygin A, Wu J, Chuang J, Manning CD, Ng AY, Potts C (2013) Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 conference on empirical methods in natural language processing, pp 1631–1642 Socher R, Perelygin A, Wu J, Chuang J, Manning CD, Ng AY, Potts C (2013) Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 conference on empirical methods in natural language processing, pp 1631–1642
26.
Zurück zum Zitat Tong X, Wu B, Wang S, Lv J (2018) A complaint text classification model based on character-level convolutional network. In: 2018 IEEE 9th international conference on software engineering and service science (ICSESS). IEEE, pp 507–511 Tong X, Wu B, Wang S, Lv J (2018) A complaint text classification model based on character-level convolutional network. In: 2018 IEEE 9th international conference on software engineering and service science (ICSESS). IEEE, pp 507–511
27.
Zurück zum Zitat Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. arXiv:170603762 Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. arXiv:​170603762
28.
Zurück zum Zitat Wang SI, Manning CD (2012) Baselines and bigrams: Simple, good sentiment and topic classification. In: Proceedings of the 50th annual meeting of the association for computational linguistics (volume 2: short papers), pp 90–94 Wang SI, Manning CD (2012) Baselines and bigrams: Simple, good sentiment and topic classification. In: Proceedings of the 50th annual meeting of the association for computational linguistics (volume 2: short papers), pp 90–94
29.
Zurück zum Zitat Wu F, Souza A, Zhang T, Fifty C, Yu T, Weinberger K (2019) Simplifying graph convolutional networks. In: International conference on machine learning, PMLR, pp 6861–6871 Wu F, Souza A, Zhang T, Fifty C, Yu T, Weinberger K (2019) Simplifying graph convolutional networks. In: International conference on machine learning, PMLR, pp 6861–6871
30.
Zurück zum Zitat Xu B, Guo X, Ye Y, Cheng J (2012) An improved random forest classifier for text categorization. JCP 7(12):2913–2920 Xu B, Guo X, Ye Y, Cheng J (2012) An improved random forest classifier for text categorization. JCP 7(12):2913–2920
31.
Zurück zum Zitat Xu D, Ruan C, Korpeoglu E, Kumar S, Achan K (2020) Inductive representation learning on temporal graphs. arXiv:200207962 Xu D, Ruan C, Korpeoglu E, Kumar S, Achan K (2020) Inductive representation learning on temporal graphs. arXiv:​200207962
32.
Zurück zum Zitat Yadav RK, Jiao L, Granmo OC, Goodwin M (2021) Enhancing interpretable clauses semantically using pretrained word representation. In: Proceedings of the fourth BlackboxNLP workshop on analyzing and interpreting neural networks for NLP, pp 265–274 Yadav RK, Jiao L, Granmo OC, Goodwin M (2021) Enhancing interpretable clauses semantically using pretrained word representation. In: Proceedings of the fourth BlackboxNLP workshop on analyzing and interpreting neural networks for NLP, pp 265–274
33.
Zurück zum Zitat Yao L, Mao C, Luo Y (2019) Graph convolutional networks for text classification. Proc AAAI Conf Artif Intell 33:7370–7377 Yao L, Mao C, Luo Y (2019) Graph convolutional networks for text classification. Proc AAAI Conf Artif Intell 33:7370–7377
34.
Zurück zum Zitat Zaheer M, Guruganesh G, Dubey A, Ainslie J, Alberti C, Ontanon S, Pham P, Ravula A, Wang Q, Yang L, et al. (2020) Big bird: transformers for longer sequences. arXiv:200714062 Zaheer M, Guruganesh G, Dubey A, Ainslie J, Alberti C, Ontanon S, Pham P, Ravula A, Wang Q, Yang L, et al. (2020) Big bird: transformers for longer sequences. arXiv:​200714062
35.
Zurück zum Zitat Zhang YD, Satapathy SC, Guttery DS, Górriz JM, Wang SH (2021) Improved breast cancer classification through combining graph convolutional network and convolutional neural network. Inf Process Manag 58(2):102439CrossRef Zhang YD, Satapathy SC, Guttery DS, Górriz JM, Wang SH (2021) Improved breast cancer classification through combining graph convolutional network and convolutional neural network. Inf Process Manag 58(2):102439CrossRef
36.
Zurück zum Zitat Zhou J, Cui G, Hu S, Zhang Z, Yang C, Liu Z, Wang L, Li C, Sun M (2020) Graph neural networks: a review of methods and applications. AI Open 1:57–81CrossRef Zhou J, Cui G, Hu S, Zhang Z, Yang C, Liu Z, Wang L, Li C, Sun M (2020) Graph neural networks: a review of methods and applications. AI Open 1:57–81CrossRef
37.
Zurück zum Zitat Zhu H, Koniusz P (2021) Simple spectral graph convolution. In: International conference on learning representations Zhu H, Koniusz P (2021) Simple spectral graph convolution. In: International conference on learning representations
38.
Zurück zum Zitat Zukov-Gregoric A, Bachrach Y, Minkovsky P, Coope S, Maksak B (2017) Neural named entity recognition using a self-attention mechanism. In: 2017 IEEE 29th international conference on tools with artificial intelligence (ICTAI). IEEE, pp 652–656 Zukov-Gregoric A, Bachrach Y, Minkovsky P, Coope S, Maksak B (2017) Neural named entity recognition using a self-attention mechanism. In: 2017 IEEE 29th international conference on tools with artificial intelligence (ICTAI). IEEE, pp 652–656
Metadaten
Titel
Simplified-Boosting Ensemble Convolutional Network for Text Classification
verfasst von
Fang Zeng
Niannian Chen
Dan Yang
Zhigang Meng
Publikationsdatum
10.05.2022
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 6/2022
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-022-10843-4

Weitere Artikel der Ausgabe 6/2022

Neural Processing Letters 6/2022 Zur Ausgabe

Neuer Inhalt