Skip to main content
Top
Published in: Cognitive Computation 2/2024

26-12-2023

A Cognitively Inspired Multi-granularity Model Incorporating Label Information for Complex Long Text Classification

Authors: Li Gao, Yi Liu, Jianmin Zhu, Zhen Yu

Published in: Cognitive Computation | Issue 2/2024

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Because the abstracts contain complex information and the labels of abstracts do not contain information about categories, it is difficult for cognitive models to extract comprehensive features to match the corresponding labels. In this paper, a cognitively inspired multi-granularity model incorporating label information (LIMG) is proposed to solve these problems. Firstly, we use information of abstracts to give labels the actual semantics. It can improve the semantic representation of word embeddings. Secondly, the model uses the dual channel pooling convolutional neural network (DCP-CNN) and the timescale shrink gated recurrent units (TSGRU) to extract multi-granularity information of abstracts. One of the channels in DCP-CNN highlights the key content and the other is used for TSGRU to extract context-related features of abstracts. Finally, TSGRU adds a timescale to retain the long-term dependence by recuring the past information and a soft thresholding algorithm to realize the noise reduction. Experiments were carried out on four benchmark datasets: Arxiv Academic Paper Dataset (AAPD), Web of Science (WOS), Amazon Review and Yahoo! Answers. As compared to the baseline models, the accuracy is improved by up to 3.36%. On AAPD (54,840 abstracts) and WOS (46,985 abstracts) datasets, the micro-F1 score reached 75.62% and 81.68%, respectively. The results show that acquiring label semantics from abstracts can enhance text representations and multi-granularity feature extraction can inspire the cognitive system’s understanding of the complex information in abstracts.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Hahn M, Keller F. Modeling task effects in human reading with neural network-based attention. Cognition. 2023;230:105289.CrossRef Hahn M, Keller F. Modeling task effects in human reading with neural network-based attention. Cognition. 2023;230:105289.CrossRef
2.
go back to reference Alatrash R, Priyadarshini R, Ezaldeen H, Alhinnawi A. Augmented language model with deep learning adaptation on sentiment analysis for E-learning recommendation. Cogn Syst Res. 2022;75:53–69.CrossRef Alatrash R, Priyadarshini R, Ezaldeen H, Alhinnawi A. Augmented language model with deep learning adaptation on sentiment analysis for E-learning recommendation. Cogn Syst Res. 2022;75:53–69.CrossRef
3.
go back to reference Yun S, Cho W, Kim C. Technological trend mining: identifying new technology opportunities using patent semantic analysis. Inf Process Manage. 2022;59(4):102993.CrossRef Yun S, Cho W, Kim C. Technological trend mining: identifying new technology opportunities using patent semantic analysis. Inf Process Manage. 2022;59(4):102993.CrossRef
4.
go back to reference Kaliyar RK, Goswami A, Narang P, Sinha S. FNDNet – a deep convolutional neural network for fake news detection. Cogn Syst Res. 2020;61:32–44.CrossRef Kaliyar RK, Goswami A, Narang P, Sinha S. FNDNet – a deep convolutional neural network for fake news detection. Cogn Syst Res. 2020;61:32–44.CrossRef
5.
go back to reference Omolara AE, Alabdulatif A, Abiodun OI, Alawida M, Alabdulatif A, Alkhawaldeh RS. A systematic review of emerging feature selection optimization methods for optimal text classification: the present state and prospective opportunities. Neural Comput Appl. 2021;33:15091–118.CrossRef Omolara AE, Alabdulatif A, Abiodun OI, Alawida M, Alabdulatif A, Alkhawaldeh RS. A systematic review of emerging feature selection optimization methods for optimal text classification: the present state and prospective opportunities. Neural Comput Appl. 2021;33:15091–118.CrossRef
6.
go back to reference Moirangthem DS, Lee M. Hierarchical and lateral multiple timescales gated recurrent units with pre-trained encoder for long text classification. Expert Syst Appl. 2021;165:113898.CrossRef Moirangthem DS, Lee M. Hierarchical and lateral multiple timescales gated recurrent units with pre-trained encoder for long text classification. Expert Syst Appl. 2021;165:113898.CrossRef
7.
go back to reference Tan C, Ren Y, Wang C. An adaptive convolution with label embedding for text classification. Appl Intell. 2022;53:804–12.CrossRef Tan C, Ren Y, Wang C. An adaptive convolution with label embedding for text classification. Appl Intell. 2022;53:804–12.CrossRef
9.
go back to reference Zia S, Azhar M, Lee B, Tahir A, Ferzund J, Murtaza F, et al. Recognition of printed Urdu script in Nastaleeq font by using CNN-BiGRU-GRU based encoder-decoder framework. Intelligent Systems with Applications. 2023;18:200194.CrossRef Zia S, Azhar M, Lee B, Tahir A, Ferzund J, Murtaza F, et al. Recognition of printed Urdu script in Nastaleeq font by using CNN-BiGRU-GRU based encoder-decoder framework. Intelligent Systems with Applications. 2023;18:200194.CrossRef
10.
go back to reference Cao H, Zhao T, Wang W, Wei P. Bilingual word embedding fusion for robust unsupervised bilingual lexicon induction. Information Fusion. 2023;97:101818.CrossRef Cao H, Zhao T, Wang W, Wei P. Bilingual word embedding fusion for robust unsupervised bilingual lexicon induction. Information Fusion. 2023;97:101818.CrossRef
11.
go back to reference Mahto D, Yadav S C. Emotion prediction for textual data using GloVe based HeBi-CuDNNLSTM model. Multim Tools Appl. 2023. Mahto D, Yadav S C. Emotion prediction for textual data using GloVe based HeBi-CuDNNLSTM model. Multim Tools Appl. 2023.
12.
go back to reference Tagarelli A, Simeri A. Unsupervised law article mining based on deep pre-trained language representation models with application to the Italian civil code. Artificial Intelligence and Law. 2021;30:417–73.CrossRef Tagarelli A, Simeri A. Unsupervised law article mining based on deep pre-trained language representation models with application to the Italian civil code. Artificial Intelligence and Law. 2021;30:417–73.CrossRef
13.
go back to reference Chen C, Wang K, Hsiao Y, Chou J. ALBERT: an automatic learning based execution and resource management system for optimizing Hadoop workload in clouds. Journal of Parallel and Distributed Computing. 2022;168:45–56.CrossRef Chen C, Wang K, Hsiao Y, Chou J. ALBERT: an automatic learning based execution and resource management system for optimizing Hadoop workload in clouds. Journal of Parallel and Distributed Computing. 2022;168:45–56.CrossRef
14.
go back to reference Hassan SU, Ahamed J, Ahmad K. Analytics of machine learning-based algorithms for text classification. Sustainable Operations and Computers. 2022;3:238–48.CrossRef Hassan SU, Ahamed J, Ahmad K. Analytics of machine learning-based algorithms for text classification. Sustainable Operations and Computers. 2022;3:238–48.CrossRef
15.
go back to reference Pavan Kumar RK, Jayagopal P. Context-sensitive lexicon for imbalanced text sentiment classification using bidirectional LSTM. J Intell Manuf. 2021;34:2123–32.CrossRef Pavan Kumar RK, Jayagopal P. Context-sensitive lexicon for imbalanced text sentiment classification using bidirectional LSTM. J Intell Manuf. 2021;34:2123–32.CrossRef
16.
go back to reference Huang Y, Liu Q, Peng H, Wang J, Yang Q, Orellana-Martín D. Sentiment classification using bidirectional LSTM-SNP model and attention mechanism. Expert Syst Appl. 2023;221:119730.CrossRef Huang Y, Liu Q, Peng H, Wang J, Yang Q, Orellana-Martín D. Sentiment classification using bidirectional LSTM-SNP model and attention mechanism. Expert Syst Appl. 2023;221:119730.CrossRef
17.
go back to reference Zhang Y, Tiwari P, Song D, Mao X, Wang P, Li X, et al. Learning interaction dynamics with an interactive LSTM for conversational sentiment analysis. Neural Netw. 2021;133:40–56.CrossRef Zhang Y, Tiwari P, Song D, Mao X, Wang P, Li X, et al. Learning interaction dynamics with an interactive LSTM for conversational sentiment analysis. Neural Netw. 2021;133:40–56.CrossRef
18.
go back to reference Huan H, Guo Z, Tingting C, He Z. A text classification method based on a convolutional and bidirectional long short-term memory model. Connect Sci. 2022;34(1):2108–24.CrossRef Huan H, Guo Z, Tingting C, He Z. A text classification method based on a convolutional and bidirectional long short-term memory model. Connect Sci. 2022;34(1):2108–24.CrossRef
19.
go back to reference Lu G, Liu Y, Wang J, Wu H. CNN-BiLSTM-Attention: a multi-label neural classifier for short texts with a small set of labels. Inf Process Manage. 2023;60(3):103320.CrossRef Lu G, Liu Y, Wang J, Wu H. CNN-BiLSTM-Attention: a multi-label neural classifier for short texts with a small set of labels. Inf Process Manage. 2023;60(3):103320.CrossRef
20.
go back to reference Luo T, Liu Y, Li T. A multi-feature fusion method with attention mechanism for long text classification. 2022 the 6th International Conference on Compute and Data Analysis. 2022. Luo T, Liu Y, Li T. A multi-feature fusion method with attention mechanism for long text classification. 2022 the 6th International Conference on Compute and Data Analysis. 2022.
21.
go back to reference Kenarang A, Farahani M, Manthouri M. BiGRU attention capsule neural network for Persian text classification. J Ambient Intell Humaniz Comput. 2022;13:3923–33.CrossRef Kenarang A, Farahani M, Manthouri M. BiGRU attention capsule neural network for Persian text classification. J Ambient Intell Humaniz Comput. 2022;13:3923–33.CrossRef
22.
go back to reference Yang S, Wang J, Zhang N, Deng B, Li X. Azghadi M R CerebelluMorphic: large-scale neuromorphic model and architecture for supervised motor learning. IEEE Trans Neural Netw Learn Syst. 2021;33(9):4398–412.CrossRef Yang S, Wang J, Zhang N, Deng B, Li X. Azghadi M R CerebelluMorphic: large-scale neuromorphic model and architecture for supervised motor learning. IEEE Trans Neural Netw Learn Syst. 2021;33(9):4398–412.CrossRef
23.
go back to reference Yang S, Wang J, Hao X, Li H, Wei X, Deng B, Loparo KA. BiCoSS: Toward large-scale cognition brain with multigranular neuromorphic architecture. IEEE Trans Neural Netw Learn Syst. 2021;33(7):2801–15.CrossRef Yang S, Wang J, Hao X, Li H, Wei X, Deng B, Loparo KA. BiCoSS: Toward large-scale cognition brain with multigranular neuromorphic architecture. IEEE Trans Neural Netw Learn Syst. 2021;33(7):2801–15.CrossRef
24.
go back to reference Pal A, Singh KP. AdamR-GRUs: Adaptive momentum-based regularized GRU for HMER problems. Appl Soft Comput. 2023;143:110457.CrossRef Pal A, Singh KP. AdamR-GRUs: Adaptive momentum-based regularized GRU for HMER problems. Appl Soft Comput. 2023;143:110457.CrossRef
25.
go back to reference Aote SS, Pimpalshende A, Potnurwar A, Lohi S. Binary particle swarm optimization with an improved genetic algorithm to solve multi-document text summarization problem of Hindi documents. Eng Appl Artif Intell. 2023;117:105575.CrossRef Aote SS, Pimpalshende A, Potnurwar A, Lohi S. Binary particle swarm optimization with an improved genetic algorithm to solve multi-document text summarization problem of Hindi documents. Eng Appl Artif Intell. 2023;117:105575.CrossRef
26.
go back to reference Herrera A, Sánchez N G, Vargas D. Rule-based Spanish multiple question reformulation and their classification using a convolutional neuronal network. Comput Sist. 2021;25(1). Herrera A, Sánchez N G, Vargas D. Rule-based Spanish multiple question reformulation and their classification using a convolutional neuronal network. Comput Sist. 2021;25(1).
27.
go back to reference Kaur K, Kaur P. BERT-CNN: improving BERT for requirements classification using CNN. Procedia Computer Science. 2023;218:2604–11.CrossRef Kaur K, Kaur P. BERT-CNN: improving BERT for requirements classification using CNN. Procedia Computer Science. 2023;218:2604–11.CrossRef
28.
go back to reference Rafiepour M, Sartakhti JS. CTRAN: CNN-transformer-based network for natural language understanding. Eng Appl Artif Intell. 2023;126:107013.CrossRef Rafiepour M, Sartakhti JS. CTRAN: CNN-transformer-based network for natural language understanding. Eng Appl Artif Intell. 2023;126:107013.CrossRef
29.
go back to reference Liang Y, Li H, Guo B, Yu Z, Zheng X, Samtani S, Zeng D. Fusion of heterogeneous attention mechanisms in multi-view convolutional neural network for text classification. Inf Sci. 2021;548:295–312.CrossRef Liang Y, Li H, Guo B, Yu Z, Zheng X, Samtani S, Zeng D. Fusion of heterogeneous attention mechanisms in multi-view convolutional neural network for text classification. Inf Sci. 2021;548:295–312.CrossRef
30.
go back to reference Ayetiran EF. Attention-based aspect sentiment classification using enhanced learning through CNN-BiLSTM networks. Knowl Based Syst. 2022;252:109409.CrossRef Ayetiran EF. Attention-based aspect sentiment classification using enhanced learning through CNN-BiLSTM networks. Knowl Based Syst. 2022;252:109409.CrossRef
31.
go back to reference Ahmed Z, Wang J. A fine-grained deep learning model using embedded-CNN with BiLSTM for exploiting product sentiments. Alex Eng J. 2022;65:731–47.CrossRef Ahmed Z, Wang J. A fine-grained deep learning model using embedded-CNN with BiLSTM for exploiting product sentiments. Alex Eng J. 2022;65:731–47.CrossRef
32.
go back to reference Li H, Yan Y, Wang S, Liu J, Cui Y. Text classification on heterogeneous information network via enhanced GCN and knowledge. Neural Comput Appl. 2023;35:14911–27.CrossRef Li H, Yan Y, Wang S, Liu J, Cui Y. Text classification on heterogeneous information network via enhanced GCN and knowledge. Neural Comput Appl. 2023;35:14911–27.CrossRef
33.
go back to reference Liu M, Liu L, Cao J, Du Q. Co-attention network with label embedding for text classification. Neurocomputing. 2022;471:61–9.CrossRef Liu M, Liu L, Cao J, Du Q. Co-attention network with label embedding for text classification. Neurocomputing. 2022;471:61–9.CrossRef
34.
go back to reference Wang J, Chen Z, Qin Y, He D, Lin F. Multi-aspect co-attentional collaborative filtering for extreme multi-label text classification. Knowledge Based Systems. 2022;260:110110.CrossRef Wang J, Chen Z, Qin Y, He D, Lin F. Multi-aspect co-attentional collaborative filtering for extreme multi-label text classification. Knowledge Based Systems. 2022;260:110110.CrossRef
35.
go back to reference Qian T, Li F, Zhang M, Jin G, Fan P, Wenhua D. Contrastive learning from label distribution: a case study on text classification. Neurocomputing. 2022;507:208–20.CrossRef Qian T, Li F, Zhang M, Jin G, Fan P, Wenhua D. Contrastive learning from label distribution: a case study on text classification. Neurocomputing. 2022;507:208–20.CrossRef
37.
go back to reference Su L, Xiong L, Yang J. Multi-Attn BLS: Multi-head attention mechanism with broad learning system for chaotic time series prediction. Appl Soft Comput. 2023;132:109831.CrossRef Su L, Xiong L, Yang J. Multi-Attn BLS: Multi-head attention mechanism with broad learning system for chaotic time series prediction. Appl Soft Comput. 2023;132:109831.CrossRef
38.
go back to reference Joshi A, Hong Y. R2Net: Efficient and flexible diffeomorphic image registration using Lipschitz continuous residual networks. Med Image Anal. 2023;89:102917.CrossRef Joshi A, Hong Y. R2Net: Efficient and flexible diffeomorphic image registration using Lipschitz continuous residual networks. Med Image Anal. 2023;89:102917.CrossRef
Metadata
Title
A Cognitively Inspired Multi-granularity Model Incorporating Label Information for Complex Long Text Classification
Authors
Li Gao
Yi Liu
Jianmin Zhu
Zhen Yu
Publication date
26-12-2023
Publisher
Springer US
Published in
Cognitive Computation / Issue 2/2024
Print ISSN: 1866-9956
Electronic ISSN: 1866-9964
DOI
https://doi.org/10.1007/s12559-023-10237-1

Other articles of this Issue 2/2024

Cognitive Computation 2/2024 Go to the issue

Premium Partner