Skip to main content
Erschienen in:
Buchtitelbild

2023 | OriginalPaper | Buchkapitel

Contextual Word Embeddings Clustering Through Multiway Analysis: A Comparative Study

verfasst von : Mira Ait-Saada, Mohamed Nadif

Erschienen in: Advances in Intelligent Data Analysis XXI

Verlag: Springer Nature Switzerland

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Transformer-based contextual word embedding models are widely used to improve several NLP tasks such as text classification and question answering. Knowledge about these multi-layered models is growing in the literature, with several studies trying to understand what is learned by each of the layers. However, little is known about how to combine the information provided by these different layers in order to make the most of the deep Transformer models. On the other hand, even less is known about how to best use these modes for unsupervised text mining tasks such as clustering. We address both questions in this paper, and propose to study several multiway-based methods for simultaneously leveraging the word representations provided by all the layers. We show that some of them are capable to perform word clustering in an effective and interpretable way. We evaluate their performances across a wide variety of Transformer models, datasets, multiblock techniques and tensor-decomposition methods commonly used to tackle three-way data.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Tenney, I., Das, D., Pavlick, E.: BERT rediscovers the classical NLP pipeline. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 4593–4601. Florence, Italy (2019) Tenney, I., Das, D., Pavlick, E.: BERT rediscovers the classical NLP pipeline. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 4593–4601. Florence, Italy (2019)
2.
Zurück zum Zitat Escofier, B., et al.: Méthode pour l’analyse de plusieurs groupes de variables. application à la caractérisation de vins rouges du Val de Loire. Revue de statistique appliquée 31(2), 43–59 (1983) Escofier, B., et al.: Méthode pour l’analyse de plusieurs groupes de variables. application à la caractérisation de vins rouges du Val de Loire. Revue de statistique appliquée 31(2), 43–59 (1983)
5.
Zurück zum Zitat Ait-Saada, M., Role, F., Nadif, M.: How to leverage a multi-layered transformer language model for text clustering: an ensemble approach. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management (CIKM 2021), pp. 2837–2841 (2021) Ait-Saada, M., Role, F., Nadif, M.: How to leverage a multi-layered transformer language model for text clustering: an ensemble approach. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management (CIKM 2021), pp. 2837–2841 (2021)
6.
Zurück zum Zitat Boutalbi, R., Ait-Saada, M., Iurshina, A., Staab, S., Nadif, M.: Tensor-based graph modularity for text data clustering. In: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2022), pp. 2227–2231 (2022) Boutalbi, R., Ait-Saada, M., Iurshina, A., Staab, S., Nadif, M.: Tensor-based graph modularity for text data clustering. In: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2022), pp. 2227–2231 (2022)
7.
Zurück zum Zitat Wold, S., Hellberg, S., Lundstedt, T., Sjostrom, M., Wold, H.: Proc. Symp. on PLS model building: theory and application. Frankfurt am Main (1987) Wold, S., Hellberg, S., Lundstedt, T., Sjostrom, M., Wold, H.: Proc. Symp. on PLS model building: theory and application. Frankfurt am Main (1987)
8.
Zurück zum Zitat Carroll, J.D.: Generalization of canonical correlation analysis to three or more sets of variables. In: Proceedings of the 76th Annual Convention of the American Psychological Association, vol. 3, pp. 227–228. Washington, DC (1968) Carroll, J.D.: Generalization of canonical correlation analysis to three or more sets of variables. In: Proceedings of the 76th Annual Convention of the American Psychological Association, vol. 3, pp. 227–228. Washington, DC (1968)
9.
Zurück zum Zitat Chessel, D., Hanafi, M.: Analyses de la co-inertie de \( K \) nuages de points. Revue de statistique appliquée 44(2), 35–60 (1996) Chessel, D., Hanafi, M.: Analyses de la co-inertie de \( K \) nuages de points. Revue de statistique appliquée 44(2), 35–60 (1996)
10.
Zurück zum Zitat Abdi, H., Williams, L.J., Valentin, D.: Multiple factor analysis: principal component analysis for multitable and multiblock data sets. Wiley Interdiscip. Rev. Comput. Stat. 5(2), 149–179 (2013)CrossRef Abdi, H., Williams, L.J., Valentin, D.: Multiple factor analysis: principal component analysis for multitable and multiblock data sets. Wiley Interdiscip. Rev. Comput. Stat. 5(2), 149–179 (2013)CrossRef
11.
Zurück zum Zitat Escoufier, Y.: L’analyse conjointe de plusieurs matrices de données. Biométrie et temps 58, 59–76 (1980) Escoufier, Y.: L’analyse conjointe de plusieurs matrices de données. Biométrie et temps 58, 59–76 (1980)
12.
Zurück zum Zitat Qannari, E.M., Wakeling, I., Courcoux, P., MacFie, H.J.: Defining the underlying sensory dimensions. Food Qual. Prefer. 11(1–2), 151–154 (2000)CrossRef Qannari, E.M., Wakeling, I., Courcoux, P., MacFie, H.J.: Defining the underlying sensory dimensions. Food Qual. Prefer. 11(1–2), 151–154 (2000)CrossRef
13.
Zurück zum Zitat Hanafi, M., Kohler, A., Qannari, E.-M.: Connections between multiple co-inertia analysis and consensus principal component analysis. Chemom. Intell. Lab. Syst. 106(1), 37–40 (2011)CrossRef Hanafi, M., Kohler, A., Qannari, E.-M.: Connections between multiple co-inertia analysis and consensus principal component analysis. Chemom. Intell. Lab. Syst. 106(1), 37–40 (2011)CrossRef
14.
Zurück zum Zitat Lê, S., Josse, J., Husson, F.: FactoMineR: a R package for multivariate analysis. J. Stat. Softw. 25(1), 1–18 (2008)CrossRef Lê, S., Josse, J., Husson, F.: FactoMineR: a R package for multivariate analysis. J. Stat. Softw. 25(1), 1–18 (2008)CrossRef
15.
Zurück zum Zitat Chen Meng. mogsa: Multiple omics data integrative clustering and gene set analysis (2019). R package version 1.20.0 Chen Meng. mogsa: Multiple omics data integrative clustering and gene set analysis (2019). R package version 1.20.0
17.
18.
Zurück zum Zitat Harshman, R.A., et al.: Foundations of the PARAFAC procedure: models and conditions for an “explanatory” multimodal factor analysis (1970) Harshman, R.A., et al.: Foundations of the PARAFAC procedure: models and conditions for an “explanatory” multimodal factor analysis (1970)
19.
Zurück zum Zitat De Lathauwer, L., De Moor, B., Vandewalle, J.: On the best rank-1 and rank-(\(R_1, R_2,\dots , R_N\)) approximation of higher-order tensors. SIAM J. Matrix Anal. Appl. 21(4), 1324–1342 (2000)MathSciNetCrossRefMATH De Lathauwer, L., De Moor, B., Vandewalle, J.: On the best rank-1 and rank-(\(R_1, R_2,\dots , R_N\)) approximation of higher-order tensors. SIAM J. Matrix Anal. Appl. 21(4), 1324–1342 (2000)MathSciNetCrossRefMATH
20.
Zurück zum Zitat Hao, L., Liang, S., Ye, J., Zenglin, X.: TensorD: a tensor decomposition library in TensorFlow. Neurocomputing 318, 196–200 (2018)CrossRef Hao, L., Liang, S., Ye, J., Zenglin, X.: TensorD: a tensor decomposition library in TensorFlow. Neurocomputing 318, 196–200 (2018)CrossRef
21.
Zurück zum Zitat Kossaifi, J., Panagakis, Y., Anandkumar, A., Pantic, M.: TensorLy: tensor learning in Python. J. Mach. Learn. Res. 20(26), 1–6 (2019) Kossaifi, J., Panagakis, Y., Anandkumar, A., Pantic, M.: TensorLy: tensor learning in Python. J. Mach. Learn. Res. 20(26), 1–6 (2019)
22.
Zurück zum Zitat Vial, L., Lecouteux, B., Schwab, D.: UFSAC: unification of sense annotated corpora and tools. In: Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018) (2018) Vial, L., Lecouteux, B., Schwab, D.: UFSAC: unification of sense annotated corpora and tools. In: Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018) (2018)
23.
Zurück zum Zitat Zhang, X., Zhao, J., LeCun, Y.: Character-level convolutional networks for text classification. Adv. Neural. Inf. Process. Syst. 28, 649–657 (2015) Zhang, X., Zhao, J., LeCun, Y.: Character-level convolutional networks for text classification. Adv. Neural. Inf. Process. Syst. 28, 649–657 (2015)
24.
Zurück zum Zitat Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), pp. 4171–4186, Minneapolis, Minnesota (2019) Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), pp. 4171–4186, Minneapolis, Minnesota (2019)
27.
Zurück zum Zitat Strehl, A., Ghosh, J.: Cluster ensembles–a knowledge reuse framework for combining multiple partitions. J. Mach. Learn. Technol. 3(Dec), 583–617 (2002) Strehl, A., Ghosh, J.: Cluster ensembles–a knowledge reuse framework for combining multiple partitions. J. Mach. Learn. Technol. 3(Dec), 583–617 (2002)
29.
Zurück zum Zitat Bojanowski, P., Grave, E., Joulin, A., Mikolov, T.: Enriching word vectors with subword information. Trans. Assoc. Comput. Linguist. 5, 135–146 (2017)CrossRef Bojanowski, P., Grave, E., Joulin, A., Mikolov, T.: Enriching word vectors with subword information. Trans. Assoc. Comput. Linguist. 5, 135–146 (2017)CrossRef
30.
Zurück zum Zitat Liu, N.F., Gardner, M., Belinkov, Y., Peters, M.E., Smith, N.A.: Linguistic knowledge and transferability of contextual representations. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), pp. 1073–1094, Minneapolis, Minnesota (2019) Liu, N.F., Gardner, M., Belinkov, Y., Peters, M.E., Smith, N.A.: Linguistic knowledge and transferability of contextual representations. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), pp. 1073–1094, Minneapolis, Minnesota (2019)
Metadaten
Titel
Contextual Word Embeddings Clustering Through Multiway Analysis: A Comparative Study
verfasst von
Mira Ait-Saada
Mohamed Nadif
Copyright-Jahr
2023
DOI
https://doi.org/10.1007/978-3-031-30047-9_1

Premium Partner