Skip to main content

2021 | OriginalPaper | Buchkapitel

Ontological Relation Classification Using WordNet, Word Embeddings and Deep Neural Networks

verfasst von : Ahlem Chérifa Khadir, Ahmed Guessoum, Hassina Aliane

Erschienen in: Modelling and Implementation of Complex Systems

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Learning ontological relations is an important step on the way to automatically developing ontologies. This paper introduces a novel way to exploit WordNet [16], the combination of pre-trained word embeddings and deep neural networks for the task of ontological relation classification. The data from WordNet and the knowledge encapsulated in the pre-trained word vectors are combined into an enriched dataset. In this dataset a pair of terms that are linked in WordNet through some ontological relation are represented by their word embeddings. A Deep Neural Network uses this dataset to learn the classification of ontological relations based on the word embeddings. The implementation of this approach has yielded encouraging results, which should help the ontology learning research community develop tools for ontological relation extraction.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
The “words” are actually terms, which could be multi-word terms. For the sake of simplicity, we use “word” to refer to term/lemma in this article.
 
7
For the hypernym class, we did not take all the available data in this experiment; this was done in order to make the data set more balanced with respect to the rest of the considered classes.
 
Literatur
2.
Zurück zum Zitat Dauphin, Y.N., de Vries, H., Bengio, Y.: Equilibrated adaptive learning rates for non-convex optimization. arXiv preprint arXiv:1502.04390 (2015) Dauphin, Y.N., de Vries, H., Bengio, Y.: Equilibrated adaptive learning rates for non-convex optimization. arXiv preprint arXiv:​1502.​04390 (2015)
3.
Zurück zum Zitat Bojanowski, P., Grave, E., Joulin, A., Mikolov, T.: Enriching word vectors with subword information. Trans. Assoc. Comput. Linguist. 5, 135–146 (2017)CrossRef Bojanowski, P., Grave, E., Joulin, A., Mikolov, T.: Enriching word vectors with subword information. Trans. Assoc. Comput. Linguist. 5, 135–146 (2017)CrossRef
4.
Zurück zum Zitat Bui, V.T., Nguyen, P.T., Pham, V.L., Ngo, T.Q.: A neural network model for efficient antonymy-synonymy classification by exploiting co-occurrence contexts and word-structure patterns. Int. J. Intell. Eng. Syst. 13(1), 156–166 (2020) Bui, V.T., Nguyen, P.T., Pham, V.L., Ngo, T.Q.: A neural network model for efficient antonymy-synonymy classification by exploiting co-occurrence contexts and word-structure patterns. Int. J. Intell. Eng. Syst. 13(1), 156–166 (2020)
5.
Zurück zum Zitat Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)MATH Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)MATH
6.
Zurück zum Zitat Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018) Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:​1810.​04805 (2018)
7.
Zurück zum Zitat Dunne, R.A., Campbell, N.A.: On the pairing of the softmax activation and cross-entropy penalty functions and the derivation of the softmax activation function. In: Proceedings of the 8th Australian Conference on the Neural Networks, Melbourne, vol. 181, p. 185. Citeseer (1997) Dunne, R.A., Campbell, N.A.: On the pairing of the softmax activation and cross-entropy penalty functions and the derivation of the softmax activation function. In: Proceedings of the 8th Australian Conference on the Neural Networks, Melbourne, vol. 181, p. 185. Citeseer (1997)
8.
Zurück zum Zitat Gábor, K., Buscaldi, D., Schumann, A.K., QasemiZadeh, B., Zargayouna, H., Charnois, T.: Semeval-2018 task 7: semantic relation extraction and classification in scientific papers. In: Proceedings of the 12th International Workshop on Semantic Evaluation, pp. 679–688 (2018) Gábor, K., Buscaldi, D., Schumann, A.K., QasemiZadeh, B., Zargayouna, H., Charnois, T.: Semeval-2018 task 7: semantic relation extraction and classification in scientific papers. In: Proceedings of the 12th International Workshop on Semantic Evaluation, pp. 679–688 (2018)
9.
Zurück zum Zitat Gal, Y., Ghahramani, Z.: Dropout as a Bayesian approximation: representing model uncertainty in deep learning. In: International Conference on Machine Learning, pp. 1050–1059 (2016) Gal, Y., Ghahramani, Z.: Dropout as a Bayesian approximation: representing model uncertainty in deep learning. In: International Conference on Machine Learning, pp. 1050–1059 (2016)
10.
Zurück zum Zitat Hearst, M.A.: Automatic acquisition of hyponyms from large text corpora. In: Proceedings of the 14th Conference on Computational Linguistics, vol. 2, pp. 539–545. Association for Computational Linguistics (1992) Hearst, M.A.: Automatic acquisition of hyponyms from large text corpora. In: Proceedings of the 14th Conference on Computational Linguistics, vol. 2, pp. 539–545. Association for Computational Linguistics (1992)
11.
Zurück zum Zitat Hendrickx, I., et al.: SemEval-2010 task 8: multi-way classification of semantic relations between pairs of nominals. In: Proceedings of the 5th International Workshop on Semantic Evaluation, pp. 33–38. Association for Computational Linguistics, Uppsala (2010). https://www.aclweb.org/anthology/S10-1006 Hendrickx, I., et al.: SemEval-2010 task 8: multi-way classification of semantic relations between pairs of nominals. In: Proceedings of the 5th International Workshop on Semantic Evaluation, pp. 33–38. Association for Computational Linguistics, Uppsala (2010). https://​www.​aclweb.​org/​anthology/​S10-1006
12.
Zurück zum Zitat LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444 (2015)CrossRef LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444 (2015)CrossRef
14.
Zurück zum Zitat Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013) Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:​1301.​3781 (2013)
15.
Zurück zum Zitat Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, pp. 3111–3119 (2013) Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, pp. 3111–3119 (2013)
16.
Zurück zum Zitat Miller, G.A.: WordNet: An Electronic Lexical Database. MIT Press, Cambridge (1998) Miller, G.A.: WordNet: An Electronic Lexical Database. MIT Press, Cambridge (1998)
17.
Zurück zum Zitat Navigli, R., Ponzetto, S.P.: Babelnet: the automatic construction, evaluation and application of a wide-coverage multilingual semantic network. Artif. Intell. 193, 217–250 (2012)MathSciNetCrossRef Navigli, R., Ponzetto, S.P.: Babelnet: the automatic construction, evaluation and application of a wide-coverage multilingual semantic network. Artif. Intell. 193, 217–250 (2012)MathSciNetCrossRef
18.
Zurück zum Zitat Qin, P., Xu, W., Guo, J.: An empirical convolutional neural network approach for semantic relation classification. Neurocomputing 190, 1–9 (2016)CrossRef Qin, P., Xu, W., Guo, J.: An empirical convolutional neural network approach for semantic relation classification. Neurocomputing 190, 1–9 (2016)CrossRef
20.
Zurück zum Zitat Shijia, E., Jia, S., Xiang, Y.: Study on the Chinese word semantic relation classification with word embedding. In: National CCF Conference on Natural Language Processing and Chinese Computing, pp. 849–855. Springer, Cham (2017) Shijia, E., Jia, S., Xiang, Y.: Study on the Chinese word semantic relation classification with word embedding. In: National CCF Conference on Natural Language Processing and Chinese Computing, pp. 849–855. Springer, Cham (2017)
21.
Zurück zum Zitat Sutton, R.S., Barto, A.G.: Reinforcement Learning: An Introduction. MIT Press, Cambridge (2018)MATH Sutton, R.S., Barto, A.G.: Reinforcement Learning: An Introduction. MIT Press, Cambridge (2018)MATH
22.
Zurück zum Zitat Wu, S., He, Y.: Enriching pre-trained language model with entity information for relation classification. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management, pp. 2361–2364 (2019) Wu, S., He, Y.: Enriching pre-trained language model with entity information for relation classification. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management, pp. 2361–2364 (2019)
23.
Zurück zum Zitat Wu, Y., Zhang, M.: Overview of the NLPCC 2017 shared task: Chinese word semantic relation classification. In: National CCF Conference on Natural Language Processing and Chinese Computing, pp. 919–925. Springer, Cham (2017) Wu, Y., Zhang, M.: Overview of the NLPCC 2017 shared task: Chinese word semantic relation classification. In: National CCF Conference on Natural Language Processing and Chinese Computing, pp. 919–925. Springer, Cham (2017)
24.
Zurück zum Zitat Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014, The 25th International Conference on Computational Linguistics: Technical Papers, pp. 2335–2344. Dublin City University and Association for Computational Linguistics, Dublin, August 2014. https://www.aclweb.org/anthology/C14-1220 Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014, The 25th International Conference on Computational Linguistics: Technical Papers, pp. 2335–2344. Dublin City University and Association for Computational Linguistics, Dublin, August 2014. https://​www.​aclweb.​org/​anthology/​C14-1220
Metadaten
Titel
Ontological Relation Classification Using WordNet, Word Embeddings and Deep Neural Networks
verfasst von
Ahlem Chérifa Khadir
Ahmed Guessoum
Hassina Aliane
Copyright-Jahr
2021
DOI
https://doi.org/10.1007/978-3-030-58861-8_10

Premium Partner