Skip to main content

2022 | OriginalPaper | Buchkapitel

Expressive Graph Informer Networks

verfasst von : Jaak Simm, Adam Arany, Edward De Brouwer, Yves Moreau

Erschienen in: Machine Learning, Optimization, and Data Science

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Applying machine learning to molecules is challenging because of their natural representation as graphs rather than vectors. Several architectures have been recently proposed for deep learning from molecular graphs, but they suffer from information bottlenecks because they only pass information from a graph node to its direct neighbors. Here, we introduce a more expressive route-based multi-attention mechanism that incorporates features from routes between node pairs. We call the resulting method Graph Informer. A single network layer can therefore attend to nodes several steps away. We show empirically that the proposed method compares favorably against existing approaches in two prediction tasks: (1) 13C Nuclear Magnetic Resonance (NMR) spectra, improving the state-of-the-art with an MAE of 1.35 ppm and (2) predicting drug bioactivity and toxicity. Additionally, we develop a variant called injective Graph Informer that is provably more powerful than the Weisfeiler-Lehman test for graph isomorphism. We demonstrate that the route information allows the method to be informed about the non-local topology of the graph and, thus, it goes beyond the capabilities of the Weisfeiler-Lehman test. Our code is available at github.​com/​jaak-s/​graphinformer.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Fußnoten
1
We assume here that the output function is also injective.
 
2
The adjacency matrix here is assumed to include the self-connection (i.e., \(A_{ii} = 1\)).
 
Literatur
1.
Zurück zum Zitat Almeida, L.B.: A learning rule for asynchronous perceptrons with feedback in a combinatorial environment. In: Proceedings, 1st First International Conference on Neural Networks, vol. 2, pp. 609–618. IEEE (1987) Almeida, L.B.: A learning rule for asynchronous perceptrons with feedback in a combinatorial environment. In: Proceedings, 1st First International Conference on Neural Networks, vol. 2, pp. 609–618. IEEE (1987)
2.
Zurück zum Zitat Alzaga, A., Iglesias, R., Pignol, R.: Spectra of symmetric powers of graphs and the Weisfeiler-Lehman refinements. arXiv preprint arXiv:0801.2322 (2008) Alzaga, A., Iglesias, R., Pignol, R.: Spectra of symmetric powers of graphs and the Weisfeiler-Lehman refinements. arXiv preprint arXiv:​0801.​2322 (2008)
4.
Zurück zum Zitat Bento, A.P., et al.: The ChEMBL bioactivity database: an update. Nucleic Acids Res. 42(D1), D1083–D1090 (2014)CrossRef Bento, A.P., et al.: The ChEMBL bioactivity database: an update. Nucleic Acids Res. 42(D1), D1083–D1090 (2014)CrossRef
5.
Zurück zum Zitat Bottou, L., Bengio, Y., Le Cun, Y.: Global training of document processing systems using graph transformer networks. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 489–494. IEEE (1997) Bottou, L., Bengio, Y., Le Cun, Y.: Global training of document processing systems using graph transformer networks. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 489–494. IEEE (1997)
6.
Zurück zum Zitat Bruna, J., Zaremba, W., Szlam, A., LeCun, Y.: Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203 (2013) Bruna, J., Zaremba, W., Szlam, A., LeCun, Y.: Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:​1312.​6203 (2013)
7.
Zurück zum Zitat Cho, K., van Merrienboer, B., Gülçehre, Ç., Bougares, F., Schwenk, H., Bengio, Y.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. CoRR arXiv:1406.1078 (2014) Cho, K., van Merrienboer, B., Gülçehre, Ç., Bougares, F., Schwenk, H., Bengio, Y.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. CoRR arXiv:​1406.​1078 (2014)
8.
Zurück zum Zitat Dai, H., Dai, B., Song, L.: Discriminative embeddings of latent variable models for structured data. In: International Conference on Machine Learning, pp. 2702–2711 (2016) Dai, H., Dai, B., Song, L.: Discriminative embeddings of latent variable models for structured data. In: International Conference on Machine Learning, pp. 2702–2711 (2016)
9.
Zurück zum Zitat Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: Advances in Neural Information Processing Systems, pp. 3844–3852 (2016) Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: Advances in Neural Information Processing Systems, pp. 3844–3852 (2016)
10.
Zurück zum Zitat Duvenaud, D.K., et al.: Convolutional networks on graphs for learning molecular fingerprints. In: Advances in Neural Information Processing Systems, pp. 2224–2232 (2015) Duvenaud, D.K., et al.: Convolutional networks on graphs for learning molecular fingerprints. In: Advances in Neural Information Processing Systems, pp. 2224–2232 (2015)
11.
Zurück zum Zitat Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 1263–1272. JMLR. org (2017) Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 1263–1272. JMLR. org (2017)
12.
Zurück zum Zitat Gunther, H.: NMR Spectroscopy: An Introduction, vol. 81. Wiley, Hoboken (1980) Gunther, H.: NMR Spectroscopy: An Introduction, vol. 81. Wiley, Hoboken (1980)
13.
Zurück zum Zitat Hamilton, W., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: Advances in Neural Information Processing Systems, pp. 1024–1034 (2017) Hamilton, W., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: Advances in Neural Information Processing Systems, pp. 1024–1034 (2017)
15.
Zurück zum Zitat Jonas, E.: Deep imitation learning for molecular inverse problems. In: Advances in Neural Information Processing Systems, pp. 4991–5001 (2019) Jonas, E.: Deep imitation learning for molecular inverse problems. In: Advances in Neural Information Processing Systems, pp. 4991–5001 (2019)
16.
Zurück zum Zitat Jonas, E., Kuhn, S.: Rapid prediction of NMR spectral properties with quantified uncertainty. J. Cheminform. 11(1), 1–7 (2019)CrossRef Jonas, E., Kuhn, S.: Rapid prediction of NMR spectral properties with quantified uncertainty. J. Cheminform. 11(1), 1–7 (2019)CrossRef
19.
Zurück zum Zitat Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016) Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:​1609.​02907 (2016)
20.
Zurück zum Zitat Li, R., Wang, S., Zhu, F., Huang, J.: Adaptive graph convolutional neural networks. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018) Li, R., Wang, S., Zhu, F., Huang, J.: Adaptive graph convolutional neural networks. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)
22.
23.
Zurück zum Zitat Maron, H., Ben-Hamu, H., Serviansky, H., Lipman, Y.: Provably powerful graph networks. In: Advances in Neural Information Processing Systems, pp. 2153–2164 (2019) Maron, H., Ben-Hamu, H., Serviansky, H., Lipman, Y.: Provably powerful graph networks. In: Advances in Neural Information Processing Systems, pp. 2153–2164 (2019)
24.
Zurück zum Zitat Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Elsevier, Amsterdam (2014)MATH Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Elsevier, Amsterdam (2014)MATH
25.
Zurück zum Zitat Scarselli, F., Gori, M., Tsoi, A.C., Hagenbuchner, M., Monfardini, G.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2008)CrossRef Scarselli, F., Gori, M., Tsoi, A.C., Hagenbuchner, M., Monfardini, G.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2008)CrossRef
26.
Zurück zum Zitat Simm, J., et al.: Repurposing high-throughput image assays enables biological activity prediction for drug discovery. Cell Chem. Biol. 25(5), 611–618 (2018)CrossRef Simm, J., et al.: Repurposing high-throughput image assays enables biological activity prediction for drug discovery. Cell Chem. Biol. 25(5), 611–618 (2018)CrossRef
27.
Zurück zum Zitat Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetMATH Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetMATH
29.
Zurück zum Zitat Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. arXiv preprint arXiv:1710.10903 (2017) Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. arXiv preprint arXiv:​1710.​10903 (2017)
30.
Zurück zum Zitat Wu, Z., et al.: MoleculeNet: a benchmark for molecular machine learning. Chem. Sci. 9(2), 513–530 (2018)CrossRef Wu, Z., et al.: MoleculeNet: a benchmark for molecular machine learning. Chem. Sci. 9(2), 513–530 (2018)CrossRef
33.
Zurück zum Zitat Yun, S., Jeong, M., Kim, R., Kang, J., Kim, H.J.: Graph transformer networks. In: Advances in Neural Information Processing Systems, pp. 11960–11970 (2019) Yun, S., Jeong, M., Kim, R., Kang, J., Kim, H.J.: Graph transformer networks. In: Advances in Neural Information Processing Systems, pp. 11960–11970 (2019)
Metadaten
Titel
Expressive Graph Informer Networks
verfasst von
Jaak Simm
Adam Arany
Edward De Brouwer
Yves Moreau
Copyright-Jahr
2022
DOI
https://doi.org/10.1007/978-3-030-95470-3_15

Premium Partner