Skip to main content
Erschienen in: Peer-to-Peer Networking and Applications 3/2021

05.02.2021

ASFGNN: Automated separated-federated graph neural network

verfasst von: Longfei Zheng, Jun Zhou, Chaochao Chen, Bingzhe Wu, Li Wang, Benyu Zhang

Erschienen in: Peer-to-Peer Networking and Applications | Ausgabe 3/2021

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Graph Neural Networks (GNNs) have achieved remarkable performance by taking advantage of graph data. The success of GNN models always depends on rich features and adjacent relationships. However, in practice, such data are usually isolated by different data owners (clients) and thus are likely to be Non-Independent and Identically Distributed (Non-IID). Meanwhile, considering the limited network status of data owners, hyper-parameters optimization for collaborative learning approaches is time-consuming in data isolation scenarios. To address these problems, we propose an Automated Separated-Federated Graph Neural Network (ASFGNN) learning paradigm. ASFGNN consists of two main components, i.e., the training of GNN and the tuning of hyper-parameters. Specifically, to solve the data Non-IID problem, we first propose a separated-federated GNN learning model, which decouples the training of GNN into two parts: the message passing part that is done by clients separately, and the loss computing part that is learnt by clients federally. To handle the time-consuming parameter tuning problem, we leverage Bayesian optimization technique to automatically tune the hyper-parameters of all the clients. We conduct experiments on benchmark datasets and the results demonstrate that ASFGNN significantly outperforms the naive federated GNN, in terms of both accuracy and parameter-tuning efficiency.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Abadi M, Agarwal A, Barham P, Brevdo E, Chen Z, Citro C, Corrado GS, Davis A, Dean J, Devin M, Ghemawat S, Goodfellow I, Harp A, Irving G, Isard M, Jia Y, Jozefowicz R, Kaiser L, Kudlur M, Levenberg J, Mané D, Monga R, Moore S, Murray D, Olah C, Schuster M, Shlens J, Steiner B, Sutskever I, Talwar K, Tucker P, Vanhoucke V, Vasudevan V, Viégas F, Vinyals O, Warden P, Wattenberg M, Wicke M, Yu Y, Zheng X (2015) TensorFlow: Large-scale machine learning on heterogeneous systems. https://www.tensorflow.org/. Software available from tensorflow.org Abadi M, Agarwal A, Barham P, Brevdo E, Chen Z, Citro C, Corrado GS, Davis A, Dean J, Devin M, Ghemawat S, Goodfellow I, Harp A, Irving G, Isard M, Jia Y, Jozefowicz R, Kaiser L, Kudlur M, Levenberg J, Mané D, Monga R, Moore S, Murray D, Olah C, Schuster M, Shlens J, Steiner B, Sutskever I, Talwar K, Tucker P, Vanhoucke V, Vasudevan V, Viégas F, Vinyals O, Warden P, Wattenberg M, Wicke M, Yu Y, Zheng X (2015) TensorFlow: Large-scale machine learning on heterogeneous systems. https://​www.​tensorflow.​org/​. Software available from tensorflow.org
3.
Zurück zum Zitat Abuadbba S, Kim K, Kim M, Thapa C, Camtepe SA, Gao Y, Kim H, Nepal S (2020) Can we use split learning on 1d cnn models for privacy preserving training? arXiv:2003.12365 Abuadbba S, Kim K, Kim M, Thapa C, Camtepe SA, Gao Y, Kim H, Nepal S (2020) Can we use split learning on 1d cnn models for privacy preserving training? arXiv:2003.​12365
4.
Zurück zum Zitat Aono Y, Hayashi T, Trieu Phong L, Wang L (2016) Scalable and secure logistic regression via homomorphic encryption. In: CODASPY. ACM, pp 142–144 Aono Y, Hayashi T, Trieu Phong L, Wang L (2016) Scalable and secure logistic regression via homomorphic encryption. In: CODASPY. ACM, pp 142–144
5.
Zurück zum Zitat Bojchevski A, Günnemann S (2017) Deep gaussian embedding of attributed graphs: Unsupervised inductive learning via ranking. arXiv:1707.03815 Bojchevski A, Günnemann S (2017) Deep gaussian embedding of attributed graphs: Unsupervised inductive learning via ranking. arXiv:1707.​03815
6.
Zurück zum Zitat Chen C, Zhou J, Wang L, Wu X, Fang W, Tan J, Wang L, Ji X, Liu A, Wang H (2020) When homomorphic encryption marries secret sharing: Secure large-scale sparse logistic regression and applications in risk control. arXiv:2008.08753 Chen C, Zhou J, Wang L, Wu X, Fang W, Tan J, Wang L, Ji X, Liu A, Wang H (2020) When homomorphic encryption marries secret sharing: Secure large-scale sparse logistic regression and applications in risk control. arXiv:2008.​08753
7.
Zurück zum Zitat Chen YW, Song Q, Hu X (2019) Techniques for automated machine learning Chen YW, Song Q, Hu X (2019) Techniques for automated machine learning
8.
Zurück zum Zitat Finn C, Abbeel P, Levine S (2017) Model-agnostic meta-learning for fast adaptation of deep networks Finn C, Abbeel P, Levine S (2017) Model-agnostic meta-learning for fast adaptation of deep networks
9.
Zurück zum Zitat Gao Y, Kim M, Abuadbba S, Kim Y, Thapa C, Kim K, Camtepe SA, Kim H, Nepal S (2020) End-to-end evaluation of federated learning and split learning for internet of things. arXiv:2003.13376 Gao Y, Kim M, Abuadbba S, Kim Y, Thapa C, Kim K, Camtepe SA, Kim H, Nepal S (2020) End-to-end evaluation of federated learning and split learning for internet of things. arXiv:2003.​13376
10.
Zurück zum Zitat Gu Z, Huang H, Zhang J, Su D, Lamba A, Pendarakis D, Molloy I (2019) Securing input data of deep learning inference systems via partitioned enclave execution. CoRR arXiv:1807.00969 Gu Z, Huang H, Zhang J, Su D, Lamba A, Pendarakis D, Molloy I (2019) Securing input data of deep learning inference systems via partitioned enclave execution. CoRR arXiv:1807.​00969
11.
Zurück zum Zitat Gupta O, Raskar R (2018) Distributed learning of deep neural network over multiple agents. J Netw Comput Appl 116:1–8 Gupta O, Raskar R (2018) Distributed learning of deep neural network over multiple agents. J Netw Comput Appl 116:1–8
12.
Zurück zum Zitat Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. In: NeurIPS, pp 1024–1034 Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. In: NeurIPS, pp 1024–1034
13.
Zurück zum Zitat Kairouz P, McMahan H.B, Avent B, Bellet A, Bennis M, Bhagoji A.N, Bonawitz K, Charles Z, Cormode G, Cummings R, D’Oliveira RGL, Rouayheb SE, Evans D, Gardner J, Garrett Z, Gascón A, Ghazi B, Gibbons PB, Gruteser M, Harchaoui Z, He C, He L, Huo Z, Hutchinson B, Hsu J, Jaggi M, Javidi T, Joshi G, Khodak M, Konečný J, Korolova A, Koushanfar F, Koyejo S, Lepoint T, Liu Y, Mittal P, Mohri M, Nock R, Özgür A, Pagh R, Raykova M, Qi H, Ramage D, Raskar R, Song D, Song W, Stich SU, Sun Z, Suresh AT, Tramèr F, Vepakomma P, Wang J, Xiong L, Xu Z, Yang Q, Yu FX, Yu H, Zhao S (2019) Advances and open problems in federated learning Kairouz P, McMahan H.B, Avent B, Bellet A, Bennis M, Bhagoji A.N, Bonawitz K, Charles Z, Cormode G, Cummings R, D’Oliveira RGL, Rouayheb SE, Evans D, Gardner J, Garrett Z, Gascón A, Ghazi B, Gibbons PB, Gruteser M, Harchaoui Z, He C, He L, Huo Z, Hutchinson B, Hsu J, Jaggi M, Javidi T, Joshi G, Khodak M, Konečný J, Korolova A, Koushanfar F, Koyejo S, Lepoint T, Liu Y, Mittal P, Mohri M, Nock R, Özgür A, Pagh R, Raykova M, Qi H, Ramage D, Raskar R, Song D, Song W, Stich SU, Sun Z, Suresh AT, Tramèr F, Vepakomma P, Wang J, Xiong L, Xu Z, Yang Q, Yu FX, Yu H, Zhao S (2019) Advances and open problems in federated learning
14.
Zurück zum Zitat Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. CoRR arXiv:1609.02907 Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. CoRR arXiv:1609.​02907
15.
Zurück zum Zitat Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv:1609.02907 Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv:1609.​02907
16.
Zurück zum Zitat Lin J, Wong SKM (1990) A new directed divergence measure and its characterization. Int J Gen Syst 17(1)L73–81. Lin J, Wong SKM (1990) A new directed divergence measure and its characterization. Int J Gen Syst 17(1)L73–81.
18.
Zurück zum Zitat Liu Z, Chen C, Li L, Zhou J, Li X, Song L, Qi Y (2018) Geniepath: Graph neural networks with adaptive receptive paths Liu Z, Chen C, Li L, Zhou J, Li X, Song L, Qi Y (2018) Geniepath: Graph neural networks with adaptive receptive paths
19.
Zurück zum Zitat Liu Z, Chen C, Yang X, Zhou J, Li X, Song L (2018) Heterogeneous graph neural networks for malicious account detection. In: Proceedings of the 27th ACM International Conference on Information and Knowledge Management, CIKM ’18. Association for Computing Machinery, New York, pp 2077–2085. https://doi.org/10.1145/3269206.3272010 Liu Z, Chen C, Yang X, Zhou J, Li X, Song L (2018) Heterogeneous graph neural networks for malicious account detection. In: Proceedings of the 27th ACM International Conference on Information and Knowledge Management, CIKM ’18. Association for Computing Machinery, New York, pp 2077–2085. https://​doi.​org/​10.​1145/​3269206.​3272010
20.
Zurück zum Zitat Lorenzo PR, Nalepa J, Ramos LS, Pastor JR (2017) Hyper-parameter selection in deep neural networks using parallel particle swarm optimization. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO ’17. Association for Computing Machinery, New York, pp 1864–1871. https://doi.org/10.1145/3067695.3084211 Lorenzo PR, Nalepa J, Ramos LS, Pastor JR (2017) Hyper-parameter selection in deep neural networks using parallel particle swarm optimization. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO ’17. Association for Computing Machinery, New York, pp 1864–1871. https://​doi.​org/​10.​1145/​3067695.​3084211
21.
Zurück zum Zitat McMahan HB, Moore E, Ramage D, y Arcas BA (2016) Federated learning of deep networks using model averaging. ArXiv:1602.05629 McMahan HB, Moore E, Ramage D, y Arcas BA (2016) Federated learning of deep networks using model averaging. ArXiv:1602.​05629
22.
Zurück zum Zitat McMahan HB, Moore E, Ramage D, Hampson S, y Arcas BA (2017) Communication-efficient learning of deep networks from decentralized data. In: AISTATS McMahan HB, Moore E, Ramage D, Hampson S, y Arcas BA (2017) Communication-efficient learning of deep networks from decentralized data. In: AISTATS
23.
Zurück zum Zitat Mei G, Guo Z, Liu S, Pan L (2019) Sgnn: A graph neural network based federated learning approach by hiding structure. In: 2019 IEEE International Conference on Big Data (Big Data). IEEE, pp 2560–2568 Mei G, Guo Z, Liu S, Pan L (2019) Sgnn: A graph neural network based federated learning approach by hiding structure. In: 2019 IEEE International Conference on Big Data (Big Data). IEEE, pp 2560–2568
24.
Zurück zum Zitat Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’14. Association for Computing Machinery, New York, pp 701–710. https://doi.org/10.1145/2623330.2623732 Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’14. Association for Computing Machinery, New York, pp 701–710. https://​doi.​org/​10.​1145/​2623330.​2623732
25.
Zurück zum Zitat Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G (2009) The graph neural network model. IEEE Trans Neural Netw 20(1):61–80 Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G (2009) The graph neural network model. IEEE Trans Neural Netw 20(1):61–80
26.
Zurück zum Zitat Shamir A (1979) How to share a secret. Commun ACM 22(11):612–613 Shamir A (1979) How to share a secret. Commun ACM 22(11):612–613
27.
Zurück zum Zitat Swersky K, Snoek J, Adams RP (2013) Multi-task bayesian optimization. In: Proceedings of the 26th International Conference on Neural Information Processing Systems - Volume 2, NIPS’13. Curran Associates Inc, Red Hook, pp 2004–2012 Swersky K, Snoek J, Adams RP (2013) Multi-task bayesian optimization. In: Proceedings of the 26th International Conference on Neural Information Processing Systems - Volume 2, NIPS’13. Curran Associates Inc, Red Hook, pp 2004–2012
28.
Zurück zum Zitat Thakkar O, Andrew G, McMahan HB (2019) Differentially private learning with adaptive clipping Thakkar O, Andrew G, McMahan HB (2019) Differentially private learning with adaptive clipping
29.
Zurück zum Zitat Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y (2017) Graph attention networks Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y (2017) Graph attention networks
30.
Zurück zum Zitat Wang T, Zhu JY, Torralba A, Efros AA (2018) Dataset distillation Wang T, Zhu JY, Torralba A, Efros AA (2018) Dataset distillation
31.
Zurück zum Zitat Wang Y, Sun Y, Liu Z, Sarma SE, Bronstein MM, Solomon JM (2018) Dynamic graph cnn for learning on point clouds. arXiv:1801.07829 Wang Y, Sun Y, Liu Z, Sarma SE, Bronstein MM, Solomon JM (2018) Dynamic graph cnn for learning on point clouds. arXiv:1801.​07829
33.
Zurück zum Zitat Ying R, He R, Chen K, Eksombatchai P, Hamilton WL, Leskovec J (2018) Graph convolutional neural networks for web-scale recommender systems. In: SIGKDD. ACM, pp 974–983 Ying R, He R, Chen K, Eksombatchai P, Hamilton WL, Leskovec J (2018) Graph convolutional neural networks for web-scale recommender systems. In: SIGKDD. ACM, pp 974–983
34.
Zurück zum Zitat Yu T, Zhu H (2020) Hyper-parameter optimization: A review of algorithms and applications. arXiv:2003.05689 Yu T, Zhu H (2020) Hyper-parameter optimization: A review of algorithms and applications. arXiv:2003.​05689
35.
Zurück zum Zitat Zhao Y, Li M, Lai L, Suda N, Civin D, Chandra V (2018) Federated learning with non-iid data Zhao Y, Li M, Lai L, Suda N, Civin D, Chandra V (2018) Federated learning with non-iid data
36.
Zurück zum Zitat Zhou J, Chen C, Zheng L, Zheng X, Wu B, Liu Z, Wang L (2020) Privacy-preserving graph neural network for node classification. arXiv:2005.11903 Zhou J, Chen C, Zheng L, Zheng X, Wu B, Liu Z, Wang L (2020) Privacy-preserving graph neural network for node classification. arXiv:2005.​11903
Metadaten
Titel
ASFGNN: Automated separated-federated graph neural network
verfasst von
Longfei Zheng
Jun Zhou
Chaochao Chen
Bingzhe Wu
Li Wang
Benyu Zhang
Publikationsdatum
05.02.2021
Verlag
Springer US
Erschienen in
Peer-to-Peer Networking and Applications / Ausgabe 3/2021
Print ISSN: 1936-6442
Elektronische ISSN: 1936-6450
DOI
https://doi.org/10.1007/s12083-021-01074-w

Weitere Artikel der Ausgabe 3/2021

Peer-to-Peer Networking and Applications 3/2021 Zur Ausgabe

Premium Partner