skip to main content
10.1145/3652583.3658051acmconferencesArticle/Chapter ViewAbstractPublication PagesicmrConference Proceedingsconference-collections
research-article

GSD-GNN: Generalizable and Scalable Algorithms for Decoupled Graph Neural Networks

Published: 07 June 2024 Publication History

Abstract

Graph Neural Networks (GNNs) have achieved remarkable performance in various applications, including social media analysis, computer vision, and natural language processing. Decoupled GNNs are a ubiquitous framework because of their high efficiency. However, existing decoupled GNNs suffer from the following several defects. (1) Their studies on GNN feature propagation are isolated, with each study emphasizing a user-specified propagation matrix. (2) They still have high computation costs to achieve provable performance on massive graphs with millions of nodes and billions of edges. (3) Their feature propagation steps are uniform, which makes it difficult for them to escape the dilemmas of over-smoothing. In this paper, we propose GSD-GNN, a <u>G</u>eneralized and <u>S</u>calable <u>D</u>ecoupled <u>GNN</u> framework based on the spectral graph theory, which offers the following advantages. Firstly, through minor parameter adjustments, it can degenerate into most existing Decoupled GNNs, such as APPNP, GDC, SGC, etc. Secondly, it efficiently computes an arbitrary propagation matrix with near-linear time complexity and theoretical guarantees. Thirdly, it customizes the adaptive feature propagation mechanism for each node to mitigate the over-smoothing dilemma. Finally, extensive experiments on massive graphs demonstrate that the proposed GSD-GNN indeed is effective, scalable, and flexible.

References

[1]
Reid Andersen, Fan R. K. Chung, and Kevin J. Lang. 2006. Local Graph Partitioning using PageRank Vectors. In 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS. IEEE Computer Society.
[2]
Aleksandar Bojchevski, Johannes Gasteiger, Bryan Perozzi, Amol Kapoor, Martin Blais, Benedek Rózemberczki, Michal Lukasik, and Stephan Günnemann. 2020. Scaling graph neural networks with approximate pagerank. In KDD. 2464--2473.
[3]
Jie Chen, Tengfei Ma, and Cao Xiao. 2018. Fastgcn: fast learning with graph convolutional networks via importance sampling. arXiv preprint arXiv:1801.10247 (2018).
[4]
Jianfei Chen, Jun Zhu, and Le Song. 2017. Stochastic training of graph convolutional networks with variance reduction. arXiv preprint arXiv:1710.10568 (2017).
[5]
Ming Chen, Zhewei Wei, Bolin Ding, Yaliang Li, Ye Yuan, Xiaoyong Du, and Ji-Rong Wen. 2020. Scalable graph neural networks via bidirectional propagation. Advances in neural information processing systems, Vol. 33 (2020), 14556--14566.
[6]
Dehua Cheng, Yu Cheng, Yan Liu, Richard Peng, and Shang-Hua Teng. 2015. Spectral Sparsification of Random-Walk Matrix Polynomials. CoRR (2015).
[7]
Wei-Lin Chiang, Xuanqing Liu, Si Si, Yang Li, Samy Bengio, and Cho-Jui Hsieh. 2019. Cluster-gcn: An efficient algorithm for training deep and large graph convolutional networks. In KDD. 257--266.
[8]
Will Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive representation learning on large graphs. Advances in neural information processing systems, Vol. 30 (2017).
[9]
Xiangnan He, Kuan Deng, Xiang Wang, Yan Li, Yong-Dong Zhang, and Meng Wang. 2020. LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation. In SIGIR. ACM, 639--648.
[10]
Keke Huang, Jing Tang, Juncheng Liu, Renchi Yang, and Xiaokui Xiao. 2023. Node-Wise Diffusion for Scalable Graph Learning. WWW.
[11]
Leo Katz. 1953. A new status index derived from sociometric analysis. Psychometrika (1953).
[12]
Nikhil Varma Keetha, Chen Wang, Yuheng Qiu, Kuan Xu, and Sebastian A. Scherer. 2022. AirObject: A Temporally Evolving Graph Embedding for Object Identification. In CVPR. 8397--8406.
[13]
Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In ICLR. OpenReview.net.
[14]
Johannes Klicpera, Aleksandar Bojchevski, and Stephan Günnemann. 2019a. Predict then Propagate: Graph Neural Networks meet Personalized PageRank. In ICLR.
[15]
Johannes Klicpera, Stefan Weißenberger, and Stephan Günnemann. 2019b. Diffusion Improves Graph Learning. In NeurIPS. 13333--13345.
[16]
Risi Kondor. 2002. Diffusion kernels on graphs and other discrete structures. In International Conference on Machine Learning.
[17]
Meihao Liao, Rong-Hua Li, Qiangqiang Dai, Hongyang Chen, Hongchao Qin, and Guoren Wang. 2023. Efficient Personalized PageRank Computation: The Power of Variance-Reduced Monte Carlo Approaches. Proc. ACM Manag. Data (2023).
[18]
Meng Liu, Hongyang Gao, and Shuiwang Ji. 2020. Towards deeper graph neural networks. In KDD. 338--348.
[19]
Lawrence Page, Sergey Brin, Rajeev Motwani, and Terry Winograd. 1999. The PageRank Citation Ranking: Bringing Order to the Web. In The Web Conference.
[20]
Jiezhong Qiu, Jian Tang, Hao Ma, Yuxiao Dong, Kuansan Wang, and Jie Tang. 2018. DeepInf: Social Influence Prediction with Deep Learning. In SIGKDD. Association for Computing Machinery.
[21]
Michael Sejr Schlichtkrull, Nicola De Cao, and Ivan Titov. 2021. Interpreting Graph Neural Networks for NLP With Differentiable Edge Masking. In ICLR.
[22]
Daniel A. Spielman and Nikhil Srivastava. 2011. Graph Sparsification by Effective Resistances. SIAM J. Comput. (2011), 1913--1926.
[23]
Daniel A. Spielman and Shang-Hua Teng. 2011. Spectral Sparsification of Graphs. SIAM J. Comput., Vol. 40, 4 (2011), 981--1025.
[24]
Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2017. Graph Attention Networks. CoRR (2017).
[25]
Hanzhi Wang, Mingguo He, Zhewei Wei, Sibo Wang, Ye Yuan, Xiaoyong Du, and Ji-Rong Wen. 2021. Approximate Graph Propagation. In KDD. 1686--1696.
[26]
Minjie Wang, Da Zheng, Zihao Ye, Quan Gan, Mufei Li, Xiang Song, Jinjing Zhou, Chao Ma, Lingfan Yu, Yu Gai, Tianjun Xiao, Tong He, George Karypis, Jinyang Li, and Zheng Zhang. 2019. Deep Graph Library: A Graph-Centric, Highly-Performant Package for Graph Neural Networks. arXiv preprint arXiv:1909.01315 (2019).
[27]
Suhang Wang, Jiliang Tang, Charu C. Aggarwal, Yi Chang, and Huan Liu. 2017. Signed Network Embedding in Social Media. In SIAM.
[28]
Felix Wu, Amauri H. Souza Jr., Tianyi Zhang, Christopher Fifty, Tao Yu, and Kilian Q. Weinberger. 2019. Simplifying Graph Convolutional Networks. In ICML, Vol. 97. 6861--6871.
[29]
Keyulu Xu, Chengtao Li, Yonglong Tian, Tomohiro Sonobe, Ken-ichi Kawarabayashi, and Stefanie Jegelka. 2018. Representation Learning on Graphs with Jumping Knowledge Networks. In ICML.
[30]
Chenxiao Yang, Qitian Wu, Jiahua Wang, and Junchi Yan. 2023. Graph Neural Networks are Inherently Good Generalizers: Insights by Bridging GNNs and MLPs. ICLR.
[31]
Rex Ying, Ruining He, Kaifeng Chen, Pong Eksombatchai, William L. Hamilton, and Jure Leskovec. 2018. Graph Convolutional Neural Networks for Web-Scale Recommender Systems. In KDD. 974--983.
[32]
Hanqing Zeng, Muhan Zhang, Yinglong Xia, Ajitesh Srivastava, Andrey Malevich, Rajgopal Kannan, Viktor Prasanna, Long Jin, and Ren Chen. 2021. Decoupling the depth and scope of graph neural networks. Advances in Neural Information Processing Systems, Vol. 34 (2021), 19665--19679.
[33]
Hanqing Zeng, Hongkuan Zhou, Ajitesh Srivastava, Rajgopal Kannan, and Viktor Prasanna. 2019. Graphsaint: Graph sampling based inductive learning method. arXiv preprint arXiv:1907.04931 (2019).
[34]
Wentao Zhang, Zeang Sheng, Ziqi Yin, Yuezihan Jiang, Yikuan Xia, Jun Gao, Zhi Yang, and Bin Cui. 2022. Model Degradation Hinders Deep Graph Neural Networks. In KDD. 2493--2503.
[35]
Wentao Zhang, Mingyu Yang, Zeang Sheng, Yang Li, Wen Ouyang, Yangyu Tao, Zhi Yang, and Bin Cui. 2021. Node Dependent Local Smoothing for Scalable Graph Learning. showeprint[nips]2110.14377
[36]
Long Zhao, Xi Peng, Yu Tian, Mubbasir Kapadia, and Dimitris N. Metaxas. 2019. Semantic Graph Convolutional Networks for 3D Human Pose Regression. In CVPR. 3425--3435.
[37]
Hao Zhu and Piotr Koniusz. 2021. Simple Spectral Graph Convolution. In ICLR.
[38]
Difan Zou, Ziniu Hu, Yewen Wang, Song Jiang, Yizhou Sun, and Quanquan Gu. 2019. Layer-dependent importance sampling for training deep and large graph convolutional networks. NeurIPS (2019).

Cited By

View all

Index Terms

  1. GSD-GNN: Generalizable and Scalable Algorithms for Decoupled Graph Neural Networks

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICMR '24: Proceedings of the 2024 International Conference on Multimedia Retrieval
    May 2024
    1379 pages
    ISBN:9798400706196
    DOI:10.1145/3652583
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 June 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. decoupled graph neural networks
    2. graph neural networks
    3. large-scale
    4. spectral graph theory

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    ICMR '24
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 254 of 830 submissions, 31%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)101
    • Downloads (Last 6 weeks)21
    Reflects downloads up to 06 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media