FreshGNN: Reducing Memory Access via Stable Historical Embeddings for Graph Neural Network Training
Abstract
References
Recommendations
Reducing communication in graph neural network training
SC '20: Proceedings of the International Conference for High Performance Computing, Networking, Storage and AnalysisGraph Neural Networks (GNNs) are powerful and flexible neural networks that use the naturally sparse connectivity information of the data. GNNs represent this connectivity as sparse matrices, which have lower arithmetic intensity and thus higher ...
SpanGNN: Towards Memory-Efficient Graph Neural Networks via Spanning Subgraph Training
Machine Learning and Knowledge Discovery in Databases. Research TrackAbstractGraph Neural Networks (GNNs) have superior capability in learning graph data. Full-graph GNN training generally has high accuracy, however, it suffers from large peak memory usage and encounters the Out-of-Memory problem when handling large ...
Accelerating Graph Neural Network Training on ReRAM-Based PIM Architectures via Graph and Model Pruning
Graph neural networks (GNNs) are used for predictive analytics on graph-structured data, and they have become very popular in diverse real-world applications. Resistive random-access memory (ReRAM)-based PIM architectures can accelerate GNN training. ...
Comments
Information & Contributors
Information
Published In
Publisher
VLDB Endowment
Publication History
Check for updates
Badges
Qualifiers
- Research-article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 137Total Downloads
- Downloads (Last 12 months)137
- Downloads (Last 6 weeks)21
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in