×
Oct 29, 2017 · In this paper, we develop control variate based algorithms which allow sampling an arbitrarily small neighbor size. Furthermore, we prove new ...
We briefly review graph convolutional networks (GCNs), stochastic training, neighbor sampling, and importance sam- pling in this section. 2.1. Graph ...
Code for the paper Stochastic Training for Graph Convolutional Networks. The implementation is based on Thomas Kipf's implementation for graph convolutional ...
In this paper, we develop a preprocessing strategy and two control variate based algorithms to further reduce the receptive field size. Our algorithms are ...
Graph convolutional networks (GCNs) are powerful deep neural networks for graph-structured data. However, GCN computes nodes' representation recursively.
A preprocessing strategy and two control variate based algorithms to further reduce the receptive field size of graph convolutional networks and are ...
The following lemma bounds the approximation error of activations in a multi- layer GCN with CV. Intuitively, there is a sequence of slow-changing model.
Sep 7, 2023 · We eliminate the subgraph formation phase and propose Edge Convolutional Network (ECN), which is trained with independently sampled edges.
People also ask
Code for the paper Stochastic Training for Graph Convolutional Networks. The implementation is based on Thomas Kipf's implementation for graph convolutional ...