scholar.google.com › citations
Oct 29, 2017 · In this paper, we develop control variate based algorithms which allow sampling an arbitrarily small neighbor size. Furthermore, we prove new ...
Nov 14, 2017 · A control variate based stochastic training algorithm for graph convolutional networks that the receptive field can be only two neighbors per node.
We briefly review graph convolutional networks (GCNs), stochastic training, neighbor sampling, and importance sam- pling in this section. 2.1. Graph ...
Code for the paper Stochastic Training for Graph Convolutional Networks. The implementation is based on Thomas Kipf's implementation for graph convolutional ...
In this paper, we develop a preprocessing strategy and two control variate based algorithms to further reduce the receptive field size. Our algorithms are ...
Graph convolutional networks (GCNs) are powerful deep neural networks for graph-structured data. However, GCN computes nodes' representation recursively.
A preprocessing strategy and two control variate based algorithms to further reduce the receptive field size of graph convolutional networks and are ...
The following lemma bounds the approximation error of activations in a multi- layer GCN with CV. Intuitively, there is a sequence of slow-changing model.
Sep 7, 2023 · We eliminate the subgraph formation phase and propose Edge Convolutional Network (ECN), which is trained with independently sampled edges.
People also ask
What is the difference between CNN and GCN?
What is the GCN technique?
How does GCNCONV work?
What are the applications of graph classification?
Code for the paper Stochastic Training for Graph Convolutional Networks. The implementation is based on Thomas Kipf's implementation for graph convolutional ...