Graph Convolutional Network (ConvGNN)
(Redirected from Graph Convolutional Network)
Jump to navigation
Jump to search
A Graph Convolutional Network (ConvGNN) is a Graph Neural Network that is based on a Convolutional Neural Network.
- Context:
- It can range from being a Spectral Graph Convolutional Network to being a Spatial Graph Convolutional Network.
- …
- Example(s):
- Counter-Example(s):
- See: Recurrent Neural Network, Feedforward Neural Network, Attention Mechanism, Spectral Network.
References
2020a
- (Wu et al., 2020) ⇒ Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, and Philip S. Yu (2020). "A Comprehensive Survey on Graph Neural Networks". In: IEEE transactions on neural networks and learning systems, 32(1), 4-24.
- QUOTE: Recently, there is increasing interest in extending deep learning approaches for graph data. Motivated by CNNs, RNNs, and autoencoders from deep learning, new generalizations and definitions of important operations have been rapidly developed over the past few years to handle the complexity of graph data. For example, a graph convolution can be generalized from a 2D convolution. As illustrated in Figure 1, an image can be considered as a special case of graphs where pixels are connected by adjacent pixels. Similar to 2D convolution, one may perform graph convolutions by taking the weighted average of a node's neighborhood information.
(a) 2D Convolution. Analogous to a graph, each pixel in an image is taken as a node where neighbors are determined by the filter size. The 2D convolution takes the weighted average of pixel values of the red node along with its neighbors. The neighbors of a node are ordered and have a fixed size. | (b) Graph Convolution. To get a hidden representation of the red node, one simple solution of the graph convolutional operation is to take the average value of the node features of the red node along with its neighbors. Different from image data, the neighbors of a node are unordered and variable in size. |
2020b
- (Zhou et al., 2020) ⇒ Jie Zhou, Ganqu Cui, Shengding Hu, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Lifeng Wang, Changcheng Li, and Maosong Sun (2020). "Graph neural networks: A review of methods and applications". AI Open, 1, 57-81.
- QUOTE: Zhang et al. (2019a) propose another comprehensive overview of graph convolutional networks. However, they mainly focus on convolution operators defined on graphs while we investigate other computation modules in GNNs such as skip connections and pooling operators.
2019
- (Zhang et al., 2019) ⇒ Si Zhang, Hanghang Tong, Jiejun Xu, and Ross Maciejewski (2019). "Graph convolutional networks: a comprehensive review". In: Computational Social Networks, 6(1), 1-23.
2018a
- (Gao et al., 2018) ⇒ Hongyang Gao, Zhengyang Wang, and Shuiwang Ji (2018). "Large-Scale Learnable Graph Convolutional Networks". In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (KDD 2018).
2018b
- (Li et al., 2018) ⇒ Ruoyu Li, Sheng Wang, Feiyun Zhu, and Junzhou Huang (2018). "Adaptive Graph Convolutional Neural Networks". In: Proceedings of The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18).
2018c
- (Zhuang & Ma, 2018) ⇒ Chenyi Zhuang, and Qiang Ma (2018). "Dual Graph Convolutional Networks for Graph-Based Semi-Supervised Classification". In: Proceedings of the 2018 World Wide Web Conference.
2017
- (Kipf & Welling, 2017) ⇒ Thomas N. Kipf, and Max Welling (2017). "Semi-Supervised Classification with Graph Convolutional Networks". In: ICLR 2017.
- QUOTE: The overall model, a multi-layer GCN for semi-supervised learning, is schematically depicted in Figure 1.