NIPS Proceedingsβ

Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks

Part of: Advances in Neural Information Processing Systems 32 (NIPS 2019) pre-proceedings

[PDF] [BibTeX] [Supplemental]

Authors

Conference Event Type: Poster

Abstract

Recently, neural network based approaches have achieved significant progress for solving large, complex, graph-structured problems. Nevertheless, the advantages of multi-scale information and deep architectures have not been sufficiently exploited. In this paper, we first analyze key factors constraining the expressive power of existing Graph Convolutional Networks (GCNs), including the activation function and shallow learning mechanisms. Then, we generalize spectral graph convolution and deep GCN in block Krylov subspace forms, upon which we devise two architectures, both scalable in depth however making use of multi-scale information differently. On several node classification tasks, the proposed architectures achieve state-of-the-art performance.