NeurIPS 2020

Optimization and Generalization Analysis of Transduction through Gradient Boosting and Application to Multi-scale Graph Neural Networks


Meta Review

The paper considers multi-scale GNNs which have been shown to address over-smoothing issues with standard GNNs, and establishes optimization and generalization guarantees from the perspective of gradient boosting. The paper also suggests GB-GNN with linear transformations, and illustrates that the model can be competitive with the state-of-the-art. Most reviewers felt that the work presents a unique perspective to the performance of multi-scale GNNs. There are some concerns regarding the work - the technical results follow from assumptions and existing results on transductive learning, so there is limited core technical novelty. The work analyzes linear transformations which is different from nonlinear transformations often used in practice. It is unclear if the assumptions needed for the analysis are valid, or how to verify them.