Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
Naganand Yadati
Message passing neural network (MPNN) has recently emerged as a successful framework by achieving state-of-the-art performances on many graph-based learning tasks.
MPNN has also recently been extended to multi-relational graphs (each edge is labelled), and hypergraphs (each edge can connect any number of vertices).
However, in real-world datasets involving text and knowledge, relationships are much more complex in which hyperedges can be multi-relational, recursive, and ordered.
Such structures present several unique challenges because it is not clear how to adapt MPNN to variable-sized hyperedges in them.
In this work, we first unify exisiting MPNNs on different structures into G-MPNN (Generalised MPNN) framework.
Motivated by real-world datasets, we then propose a novel extension of the framework, MPNN-R (MPNN-Recursive) to handle recursively-structured data.
Experimental results demonstrate the effectiveness of proposed G-MPNN and MPNN-R.