NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:1565
Title:Quaternion Knowledge Graph Embeddings


		
The paper attempts learn better entity and relation embeddings for knowledge graphs. In this regard, the authors employ quarternion algebra with Hamilton product, which is used as the scoring function for knowledge triplets. Hamilton product is asymmetric, which is claimed to be beneficial for modeling directed egdes in a knowledge graph. Further the paper outperforms many well established methods and the authors seem to have done an exhaustive set of experiments. However, all the reviewers are in consensus that motivation for the use of quarternions is not clear, e.g. the paper does a poor job in demonstrating how does more degrees of freedom in rotation help in learning better embedding. This should be made clear in the camera ready version. During discussion, one possible direction of explanation came up which you might pursue further: QuatE learns a representation of relations which are maximally orthogonal to head/tail entities representations. This is in contrast to most other methods that learn relations which are like a difference between head and tail representations. Such hypothesis can be easily verified empirically as well. Looking at connections of triple products and quarternions products might be a starting place. Adding such insights would improve the paper significance a lot. So please put in a genuine effort in working out a motivation for the camera ready version. Also in empirical comparison with complex, please include exact numbers with and without relation normalization in the final version. Thus, I am barely recommending acceptance to NeurIPS.