Part of Advances in Neural Information Processing Systems 31 (NeurIPS 2018)
Mahito Sugiyama, Hiroyuki Nakahara, Koji Tsuda
We present a novel nonnegative tensor decomposition method, called Legendre decomposition, which factorizes an input tensor into a multiplicative combination of parameters. Thanks to the well-developed theory of information geometry, the reconstructed tensor is unique and always minimizes the KL divergence from an input tensor. We empirically show that Legendre decomposition can more accurately reconstruct tensors than other nonnegative tensor decomposition methods.