Manifold Regularization for SIR with Rate Root-n Convergence

Part of Advances in Neural Information Processing Systems 22 (NIPS 2009)

Bibtex Metadata Paper


Wei Bian, Dacheng Tao


In this paper, we study the manifold regularization for the Sliced Inverse Regression (SIR). The manifold regularization improves the standard SIR in two aspects: 1) it encodes the local geometry for SIR and 2) it enables SIR to deal with transductive and semi-supervised learning problems. We prove that the proposed graph Laplacian based regularization is convergent at rate root-n. The projection directions of the regularized SIR are optimized by using a conjugate gradient method on the Grassmann manifold. Experimental results support our theory.