Processing math: 100%

Non-parametric classification via expand-and-sparsify representation

Part of Advances in Neural Information Processing Systems 37 (NeurIPS 2024) Main Conference Track

Bibtex Paper Supplemental

Authors

Kaushik Sinha

Abstract

In *expand-and-sparsify* (EaS) representation, a data point in Sd1 is first randomly mapped to higher dimension Rm, where m>d, followed by a sparsification operation where the informative km of the m coordinates are set to one and the rest are set to zero. We propose two algorithms for non-parametric classification using such EaS representation. For our first algorithm, we use *winners-take-all* operation for the sparsification step and show that the proposed classifier admits the form of a locally weighted average classifier and establish its consistency via Stone's Theorem. Further, assuming that the conditional probability function P(y=1|x)=η(x) is H\"{o}lder continuous and for optimal choice of m, we show that the convergence rate of this classifier is minimax-optimal. For our second algorithm, we use *empirical k-thresholding* operation for the sparsification step, and under the assumption that data lie on a low dimensional manifold of dimension d0d, we show that the convergence rate of this classifier depends only on d0 and is again minimax-optimal. Empirical evaluations performed on real-world datasets corroborate our theoretical results.