Asymptotic slowing down of the nearest-neighbor classifier

Part of Advances in Neural Information Processing Systems 3 (NIPS 1990)

Bibtex Metadata Paper


Robert Snapp, Demetri Psaltis, Santosh Venkatesh


If patterns are drawn from an n-dimensional feature space according to a probability distribution that obeys a weak smoothness criterion, we show that the probability that a random input pattern is misclassified by a nearest-neighbor classifier using M random reference patterns asymptoti(cid:173) cally satisfies

PM(error) "" Poo(error) + M2/n'


for sufficiently large values of M. Here, Poo(error) denotes the probability of error in the infinite sample limit, and is at most twice the error of a Bayes classifier. Although the value of the coefficient a depends upon the underlying probability distributions, the exponent of M is largely distri(cid:173) bution free. We thus obtain a concise relation between a classifier's ability to generalize from a finite reference sample and the dimensionality of the feature space, as well as an analytic validation of Bellman's well known "curse of dimensionality."