This paper compares a family of methods for characterizing neural feature selec- tivity with natural stimuli in the framework of the linear-nonlinear model. In this model, the neural ﬁring rate is a nonlinear function of a small number of relevant stimulus components. The relevant stimulus dimensions can be found by max- imizing one of the family of objective functions, R´enyi divergences of different orders [1, 2]. We show that maximizing one of them, R´enyi divergence of or- der 2, is equivalent to least-square ﬁtting of the linear-nonlinear model to neural data. Next, we derive reconstruction errors in relevant dimensions found by max- imizing R´enyi divergences of arbitrary order in the asymptotic limit of large spike numbers. We ﬁnd that the smallest errors are obtained with R´enyi divergence of order 1, also known as Kullback-Leibler divergence. This corresponds to ﬁnding relevant dimensions by maximizing mutual information . We numerically test how these optimization schemes perform in the regime of low signal-to-noise ra- tio (small number of spikes and increasing neural noise) for model visual neurons. We ﬁnd that optimization schemes based on either least square ﬁtting or informa- tion maximization perform well even when number of spikes is small. Information maximization provides slightly, but signiﬁcantly, better reconstructions than least square ﬁtting. This makes the problem of ﬁnding relevant dimensions, together with the problem of lossy compression , one of examples where information- theoretic measures are no more data limited than those derived from least squares.