The Probabilistic Neural Network (PNN) algorithm represents the likeli(cid:173) hood function of a given class as the sum of identical, isotropic Gaussians. In practice, PNN is often an excellent pattern classifier, outperforming other classifiers including backpropagation. However, it. is not. robust with respect to affine transformations of feature space, and this can lead to poor performance on certain data. We have derived an extension of PNN called Weighted PNN (WPNN) which compensates for this flaw by allow(cid:173) ing anisotropic Gaussians, i.e. Gaussians whose covariance is not a mul(cid:173) tiple of the identity matrix. The covariance is optimized using a genetic algorithm, some interesting features of which are its redundant, logarith(cid:173) mic encoding and large population size. Experimental results validate our claims.