NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
The paper presents a method for scaling up classifiers for tasks with extremely large number of classes, with memory requirements scaling with O(logK) for K classes. The proposed model is uses count-min sketch to transform a very large classification problem to a small number of classification tasks with a fixed small number of classes. Each of these models can be trained independently and in parallel. Experimental results on a number of multi-class and multi-label classification tasks shows that it either performs as well as other more resource-demanding approaches or it outperforms them, The methodological contribution is significant and it would be the baseline of future studies.