This paper gives a nearly linear time algorithm for estimating the covariance of an unknown Gaussian distribution in the presence of a small constant \epsilon fraction of outliers. This has been an important research direction in the past few years and recent works used the idea of "quantum entropy scoring" to design a similar linear time algorithm for estimating the mean. The main contribution of this paper is extending that idea to the harder task of covariance estimation. One of the concerns in the discussion was the relationship to the prior works on robust linear time algorithms for mean estimation. The reviewers found the authors' feedback on this useful and clarifying. I recommend accepting this paper to the NeurIPS program.