NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
This paper presents a fast algorithm for approximating the solution of a total least square regression problem, where the running time is approximately linear in the number of non-zero entries of the matrices. Applicable to sparse inputs, the method is analyzed theoretically with guarantees on its approximation capability. It is demonstrated on small as well as large scale problems. The generalization to regularized version is also presented. Overall the work is good and the results are significant for the NeurIPS audience.