NeurIPS 2020

What if Neural Networks had SVDs?

Meta Review

This paper carefully applies known linear algebra results to represent the SVD of weight matrices in neural networks, allowing efficient forward and backward passes in parallel computing environments. The analysis is correct and the experiments convincingly demonstrate speedup by the proposed method. The author also open-sourced their efficient implementation of their algorithm, which can be quite useful for the community and can inspire/accelerate future research.