Yatin Dandi submits Neural LoFi spectral theory paper
Yatin Dandi submitted the arXiv paper Deep Learning as Neural Low-Degree Filtering on 13 May 2026. The work introduces Neural LoFi, a framework that treats deep networks as iterated kernel spectral filters to explain hierarchical feature learning, depth advantages, abstract concept emergence, and convolutional behavior. It also derives a backpropagation-free training algorithm. The paper appears under identifier 2605.13612, with researchers including Florent Krzakala and Lenka Zdeborová noting the submission.
Very nice progress toward understanding deep learning here!
What if a theory of deep learning could be built from iterated kernel spectral methods? Feature learning, advantage of depth, emergence of concepts, convnets filters.... and a new backprop-free algorithm too! We have it all! Introducing Neural LoFi 🧵 https://arxiv.org/abs/2605.13612