François Chollet says symbolic learning replaces gradient descent
François Chollet, creator of the Keras framework and ARC-AGI benchmark, posted that symbolic learning serves as a scalable low-level replacement for gradient descent and neural networks rather than coding agents. Research engineer Alex Nichol replied requesting proof. TeortaxesTex responded by referencing a Yann LeCun paper excerpt that frames reasoning as energy minimization and notes efficiency gains from gradient-based methods over symbolic approaches in both continuous and discrete spaces.
Symbolic learning is not a replacement for coding agents, it's a replacement for gradient descent & NNs: a low-level, completely general, extremely scalable new learning substrate.
Symbolic reasoning definitely works and exists in the sense that we can and have designed such algorithms. But I think LeCun was very right when he wrote this in «A Path Towards Autonomous Machine Intelligence» Might be his most clear-headed take ever.

@fchollet Prove it?
Symbolic learning is not a replacement for coding agents, it's a replacement for gradient descent & NNs: a low-level, completely general, extremely scalable new learning substrate.