1d ago

François Chollet says symbolic learning replaces gradient descent

0

François Chollet, creator of the Keras framework and ARC-AGI benchmark, posted that symbolic learning serves as a scalable low-level replacement for gradient descent and neural networks rather than coding agents. Research engineer Alex Nichol replied requesting proof. TeortaxesTex responded by referencing a Yann LeCun paper excerpt that frames reasoning as energy minimization and notes efficiency gains from gradient-based methods over symbolic approaches in both continuous and discrete spaces.

Original post

Symbolic learning is not a replacement for coding agents, it's a replacement for gradient descent & NNs: a low-level, completely general, extremely scalable new learning substrate.

8:30 AM · May 12, 2026 View on X

Symbolic learning is not a replacement for coding agents, it's a replacement for gradient descent & NNs: a low-level, completely general, extremely scalable new learning substrate.

3:30 PM · May 12, 2026 · 57.9K Views

Symbolic reasoning definitely works and exists in the sense that we can and have designed such algorithms. But I think LeCun was very right when he wrote this in «A Path Towards Autonomous Machine Intelligence» Might be his most clear-headed take ever.

11:56 PM · May 12, 2026 · 4.7K Views

@fchollet Prove it?

François Chollet@fchollet

Symbolic learning is not a replacement for coding agents, it's a replacement for gradient descent & NNs: a low-level, completely general, extremely scalable new learning substrate.

3:30 PM · May 12, 2026 · 57.9K Views
3:42 PM · May 12, 2026 · 224 Views