- Amid the rise of billion-parameter models, I argue that toy models, with just a few neurons, remain essential—and may be all neuroscience needs, writes @marcusghosh.bsky.social. #neuroskyence www.thetransmitter.org/theoretical-...
Dec 22, 2025 14:44
- I'd consider this model with an infinite number of neurons to also be a useful toy model. Chaotic balanced state in a model of cortical circuits Vreeswijk & Sompolinsky www.gatsby.ucl.ac.uk/~pel/tnlectu...
- The billion-parameter models are not being developed by neuroscientists in the first place. They are being developed by corporations for the purpose of mass adoption, not by scientists for the purpose of research (strictly). So this comparison is problematic. (Unless I am misinterpreting)...
- Secondly, while I agree that toy models with just a few neurons are useful and important, that is not "all that neuroscience needs". Why take an otherwise good point to such an extreme? Brains are fundamentally parallel and complex; what emerges as intelligence relies on a vast numbers of neurons.
- "Over the past 25 years, artificial neural networks have exploded in size, expanding from 60,000 parameters in 1998 (LeNet) to 70 billion in 2024 (Llama 3)." Scaling up-scaling down!? I hope that one day neuroscience can admit that the whole paradigm is wrong.