David G. Clark
Theoretical neuroscientist
Research fellow @ Kempner Institute, Harvard
dclark.io
- Reposted by David G. ClarkThe cerebellum supports high-level language?? Now out in @cp-neuron.bsky.social, we systematically examined language-responsive areas of the cerebellum using precision fMRI and identified a *cerebellar satellite* of the neocortical language network! authors.elsevier.com/a/1mUU83BtfH... 1/n 🧵👇
- Reposted by David G. ClarkBy the way, if you’re interested in working together on problems like this, I’m starting my lab at UCSF this summer. Get in touch if you’re interested in doing a postdoc! More info here: wj2.github.io/postdoc_ad (7/7)
- Excited to be giving the van Vreeswijk Theoretical Neuroscience Seminar this Wednesday, Jan 14, where I'll talk about "Computation Through Neuronal-Synaptic Dynamics"! www.wwtns.online
- Reposted by David G. Clark1/n: A new collaborative preprint from the lab to start the year: "A multi-ring shifter network computes head direction in zebrafish" together with Siyuan Mei, Martin Stemmler and Andreas Herz from the LMU, Munich.
- Excited and honored to be speaking at this ICLR workshop on associative memory in Rio!
- Official #ICLR2026 workshop on Associative Memory! The scope covers algorithms, architectures, hardware design, agentic workflows, and other ideas where memory plays a crucial computational role.
- Reposted by David G. ClarkBichan Wu (@bichanw.bsky.social) & I wrote a tutorial paper on Reduced Rank Regression (RRR) — the statistical method underlying "communication subspaces" from Semedo et al 2019 — aimed at neuroscientists. arxiv.org/abs/2512.12467
- Very excited about this new work from the omnipotent Owen, with me and Ashok Litwin-Kumar! Can we reconcile low- and high-dimensional activity in neural circuits by recognizing that these circuits ~multitask~? (Plausibly, yes 😊)
- 1/X Excited to present this preprint on multi-tasking, with @david-g-clark.bsky.social and Ashok Litwin-Kumar! Timely too, as “low-D manifold” has been trending again. (If you read thru the end, we escape Flatland and return to the glorious high-D world we deserve.) www.biorxiv.org/content/10.6...
- Reposted by David G. Clark📍Excited to share that our paper was selected as a Spotlight at #NeurIPS2025! arxiv.org/pdf/2410.03972 It started from a question I kept running into: When do RNNs trained on the same task converge/diverge in their solutions? 🧵⬇️
- Reposted by David G. ClarkOur work on training biophysical models with Jaxley is now out in @natmethods.nature.com. Led by @deismic.bsky.social, with @philipp.hertie.ai, @ppjgoncalves.bsky.social & @jakhmack.bsky.social et al. Paper: www.nature.com/articles/s41...
- 🆒
- Our next paper on comparing dynamical systems (with special interest to artificial and biological neural networks) is out!! Joint work with @annhuang42.bsky.social , as well as @satpreetsingh.bsky.social , @leokoz8.bsky.social , Ila Fiete, and @kanakarajanphd.bsky.social : arxiv.org/pdf/2510.25943
- Reposted by David G. ClarkThe term “manifold” comes from “Mannigfaltigkeit,” which is German for “variety” or “multiplicity.” www.quantamagazine.org/what-is-a-ma...
- Reposted by David G. ClarkThis was a lot of fun! From my side, it started with a technical Q: what's the relation between two-side cavity and path integrals? Turns out it's a fluctuation correction - and amazingly, this also enable the "O(N) rank" theory by @david-g-clark.bsky.social and @omarschall.bsky.social. 🤯
- Reposted by David G. ClarkCan confirm this was a fun project! My favorite takeaway is that the (low-but-extensive) rank of a network can be used as a knob for controlling dimensionality while leaving single-neuron properties unchanged.
- Reposted by David G. ClarkSimultaneous detection and estimation in olfactory sensing biorxiv.org/content/10.1101/202…
- Now in PRX: Theory linking connectivity structure to collective activity in nonlinear RNNs! For neuro fans: conn. structure can be invisible in single neurons but shape pop. activity For low-rank RNN fans: a theory of rank=O(N) For physics fans: fluctuations around DMFT saddle⇒dimension of activity
- For fans, like me, of working with @omarschall.bsky.social, @avm.bsky.social, and Ashok Litwin-Kumar: a great time!
- Reposted by David G. ClarkI did a QA with Quanta about interpretability and training dynamics! I got to talk about a bunch of research hobby horses and how I got into them.
- Reposted by David G. ClarkI’m super excited to finally put my recent work with @behrenstimb.bsky.social on bioRxiv, where we develop a new mechanistic theory of how PFC structures adaptive behaviour using attractor dynamics in space and time! www.biorxiv.org/content/10.1...
- Reposted by David G. Clark🎉 "High-dimensional neuronal activity from low-dimensional latent dynamics: a solvable model" will be presented as an oral at #NeurIPS2025 🎉 Feeling very grateful that reviewers and chairs appreciated concise mathematical explanations, in this age of big models. www.biorxiv.org/content/10.1... 1/2
- A great reading list for historical+recent RNN theory
- Just got back from a great summer school at Sapienza University sites.google.com/view/math-hi... where I gave a short course on Dynamics and Learning in RNNs. I compiled a (very biased) list of recommended readings on the subject, for anyone interested: aleingrosso.github.io/_pages/2025_...
- One for the books
- Motor cortex flexibly deploys a high-dimensional repertoire of subskills biorxiv.org/content/10.1101/202…
- Reposted by David G. ClarkI'm excited to share that my new postdoctoral position is going so well that I submitted a new paper at the end of my first week! www.biorxiv.org/content/10.1... A thread below
- (1/26) Excited to share a new preprint led by grad student Albert Wakhloo, with me and Larry Abbott: "Associative synaptic plasticity creates dynamic persistent activity." www.biorxiv.org/content/10.1...
- (2/26) Most neural network models treat synapses as static, but in actual neural circuits, neuronal and synaptic dynamics are tightly coupled. We show that this coupling enables a novel form of dynamic memory.
- (3/26) We study a nonlinear recurrent-network model, introduced by me and Abbott (PRX, 2024), where synaptic weights fluctuate around fixed random baselines via ongoing Hebbian plasticity. The static and plastic components of the weight matrix are denoted J and A(t), with J_ij ~ N(0, g²/N).
-
View full thread(26/26) Finally, one more HUGE shoutout to Albert Wakhloo for conceiving, calculating, and charting our way through this fascinating project. (Link, again: www.biorxiv.org/content/10.1...)
- Wanted to share a new version (much cleaner!) of a preprint on how connectivity structure shapes collective dynamics in nonlinear RNNs. Neural circuits have highly non-iid connectivity (e.g., rapidly decaying singular values, structured singular-vector overlaps), unlike classical random RNN models.
- We introduce the "random-mode model," a random-matrix ensemble similar to an SVD that enables control of both the spectrum and mode overlaps. The key distinction from well-studied low-rank RNNs (Ostojic et al.) is that we use extensive rank scaling (number of modes scales with network size).
- This extensive rank scaling—in agreement with recent connectome data—enables modeling more sophisticated, higher-dimensional dynamics and computations—in agreement with recent large-scale neural recordings—compared to intensive-rank models that produce correspondingly low-dimensional activity.
-
View full threadSuch high-dimensional, highly structured connectivity and dynamics (that, somehow, ultimately underlie task performance) are, I think, a main frontier of theoretical neuroscience research. This work constitutes a step in this direction. (Link, again: arxiv.org/abs/2409.01969)
- Reposted by David G. ClarkVery happy about my former mentor Sara Solla having received the Valentin Braitenberg Award for her lifelong contributions to computational neuroscience! Sara will be giving a lecture at the upcoming @bernsteinneuro.bsky.social meeting which you shouldn't miss. bernstein-network.de/en/newsroom/...
- Reposted by David G. ClarkComing March 17, 2026! Just got my advance copy of Emergence — a memoir about growing up in group homes and somehow ending up in neuroscience and AI. It’s personal, it’s scientific, and it’s been a wild thing to write. Grateful and excited to share it soon.
- Cool!
- Finally up on bioRxiv! A new study with @jdrugowitsch.bsky.social where we do a deep dive into the coding benefits of receptive field heterogeneity and come up with new ways to measure receptive fields! www.biorxiv.org/content/10.1... [1/n]
- Reposted by David G. Clark#KempnerInstitute research fellow @andykeller.bsky.social and coauthors Yue Song, Max Welling and Nicu Sebe have a new book out that introduces a framework for developing equivariant #AI & #neuroscience models. Read more: kempnerinstitute.harvard.edu/news/kempner... #NeuroAI
- Reposted by David G. ClarkWhen neurons change, but behavior doesn’t: Excitability changes driving representational drift New preprint of work with Christian Machens: www.biorxiv.org/content/10.1...
- Reposted by David G. ClarkThe summer schools at Les Houches are a magnificent tradition. I was honored to lecture there in 2023, and my notes now are published as "Ambitions for theory in the physics of life." #physics #physicsoflife scipost.org/SciPostPhysL...
- Reposted by David G. ClarkTrying to train RNNs in a biol plausible (local) way? Well, try our new method using predictive alignment. Paper just out in Nat. Com. Toshitake Asabuki deserves all the credit! www.nature.com/articles/s41...
- Reposted by David G. ClarkNew in the #DeeperLearningBlog: #KempnerInstitute research fellow @andykeller.bsky.social introduces the first flow equivariant neural networks, which reflect motion symmetries, greatly enhancing generalization and sequence modeling. bit.ly/451fQ48 #AI #NeuroAI
- Excited!
- Thrilled to announce the 2025 recipients of #KempnerInstitute Research Fellowships: Elom Amemastro, Ruojin Cai, David Clark, Alexandru Damian, William Dorrell, Mark Goldstein, Richard Hakim, Hadas Orgad, Gizem Ozdil, Gabriel Poesia, & Greta Tuckute! bit.ly/3IpzD5E
- 🆒
- Excited to share new work @icmlconf.bsky.social by Loek van Rossem exploring the development of computational algorithms in recurrent neural networks. Hear it live tomorrow, Oral 1D, Tues 15 Jul West Exhibition Hall C: icml.cc/virtual/2025... Paper: openreview.net/forum?id=3go... (1/11)