I’m super excited to finally put my recent work with
@behrenstimb.bsky.social on bioRxiv, where we develop a new mechanistic theory of how PFC structures adaptive behaviour using attractor dynamics in space and time!
www.biorxiv.org/content/10.1... Sep 24, 2025 09:52It is increasingly clear from recent work in mice and monkeys that prefrontal cortex solves sequence memory tasks by using different populations of neurons to represent different elements of the sequence.
2/8
We show that these representations can do much more than that. If you connect the different neural populations the right way, the resulting attractor network can infer the future! This allows the network to solve complex problems like planning using representations that we know exist in PFC.
3/8
It turns out the resulting 'spacetime attractor' (STA) network is particularly good at tasks where the environment changes on a fast timescale – and these are exactly the types of behaviour that we need PFC for!
4/8
RNNs trained to solve such 'PFC-like' tasks learn a solution that exactly mirrors the spacetime attractor in both representation, connectivity, and dynamics. They also reveal an elegant mechanism for rapid adaptation of a 'world model' to changing environments, without the need for plasticity!
5/8
What is most exciting to us is that the STA solves these tasks using attractor dynamics that resemble how visual cortex infers 'missing edges' from partial inputs, how language cortex infers meaning even if we miss a word or two, and how navigation circuits infer orientation and location.
6/8
We think PFC structures adaptive behaviour using these same principles. If true, it could provide a path towards a unified mechanistic understanding of cortical computations from the sensory periphery to high-level cognition!
7/8
This has been a super fun project, and I’m very excited for the coming years where we will test some of the ideas experimentally together with our many excellent colleagues at the
@sainsburywellcome.bsky.social and Oxford!
8/8
Finally a big thanks to all of our co-authors Peter Doohan,
@mathiassablemeyer.bsky.social,
@sandra-neuro.bsky.social,
@alonbaram.bsky.social, and Thomas Akam + everyone else who contributed through discussions, ideas, and feedback!