Sarthak Chandra
Interested in neuroscience, development and dynamical systems | Faculty member @ ICTS | Previously: @MIT, @UMD, @IITK
- 1/ Our paper appeared in @Nature today! www.nature.com/articles/s41... w/ Fiete Lab and @khonamikail.bsky.social . Explains emergence of multiple grid cell modules, w/ excellent match to data! Novel mechanism for applying across vast systems from development to ecosystems. 🧵👇
-
View full thread7/ Peak Selection applies broadly for module emergence: The same mechanism can also explain: 🌱 Emergent ecological niches 🐠 Coral spawning synchrony 🤖 Modularity in optimization & learning
- 8/8 TL;DR: Peak Selection is a novel mechanism for the modularity emergence in a variety of systems. Applied to grid cells, it makes testable predictions at molecular, circuit, and functional levels, and matches observed period ratios better than any existing model!
- 6/ Central results and predictions: •Nearly **any** interaction shape can form grid cell patterning (Mexican-hat kernels not needed!) •Grid cells involve two scales of interactions, one spatially varying and one fixed. •Functional modularity can emerge without molecular modularity.
- 6b/ (cont'd) Central results and predictions: •Self-scaling with organism size •Topologically robust: insensitive to almost all param variations, activity perturbations; also robust to weight heterogeneity! (no need for exactly symmetric interactions in CANs)
- 5/ Grid modularity from Peak Selection! •Two forms of local interactions: one spatially varying smoothly in scale, the other held fixed. •These spontaneously generate local patterning and global modularity!
- 5b/ (cont’d) Grid modularity from Peak Selection! •Discrete jumps in grid period without discrete precursors. •Novel period ratio prediction: adjacent periods ratios vary as ratio of integers (3/2, 4/3, 5/4, …). •Excellent agreement with data (R^2 ~0.99)!
- 3/ Various measured cellular and circuit properties vary smoothly across grid cells. Yet, grid cells are organized into discrete modules with different spatial periods. How does discrete organization arise from smooth gradients?
- 4/ 2 classic ideas for structure emergence in biology •Positional hypothesis: genes apply discrete thresholds, but discrete gene expression •Turing hypothesis: Local interactions drive patterns, but single scale But grid modules are multiscale, from presumably continuous precursors
- 2/ The work introduces “Peak Selection”: a general mechanism by which local interactions and smooth gradients give rise to global modules. We first focus on a classic example of modularity, grid cells in the brain.
- 1/ Super-excited to share our new work “Episodic and associative memory from spatial scaffolds in the hippocampus”, that just appeared in Nature! www.nature.com/articles/s41... Key insights and ideas 👇#tweeprint
- [Not loaded yet]
- Thanks! Yes, in its current form it doesn't have recency or primacy effects. We have some thoughts on including recency with some weight decay to reduce the importance of older memories. But how to build in primacy and other forms of memory salience in this model is something to think more about!
- [Not loaded yet]
- Thanks Sreeparna!
- 9b/ (cont’d) Hippocampal cells remap by direction/context 📍➡️⬅️ Memory consolidation of multiple memory traces 📚 Model thus bridges experiments and theory!
- 10/10 🚀 TL;DR: VectorHaSH provides a unifying framework for efficient spatial, episodic, and associative memory in the hippocampus. Curious? Read the paper www.nature.com/articles/s41... #Memory #Neuroscience #Hippocampus
- 8/ 🏰 Memory palaces explained! Why does imagining a spatial walk supercharge memory? VectorHaSH shows how recall of familiar locations acts as a secondary scaffold. Result: Even approximate recall of locations reliably supports one-shot arbitrary, high-fidelity memories. 💡
- 9/ Experimental alignment: 🧠🔬 VectorHaSH mirrors entorhinal-hippocampal phenomena: Grid cells demonstrate stable periodicity, rapid phase resets, robust velocity integration 🌐 Recreate correlation statistics of grid cells and place cells 📊
- 6/ Spatial memory at scale? VectorHaSH links scaffold states to sensory cues via the hippocampus. This leads to independent non-interfering learned maps (landmark-location associations) in multiple rooms. Metric grid structure supports zero-shot inference along novel paths🚶♀️
- 7/ How does VectorHaSH implement efficient episodic/sequence memory? Conventional models recall entire high-dim states ➡️ fail quickly. VectorHaSH reduces the problem to recalling low-dim velocity vectors on a scaffold. Result: Long sequences stored & recalled with precision! 🔥
- 5/ Memory without cliffs? Hopfield and other models crash 📉after reaching capacity, completely losing all previous memories. VectorHaSH avoids this by first using grid cells to create a scaffold of exponentially many large-basin fixed points
- 5a/ (cont’d) VectorHaSH then stores memories by heteroassociation of inputs with these scaffold states, enabling graceful degradation of memory detail with the number of stored memories over a vast number of inputs
- 4/ VectorHaSH supports: (1) Item memory, avoiding memory cliffs of Hopfield nets (2) Spatial memory, learning landmark-location associations over many maps 🌍& minimizing catastrophic forgetting
- 4a/ (cont’d) (3) Episodic memory, using low-dimensional transitions in the grid space to support massive sequence capacity 🎞️(4) Method of Loci, explaining the paradox of why adding to the memory task (associating items with spatial locations) boosts performance 🏰
- 2/ Why are spatial & episodic memory co-localized in the hippocampus? How do memory palaces allow memorization of decks of cards? Our model, VectorHaSH, shows how the hippocampus along with grid cells integrate these roles for memory storage, sequence recall, memory palaces 🏰
- 3/ Key ideas 🔑Hippocampal and grid cells create a fixed "scaffold" that serves as a robust, error-correcting memory foundation. External inputs are "hooked" onto the scaffold through heteroassociation. Low-dimensional transitions in grid space enable large sequence memory.