DurstewitzLab
Scientific AI/ machine learning, dynamical systems (reconstruction), generative surrogate models of brains & behavior, applications in neuroscience & mental health
- Tomorrow Christoph will present DynaMix, the first foundation model for dynamical systems reconstruction, at #NeurIPS2025 Exhibit Hall C,D,E #2303
- Unlike current AI systems, animals can quickly and flexibly adapt to changing environments. This is the topic of our new perspective in Nature MI (rdcu.be/eSeif), where we relate dynamical and plasticity mechanisms in the brain to in-context and continual learning in AI. #NeuroAI
- Revised version of our #NeurIPS2025 paper with full code base in Julia & Python now online, see arxiv.org/abs/2505.13192
- Our #AI #DynamicalSystems #FoundationModel DynaMix was accepted to #NeurIPS2025 with outstanding reviews (6555) – first model which can *zero-shot*, w/o any fine-tuning, forecast the *long-term statistics* of time series provided a context. Test it on #HuggingFace: huggingface.co/spaces/Durst...
- Our #AI #DynamicalSystems #FoundationModel DynaMix was accepted to #NeurIPS2025 with outstanding reviews (6555) – first model which can *zero-shot*, w/o any fine-tuning, forecast the *long-term statistics* of time series provided a context. Test it on #HuggingFace: huggingface.co/spaces/Durst...
- Can time series (TS) #FoundationModels (FM) like Chronos zero-shot generalize to unseen #DynamicalSystems (DS)? No, they cannot! But *DynaMix* can, the first TS/DS FM based on principles of DS reconstruction, capturing the long-term evolution of out-of-domain DS: arxiv.org/pdf/2505.131... (1/6)
- Despite being extremely lightweight (only 0.1% of params, 0.6% training corpus size, of closest competitor), it also outperforms major TS foundation models like Chronos variants on real-world TS forecasting with minimal inference times (0.2%) ...
- We have openings for several fully-funded positions (PhD & PostDoc) at the intersection of AI/ML, dynamical systems, and neuroscience within a BMFTR-funded Neuro-AI consortium, at Heidelberg University & Central Institute of Mental Health: www.einzigartigwir.de/en/job-offer... More info below ...
- Relevant publications: www.nature.com/articles/s41... openreview.net/pdf?id=Vp2OA... proceedings.mlr.press/v235/brenner... www.nature.com/articles/s41...
- Got prov. approval for 2 major grants in Neuro-AI & Dynamical Systems Reconstruction, on learning & inference in non-stationary environments, out-of-domain generalization, and DS foundation models. To all AI/math/DS enthusiasts: Expect job announcements (PhD/PostDoc) soon! Feel free to get in touch.
- We wrote a little #NeuroAI piece about in-context learning & neural dynamics vs. continual learning & plasticity, both mechanisms to flexibly adapt to changing environments: arxiv.org/abs/2507.02103 We relate this to non-stationary rule learning tasks with rapid performance jumps. Feedback welcome!
- Happy to discuss our work on parsimonious & math. tractable RNNs for dynamical systems reconstruction next week at cns2025florence.sched.com/event/1z9Mt/...
- How do animals learn new rules? By systematically testing diff. behavioral strategies, guided by selective attn. to rule-relevant cues: rdcu.be/etlRV Akin to in-context learning in AI, strategy selection depends on the animals' "training set" (prior experience), with similar repr. in rats & humans.
- Fantastic work by Florian Bähner, Hazem Toutounji, Tzvetan Popov and many others - I'm just the person advertising!
- I’m really looking so much forward to this! In wonderful Pisa!
- Into population dynamics? Coming to #CNS2025 but not quite ready to head home? Come join us! at the Symposium on "Neural Population Dynamics and Latent Representations"! 🧠 📆 July 10th 📍 Scuola Superiore Sant’Anna, Pisa (and online) 👉 Free registration: neurobridge-tne.github.io #compneuro
- Just heading back from a fantastic workshop on neural dynamics at Gatsby/ London, organized by Tatiana Engel, Bruno Averbeck, & Peter Latham. Enjoyed seeing so many old friends, Memming Park, Carlos Brody, Wulfram Gerstner, Nicolas Brunel & many others … Discussed our recent DS foundation models …
- Can time series (TS) #FoundationModels (FM) like Chronos zero-shot generalize to unseen #DynamicalSystems (DS)? No, they cannot! But *DynaMix* can, the first TS/DS FM based on principles of DS reconstruction, capturing the long-term evolution of out-of-domain DS: arxiv.org/pdf/2505.131... (1/6)
- Unlike TS FMs, DynaMix exhibits #ZeroShotLearning of long-term stats of unseen DS, incl. attractor geometry & power spectrum, w/o *any* re-training, just from a context signal. It does so with only 0.1% of the parameters of Chronos & 10x faster inference times than the closest competitor. (2/6)
- It often even outperforms TS FMs on forecasting diverse empirical time series, like weather, traffic, or medical data, typically used to train TS FMs. This is surprising, cos DynaMix’ training corpus consists *solely* of simulated limit cycles & chaotic systems, no empirical data at all! (3/6)
-
View full threadWe dive a bit into the reasons why current time series FMs not trained for DS reconstruction fail, and conclude that a DS perspective on time series forecasting & models may help to advance the #TimeSeriesAnalysis field. (6/6)
- I'm presenting our lab's work on *learning generative dynamical systems models from multi-modal and multi-subject data* in the world-wide theoretical neurosci seminar Wed 23rd, 11am ET: www.wwtns.online --> incl. recent work on building foundation models for #dynamical-systems reconstruction #AI 🧪
- Our revised #iclr2025 paper and codebase for an architecture for foundation models for dynamical systems reconstruction is now online: openreview.net/pdf?id=Vp2OA... ... includes additional examples of how this may be harvested for identifying drivers (control par.) of non-stationary processes.
- Toward interpretable #AI foundation models for #DynamicalSystems reconstruction: Our paper on transfer & few-shot learning for dynamical systems just got accepted for #ICLR2025 ! Previous version: arxiv.org/pdf/2410.04814; strongly updated version will be available soon ... (1/4)
- Toward interpretable #AI foundation models for #DynamicalSystems reconstruction: Our paper on transfer & few-shot learning for dynamical systems just got accepted for #ICLR2025 ! Previous version: arxiv.org/pdf/2410.04814; strongly updated version will be available soon ... (1/4)
- We show applications like transfer & few-shot learning, but most interestingly perhaps, subject/system-specific features were often linearly related to control parameters of the underlying dynamical system trained on … (2/4)
- This gives rise to an interpretable latent feature space, where datasets with similar dynamics cluster. Intriguingly, this clustering according to *dynamical systems features* led to much better separation of groups than could be achieved by more trad. time series features. (3/4)
- Spearheaded by Manuel Brenner & Elias Weber, together with Georgia Koppe. (4/4)
- That paper discusses an important issue for RNNs as used in neurosci. But we would argue that many RNN approaches do not truly reconstruct DS, for which we demand also agreement in long-term stats, attractor geometry, and generative perform. (esp. in chaotic systems, MSE as stats can be misleading).
- Honest question, not meant as inflammatory: what do you think of this? bsky.app/profile/bqia... I am very interested in using DSR with our data, but I’m not sure what to make of it
- Also important to keep in mind that different network architectures may be compatible with the same *dynamical mechanisms*, so one needs to be careful with the level at which one seeks interpretation & insight.
- Now may be a good time to introduce our group on bsky with some of our contrib. to dynamical systems reconstruction (DSR) from past year. By DSR we mean learning a *generative surrogate model* of a dyn. process from TS data which reproduces full attractor & generalizes to new init. conditions (1/6)
- I start with *generalized TF*, which overcomes the explod./vanish. grad. probl. *in training* for any RNN, enabling DSR on highly chaotic and complex real-world data: proceedings.mlr.press/v202/hess23a... Most other DSR work considers only simulated benchm. & struggles with real data. (2/6)
- In multimodal TF we extended this idea to combinations of arbitrary data modalities, illustrating that chaotic attractors can even be learned from just a symbolic encoding, and providing a common dynamical embedding for different modalities: proceedings.mlr.press/v235/brenner... (3/6)
-
View full threadIn proceedings.neurips.cc/paper_files/... we provided a highly efficient (often linear-time) algo for precisely locating attractors in ReLU-based RNNs. We prove that besides EVGP, bifurcations are a major obstacle in RNN training, but are provably alleviated by training techniques like GTF. (6/6)
- Good targets for foundation/ inference models/ LLMs …
- Can really recommend this excellent talk by Christoph Bergmeir at neurips.cc/virtual/2024... yesterday … on the inability of Transformer- & LLM-based recent time series models to beat even simple baselines *if you do the stats and testing right*! A lesson in careful stat. eval.
- Don't miss out on our 2 @neuripsconf.bsky.social papers on dynamical systems reconstruction today & tomorrow: 1) neurips.cc/virtual/2024...
- Happy our team is among this years’ recipients of the Samsung Global Research Outreach Awards! www.sait.samsung.co.kr/saithome/abo... We will take #DynamicalSystems reconstruction to the next level, large-scale – looking forward to the collaboration with the Samsung team!