- Now may be a good time to introduce our group on bsky with some of our contrib. to dynamical systems reconstruction (DSR) from past year. By DSR we mean learning a *generative surrogate model* of a dyn. process from TS data which reproduces full attractor & generalizes to new init. conditions (1/6)
- I start with *generalized TF*, which overcomes the explod./vanish. grad. probl. *in training* for any RNN, enabling DSR on highly chaotic and complex real-world data: proceedings.mlr.press/v202/hess23a... Most other DSR work considers only simulated benchm. & struggles with real data. (2/6)
- In multimodal TF we extended this idea to combinations of arbitrary data modalities, illustrating that chaotic attractors can even be learned from just a symbolic encoding, and providing a common dynamical embedding for different modalities: proceedings.mlr.press/v235/brenner... (3/6)Dec 24, 2024 12:39
- Our Almost-Linear RNN openreview.net/pdf?id=sEpSx... shows that simplicity is king, reducing the # of required nonlin. to a bare min., eg learning Lorenz chaos with just 2 ReLUs! The AL-RNN has a direct relation to symbolic dynamics. It strongly facilitates math. analysis of trained models. (4/6)
- In proceedings.mlr.press/v235/goring2... we laid out a general theory for out-of-domain generalization in DSR. We define OODG in DSR as the ability to predict dynamics in unseen dynamical regimes (basins of attraction). We prove that in its most general form, this problem is intractable. (5/6)
- In proceedings.neurips.cc/paper_files/... we provided a highly efficient (often linear-time) algo for precisely locating attractors in ReLU-based RNNs. We prove that besides EVGP, bifurcations are a major obstacle in RNN training, but are provably alleviated by training techniques like GTF. (6/6)