Alessandro Ingrosso
Theoretical neuroscience, machine learning and spin glasses.
Assistant professor at Donders Institute, Nijmegen, The Netherlands.
Website: aleingrosso.github.io
- Not despondent to share our new work arxiv.org/abs/2601.17427 with Santiago Acevedo and Cristopher Erazo from SISSA, where we show that Binary Intrisic Dimension (BID) [ nature.com/articles/s42... ] is able to detect phase transitions in the Hopfield model. 1/N
- We directly link the BID estimator for the intrinsic dimensionality of the spin dynamics to the overlap distribution in finite-size systems. 2/N
- We find that the BID scales sublinearly in the spin-glass phase, with changes of scaling exponents sitting at phase transitions. 3/N
-
View full threadFinally, a call to the neuroscience community: the BID is the perfect tool to study low-dimensional dynamics directly at the level of spikes in both spiking network models and data. Feel free to get in touch if you're interested in collaborating. N/N
- Reposted by Alessandro IngrossoInterested in doing a Ph.D. to work on building models of the brain/behavior? Consider applying to graduate schools at CU Anschutz: 1. Neuroscience www.cuanschutz.edu/graduate-pro... 2. Bioengineering engineering.ucdenver.edu/bioengineeri... You could work with several comp neuro PIs, including me.
- Please RT - Open PhD position in my group at the Donders Center for Neuroscience, Radboud University. We're looking for a PhD candidate interested in developing theories of learning in neural networks. Applications are open until October 20th. For more info: www.ru.nl/en/working-a...
- Just got back from a great summer school at Sapienza University sites.google.com/view/math-hi... where I gave a short course on Dynamics and Learning in RNNs. I compiled a (very biased) list of recommended readings on the subject, for anyone interested: aleingrosso.github.io/_pages/2025_...
- Reposted by Alessandro IngrossoWith all the sad developments in the US - go study in the Netherlands: relatively low tuition (and adequate job search visa after graduating) for high-quality programs like this one in Neurophysics or Cognitive Neuroscience at the neuroscience Donders hub
- Our paper on the statistical mechanics of transfer learning is now published in PRL. Franz-Parisi meets Kernel Renormalization in this nice collaboration with friends in Bologna (F. Gerace) and Parma (P. Rodondo, R. Pacelli). journals.aps.org/prl/abstract...
- Reposted by Alessandro Ingrosso🇳🇱 For the next FRESK seminar, Alessandro Ingrosso (Radboud University, NL) will give a lecture on "Statistical mechanics of transfer learning in the proportional limit" @aingrosso.bsky.social More info on Qbio's website ! ⤵️ qbio.ens.psl.eu/en/events/ex...
- Announcing our StatPhys29 Satellite Workshop "Molecular biophysics at the transition state: from statistical mechanics to AI" to be held in Trento, Italy, from July 7th to 11th, 2025: indico.ectstar.eu/event/252/. Co-organized with Raffaello Potestio and his lab in Trento.
- Reposted by Alessandro IngrossoNew paper with @aingrosso.bsky.social @sebgoldt.bsky.social and Zhangyang Wang “On How Iterative Magnitude Pruning Discovers Local Receptive Fields in Fully Connected Neural Networks“ accepted at the conference on parsimony and learning (CPAL) arxiv.org/abs/2412.06545 1/
- Our paper on density of states in NNs is now published in TMLR. We show how the loss landscape in simple learning problems can be characterized by Wang-Landau sampling. A nice collaboration with the Potestio Lab in Trento, at the interface between ML and soft-matter. openreview.net/forum?id=BLD...
- Reposted by Alessandro Ingrosso🤖 🧠 🧪 New #Preprint Alert! Imagine AI systems that can learn and adapt on-chip while displaying minimal energy usage. We've just made a step towards unlocking the final piece of the puzzle needed to deploy neuromorphic at scale using SpiNNaker2! (1/8) arxiv.org/abs/2412.15021
- Reposted by Alessandro IngrossoNew paper with @leonlufkin.bsky.social and @eringrant.bsky.social! Why do we see localized receptive fields so often, even in models without sparisity regularization? We present a theory in the minimal setting from @aingrosso.bsky.social and @sebgoldt.bsky.social