Pengfei
I’m a postdoc @Imperial, advised by @danakarca.bsky.social and @neural-reckoning.org.
I’m passionate about brain‑inspired neural networks, focusing on delay learning in RNNs (spiking/rate) and lightweight attention mechanisms.
- Reposted by PengfeiToy models, just in time for Christmas! Excited to share my first article for @thetransmitter.bsky.social #neuroskyence
- Amid the rise of billion-parameter models, I argue that toy models, with just a few neurons, remain essential—and may be all neuroscience needs, writes @marcusghosh.bsky.social. #neuroskyence www.thetransmitter.org/theoretical-...
- Reposted by PengfeiBrains have many pathways / subnetworks but which principles underlie their formation? In our #NeurIPS paper lead by Jack Cook we identify biologically relevant inductive biases that create pathways in brain-like Mixture-of-Experts models🧵 #neuroskyence #compneuro #neuroAI arxiv.org/abs/2506.02813
- With my great advisors and colleagues, @achterbrain.bsky.social @zhe @danakarca.bsky.social @neural-reckoning.org, we show that if heterogeneous axonal delays (imprecise) can capture the essential temporal structure of a task, spiking networks do not need precise synaptic weights to perform well.
- Psst - neuromorphic folks. Did you know that you can solve the SHD dataset with 90% accuracy using only 22 kb of parameter memory by quantising weights and delays? Check out our preprint with @pengfei-sun.bsky.social and @danakarca.bsky.social, or read the TLDR below. 👇🤖🧠🧪 arxiv.org/abs/2510.27434
- This might suggest that a large fraction of computational power may actually live in the temporal parameters, not just in the weights.
- Reposted by PengfeiPsst - neuromorphic folks. Did you know that you can solve the SHD dataset with 90% accuracy using only 22 kb of parameter memory by quantising weights and delays? Check out our preprint with @pengfei-sun.bsky.social and @danakarca.bsky.social, or read the TLDR below. 👇🤖🧠🧪 arxiv.org/abs/2510.27434
- Reposted by PengfeiTalks from #SNUFA 2025 are now available on YouTube: youtube.com/playlist?lis... 🤖🧠🧪
- Reposted by PengfeiInterested in #neuroscience + #AI and looking for a PhD position? I can support your application @imperialcollegeldn.bsky.social ✅ Check your eligibility (below) ✅ Contact me (DM or email) UK nationals: www.imperial.ac.uk/life-science... Otherwise: www.imperial.ac.uk/study/fees-a...
- Reposted by PengfeiSpiking NN fans - the #SNUFA workshop (Nov 5-6) agenda is finalised and online now. Make sure to register (free) soon. (Note you can register for either day and come to both.) Agenda: snufa.net/2025/ Registration: www.eventbrite.co.uk/e/snufa-2025... Thanks to all who voted on abstracts! 🤖🧠🧪
- Reposted by Pengfei
- Reposted by PengfeiMessage for participants of the #SNUFA 2025 spiking neural network workshop. We got almost 60 awesome abstract submissions, and we'd now like your help to select which ones should be offered talks. Follow the "abstract voting" link at snufa.net/2025/ to take part. It should take <15m. Thanks! ❤️
- Reposted by PengfeiNew preprint! What happens if you add neuromodulation to spiking neural networks and let them go wild with it? TLDR: it can improve performance especially in challenging sensory processing tasks. Explainer thread below. 🤖🧠🧪 www.biorxiv.org/content/10.1...
- Reposted by PengfeiIn our latest “This paper changed my life,” @neural-reckoning.org explains how @fzenke.bsky.social's 2019 paper, and its related coding tutorial SpyTorch, made it possible to apply modern machine learning to spiking neural networks. #neuroskyence www.thetransmitter.org/this-paper-c...
- Reposted by PengfeiSubmissions (short!) due for SNUFA spiking neural networks conference in <2 weeks! 🤖🧠🧪 forms.cloud.microsoft/e/XkZLavhaJe More info at snufa.net/2025/ Note that we normally get around 700 participants and recordings go on YouTube and get 100s-1000s views. Please repost.
- Reposted by PengfeiIs anarchist science possible? As an experiment, we got together a large group of computational neuroscientists from around the world to work on a single project without top down direction. Read on to find out what happened. 🤖🧠🧪
- Reposted by Pengfei[Not loaded yet]
- Reposted by PengfeiSpiking neural networks people, this message is for you! The annual SNUFA workshop is now open for abstract submission (deadline Sept 26) and (free) registration. This year's speakers include Elisabetta Chicca, Jason Eshraghian, Tomoki Fukai, Chengcheng Huang, and... you? snufa.net/2025/ 🤖🧠🧪
- Reposted by PengfeiHow does the structure of a neural circuit shape its function? @neuralreckoning.bsky.social & I explore this in our new preprint: doi.org/10.1101/2025... 🤖🧠🧪 🧵1/9
- Reposted by Pengfei[Not loaded yet]
- Together with my supervisor and the Neural Reckoning team neuralreckoning.bsky.social, we’ve developed a new SHD/SSC variant that strips out rate information but spike‑timing information—designed to really challenge SNNs. Your feedback is welcome—let us know how it performs in your models!
- New preprint for #neuromorphic and #SpikingNeuralNetwork folk (with @pengfei-sun.bsky.social). arxiv.org/abs/2507.16043 Surrogate gradients are popular for training SNNs, but some worry whether they really learn complex temporal spike codes. TLDR: we tested this, and yes they can! 🧵👇 🤖🧠🧪
- 🔗 Try it now: github.com/neural-recko...