SueYeon Chung
comp neuro, neural manifolds, neuroAI, physics of learning
assistant professor @ harvard (physics, center for brain science, kempner institute) + @ Flatiron Institute
https://www.sychung.org
- Reposted by SueYeon ChungNew preprint from our group (collaboration with @sueyeonchung.bsky.social) showing that discriminating odor components within a complex mixture is constrained by neural sensitivity rather than background interference - likely due to sparse representations at the front end.
- Reposted by SueYeon ChungIn @thetransmitter.bsky.social’s Rising Stars of Neuroscience 2025, we recognize 25 early-career researchers who have made outstanding scientific contributions and demonstrated a commitment to mentoring and community-building in neuroscience. #neuroskyence #StateOfNeuroscience bit.ly/4rnFnyQ
- Thank you @thetransmitter.bsky.social for recognizing our group's work! www.thetransmitter.org/early-career...
- Reposted by SueYeon Chung#Cosyne2026 deadline is just around the corner: 🧠📜 **16 October 2025! 📜🧠 See below for more on key dates and abstract submission: www.cosyne.org/abstracts-su...
- Also those interested in comp neuro and deep learning/AI are encouraged to apply 👇🏻👇🏻
- Ever wondered what gives rise to efficient neural population geometry? Our lab’s new work, led by Sonica Saraf (w/ Tony Movshon), shows how diversity in single-neuron tuning shapes population-level representation geometries to improve perceptual efficiency. Congrats @sonicasaraf.bsky.social!
- In many brain areas, neuronal tuning is heterogeneous. But how does this diversity help behavior? We show how tuning diversity shapes representational geometry and boosts coding efficiency for perception in our new preprint: www.biorxiv.org/content/10.1... (w/ @sueyeonchung.bsky.social&Tony Movshon)
- Enjoyed speaking at Frontiers in NeuroAI Symposium @harvard.edu's @kempnerinstitute.bsky.social Key papers from the talk: *Manifold capacity theory: - original: doi.org/10.1103/Phys... - correlated: doi.org/10.1103/Phys... - data-driven (latest theory): pmc.ncbi.nlm.nih.gov/articles/PMC... 1/3
- Reposted by SueYeon ChungFolks at Princeton have put together an aggregator to highlight example of how federally funded research has directly impacted people's lives. They are looking for more studies to highlight, so send along suggestions and help make the benefits of science more visible! publicusaresearchbenefits.com
- Reposted by SueYeon Chung[Not loaded yet]
- Reposted by SueYeon Chung30 days to the launch of Elusive Cures! I learned so much writing it, and I want to share it. For the next 30 days, I'll post brain & mind research breakthroughs on odd days, and highlight unmet needs on even ones. #ElusiveCures30 First breakthrough: /1 press.princeton.edu/books/hardco...
- Reposted by SueYeon Chungwe're crowd-sourcing a searchable repository of tangible benefits stemming from federally-funded research. Come enjoy the great stories; or send in an idea; or volunteer to join the team. publicusaresearchbenefits.com please share and re-share so we get more great stories in there!
- Reposted by SueYeon Chung[Not loaded yet]
- Reposted by SueYeon ChungTeddy's project is one of the only clear examples we know of (so far) where modifying the *objective function* used for training a deep neural network systematically improves neural prediction of IT responses. Come chat at his poster this afternoon at #cosyne2025! (Poster 2-036)
- If you are at #cosyne2025 come check out my poster this afternoon! We demonstrate a case where objective function design can systematically improve neural predictivity in deep networks. [2-036] Contrastive-Equivariant Self-Supervised Learning Improves Alignment with Primate Visual Area IT
- Reposted by SueYeon ChungWe are presenting two exciting projects tonight at #Cosyne2025! 🧠🧑🔬 [1-032] Changes in tuning curves, not neural population covariance, improve category separability in the primate ventral visual pathway [1-112] Comparing image representations in terms of sensitivities to local distortions
- Reposted by SueYeon ChungIf you are at #cosyne2025 come check out my poster this afternoon! We demonstrate a case where objective function design can systematically improve neural predictivity in deep networks. [2-036] Contrastive-Equivariant Self-Supervised Learning Improves Alignment with Primate Visual Area IT
- Heading to Anaheim for #APSSummit25! I'll be giving a talk at GSNP's MAR-G54 session. summit.aps.org/events/MAR-G54 Looking forward to connecting with old and new colleagues #APS2025 @apsphysics.bsky.social
- Reposted by SueYeon Chung[Not loaded yet]
- Reposted by SueYeon Chung[Not loaded yet]
- Reposted by SueYeon Chung[Not loaded yet]
- Reposted by SueYeon ChungHorace Barlow, by all accounts, was thinking about 30 years ahead of his time. Here is a transcript of the opening keynote I gave at a symposium honoring Horace's scientific legacy: markusmeister.com/2024/12/14/h....
- Reposted by SueYeon Chung[Not loaded yet]
- Reposted by SueYeon ChungTL;DR: Encouraging transformation equivariance in an otherwise standard self-supervised learning setup systematically improves your ability to predict neural responses in area IT. Paper: openreview.net/pdf?id=AiMs8... Poster: Poster Session 3 #2204, East Exhibit Hall
- If you are at #NeurIPS2024, check out Teddy’s poster today! Contrastive-equivariant SSL improves alignment with primate IT 🤖🐵
- Excited to present work with @jfeather.bsky.social @eerosim.bsky.social and @sueyeonchung.bsky.social today at Neurips! May do a proper thread later on, but come by or shoot me a message if you are in Vancouver and want to chat :) Brief details in post below
- Reposted by SueYeon ChungThe Undergrad Travel Grant Program (deadline: Dec 6th) provides an opportunity for undergrads to learn more about comp systems neuro. It’s especially suited for students considering neuro graduate study! Apply here: docs.google.com/forms/d/e/1F... More info: www.cosyne.org/travel-grants
- Reposted by SueYeon Chung[This post could not be retrieved]
- Reposted by SueYeon Chung[Not loaded yet]
- Reposted by SueYeon Chung[Not loaded yet]
- Reposted by SueYeon Chung[Not loaded yet]
- Reposted by SueYeon Chung[Not loaded yet]
- Reposted by SueYeon Chung
- Excited to share our results on Efficient Coding of Natural Images using Maximum Manifold Capacity Representations, a collaboration with Teddy Yerxa, Yilun Kuang, Eero Simoncelli to be presented at #NeurIPS2023 🧵1/n @tedyerxa.bsky.social @flatironinstitute.org @simonsfoundation.org
- We introduce a new self-supervised learning method called MMCR (maximum manifold capacity representation), inspired by methods from statistical physics that use geometry to characterize the coding capacity of a learned representation. Let's dig in! 2/n
- Reposted by SueYeon ChungAt #NeurIPS2023? Interested in brains, neural networks, and geometry? Come by our **Spotlight Poster** Tuesday @ 5:15PM (#1914) on A Spectral Theory of Neural Prediction and Alignment. paper: openreview.net/forum?id=5B1... w/ Abdul Canatar, Albert Wakhloo & SueYeon Chung @sueyeonchung.bsky.social