Guy Moss
PhD student at @mackelab.bsky.social - machine learning & geoscience.
- Reposted by Guy MossOn my way from Munich to Grenoble 🚞 to co-lead a 3-day SBI tutorial + hackathon together with @danielged.bsky.social, organised by Pedro Rodriguez and @ugrenoblealpes.bsky.social. Excited to meet researchers from across France, many bringing their own simulators 🚀
- Reposted by Guy Moss1/ 🌀 New paper alert! We introduce Dingo-T1, a flexible transformer-based deep learning model for gravitational-wave (GW) data analysis. It adapts to different detector & frequency settings, improving inference efficiency and flexibility 🚀 #AI #MachineLearning #Physics #Astronomy #AcademicSky
- Reposted by Guy MossFinally got the job ad—looking for 2 PhD students to start spring next year: www.gao-unit.com/join-us/ If comp neuro, ML, and AI4Neuro is your thing, or you just nerd out over brain recordings, apply! I'm at neurips. DM me here / on the conference app or email if you want to meet 🏖️🌮
- Reposted by Guy MossI’m at NeurIPS in San Diego this week to present cool work on foundation models for SBI! Most importantly, I’ll be around to meet people and discuss science. 👨🔬
- Second, come by to check out NPE-PFN: We leverage the power of tabular foundation models for training-free and simulation-efficient SBI. SBI has never been so effortless! By @vetterj.bsky.social, Manuel Gloeckler, @danielged.bsky.social, @jakhmack.bsky.social 4/11
- Reposted by Guy MossOur group is at NeurIPS and EurIPS this year with four papers and one workshop poster. If you are either curious about SBI with autoML, with foundation models, or on function spaces or about differentiable simulators with Jaxley, have a look below 👇 1/11
- I’m super excited to present our new work in #Eurips2025 and #Neurips2025! We developed FNOPE: a new simulation-based inference (SBI) method which excels at inferring function-valued parameters! Paper: openreview.net/forum?id=yB5... Code: github.com/mackelab/fnope (1/9)
- In SBI we train generative models for posterior estimation using model simulations. However, when the parameters of interest are function-valued, we end up with very high-dimensional parameter spaces, requiring huge numbers of training simulations. (2/9)
- Our method uses Fourier Neural Operators (FNOs) for Posterior Estimation (FNOPE). By training flow matching models with an FNO backbone, we can take into account the inductive biases of continuous parameters, forming a natural way to represent distributions over smooth functions! (3/9)
-
View full threadI’m very grateful to my colleagues @leahsmuhle.bsky.social , @coschroeder.bsky.social, Reinhard Drews and @jakhmack.bsky.social for making this happen! Come find me at #Eurips2025 or reach out to learn more! (9/9)
- Reposted by Guy MossMackeLab has grown! 🎉 Warm welcome to 5(!) brilliant and fun new PhD students / research scientists who joined our lab in the past year — we can’t wait to do great science and already have good times together! 🤖🧠 Meet them in the thread 👇 1/7
- Reposted by Guy MossSimulation-based inference (SBI) has transformed parameter inference across a wide range of domains. To help practitioners get started and make the most of these methods, we joined forces with researchers from many institutions and wrote a practical guide to SBI. 📄 Paper: arxiv.org/abs/2508.12939
- Reposted by Guy Moss🎉 sbi participated in GSoC 2025 through @numfocus.bsky.social and it was a great success: our two students contributed major new features and substantial internal improvements: 🧵 👇
- Reposted by Guy MossCongrats to Dr Michael Deistler @deismic.bsky.social, who defended his PhD! Michael worked on "Machine Learning for Inference in Biophysical Neuroscience Simulations", focusing on simulation-based inference and differentiable simulation. We wish him all the best for the next chapter! 👏🎓
- Reposted by Guy MossThe Macke lab is well-represented at the @bernsteinneuro.bsky.social conference in Frankfurt this year! We have lots of exciting new work to present with 7 posters (details👇) 1/9
- Reposted by Guy MossI've been waiting some years to make this joke and now it’s real: I conned somebody into giving me a faculty job! I’m starting as a W1 Tenure-Track Professor at Goethe University Frankfurt in a week (lol), in the Faculty of CS and Math and I'm recruiting PhD students 🤗

- Reposted by Guy MossFrom hackathon to release: sbi v0.25 is here! 🎉 What happens when dozens of SBI researchers and practitioners collaborate for a week? New inference methods, new documentation, lots of new embedding networks, a bridge to pyro and a bridge between flow matching and score-based methods 🤯 1/7 🧵
- Reposted by Guy MossLooky Looky! 😍🥳👏 arxiv.org/abs/2508.12939 Super fun project, I ❤️ed coauthoring w/ @sbi-devs.bsky.social. Great lead by @deismic.bsky.social & @janboelts.bsky.social. Contribs by many talented people @jakhmack.bsky.social. 🙏 to #BenjaminKurtMiller for the kickstart! @helmholtzai.bsky.social
- Reposted by Guy MossNew preprint: SBI with foundation models! Tired of training or tuning your inference network, or waiting for your simulations to finish? Our method NPE-PF can help: It provides training-free simulation-based inference, achieving competitive performance with orders of magnitude fewer simulations! ⚡️
- Reposted by Guy MossI have been genuinely amazed how well tabpfn works as a density estimator, and how helpful this is for SBI ... Great work by @vetterj.bsky.social, Manuel and @danielged.bsky.social!!
- Reposted by Guy MossMy first paper on simulation-based inference (SBI) as part of @mackelab.bsky.social! Exciting work on adapting state-of-the-art foundation models for posterior estimation. Almost plug-and-play, and surprisingly effective. Paper/code in thread below 🧵
- Reposted by Guy MossNew paper in Geophysical Research Letters led by Vjeran Višnjević mapping out ice shelf areas which are maintained by local precipitation only doi.org/10.1029/2024...
- Have I been to Antarctica? No. But my colleagues have, and we can learn a lot from the data they collected! Really happy to share that our work is now published!
- Thrilled to share that our paper on using simulation-based inference for inferring ice accumulation and melting rates for Antarctic ice shelves is now published in Journal of Glaciology! www.cambridge.org/core/journal...
- Reposted by Guy MossMore great news from the SBI community! 🎉 Two projects have been accepted for Google Summer of Code under the NumFOCUS umbrella, bringing new methods and general improvements to sbi. Big thanks to @numfocus.bsky.social, GSoC and our future contributors!
- Reposted by Guy MossGreat news! Our March SBI hackathon in Tübingen was a huge success, with 40+ participants (30 onsite!). Expect significant updates soon: awesome new features & a revamped documentation you'll love! Huge thanks to our amazing SBI community! Release details coming soon. 🥁 🎉
- Reposted by Guy Moss🎓Hiring now! 🧠 Join us at the exciting intersection of ML and Neuroscience! #AI4science We’re looking for PhDs, Postdocs and Scientific Programmers that want to use deep learning to build, optimize and study mechanistic models of neural computations. Full details: www.mackelab.org/jobs/ 1/5
- Reposted by Guy MossExcited to present our work on compositional SBI for time series at #ICLR2025 tomorrow! If you're interested in simulation-based inference for time series, come chat with Manuel Gloeckler or Shoji Toyota at Poster #420, Saturday 10:00–12:00 in Hall 3. 📰: arxiv.org/abs/2411.02728
- Reposted by Guy Moss🥳Great news, our JOSS paper "sbi reloaded" has been accepted! 🎉 This community lead by the fine folks of @sbi-devs.bsky.social is very welcoming and super fun to work with! I learn with every discussion I have. paper: joss.theoj.org/papers/10.21... review: github.com/openjournals...
- Reposted by Guy MossIt's been a blast, thanks to @sbi-devs.bsky.social ! This week's hackathon was phenomenal! 🙏 😍 The sbi hackathon welcomed about 25 people in Tübingen with contributions spanning the globe , e.g. 🇺🇸🇯🇵🇧🇪🇩🇪. Wanna see, what we did? Check out the PRs👇 github.com/sbi-dev/sbi/...
- Reposted by Guy Moss🙏 Please help us improve the SBI toolbox! 🙏 In preparation for the upcoming SBI Hackathon, we’re running a user study to learn what you like, what we can improve, and how we can grow. 👉 Please share your thoughts here: forms.gle/foHK7myV2oaK... Your input will make a big difference—thank you! 🙌
- Reposted by Guy Moss🚀 Join the 4th SBI Hackathon! 🚀 The last SBI hackathon was a fantastic milestone in forming a collaborative open-source community around SBI. Be part of it this year as we build on that momentum! 📅 March 17–21, 2025 📍 Tübingen, Germany or remote 👉 Details: github.com/sbi-dev/sbi/... More Info:🧵👇
- Reposted by Guy Moss🎉 Just in time for the end of the year, we’ve released a new version of sbi! 📦 v0.23.3 comes packed with exciting features, bug fixes, and docs updates to make sbi smoother and more robust. Check it out! 👇 🔗 Full changelog: github.com/sbi-dev/sbi/...
- Reposted by Guy MossThe slides of my NeurIPS lecture "From Diffusion Models to Schrödinger Bridges - Generative Modeling meets Optimal Transport" can be found here drive.google.com/file/d/1eLa3...
- Reposted by Guy Moss1) With our @neuripsconf.bsky.social poster happening tomorrow, it's about time to introduce our Spotlight paper 🔦, co-lead with @jkapoor.bsky.social: Latent Diffusion for Neural Spiking data (LDNS), a latent variable model (LVM) which addresses 3 goals simultaneously:
- Reposted by Guy MossHow to find all fixed points in piece-wise linear recurrent neural networks (RNNs)? A short thread 🧵 In RNNs with N units with ReLU(x-b) activations the phase space is partioned in 2^N regions by hyperplanes at x=b 1/7
- @vetterj.bsky.social and I are excited to present our work at #NeurIPS2024! We present Sourcerer: a maximum-entropy, sample-based solution to source distribution estimation. Paper: openreview.net/forum?id=0cg... Code: github.com/mackelab/sou... (1/8)
- Given a model and a dataset of observations, the goal is to estimate a distribution over the model parameters that reproduces the dataset when passed through the model. You may also know this problem as Empirical Bayes, statistical inverse modeling, population inference… (2/8)
- The problem? The source distribution is not unique! So, which source distribution should we target? We propose to target the maximum entropy source distribution. This guarantees uniqueness, and also ensures that we do not miss any feasible model parameters! (3/8)
-
View full threadInterested to learn more? Come visit our poster at #Neurips2024, or simply get in touch! Huge thanks again to @vetterj.bsky.social , Cornelius Schröder, @rdgao.bsky.social , and @jakhmack.bsky.social (8/8)