COSMO Lab
Computational Science and Modelling of materials and molecules at the atomic-scale, with machine learning.
- PET continues its victory round of benchmarks and challenges 🥇🥉. And this one has a (bit far-fetched) end goal that would also make it useful! Congrats to Filippo and Cesare (and @marceldotsci.bsky.social who got a honorable mention and will also try further his LOREM model)🚀 dtu.dk/english/news...
- Congratulations to 🧑🚀 Sergey Pozdnyakov who very deservedly won the @materials-epfl.bsky.social doctoral distinction award. A good time to go check on his papers, if you haven't read them already!
- Release candidate 3 of chemiscope 1.0 is out, with class and range based highlighting of points. Try it, break it, report it on github.com/lab-cosmo/ch...
- Fantastic news from the @snf-fns.ch, who despite the budget cuts managed to fund six new NCCRs. Looking forward to doing some cool simulations to advance separation science! actu.epfl.ch/news/a-new-n...
- If you got curious by the PET-OAM results a week ago, you can learn more reading up arxiv.org/abs/2601.16195. Including some general considerations on how to train and use safely an unconstrained ML potential.
- If you're scared by the 700M parameters (you shouldn't be) there's a whole set of models from 🐁 to 🦣. You can find them all on github.com/lab-cosmo/upet !
- Not going to make a big deal out of a benchmark table, but PET just got the top spot on matbench-discovery.materialsproject.org. And don't be fooled by the huge parameters count, it's faster and can handle larger structures than eSEN-30M 🚀. Kudos to 🧑🚀 Filippo, Arslan and Paolo!
- You can fetch the model here github.com/lab-cosmo/upet, as easy as `pip install upet`, and then, for the ASE interface, `from upet.calculator import UPETCalculator; calculator = UPETCalculator(model="pet-oam-xl", version="1.0.0", device="cuda")` Have fun and go break it!
- 📢 chemiscope.org 1.0.0rc1 just dropped on pypi! We are making (a few) breaking changes to the interfaces, fixing a ton of bugs and introducing some exciting features (you can finally load datasets with > 100k points!). We'd be grateful if you test, break and report 🐛 github.com/lab-cosmo/ch...
- Hope y'all are getting a great start of 2026. Here we're taking some time to add the 2025 winter card to the archives www.epfl.ch/labs/cosmo/i... 🎅=🧑🚀
- 📢 New chemiscope.org release just landed! To make it even easier to integrate ⚗️🔭 into your workflow, we added a @streamlit.bsky.social component, so you can run analyses and show you atomistic data in a web app by just writing a few lines of python! try it, break it, report it!
- Congrats to 🧑🚀 Sergey Pozdnyakov who received a distinction (best 8% of theses at @materials-epfl.bsky.social) for his PhD thesis "Advancing understanding and practical performance of machine learning interatomic potentials". Поїхали 🚀! infoscience.epfl.ch/entities/pub...
- No day goes by without a new universal #ML potential. But how different they really are? Sanggyu and Sofiia tried to give a quantitative answer by comparing the reconstruction errors between their latent-space features. If you are curious, check out the #preprint arxiv.org/html/2512.05...
- 📢 PET-MAD is here! 📢 It has been for a while for those who read the #arXiv, but now you get it preciously 💸 typeset by @natcomms.nature.com Take home: unconstrained architecture + good train set choices give you fast, accurate and stable universal MLIP that just works™️ www.nature.com/articles/s41...
- 📢 Let us (re)introduce to you our Massive Atomic Diversity dataset for universal MLIPs. MAD includes molecules, clusters, surfaces and plenty of bulk configs, we cover a lot of ground with fewer than 100k structures, using highly consistent DFT settings. Read more 📑 www.nature.com/articles/s41...
- Reposted by COSMO LabIn this blog post, Filippo Bigi, Marcel Langer (@labcosmo.bsky.social) and @micheleceriotti.bsky.social write about the need to balance speed and physical laws when using ML for atomic-scale simulations aihub.org/2025/10/10/m...
- A primer for non conservative (& rotationally unconstrained) MLIPs, and how to use them safely. Thanks @aihub.org for the space! aihub.org/2025/10/10/m...
- Reposted by COSMO LabLooks like @ox.ac.uk forbids their researchers to do any kind of literature search, though it seems that thankfully they can still submit to the arxiv arxiv.org/abs/2510.00027 🤷
- 📝 We have been told (& been telling) that ML potentials are linked quite directly to the expansion of the atomic energy into pairs, triples, and so on. But is this actually true 🤔? Go read the latest from the 🧑🚀 team (w/QM help from Joonho's team at Harvard) to find out more arxiv.org/html/2509.14...
- TL;DR: not really. ML potentials learn whatever they want, as long as it allows them good accuracy on the train set. We note in particular that MACE is strongly preconditioned to learn a fast-decaying body-order expansion, whether it decays fast or not.
- Bragging time - ⚡ FlashMD⚡ was accepted as a spotlight paper at #NeurIPS25. if you still haven't checked it out, it's already on the #arxiv arxiv.org/abs/2505.19350, the code is at flashmd.org and the 🧑🍳📖 is here atomistic-cookbook.org/examples/fla.... Congrats to Filippo, Sanggyu and Augustinus!
- Reposted by COSMO Lab[Not loaded yet]
- Reposted by COSMO Lab[Not loaded yet]
- Anticipating 🧑🚀 Wei Bin's talk at #psik2025 (noon@roomA), 📢 a new #preprint using PET and the MAD dataset to train a universal #ml model for the density of states, giving band gaps for solids, clusters, surfaces and molecules with MAE ~200meV. Go to the talk, or check out arxiv.org/html/2508.17...!
- 📢 Now out on @physrevx.bsky.social energy, journals.aps.org/prxenergy/ab... from 🧑🚀 @dtisi.bsky.social and Hanna Türk, our #PET -powered study of the dynamic reconstruction of LPS surfaces, and how it affects their structure, stability and reactivity.
- Reconstructed surfaces become lower in energy, and the surface energy less orientation dependent - and so the Wulff shape of particles become more spherical.
- If you are at the #psik2025 and want to know more about the #metatensor ecosystem, don't miss @luthaf.bsky.social talk tomorrow morning 9:45 in room 1
- 🚨 #machinelearning for #compchem goodies from our 🧑🚀 team incoming! After years of work it's time to share. Go check arxiv.org/abs/2508.15704 and/or metatensor.org to learn about #metatensor and #metatomic. What they are, what they do, why you should use them for all of your atomistic ML projects 🔍.
- 🚨 #machinelearning for #compchem goodies from our 🧑🚀 team incoming! After years of work it's time to share. Go check arxiv.org/abs/2508.15704 and/or metatensor.org to learn about #metatensor and #metatomic. What they are, what they do, why you should use them for all of your atomistic ML projects 🔍.
- TL;DR - this is a cross-platform, model-agnostic library to handle atomistic data (handling geometry and property derivatives such as forces and stresses) that lets you package your model into a portable torchscript file.
- Go metatensor.org!
- If you are excited about 30x longer time steps in molecular dynamics using FlashMD, but are worried about it not being symplectic, Filippo has something new cooking that should make you even more excited. Head to the #arxiv for a preview arxiv.org/html/2508.01...
- We can get long-stride geometry-conserving integration by learning the Hamilton-Jacobi action. This fixes for good, doesn't just patch up, the instability of direct MD prediction, although it's not as fast. And work also for serious simulations, like glassy relaxation in deep supercooled GeTe!
- Two new recipes landed in the #atomistic-cookbook 🧑🍳📖. One explaining how to fine-tune the #PET-MAD universal model on a system-specific dataset, one training a model with conservative fine tuning. Check them out on atomistic-cookbook.org/examples/pet... and atomistic-cookbook.org/examples/pet...
- Thanks to the 🧑🚀🧑🚀🧑🚀 who put this together, Sofiia in particular, and thanks to the #metatrain team as this would not be so easy without their work metatensor.github.io/metatrain/la...
- Reposted by COSMO Lab[Not loaded yet]
- New 🧑🍳📖 #recipe landed, doubling up as a @plumed.org tutorial 🐦 atomistic-cookbook.org/examples/met..., and explaining how to use the #metatomic interface in #plumed to define custom collective variables with all the flexibility and speed of torch.
- Diverse data is good data! Took a while to polish it, but we have finally released the small-but-smart MAD dataset we used to train PET-MAD. You can find more on the #preprint arxiv.org/html/2506.19... or just head to the #materialscloud to fetch MAD archive.materialscloud.org/records/xdsb...