@vetterj.bsky.social and I are excited to present our work at
#NeurIPS2024! We present Sourcerer: a maximum-entropy, sample-based solution to source distribution estimation.
Paper:
openreview.net/forum?id=0cg...
Code:
github.com/mackelab/sou...
(1/8)

Sourcerer: Sample-based Maximum Entropy Source Distribution Estimation
Scientific modeling applications often require estimating a distribution of parameters consistent with a dataset of observations - an inference task also known as source distribution estimation....
Given a model and a dataset of observations, the goal is to estimate a distribution over the model parameters that reproduces the dataset when passed through the model. You may also know this problem as Empirical Bayes, statistical inverse modeling, population inference…
(2/8)
The problem? The source distribution is not unique! So, which source distribution should we target? We propose to target the maximum entropy source distribution. This guarantees uniqueness, and also ensures that we do not miss any feasible model parameters!
(3/8)
We define a sample-based loss with an entropy-regularization term. Therefore, we have no constraints on our variational distribution, and we can optimize directly from simulations - we do not need to know or estimate the model likelihood.
(4/8)
Sourcerer consistently finds source distributions that reproduce the dataset with high fidelity on a collection of benchmark tasks. When we also regularize for high entropy, Sourcerer finds higher entropy source distributions at no cost to the simulation fidelity!
(5/8)
With the likelihood-free loss, Sourcerer can make use of differentiable simulators to quickly estimate source distributions, even when the data is high-dimensional, such as for time series data.
(6/8)
We apply Sourcerer to a real dataset of single-neuron recordings and the Hodgkin-Huxley model. This model is misspecified and highly nonlinear. Still, Sourcerer estimates source distributions that accurately reproduce the dataset, again achieving higher entropy “for free”!
(7/8)
Interested to learn more? Come visit our poster at
#Neurips2024, or simply get in touch! Huge thanks again to
@vetterj.bsky.social , Cornelius Schröder,
@rdgao.bsky.social , and
@jakhmack.bsky.social
(8/8)
Dec 10, 2024 02:33