New preprint on unifying the zoo of multivariate higher-order information measures into a common form.
May be of interest to anyone interested in higher-order interactions, complex systems, emergence, or complexity.
1/N
arxiv.org/abs/2601.08030
The many faces of multivariate information
Extracting higher-order structures from multivariate data has become an area of intensive study in complex systems science, as these multipartite interactions can reveal insights into fundamental feat...
There are many different information-based measures of multipartite interactions in complex systems, which are generally thought to represent different "types" of information-sharing.
In this paper, I focus on 4:
1. Total correlation
2. Dual total correlation
3. S-information
4. O-information
2/N
Jan 14, 2026 15:43In a previous paper with
@popeme.bsky.social, we showed that the dual total correlation can be written in terms of a linear combination of joint and leave-one-out marginal total correlations. 3/N
pubmed.ncbi.nlm.nih.gov/37095282/
Multivariate information theory uncovers synergistic subsystems of the human cerebral cortex - PubMed
One of the most well-established tools for modeling the brain is the functional connectivity network, which is constructed from pairs of interacting brain regions. While powerful, the network model is limited by the restriction that only pairwise dependencies are considered and potentially higher-or …
You can push this farther, and find that three measures can be written in terms of the fourth - and that they share a common form, parameterized by a single variable k.
I call this function \Delta^{k}, since it's the difference between the "whole" and "sum of the parts."
4/N
A minor, mostly aesthetic, take-away is that the original definition of O-information is "backwards". If it had been defined as D(X) - T(X), instead of the other way around, it would fit neatly into this hierarchy of synergies.
But that's not the most interesting thing about \Delta^k.
5/N
What about value of k > 2? Are they meaningful?
Yes. It turns out that if you have a system comprised purely of k-order synergies, then \Delta^k = 0.
So \Delta^3 = 0 for an XOR gate, \Delta^4 = 0 for a 4-element parity gate and so on. 6/N
Why does this work? Imagine a system X of N elements, composed entirely of (potentially multiple) k-dimensional parity gates (which are pure-synergy).
In that case, the term (N-k)T(X) is sum of the leave-one-out marginals: T(X^{-i}).
7/N
So the (N-k) term is like your "null" or "prior", and \Delta^k is asking: how much faster/slower does the structure decay under single-element failures than we would expect if it was composed entirely of k-order synergies?
8/N
If the structure of X decays faster, then \Delta^k > 0, b/c the system is dominated by synergies of an order greater than k. If \Delta^k < 0 then the system is dominated by synergies of an order lower than k.
Eg. for an XOR gate, \Delta^2 = +1, while \Delta^4 = -1, and \Delta^3 = 0.
9/N
Using the entropic conjugation framework introduced by
@frosas.bsky.social and co. we can also derive the conjugate measure, which I called \Gamma^k (arbitrarily).
This one is less obviously interepretable to me, but it induces the correct hierarchy on k. 10/N
\Gamma^k has the mirror image property of \Delta^k, but for pure redundancies rather than pure synergies. For a k-dimensional Giant Bit distribution, \Gamma^k = 0. For a 3-dimensional Giant Bit, \Gamma^2 = 1, \Gamma^4=-1.
11/N
I'm excited about these results for a few reasons. One is that this lets us go from talking about "global redundancy/synergy dominance" to asking specifically about the order of the interaction. 12/N
It's also interesting to me that the \Delta^k function doesn't *have* to use the total correlation. Any function that met certain criteria could work just as well. This could be a path forwards to a more general theory of higher-order synergies beyond just information theory. 13/N
Finally, I have NOT submitted this anywhere. I feel like the story is still not totally complete, but also like I've pushed as far as I can go solo.
So this is an open call: if anyone is inspired by this and wants to collaborate (or make their own independent contribution), DM me or email me. 14/N
I'm sure there's a LOT of stuff I've missed, so in the spirit of open science and collaboration, I'm putting my cards on the table and inviting anyone to work with me. I'd love to see a bigger, multi-author paper come out of this.
(If no one reaches out, I'll submit somewhere in Feb). 15/N
Finally, thanks to
@doctorjosh.bsky.social for reading it, and letting me pursue this little side-project when I should have been doing the job I'm actually paid to do. FIN
16/16