We establish methods that quantify the statistical interactions structure within a given data set using the characterization of information theory in cohomology by finite methods, and provide their expression in term of statistical physic and machine learning.

In a first part, we will have a look at the formalism of Information Cohomology obtained with Daniel Bennequin and refined by Juan Pablo Vigneaux with extension to Tsallis entropies [1,2]. It considers random variables as partitions of atomic probabilities and the associated poset given by their lattice. The basic cohomology is settled by the Hochschild coboundary, with a left action corresponding to information conditioning. The first degree cocycle is the entropy chain rule, allowing to derive the functional equation of information and hence to characterize entropy uniquely as the first group of the cohomology. (minus) Odd multivariate mutual informations (MI, I2k+1) appears as even degrees... more

In a first part, we will have a look at the formalism of Information Cohomology obtained with Daniel Bennequin and refined by Juan Pablo Vigneaux with extension to Tsallis entropies [1,2]. It considers random variables as partitions of atomic probabilities and the associated poset given by their lattice. The basic cohomology is settled by the Hochschild coboundary, with a left action corresponding to information conditioning. The first degree cocycle is the entropy chain rule, allowing to derive the functional equation of information and hence to characterize entropy uniquely as the first group of the cohomology. (minus) Odd multivariate mutual informations (MI, I2k+1) appears as even degrees... more