ML p(r)ior | Quantifying synergistic information using intermediate stochastic variables

Quantifying synergistic information using intermediate stochastic variables

2016-02-03
Quantifying synergy among stochastic variables is an important open problem in information theory. Information synergy occurs when multiple sources together predict an outcome variable better than the sum of single-source predictions. It is an essential phenomenon in biology such as in neuronal networks and cellular regulatory processes, where different information flows integrate to produce a single response, but also in social cooperation processes as well as in statistical inference tasks in machine learning. Here we propose a metric of synergistic entropy and synergistic information from first principles. The proposed measure relies on so-called synergistic random variables (SRVs) which are constructed to have zero mutual information about individual source variables but non-zero mutual information about the complete set of source variables. We prove several basic and desired properties of our measure, including bounds and additivity properties. In addition, we prove several important consequences of our measure, including the fact that different types of synergistic information may co-exist between the same sets of variables. A numerical implementation is provided, which we use to demonstrate that synergy is associated with resilience to noise. Our measure may be a marked step forward in the study of multivariate information theory and its numerous applications.
PDF

Highlights - Most important sentences from the article

Login to like/save this paper, take notes and configure your recommendations

Related Articles

2015-07-08
1507.02284 | stat.ML

We introduce a new framework for unsupervised learning of representations based on a novel hierarchi… show more
PDF

Highlights - Most important sentences from the article

2019-04-21

The task of manipulating correlated random variables in a distributed setting has received attention… show more
PDF

Highlights - Most important sentences from the article

2018-11-14

We introduce a variant of the R\'enyi entropy definition that aligns it with the well-known H\"older… show more
PDF

Highlights - Most important sentences from the article

2019-03-21

In many real-world systems, information can be transmitted in two qualitatively different ways: by {… show more
PDF

Highlights - Most important sentences from the article

2019-02-06

We develop a theoretical framework for defining and identifying flows of information in computationa… show more
PDF

Highlights - Most important sentences from the article

2018-04-18

The matrix-based Renyi's \alpha-entropy functional and its multivariate extension were recently deve… show more
PDF

Highlights - Most important sentences from the article

2018-07-13
1807.05152 | math-ph

While Shannon entropy is related to the growth rate of multinomial coefficients, we show that the qu… show more
PDF

Highlights - Most important sentences from the article

2017-09-22
1709.07807 | cs.IT

D. Bennequin and P. Baudot introduced a cohomological construction adapted to information theory, ca… show more
PDF

Highlights - Most important sentences from the article

2018-11-01

The partial information decomposition (PID) is a promising framework for decomposing a joint random … show more
PDF

Highlights - Most important sentences from the article

2019-02-28

This article introduces a model-agnostic approach to study statistical synergy, a form of emergence … show more
PDF

Highlights - Most important sentences from the article

2017-09-19
1709.06653 | cond-mat.stat-mech

The partial information decomposition (PID) is perhaps the leading proposal for resolving informatio… show more
PDF

Highlights - Most important sentences from the article

2018-07-13

Given two channels that convey information about the same random variable, we introduce two measures… show more
PDF

Highlights - Most important sentences from the article

2018-10-26

Information theoretic quantities play an important role in various settings in machine learning, inc… show more
PDF

Highlights - Most important sentences from the article

2018-11-25

Typically, real-world stochastic processes are not easy to analyze. In this work we study the repres… show more
PDF

Highlights - Most important sentences from the article

2019-01-23

The unique information ($UI$) is an information measure that quantifies a deviation from the Blackwe… show more
PDF

Highlights - Most important sentences from the article

2017-10-14

We study learning algorithms that are restricted to using a small amount of information from their i… show more
PDF

Highlights - Most important sentences from the article

2018-02-16
1802.05968 | cs.IT

Shannon's mathematical theory of communication defines fundamental limits on how much information ca… show more