### Quantifying synergistic information using intermediate stochastic variables

**2016-02-03**

1602.01265 | cs.IT

Quantifying synergy among stochastic variables is an important open problem
in information theory. Information synergy occurs when multiple sources
together predict an outcome variable better than the sum of single-source
predictions. It is an essential phenomenon in biology such as in neuronal
networks and cellular regulatory processes, where different information flows
integrate to produce a single response, but also in social cooperation
processes as well as in statistical inference tasks in machine learning. Here
we propose a metric of synergistic entropy and synergistic information from
first principles. The proposed measure relies on so-called synergistic random
variables (SRVs) which are constructed to have zero mutual information about
individual source variables but non-zero mutual information about the complete
set of source variables. We prove several basic and desired properties of our
measure, including bounds and additivity properties. In addition, we prove
several important consequences of our measure, including the fact that
different types of synergistic information may co-exist between the same sets
of variables. A numerical implementation is provided, which we use to
demonstrate that synergy is associated with resilience to noise. Our measure
may be a marked step forward in the study of multivariate information theory
and its numerous applications.

**Login to like/save this paper, take notes and configure your recommendations**

# Related Articles

**2015-07-08**

1507.02284 | stat.ML

We introduce a new framework for unsupervised learning of representations
based on a novel hierarchi… show more

**2019-04-21**

1904.09563 | cs.IT

The task of manipulating correlated random variables in a distributed setting
has received attention… show more

**2018-11-14**

1811.06122 | cs.IT

We introduce a variant of the R\'enyi entropy definition that aligns it with
the well-known H\"older… show more

**2019-03-21**

1903.10693 | cs.IT

In many real-world systems, information can be transmitted in two
qualitatively different ways: by {… show more

**2019-02-06**

1902.02292 | cs.IT

We develop a theoretical framework for defining and identifying flows of
information in computationa… show more

**2018-04-18**

1804.06537 | cs.LG

The matrix-based Renyi's \alpha-entropy functional and its multivariate
extension were recently deve… show more

**2018-07-13**

1807.05152 | math-ph

While Shannon entropy is related to the growth rate of multinomial
coefficients, we show that the qu… show more

**2017-09-22**

1709.07807 | cs.IT

D. Bennequin and P. Baudot introduced a cohomological construction adapted to
information theory, ca… show more

**2018-11-01**

1811.01745 | cs.IT

The partial information decomposition (PID) is a promising framework for
decomposing a joint random … show more

**2019-02-28**

1902.11239 | cs.IT

This article introduces a model-agnostic approach to study statistical
synergy, a form of emergence … show more

**2017-09-19**

1709.06653 | cond-mat.stat-mech

The partial information decomposition (PID) is perhaps the leading proposal
for resolving informatio… show more

**2018-07-13**

1807.05103 | cs.IT

Given two channels that convey information about the same random variable, we
introduce two measures… show more

**2018-10-26**

1810.11551 | cs.IT

Information theoretic quantities play an important role in various settings
in machine learning, inc… show more

**2018-11-25**

1811.10071 | cs.IT

Typically, real-world stochastic processes are not easy to analyze. In this
work we study the repres… show more

**2019-01-23**

1901.08007 | cs.IT

The unique information ($UI$) is an information measure that quantifies a
deviation from the Blackwe… show more

**2017-10-14**

1710.05233 | cs.LG

We study learning algorithms that are restricted to using a small amount of
information from their i… show more

**2018-02-16**

1802.05968 | cs.IT

Shannon's mathematical theory of communication defines fundamental limits on
how much information ca… show more