Related Searches
Definitions
Nearby Words

# Negentropy

Negative entropy or negentropy or syntropy of a living system is the entropy that it exports to maintain its own entropy low (see entropy and life). The concept and phrase (Negative Entropy) were introduced by Erwin Schrödinger in his 1943 popular-science book What is life? Later, Léon Brillouin shortened the phrase to negentropy, to express it in a more "positive" way: a living system imports negentropy and stores it. In 1974, Albert Szent-Györgyi proposed replacing the term negentropy with syntropy. That term may have originated in the 1940s with the Italian mathematician Luigi Fantappiè, who tried to construct a unified theory of biology and physics. (This attempt has not gained renown or borne great fruit.) Buckminster Fuller tried to popularize this usage, but negentropy remains common.

In a note to What is Life? Schrödinger explained his use of this phrase.

[...] if I had been catering for them [physicists] alone I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things.

## Information theory

In information theory and statistics, negentropy is used as a measure of distance to normality. Consider a signal with a certain distribution. If the signal is Gaussian, the signal is said to have a normal distribution. Negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes iff the signal is Gaussian.

Negentropy is defined as

$J\left(p_x\right) = S\left(phi_x\right) - S\left(p_x\right),$

where $S\left(phi_x\right)$ stands for the differential entropy of the Gaussian density with the same mean and variance as $p_x$ and $S\left(p_x\right)$ is the differential entropy of $p_x$:

$S\left(p_x\right) = - int p_x\left(u\right) log p_x\left(u\right) du$

Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in Independent Component Analysis. Negentropy can be understood intuitively as the information that can be saved when representing $p_x$ in an efficient way; if $p_x$ were a random variable (with Gaussian distribution) with the same mean and variance, would need the maximum length of data to be represented, even in the most efficient way. Since $p_x$ is less random, then something about it is known beforehand, it contains less unknown information, and needs less length of data to be represented in an efficient way.

## Correlation between statistical negentropy and Gibbs' free energy

There is a physical quantity closely linked to free energy (free enthalpy), with a unit of entropy and isomorphic to negentropy known in statistics and information theory. In 1873 Willard Gibbs created a diagram illustrating the concept of free energy corresponding to free enthalpy. On the diagram one can see the quantity called capacity for entropy. The said quantity is the amount of entropy that may be increased without changing an internal energy or increasing its volume. In other words, it is a difference between maximum possible, under assumed conditions, entropy and its actual entropy. It corresponds exactly to the definition of negentropy adopted in statistics and information theory. A similar physical quantity was introduced in 1869 by Massieu for the isothermal process (both quantities differs just with a figure sign) and then Planck for the isothermal-isobaric process More recently, the Massieu-Planck thermodynamic potential, known also as free entropy, has been shown to play a great role in the so-called entropic formulation of statistical mechanics, applied among the others in molecular biology. and thermodynamic non-equilibriumi processes.

$J = S_max - S = -Phi = -k ln Z,$

where:
$J$ - negentropy (Gibbs "capacity for entropy")
$Phi$Massieu potential
$Z$ - partition function
$k$ - Boltzmann constant

## Organization theory

In 1988, on the basis of Shannon's definition of statistical entropy, Mario Ludovico gave a formal definition of syntropy, as a measurement of the degree of organization internal to any system formed by interacting components. According to that definition, syntropy is a quantity complementary to entropy. The sum of the two quantities defines a constant value, specific of the system of which that constant value identifies the transformation potential. By use of such definitions, the theory develops equations apt to describe/simulate any possible evolution of the system, either toward higher/lower levels of "internal organization" (i.e., syntropy) or toward the system's collapse.

In risk management, negentropy is the force that seeks to achieve effective organizational behavior and lead to a steady predictable state.