In a note to What is Life? Schrödinger explained his use of this phrase.
[...] if I had been catering for them [physicists] alone I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things.
Negentropy is defined as
Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in Independent Component Analysis. Negentropy can be understood intuitively as the information that can be saved when representing in an efficient way; if were a random variable (with Gaussian distribution) with the same mean and variance, would need the maximum length of data to be represented, even in the most efficient way. Since is less random, then something about it is known beforehand, it contains less unknown information, and needs less length of data to be represented in an efficient way.
There is a physical quantity closely linked to free energy (free enthalpy), with a unit of entropy and isomorphic to negentropy known in statistics and information theory. In 1873 Willard Gibbs created a diagram illustrating the concept of free energy corresponding to free enthalpy. On the diagram one can see the quantity called capacity for entropy. The said quantity is the amount of entropy that may be increased without changing an internal energy or increasing its volume. In other words, it is a difference between maximum possible, under assumed conditions, entropy and its actual entropy. It corresponds exactly to the definition of negentropy adopted in statistics and information theory. A similar physical quantity was introduced in 1869 by Massieu for the isothermal process (both quantities differs just with a figure sign) and then Planck for the isothermal-isobaric process More recently, the Massieu-Planck thermodynamic potential, known also as free entropy, has been shown to play a great role in the so-called entropic formulation of statistical mechanics, applied among the others in molecular biology. and thermodynamic non-equilibriumi processes.
In 1988, on the basis of Shannon's definition of statistical entropy, Mario Ludovico gave a formal definition of syntropy, as a measurement of the degree of organization internal to any system formed by interacting components. According to that definition, syntropy is a quantity complementary to entropy. The sum of the two quantities defines a constant value, specific of the system of which that constant value identifies the transformation potential. By use of such definitions, the theory develops equations apt to describe/simulate any possible evolution of the system, either toward higher/lower levels of "internal organization" (i.e., syntropy) or toward the system's collapse.
In risk management, negentropy is the force that seeks to achieve effective organizational behavior and lead to a steady predictable state.