Added to Favorites

Related Searches

Definitions

Nearby Words

Boltzmann, Ludwig, 1844-1906, Austrian physicist, b. Vienna, educated at Univ. of Vienna. He began teaching (1869) at Graz Univ. In 1873 he became mathematics professor at Vienna and then physics professor at Graz (1876), Munich (1890), Vienna (1895), and Leipzig (1900). Boltzmann made important contributions to the kinetic theory of gases and to statistical mechanics—the Boltzmann constant, which relates the mean total energy of a molecule to its absolute temperature, is used widely in statistics and is named for him. Working independently, he demonstrated a law on black body radiation that had been stated by the Austrian physicist Josef Stefan; hence the law is sometimes known as the Stefan-Boltzmann law.

The Columbia Electronic Encyclopedia Copyright © 2004.

Licensed from Columbia University Press

Licensed from Columbia University Press

In statistical thermodynamics, Boltzmann's equation is a probability equation relating the entropy S of an ideal gas to the quantity W, which is the number of microstates corresponding to a given macrostate:## History

The equation was originally formulated by Ludwig Boltzmann between 1872 to 1875, but later put into its current form by Max Planck in about 1900. To quote Planck, "the logarithmic connection between entropy and probability was first stated by L. Boltzmann in his kinetic theory of gases."## Generalization

## Boltzmann entropy excludes statistical dependencies

## See also

## References

## External links

- $S\; =\; k\; log\; W\; !$ (1)

where k is Boltzmann's constant equal to 1.38062 x 10^{-23} joule/kelvin and W is the number of microstates consistent with the given macrostate. In short, the Boltzmann formula shows the relationship between entropy and the number of ways the atoms or molecules of a thermodynamic system can be arranged. In 1934, Swiss physical chemist Werner Kuhn successfully derived a thermal equation of state for rubber molecules using Boltzmann's formula, which has since come to be known as the entropy model of rubber.

The value of $W$, specifically, is the Wahrscheinlichkeit, or number of possible microstates corresponding to the macroscopic state of a system — number of (unobservable) "ways" the (observable) thermodynamic state of a system can be realized by assigning different positions and momenta to the various molecules. Boltzmann’s paradigm was an ideal gas of $N$ identical particles, of which $N\_i$ are in the $i$-th microscopic condition (range) of position and momentum. $W$ can be counted using the formula for permutations

- $W\; =\; N!;\; /\; ;\; prod\_i\; N\_i!$ (2)

where i ranges over all possible molecular conditions and $!$ denotes factorial. The "correction" in the denominator is due to the fact that identical particles in the same condition are indistinguishable. $W$ is sometimes called the "thermodynamic probability" since it is an integer greater than one, while mathematical probabilities are always numbers between zero and one.

Boltzmann's formula applies to microstates of the universe as a whole, each possible microstate of which is presumed to be equally probable.

But in thermodynamics it is important to be able to make the approximation of dividing the universe into a system of interest, plus its surroundings; and then to be able to identify the entropy of the system with the system entropy in Classical thermodynamics. The microstates of such a thermodynamic system are not equally probable—for example, high energy microstates are less probable than low energy microstates for a thermodynamic system kept at a fixed temperature by allowing contact with a heat bath.

For thermodynamic systems where microstates of the system may not have equal probabilities, the appropriate generalization, called the Gibbs entropy, is:

- $S\; =\; -\; k\; sum\; p\_i\; log\; p\_i$ (3)

This reduces to equation (1) if the probabilities p_{i} are all equal.

Boltzmann used a $rhologrho$ formula as early as 1866. He interpreted $rho$ as a density in phase space—without mentioning probability—but since this satisfies the axiomatic definition of a probability measure we can retrospectively interpret it as a probability anyway. Gibbs gave an explicitly probabilistic interpretation in 1878.

Boltzmann himself used an expression equivalent to (3) in his later work and recognized it as more general than equation (1). That is, equation (1) is a corollary of equation (3)—and not vice versa. In every situation where equation (1) is valid, equation (3) is valid also—and not vice versa.

The term Boltzmann entropy is also sometimes used to indicate entropies calculated based on the approximation that the overall probability can be factored into an identical separate term for each particle -- i.e., assuming each particle has an identical independent probability distribution, and ignoring interactions and correlations between the particles. This is exact for an ideal gas of identical particles, and may or may not be a good approximation for other systems.

Wikipedia, the free encyclopedia © 2001-2006 Wikipedia contributors (Disclaimer)

This article is licensed under the GNU Free Documentation License.

Last updated on Tuesday September 02, 2008 at 12:13:00 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

This article is licensed under the GNU Free Documentation License.

Last updated on Tuesday September 02, 2008 at 12:13:00 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

Copyright © 2015 Dictionary.com, LLC. All rights reserved.