The function H is understood to be a real-valued function on the space of states , while is a real-valued free parameter (conventionally, the inverse temperature). The sum over the is understood to be a sum over all possible values that the random variable may take. Thus, the sum is to be replaced by an integral when the are continuous, rather than discrete. Thus, one writes
for the case of continuously-varying .
Such is the case for for the partition function in quantum field theory.
A common, useful modification to the partition function is to introduce auxiliary functions. This allows, for example, the partition function to be used as a generating function for correlation functions. This is discussed in greater detail below.
The role or meaning of the parameter is best understood by examining the derivation of the partition function with maximum entropy methods. Here, the parameter appears as a Lagrange multiplier; the multiplier is used to guarantee that the expectation value of some quantity is preserved by the distribution of probabilities. Thus, in physics problems, the use of just one parameter reflects the fact that there is only one expectation value that must be held constant: this is the energy. For the grand canonical ensemble, there are two Lagrange multipliers: one to hold the energy constant, and another (the fugacity) to hold the particle count constant. In the general case, there are a set of parameters taking the place of , one for each constraint enforced by the multiplier. Thus, for the general case, one has
The corresponding Gibbs measure then provides a probability distribution such that the expectation value of each is a fixed value.
Although the value of is commonly taken to be real, it need not be, in general; this is discussed in the section Normalization below.
where the sum over s is a sum over some subset of the power set P(X) of the set . For example, in statistical mechanics, such as the Ising model, the sum is over pairs of nearest neighbors. In probability theory, such as Markov networks, the sum might be over the cliques of a graph; so, for the Ising model and other lattice models, the maximal cliques are edges.
The fact that the potential function can be written as a sum usually reflects the fact that it is invariant under the action of a group symmetry, such as translational invariance. Such symmetries can be discrete or continuous; they materialize in the correlation functions for the random variables (discussed below). Thus a symmetry in the Hamiltonian becomes a symmetry of the correlation function (and vice-versa).
This symmetry has a critically important interpretation in probability theory: it implies that the Gibbs measure has the Markov property; that is, it is independent of the random variables in a certain way, or, equivalently, the measure is identical on the equivalence classes of the symmetry. This leads to the widespread appearance of the partition function in problems with the Markov property, such as Hopfield networks.
can be interpreted as a likelihood that a specific configuration of values occurs in the system. Thus, given a specific configuration ,
is the probability of the configuration occurring in the system, which is now properly normalized so that , and such that the sum over all configurations totals to one. As such, the partition function can be understood to provide a measure on the space of states; it is sometimes called the Gibbs measure. More narrowly, it is called the canonical ensemble in statistical mechanics.
There exists at least one configuration for which the probability is maximized; this configuration is conventionally called the ground state. If the configuration is unique, the ground state is said to be non-degenerate, and the system is said to be ergodic; otherwise the ground state is degenerate. The ground state may or may not commute with the generators of the symetry; if commutes, it is said to be an invariant measure. When it does not commute, the symmetry is said to be spontaneously broken.
Conditions under which a ground state exists and is unique are given by the Karush–Kuhn–Tucker conditions; these conditions are commonly used to justify the use of the Gibbs measure in maximum-entropy problems.
gives the average (expectation value) of H. In physics, this would be called the average energy of the system.
The entropy is given by
The Gibbs measure is the unique statistical distribution that maximizes the entropy for a fixed expectation value of the energy; this underlies its use in maximum entropy methods.
By introducing artificial auxiliary functions into the partition function, it can then be used to obtain the expectation value of the random variables. Thus, for example, by writing
one then has
as the expectation value of .