Added to Favorites

Related Searches

In probability theory, de Finetti's theorem explains why exchangeable observations are conditionally independent given some (usually) unobservable quantity to which an epistemic probability distribution would then be assigned. It is named in honor of Bruno de Finetti.## Statement of the theorem

### Another way of stating the theorem

## Example

## External links and references

One of the differences between Bayesian and frequentist methods in statistical inference is that frequentists often treat observations as independent that Bayesians treat as exchangeable. A Bayesian statistician will often seek the conditional probability distribution of that unobservable quantity given the observable data. The concept of exchangeability (see below) was introduced by de Finetti. De Finetti's theorem explains the mathematical relationship between independence and exchangeability.

An infinite sequence

- $X\_1,\; X\_2,\; X\_3,\; dots\; !$

of random variables is said to be exchangeable if for any finite cardinal number n and any two finite sequences i_{1}, ..., i_{n} and j_{1}, ..., j_{n}, the two sequences

- $X\_\{i\_1\},dots,X\_\{i\_n\}\; mbox\{\; and\; \}\; X\_\{j\_1\},dots,X\_\{j\_n\}\; !$

both have the same probability distribution. The condition of exchangeability is stronger than the assumption of identical distribution of the individual random variables in the sequence, and weaker than the assumption that they are independent and identically distributed.

A random variable X has a Bernoulli distribution if $Pr(X=0)\; =\; p$ and $Pr(X=1)\; =\; 1-p$, for some p ∈ (0, 1).

De Finetti's theorem states that the probability distribution of any infinite exchangeable sequence of Bernoulli random variables is a "mixture" of the probability distributions of independent and identically distributed sequences of Bernoulli random variables. "Mixture", in this sense, means a weighted average, but this need not mean a finite or countably infinite (i.e., discrete) weighted average: it can be an integral rather than a sum.

More precisely, suppose X_{1}, X_{2}, X_{3}, ... is an infinite exchangeable sequence of Bernoulli-distributed random variables. Then there is some probability distribution m on the interval [0, 1] and some random variable Y such that

- The probability distribution of Y is m, and
- The conditional probability distribution of the whole sequence X
_{1}, X_{2}, X_{3}, ... given the value of Y is described by saying that - X
_{1}, X_{2}, X_{3}, ... are conditionally independent given Y, and - For any i ∈ {1, 2, 3, ...}, the conditional probability that X
_{i}= 1, given the value of Y, is Y.

Suppose X_{1}, X_{2}, X_{3}, ... is an infinite exchangeable sequence of Bernoulli-distributed random variables. Then X_{1}, X_{2}, X_{3}, ... are conditionally independent given the tail sigma-field.

Here is a concrete example. Suppose p = 2/3 with probability 1/2 and p = 9/10 with probability 1/2. Suppose the conditional distribution of the sequence

- $X\_1,\; X\_2,\; X\_3,\; dots\; !$

given the event that p = 2/3, is described by saying that they are independent and identically distributed and X_{1} = 1 with probability 2/3 and X_{1} = 0 with probability 1 − (2/3). Further, the conditional distribution of the same sequence given the event that p = 9/10, is described by saying that they are independent and identically distributed and X_{1} = 1 with probability 9/10 and X_{1} = 0 with probability 1 - (9/10). The independence asserted here is conditional independence, i.e., the Bernoulli random variables in the sequence are conditionally independent given the event that p = 2/3, and are conditionally independent given the event that p = 9/10. But they are not unconditionally independent; they are positively correlated. In view of the strong law of large numbers, we can say that

- $lim\_\{nrightarrowinfty\}(X\_1+cdots+X\_n)/n\; =\; left\{begin\{matrix\}$

Rather than concentrating probability 1/2 at each of two points between 0 and 1, the "mixing distribution" can be any probability distribution supported on the interval from 0 to 1; which one it is depends on the joint distribution of the infinite sequence of Bernoulli random variables.

The conclusion of the first version of the theorem above makes sense if the sequence of exchangeable Bernoulli random variables is finite, but the theorem is not generally true in that case. It is true if the sequence can be extended to an exchangeable sequence that is infinitely long. The simplest example of an exchangeable sequence of Bernoulli random variables that cannot be so extended is the one in which X_{1} = 1 − X_{2} and X_{1} is either 0 or 1, each with probability 1/2. This sequence is exchangeable, but cannot be extended to an exchangeable sequence of length 3, let alone an infinitely long one.

Wikipedia, the free encyclopedia © 2001-2006 Wikipedia contributors (Disclaimer)

This article is licensed under the GNU Free Documentation License.

Last updated on Tuesday June 26, 2007 at 17:10:12 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

This article is licensed under the GNU Free Documentation License.

Last updated on Tuesday June 26, 2007 at 17:10:12 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

Copyright © 2014 Dictionary.com, LLC. All rights reserved.