Added to Favorites

Related Searches

Definitions

White noise is a random signal (or process) with a flat power spectral density. In other words, the signal contains equal power within a fixed bandwidth at any center frequency. White noise draws its name from white light in which the power spectral density of the light is distributed over the visible band in such a way that the eye's three color receptors (cones) are rather equally stimulated.

An infinite-bandwidth, white noise signal is purely a theoretical construction. By having power at all frequencies, the total power of such a signal is infinite and therefore impossible to generate. In practice, however, a signal can be "white" with a flat spectrum over a defined frequency band.

The image to the right displays a finite length, discrete time realization of a white noise process generated from a computer.

Being uncorrelated in time does not restrict the values a signal can take. Any distribution of values is possible (although it must have zero DC component). For example, on Linux white noise can be generated with the command , feeding the kernel random number generator (uniformly distributed integers between 0 and 255) into the digital signal processor. Even a binary signal which can only take on the values 1 or 0 will be white if the sequence of zeros and ones is statistically uncorrelated. Noise having a continuous distribution, such as a normal distribution, can of course be white.

It is often incorrectly assumed that Gaussian noise (i.e., noise with a Gaussian amplitude distribution — see normal distribution) is necessarily white noise, yet neither property implies the other. Gaussianity refers to the probability distribution with respect to the value i.e. the probability that the signal has a certain given value, while the term 'white' refers to the way the signal power is distributed over time or among frequencies.

We can therefore find Gaussian white noise, but also Poisson, Cauchy, etc. white noises. Thus, the two words "Gaussian" and "white" are often both specified in mathematical models of systems. Gaussian white noise is a good approximation of many real-world situations and generates mathematically tractable models. These models are used so frequently that the term additive white Gaussian noise has a standard abbreviation: AWGN. Gaussian white noise has the useful statistical property that its values are independent (see Statistical independence).

White noise is the generalized mean-square derivative of the Wiener process or Brownian motion.

White noise is commonly used in the production of electronic music, usually either directly or as an input for a filter to create other types of noise signal. It is used extensively in audio synthesis, typically to recreate percussive instruments such as cymbals which have high noise content in their frequency domain.

It is also used to generate impulse responses. To set up the EQ for a concert or other performance in a venue, a short burst of white or pink noise is sent through the PA system and monitored from various points in the venue so that the engineer can tell if the acoustics of the building naturally boost or cut any frequencies. The engineer can then adjust the overall EQ to ensure a balanced mix.

White noise can be used for frequency response testing of amplifiers and electronic filters. It is sometimes used with a flat response microphone and an automatic equalizer. The idea is that the system will generate white noise and the microphone will pick up the white noise produced by the speakers. It will then automatically equalize each frequency band to get a flat response. That system is used in professional level equipment, some high-end home stereo and some high-end car radios.

White noise is used as the basis of some random number generators.

White noise can be used to disorient individuals prior to interrogation and may be used as part of sensory deprivation techniques. White noise machines are sold as privacy enhancers and sleep aids and to mask tinnitus. White noise CDs, when used with headphones, can aid concentration by blocking out irritating or distracting noises in a person's environment. In open plan offices, large corporations such as ExxonMobil apply white noise to reduce the reach of speech, thus, by preventing office staff from being distracted by conversations in the background, safeguarding productivity.

- $mu\_w\; =\; mathbb\{E\}\{\; mathbf\{w\}\; \}\; =\; 0$

- $R\_\{ww\}\; =\; mathbb\{E\}\{\; mathbf\{w\}\; mathbf\{w\}^T\}\; =\; sigma^2\; mathbf\{I\}\; .$

That is, it is a zero mean random vector, and its autocorrelation matrix is a multiple of the identity matrix. When the autocorrelation matrix is a multiple of the identity, we say that it has spherical correlation.

- $mu\_w(t)\; =\; mathbb\{E\}\{\; w(t)\}\; =\; 0$

- $R\_\{ww\}(t\_1,\; t\_2)\; =\; mathbb\{E\}\{\; w(t\_1)\; w(t\_2)\}\; =\; (N\_\{0\}/2)delta(t\_1\; -\; t\_2)$.

i.e. it is a zero mean process for all time and has infinite power at zero time shift since its autocorrelation function is the Dirac delta function.

The above autocorrelation function implies the following power spectral density.

- $S\_\{xx\}(omega)\; =\; N\_\{0\}/2\; ,!$

since the Fourier transform of the delta function and likewise is equal to 1. Since this power spectral density is the same at all frequencies, we call it white as an analogy to the frequency spectrum of white light.

These two ideas are crucial in applications such as channel estimation and channel equalization in communications and audio. These concepts are also used in data compression.

- $,!\; K\_\{xx\}\; =\; E\; Lambda\; E^T$

where $E$ is the orthogonal matrix of eigenvectors and $Lambda$ is the diagonal matrix of eigenvalues.

We can simulate the 1st and 2nd moment properties of this random vector $mathbf\{x\}$ with mean $mathbf\{mu\}$ and covariance matrix $K\_\{xx\}$ via the following transformation of a white vector $mathbf\{w\}$:

- $mathbf\{x\}\; =\; H\; ,\; mathbf\{w\}\; +\; mu$

where

- $,!H\; =\; E\; Lambda^\{1/2\}$

Thus, the output of this transformation has expectation

- $mathbb\{E\}\; \{mathbf\{x\}\}\; =\; H\; ,\; mathbb\{E\}\; \{mathbf\{w\}\}\; +\; mu\; =\; mu$

and covariance matrix

- $mathbb\{E\}\; \{(mathbf\{x\}\; -\; mu)\; (mathbf\{x\}\; -\; mu)^T\}\; =\; H\; ,\; mathbb\{E\}\; \{mathbf\{w\}\; mathbf\{w\}^T\}\; ,\; H^T\; =\; H\; ,\; H^T\; =\; E\; Lambda^\{1/2\}\; Lambda^\{1/2\}\; E^T\; =\; K\_\{xx\}$

- $mathbf\{w\}\; =\; Lambda^\{-1/2\},\; E^T\; ,\; (mathbf\{x\}\; -\; mathbf\{mu\}\; )$

Thus, the output of this transformation has expectation

- $mathbb\{E\}\; \{mathbf\{w\}\}\; =\; Lambda^\{-1/2\},\; E^T\; ,\; (mathbb\{E\}\; \{mathbf\{x\}\; \}\; -\; mathbf\{mu\}\; )\; =\; Lambda^\{-1/2\},\; E^T\; ,\; (mu\; -\; mu)\; =\; 0$

and covariance matrix

- $mathbb\{E\}\; \{mathbf\{w\}\; mathbf\{w\}^T\}\; =\; mathbb\{E\}\; \{\; Lambda^\{-1/2\},\; E^T\; ,\; (mathbf\{x\}\; -\; mathbf\{mu\}\; )(mathbf\{x\}\; -\; mathbf\{mu\}\; )^T\; E\; ,\; Lambda^\{-1/2\},\; \}$

- $=\; Lambda^\{-1/2\},\; E^T\; ,\; mathbb\{E\}\; \{(mathbf\{x\}\; -\; mathbf\{mu\}\; )(mathbf\{x\}\; -\; mathbf\{mu\}\; )^T\}\; E\; ,\; Lambda^\{-1/2\},$

- $=\; Lambda^\{-1/2\},\; E^T\; ,\; K\_\{xx\}\; E\; ,\; Lambda^\{-1/2\}$

By diagonalizing $K\_\{xx\}$, we get the following:

- $Lambda^\{-1/2\},\; E^T\; ,\; E\; Lambda\; E^T\; E\; ,\; Lambda^\{-1/2\}\; =\; Lambda^\{-1/2\},\; Lambda\; ,\; Lambda^\{-1/2\}\; =\; I$

Thus, with the above transformation, we can whiten the random vector to have zero mean and the identity covariance matrix.

We can simulate any wide-sense stationary, continuous-time random process $x(t)\; :\; t\; in\; mathbb\{R\},!$ with constant mean $mu$ and covariance function

- $K\_x(tau)\; =\; mathbb\{E\}\; left\{\; (x(t\_1)\; -\; mu)\; (x(t\_2)\; -\; mu)^\{*\}\; right\}\; mbox\{\; where\; \}\; tau\; =\; t\_1\; -\; t\_2$

- $S\_x(omega)\; =\; int\_\{-infty\}^\{infty\}\; K\_x(tau)\; ,\; e^\{-j\; omega\; tau\}\; ,\; dtau$

We can simulate this signal using frequency domain techniques.

Because $K\_x(tau)$ is Hermitian symmetric and positive semi-definite, it follows that $S\_x(omega)$ is real and can be factored as

- $S\_x(omega)\; =\; |\; H(omega)\; |^2\; =\; H(omega)\; ,\; H^\{*\}\; (omega)$

if and only if $S\_x(omega)$ satisfies the Paley-Wiener criterion.

- $int\_\{-infty\}^\{infty\}\; frac\{log\; (S\_x(omega))\}\{1\; +\; omega^2\}\; ,\; d\; omega\; <\; infty$

If $S\_x(omega)$ is a rational function, we can then factor it into pole-zero form as

- $S\_x(omega)\; =\; frac\{Pi\_\{k=1\}^\{N\}\; (c\_k\; -\; j\; omega)(c^\{*\}\_k\; +\; j\; omega)\}\{Pi\_\{k=1\}^\{D\}\; (d\_k\; -\; j\; omega)(d^\{*\}\_k\; +\; j\; omega)\}$

Choosing a minimum phase $H(omega)$ so that its poles and zeros lie inside the left half s-plane, we can then simulate $x(t)$ with $H(omega)$ as the transfer function of the filter.

We can simulate $x(t)$ by constructing the following linear, time-invariant filter

- $hat\{x\}(t)\; =\; mathcal\{F\}^\{-1\}\; left\{\; H(omega)\; right\}\; *\; w(t)\; +\; mu$

where $w(t)$ is a continuous-time, white-noise signal with the following 1st and 2nd moment properties:

- $mathbb\{E\}\{w(t)\}\; =\; 0$

- $mathbb\{E\}\{w(t\_1)w^\{*\}(t\_2)\}\; =\; K\_w(t\_1,\; t\_2)\; =\; delta(t\_1\; -\; t\_2)$

Thus, the resultant signal $hat\{x\}(t)$ has the same 2nd moment properties as the desired signal $x(t)$.

Suppose we have a wide-sense stationary, continuous-time random process $x(t)\; :\; t\; in\; mathbb\{R\},!$ defined with the same mean $mu$, covariance function $K\_x(tau)$, and power spectral density $S\_x(omega)$ as above.

We can whiten this signal using frequency domain techniques. We factor the power spectral density $S\_x(omega)$ as described above.

Choosing the minimum phase $H(omega)$ so that its poles and zeros lie inside the left half s-plane, we can then whiten $x(t)$ with the following inverse filter

- $H\_\{inv\}(omega)\; =\; frac\{1\}\{H(omega)\}.$

We choose the minimum phase filter so that the resulting inverse filter is stable. Additionally, we must be sure that $H(omega)$ is strictly positive for all $omega\; in\; mathbb\{R\}$ so that $H\_\{inv\}(omega)$ does not have any singularities.

The final form of the whitening procedure is as follows:

- $w\; (t)\; =\; mathcal\{F\}^\{-1\}\; left\{\; H\_\{inv\}(omega)\; right\}\; *\; (x(t)\; -\; mu)$

so that $w(t)$ is a white noise random process with zero mean and constant, unit power spectral density

- $S\_\{w\}(omega)\; =\; mathcal\{F\}\; left\{\; mathbb\{E\}\; \{\; w(t\_1)\; w(t\_2)\; \}\; right\}\; =\; H\_\{inv\}(omega)\; S\_x(omega)\; H^\{*\}\_\{inv\}(omega)\; =\; frac\{S\_x(omega)\}\{S\_x(omega)\}\; =\; 1.$

Note that this power spectral density corresponds to a delta function for the covariance function of $w(t)$.

- $K\_w(tau)\; =\; ,!delta\; (tau)$

- A mathematical application of noise whitening of pictures - pdf
- White noise in wave(.wav) format (1 minute)
- White noise calculator, thermal noise - Voltage in microvolts, conversion to noise level in dBu and dBV and vice versa
- Noise generator - A generator to explore different types of noise
- A free collection of online test tones (white noise and more)
- A free online white, pink and brown/red noise generator, uses Flash

Wikipedia, the free encyclopedia © 2001-2006 Wikipedia contributors (Disclaimer)

This article is licensed under the GNU Free Documentation License.

Last updated on Wednesday October 08, 2008 at 03:22:36 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

This article is licensed under the GNU Free Documentation License.

Last updated on Wednesday October 08, 2008 at 03:22:36 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

Copyright © 2014 Dictionary.com, LLC. All rights reserved.