Added to Favorites

Related Searches

Definitions

In mathematics the Chebyshev polynomials, named after Pafnuty Chebyshev, are a sequence of orthogonal polynomials which are related to de Moivre's formula and which are easily defined recursively, like Fibonacci or Lucas numbers. One usually distinguishes between Chebyshev polynomials of the first kind which are denoted T_{n} and Chebyshev polynomials of the second kind which are denoted U_{n}. The letter T is used because of the alternative transliterations of the name Chebyshev as Tchebyshef or Tschebyscheff. ## Definition

### Trigonometric definition

### Pell equation definition

The Chebyshev polynomials can also be defined as the solutions to the Pell equation### Relation between Chebyshev polynomials of the first and second kind

## Explicit formulas

## Properties

### Orthogonality

### Minimal ∞-norm

### Differentiation and integration

### Roots and extrema

A Chebyshev polynomial of either kind with degree n has n different simple roots, called Chebyshev roots, in the interval [−1,1]. The roots are sometimes called Chebyshev nodes because they are used as nodes in polynomial interpolation. Using the trigonometric definition and the fact that### Other properties

## Examples

## As a basis set

### Partial sums

### Polynomial in Chebyshev form

An arbitrary polynomial of degree N can be written in terms of the Chebyshev polynomials of the first kind. Such a polynomial p(x) is of the form## Spread polynomials

## See also

## References

## Notes

## External links

The Chebyshev polynomials T_{n} or U_{n} are polynomials of degree n and the sequence of Chebyshev polynomials of either kind composes a polynomial sequence.

Chebyshev polynomials are important in approximation theory because the roots of the Chebyshev polynomials of the first kind, which are also called Chebyshev nodes, are used as nodes in polynomial interpolation. The resulting interpolation polynomial minimizes the problem of Runge's phenomenon and provides an approximation that is close to the polynomial of best approximation to a continuous function under the maximum norm. This approximation leads directly to the method of Clenshaw–Curtis quadrature.

In the study of differential equations they arise as the solution to the Chebyshev differential equations

- $(1-x^2),y\; -\; x,y\text{'}\; +\; n^2,y\; =\; 0\; ,!$

- $(1-x^2),y\; -\; 3x,y\text{'}\; +\; n(n+2),y\; =\; 0\; ,!$

for the polynomials of the first and second kind, respectively. These equations are special cases of the Sturm-Liouville differential equation.

The Chebyshev polynomials of the first kind are defined by the recurrence relation

- $T\_0(x)\; =\; 1\; ,!$

- $T\_1(x)\; =\; x\; ,!$

- $T\_\{n+1\}(x)\; =\; 2xT\_n(x)\; -\; T\_\{n-1\}(x).\; ,!$

One example of a generating function for T_{n} is

- $sum\_\{n=0\}^\{infty\}T\_n(x)\; t^n\; =\; frac\{1-tx\}\{1-2tx+t^2\}.\; ,!$

The Chebyshev polynomials of the second kind are defined by the recurrence relation

- $U\_0(x)\; =\; 1\; ,!$

- $U\_1(x)\; =\; 2x\; ,!$

- $U\_\{n+1\}(x)\; =\; 2xU\_n(x)\; -\; U\_\{n-1\}(x).\; ,!$

One example of a generating function for U_{n} is

- $sum\_\{n=0\}^\{infty\}U\_n(x)\; t^n\; =\; frac\{1\}\{1-2tx+t^2\}.\; ,!$

The Chebyshev polynomials of the first kind can be defined by the trigonometric identity:

- $T\_n(x)=cos(n\; arccos\; x)=cosh(n,mathrm\{arccosh\},x)\; ,!$

whence:

- $T\_n(cos(vartheta))=cos(nvartheta)\; ,!$

for n = 0, 1, 2, 3, ..., while the polynomials of the second kind satisfy:

- $U\_n(cos(vartheta))\; =\; frac\{sin((n+1)vartheta)\}\{sinvartheta\}\; ,!$

which is structurally quite similar to the Dirichlet kernel.

That cos(nx) is an nth-degree polynomial in cos(x) can be seen by observing that cos(nx) is the real part of one side of de Moivre's formula, and the real part of the other side is a polynomial in cos(x) and sin(x), in which all powers of sin(x) are even and thus replaceable via the identity cos^{2}(x) + sin^{2}(x) = 1.

This identity is extremely useful in conjunction with the recursive generating formula inasmuch as it enables one to calculate the cosine of any integral multiple of an angle solely in terms of the cosine of the base angle. Evaluating the first two Chebyshev polynomials:

- $T\_0(x)=cos\; 0x\; =1\; ,!$

and:

- $T\_1(cos(x))=cos\; (x)\; ,!$

one can straightforwardly determine that:

- $$

- $$

and so forth. To trivially check whether the results seem reasonable, sum the coefficients on both sides of the equals sign (that is, setting $vartheta$ equal to zero, for which the cosine is unity), and one sees that 1 = 2 − 1 in the former expression and 1 = 4 − 3 in the latter.

An immediate corollary is the composition identity (or the "nesting property")

- $T\_n(T\_m(x))\; =\; T\_\{ncdot\; m\}(x).,!$

- $T\_i^2\; -\; (x^2-1)\; U\_\{i-1\}^2\; =\; 1\; ,!$

in a ring R[x]. Thus, they can be generated by the standard technique for Pell equations of taking powers of a fundamental solution:

- $T\_i\; +\; U\_\{i-1\}\; sqrt\{x^2-1\}\; =\; (x\; +\; sqrt\{x^2-1\})^i.\; ,!$

The Chebyshev polynomials of the first and second kind are closely related by the following equations

- $frac\{d\}\{dx\}\; ,\; T\_n(x)\; =\; n\; U\_\{n-1\}(x)\; mbox\{\; ,\; \}\; n=1,ldots$

- $T\_n(x)\; =\; frac\{1\}\{2\}\; (U\_n(x)\; -\; ,\; U\_\{n-2\}(x)).$

- $T\_\{n+1\}(x)\; =\; xT\_n(x)\; -\; (1\; -\; x^2)U\_\{n-1\}(x),$

- $T\_n(x)\; =\; U\_n(x)\; -\; x\; ,\; U\_\{n-1\}(x).$

The recurrence relationship of the derivative of Chebyshev polynomials can be derived from these relations

- $2\; T\_n(x)\; =\; frac\{1\}\{n+1\};\; frac\{d\}\{dx\}\; T\_\{n+1\}(x)\; -\; frac\{1\}\{n-1\};\; frac\{d\}\{dx\}\; T\_\{n-1\}(x)\; mbox\{\; ,\; \}quad\; n=1,ldots$

This relationship is used in the Chebyshev spectral method of solving differential equations.

Equivalently, the two sequences can also be defined from a pair of mutual recurrence equations:

- $T\_0(x)\; =\; 1,!$

- $U\_\{-1\}(x)\; =\; 0,!$

- $T\_\{n+1\}(x)\; =\; xT\_n(x)\; -\; (1\; -\; x^2)U\_\{n-1\}(x),$

- $U\_n(x)\; =\; xU\_\{n-1\}(x)\; +\; T\_n(x),$

These can be derived from the trigonometric formulae; for example, if $x\; =\; cosvartheta$, then

- $begin\{align\}$

&= cos((n + 1)vartheta)

&= cos(nvartheta)cos(vartheta) - sin(nvartheta)sin(vartheta)&= T_n(cos(vartheta))cos(vartheta) - U_{n-1}(cos(vartheta))sin^2(vartheta) &= xT_n(x) - (1 - x^2)U_{n-1}(x). end{align}

Note that both these equations and the trigonometric equations take a simpler form if we, like some works, follow the alternate convention of denoting our U_{n} (the polynomial of degree n) with U_{n+1} instead.

Different approaches to defining Chebyshev polynomials lead to different explicit formulas such as:

- $T\_n(x)\; =$

- $T\_n(x)=frac\{(x+sqrt\{x^2-1\})^n+(x-sqrt\{x^2-1\})^n\}\{2\}\; =\; sum\_\{k=0\}^\{lfloor\; n/2rfloor\}\; binom\{n\}\{2k\}\; (x^2-1)^k\; x^\{n-2k\}$

- $U\_n(x)=frac\{(x+sqrt\{x^2-1\})^\{n+1\}-(x-sqrt\{x^2-1\})^\{n+1\}\}\{2sqrt\{x^2-1\}\}\; =\; sum\_\{k=0\}^\{lfloor\; n/2rfloor\}\; binom\{n+1\}\{2k+1\}\; (x^2-1)^k\; x^\{n-2k\}$

- $T\_n(x)\; =\; 1+n^2\; \{(x-1)\}\; prod\_\{k=1\}^\{n-1\}\; left(\{\; 1+\{over\; 2\; sin^2left(\{k\; pi\; over\; n\}right)\}\}right)$ (due to M. Hovdan)

Both the T_{n} and the U_{n} form a sequence of orthogonal polynomials. The polynomials of the first kind are orthogonal with respect to the weight

- $frac\{1\}\{sqrt\{1-x^2\}\},\; ,!$

on the interval [−1,1], i.e. we have:

- $int\_\{-1\}^1\; T\_n(x)T\_m(x),frac\{dx\}\{sqrt\{1-x^2\}\}=left\{$

This can be proven by letting $x\; =\; cos(vartheta)$ and using the identity $T\_n(cos(vartheta))\; =\; cos(nvartheta)$. Similarly, the polynomials of the second kind are orthogonal with respect to the weight

- $sqrt\{1-x^2\}\; ,!$

on the interval [−1,1], i.e. we have:

- $int\_\{-1\}^1\; U\_n(x)U\_m(x)sqrt\{1-x^2\},dx\; =$

(Note that the weight $sqrt\{1-x^2\}\; ,!$ is, to within a normalizing constant, the density of the Wigner semicircle distribution).

For any given n ≥ 1, among the polynomials of degree n with leading coefficient 1,

- $f(x)\; =\; frac1\{2^\{n-1\}\}T\_n(x)$

is the one of which the maximal absolute value on the interval [−1, 1] is minimal.

This maximal absolute value is

- $frac1\{2^\{n-1\}\}$

and |ƒ(x)| reaches this maximum exactly n + 1 times: at

- $x\; =\; cos\; frac\{kpi\}\{n\}text\{\; for\; \}0\; le\; k\; le\; n.$

The derivatives of the polynomials can be less than straightforward. By differentiating the polynomials in their trigonometric forms, it's easy to show that:

- $frac\{d\; T\_n\}\{d\; x\}\; =\; n\; U\_\{n\; -\; 1\},$

- $frac\{d\; U\_n\}\{d\; x\}\; =\; frac\{(n\; +\; 1)T\_\{n\; +\; 1\}\; -\; x\; U\_n\}\{x^2\; -\; 1\},$

- $frac\{d^2\; T\_n\}\{d\; x^2\}\; =\; n\; frac\{n\; T\_n\; -\; x\; U\_\{n\; -\; 1\}\}\{x^2\; -\; 1\}\; =\; n\; frac\{(n\; +\; 1)T\_n\; -\; U\_n\}\{x^2\; -\; 1\}.,$

The last two formulas can be numerically troublesome due to the division by zero (0/0 indeterminate form, specifically) at x = 1 and x = −1. It can be shown that:

- $frac\{d^2\; T\_n\}\{d\; x^2\}\; Bigg|\_\{x\; =\; 1\}\; !!\; =\; frac\{n^4\; -\; n^2\}\{3\},$

- $frac\{d^2\; T\_n\}\{d\; x^2\}\; Bigg|\_\{x\; =\; -1\}\; !!\; =\; (-1)^n\; frac\{n^4\; -\; n^2\}\{3\}.$

The second derivative of the Chebyshev polynomial of the first kind is

- $T\_n\; =\; n\; frac\{n\; T\_n\; -\; x\; U\_\{n\; -\; 1\}\}\{x^2\; -\; 1\}$

which, if evaluated as shown above, poses a problem because it is indeterminate at x = ±1. Since the function is a polynomial, (all of) the derivatives must exist for all real numbers, so the taking to limit on the expression above should yield the desired value:

- $T\_n(1)\; =\; lim\_\{x\; to\; 1\}\; n\; frac\{n\; T\_n\; -\; x\; U\_\{n\; -\; 1\}\}\{x^2\; -\; 1\}$

where only $x\; =\; 1$ is considered for now. Factoring the denominator:

- $T\_n(1)\; =\; lim\_\{x\; to\; 1\}\; n\; frac\{n\; T\_n\; -\; x\; U\_\{n\; -\; 1\}\}\{(x\; +\; 1)(x\; -\; 1)\}\; =$

Since the limit as a whole must exist, the limit of the numerator and denominator must independently exist, and

- $T\_n(1)\; =\; n\; frac\{lim\_\{x\; to\; 1\}\; frac\{n\; T\_n\; -\; x\; U\_\{n\; -\; 1\}\}\{x\; -\; 1\}\}\{lim\_\{x\; to\; 1\}\; (x\; +\; 1)\}\; =$

The denominator (still) limits to zero, which implies that the numerator must be limiting to zero, ie $U\_\{n\; -\; 1\}(1)\; =\; n\; T\_n(1)\; =\; n$ which will be useful later on. Since the numerator and denominator are both limiting to zero, L'Hôpital's rule applies:

- $begin\{align\}$

The proof for $x\; =\; -1$ is similar, with the fact that $T\_n(-1)\; =\; (-1)^n$ being important.

Indeed, the following, more general formula holds:

- $frac\{d^p\; T\_n\}\{d\; x^p\}\; Bigg|\_\{x\; =\; pm\; 1\}\; !!\; =\; (pm\; 1)^\{n+p\}prod\_\{k=0\}^\{p-1\}frac\{n^2-k^2\}\{2k+1\}.$

This latter result is of great use in the numerical solution of eigenvalue problems.

Concerning integration, the first derivative of the T_{n} implies that

- $int\; U\_n,\; dx\; =\; frac\{T\_\{n\; +\; 1\}\}\{n\; +\; 1\},$

and the recurrence relation for the first kind polynomials involving derivatives establishes that

- $int\; T\_n,\; dx\; =\; frac\{1\}\{2\}\; left(frac\{T\_\{n\; +\; 1\}\}\{n\; +\; 1\}\; -\; frac\{T\_\{n\; -\; 1\}\}\{n\; -\; 1\}right)\; =\; frac\{n\; T\_\{n\; +\; 1\}\}\{n^2\; -\; 1\}\; -\; frac\{x\; T\_n\}\{n\; -\; 1\}.,$

- $cosleft(frac\{pi\}\{2\},(2k+1)right)=0$

one can easily prove that the roots of T_{n} are

- $x\_k\; =\; cosleft(frac\{pi\}\{2\},frac\{2k-1\}\{n\}right),quad\; k=1,ldots,n.$

Similarly, the roots of U_{n} are

- $x\_k\; =\; cosleft(frac\{k\}\{n+1\}piright),quad\; k=1,ldots,n.$

One unique property of the Chebyshev polynomials of the first kind is that on the interval −1 ≤ x ≤ 1 all of the extrema have values that are either −1 or 1. Thus these polynomials have only two finite critical values, the defining property of Shabat polynomials. Both the first and second kinds of Chebyshev polynomial have extrema at the endpoints, given by:

- $T\_n(1)\; =\; 1,$

- $T\_n(-1)\; =\; (-1)^n,$

- $U\_n(1)\; =\; n\; +\; 1,$

- $U\_n(-1)\; =\; (n\; +\; 1)(-1)^n.,$

The Chebyshev polynomials are a special case of the ultraspherical or Gegenbauer polynomials, which themselves are a special case of the Jacobi polynomials.

For every nonnegative integer n, T_{n}(x) and U_{n}(x) are both polynomials of degree n.
They are even or odd functions of x as n is even or odd, so when written as polynomials of x, it only has even or odd degree terms respectively.

The leading coefficient of T_{n} is 2^{n − 1} if 1 ≤ n, but 1 if 0 = n.

T_{n} are a special case of Lissajous curves with frequency ratio to equal to n.

The first few Chebyshev polynomials of the first kind are

- $T\_0(x)\; =\; 1\; ,$

- $T\_1(x)\; =\; x\; ,$

- $T\_2(x)\; =\; 2x^2\; -\; 1\; ,$

- $T\_3(x)\; =\; 4x^3\; -\; 3x\; ,$

- $T\_4(x)\; =\; 8x^4\; -\; 8x^2\; +\; 1\; ,$

- $T\_5(x)\; =\; 16x^5\; -\; 20x^3\; +\; 5x\; ,$

- $T\_6(x)\; =\; 32x^6\; -\; 48x^4\; +\; 18x^2\; -\; 1\; ,$

- $T\_7(x)\; =\; 64x^7\; -\; 112x^5\; +\; 56x^3\; -\; 7x\; ,$

- $T\_8(x)\; =\; 128x^8\; -\; 256x^6\; +\; 160x^4\; -\; 32x^2\; +\; 1\; ,$

- $T\_9(x)\; =\; 256x^9\; -\; 576x^7\; +\; 432x^5\; -\; 120x^3\; +\; 9x.\; ,$

The first few Chebyshev polynomials of the second kind are

- $U\_0(x)\; =\; 1\; ,$

- $U\_1(x)\; =\; 2x\; ,$

- $U\_2(x)\; =\; 4x^2\; -\; 1\; ,$

- $U\_3(x)\; =\; 8x^3\; -\; 4x\; ,$

- $U\_4(x)\; =\; 16x^4\; -\; 12x^2\; +\; 1\; ,$

- $U\_5(x)\; =\; 32x^5\; -\; 32x^3\; +\; 6x\; ,$

- $U\_6(x)\; =\; 64x^6\; -\; 80x^4\; +\; 24x^2\; -\; 1\; ,$

- $U\_7(x)\; =\; 128x^7\; -\; 192x^5\; +\; 80x^3\; -\; 8x\; ,$

- $U\_8(x)\; =\; 256x^8\; -\; 448\; x^6\; +\; 240\; x^4\; -\; 40\; x^2\; +\; 1\; ,$

- $U\_9(x)\; =\; 512x^9\; -\; 1024\; x^7\; +\; 672\; x^5\; -\; 160\; x^3\; +\; 10\; x.\; ,$

In the appropriate Sobolev space, the set of Chebyshev polynomials form a complete basis set, so that a function in the same space can, on −1 ≤ x ≤ 1 be expressed via the expansion:

- $f(x)\; =\; sum\_\{n\; =\; 0\}^infty\; a\_n\; T\_n(x).$

Furthermore, as mentioned previously, the Chebyshev polynomials form an orthogonal basis which (among other things) implies that the coefficients a_{n} can be determined easily through the application of an inner product. This sum is called a Chebyshev series or a Chebyshev expansion.

Since a Chebyshev series is related to a Fourier cosine series through a change of variables, all of the theorems, identities, etc that apply to Fourier series have a Chebyshev counterpart. These attributes include:

- The Chebyshev polynomials form a complete orthogonal system.
- The Chebyshev series converges to ƒ(x) if the function is piecewise smooth and continuous. The smoothness requirement can be relaxed in most cases — as long as there are a finite number of discontinuities in ƒ(x) and its derivatives.
- At a discontinuity, the series will converge to the average of the right and left limits.

The abundance of the theorems and identities inherited from Fourier series make the Chebyshev polynomials important tools in numeric analysis; for example they are the most popular general purpose basis functions used in the spectral method, often in favor of trigonometric series due to generally faster convergence for continuous functions (Gibbs' phenomenon is still a problem).

The partial sums of

- $f(x)\; =\; sum\_\{n\; =\; 0\}^infty\; a\_n\; T\_n(x)$

are very useful in the approximation of various functions and in the solution of differential equations (see spectral method). Two common methods for determining the coefficients a_{n} are through the use of the inner product as in Galerkin's method and through the use of collocation which is related to interpolation.

As an interpolant, the N coefficients of the (N − 1)^{th} partial sum are usually obtained on the Chebyshev-Gauss-Lobatto points (or Lobatto grid), which results in minimum error and avoids Runge's phenomenon associated with a uniform grid. This collection of points corresponds to the extrema of the highest order polynomial in the sum, plus the endpoints and is given by:

- $x\_i\; =\; -cosleft(frac\{i\; pi\}\{N\; -\; 1\}right)\; ;\; qquad\; i\; =\; 0,\; 1,\; dots,\; N\; -\; 1.$

- $p(x)\; =\; sum\_\{n=0\}^\{N\}\; a\_n\; T\_n(x).$

Polynomials in Chebyshev form can be evaluated using the Clenshaw algorithm.

The spread polynomials are in a sense equivalent to the Chebyshev polynomials of the first kind, but enable one to avoid square roots and conventional trigonometric functions in certain contexts, notably in rational trigonometry.

- Chebyshev nodes
- Chebyshev filter
- Chebyshev cube root
- Dickson polynomials
- Legendre polynomials
- Hermite polynomials
- Chebyshev rational functions
- Clenshaw–Curtis quadrature
- Approximation theory

- Module for Chebyshev Polynomials by John H. Mathews
- Chebyshev Interpolation: An Interactive Tour, includes illustrative Java applet.
- chebfun project, representing functions by automatic Chebyshev polynomial interpolation in MATLAB.

Wikipedia, the free encyclopedia © 2001-2006 Wikipedia contributors (Disclaimer)

This article is licensed under the GNU Free Documentation License.

Last updated on Monday October 06, 2008 at 00:32:57 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

This article is licensed under the GNU Free Documentation License.

Last updated on Monday October 06, 2008 at 00:32:57 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

Copyright © 2014 Dictionary.com, LLC. All rights reserved.