Added to Favorites

Related Searches

Nearby Words

The smoothing spline is a method of smoothing, or fitting a smooth curve to a set of noisy observations.
## Definition

Let $(x\_i,Y\_i);\; i=1,dots,n$ be a sequence of observations, modeled by the relation $E(Y\_i)\; =\; mu(x\_i)$. The smoothing spline estimate $hatmu$ of the function $mu$ is defined to be the minimizer (over the class of twice differentiable functions) of
## Derivation of the smoothing spline

## Related methods

## Further reading

## References

- $$

Remarks:

- $lambda\; ge\; 0$ is a smoothing parameter, controlling the trade-off between fidelity to the data and roughness of the function estimate.
- The integral is evaluated over the range of the $x\_i$.
- As $lambdato\; 0$ (no smoothing), the smoothing spline converges to the interpolating spline.
- As $lambdatoinfty$ (infinite smoothing), the roughness penalty becomes paramount and the estimate converges to a linear least-squares estimate.
- The roughness penalty based on the second derivative is the most common in modern statistics literature, although the method can easily be adapted to penalties based on other derivatives.
- In early literature, with equally-spaced $x\_i$, second or third-order differences were used in the penalty, rather than derivatives.
- When the sum-of-squares term is replaced by a log-likelihood, the resulting estimate is termed penalized likelihood. The smoothing spline is the special case of penalized likelihood resulting from a Gaussian likelihood.

It is useful to think of fitting a smoothing spline in two steps:

- First, derive the values $hatmu(x\_i);i=1,ldots,n$.
- From these values, derive $hatmu(x)$ for all x.

Now, treat the second step first.

Given the vector $hat\{m\}\; =\; (hatmu(x\_1),ldots,hatmu(x\_n))^T$ of fitted values, the sum-of-squares part of the spline criterion is fixed. It remains only to minimize $int\; hatmu(x)^2\; ,\; dx$, and the minimizer is a natural cubic spline that interpolates the points $(x\_i,hatmu(x\_i))$. This interpolating spline is a linear operator, and can be written in the form

- $$

- $$

Now back the first step. The penalized sum-of-squares can be written as

- $$

- $$

Smoothing splines are related to, but distinct from:

- Regression splines. In this method, the data is fitted to a set of spline basis functions with a reduced set of knots, typically by least squares. No roughness penalty is used.
- Penalized Splines. This combines the reduced knots of regression splines, with the roughness penalty of smoothing splines.

- Wahba, G. (1990). Spline Models for Observational Data. SIAM, Philadelphia.
- Green, P. J. and Silverman, B. W. (1994). Nonparametric Regression and Generalized Linear Models. CRC Press.

Wikipedia, the free encyclopedia © 2001-2006 Wikipedia contributors (Disclaimer)

This article is licensed under the GNU Free Documentation License.

Last updated on Wednesday September 03, 2008 at 15:33:10 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

This article is licensed under the GNU Free Documentation License.

Last updated on Wednesday September 03, 2008 at 15:33:10 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

Copyright © 2014 Dictionary.com, LLC. All rights reserved.