Definitions

# Symmetry of second derivatives

In mathematics, the symmetry of second derivatives refers to the possibility of interchanging the order of taking partial derivatives of a function

f(x1, x2, ..., xn)

of n variables. If the partial derivative with respect to xi is denoted with a subscript i, then the symmetry is the assertion that the second-order partial derivatives fij satisfy the identity

fij = fji

so that they form an n × n symmetric matrix. This is sometimes known as Young's theorem.

## The Hessian matrix is typically symmetric

This matrix of second-order partial derivatives of f is called the Hessian matrix of f. The entries in it off the main diagonal are the mixed derivatives; that is, successive partial derivatives with respect to different variables.

In most normal circumstances the Hessian matrix is indeed a symmetric matrix; but from the point of view of mathematical analysis this isn't a safe statement, without some hypothesis on f that goes further than simply stating the existence of the second derivatives at a particular point. Clairaut's theorem gives a sufficient condition on f for this to occur.

## Formal expressions of symmetry

In symbols, the symmetry says that, for example,

$frac \left\{partial\right\}\left\{partial x\right\} left\left(frac \left\{ partial f \right\}\left\{ partial y\right\} right\right) =$
frac {partial}{partial y} left(frac { partial f }{ partial x} right)

This equality can also be written as

$partial_\left\{xy\right\} f = partial_\left\{yx\right\} f.$

Alternatively, the symmetry can be written as an algebraic statement involving the differential operator Di which takes the partial derivative with respect to xi:

Di . Dj = Dj . Di.

From this relation it follows that the ring of differential operators with constant coefficients, generated by the Di, is commutative. But one should naturally specify some domain for these operators. It is easy to check the symmetry as applied to monomials, so that one can take polynomials in the xi as a domain. In fact smooth functions is possible.

## Clairaut's theorem

In mathematical analysis, Clairaut's theorem or Schwarz's theorem, named after Alexis Clairaut and Hermann Schwarz, states that if

$f colon mathbb\left\{R\right\}^n to mathbb\left\{R\right\}$

has continuous second partial derivatives at any given point in $mathbb\left\{R\right\}^n$, say, $\left(a_1, dots, a_n\right),$ then for $1 leq i,j leq n,$

$frac\left\{partial^2 f\right\}\left\{partial x_i, partial x_j\right\}\left(a_1, dots, a_n\right) = frac\left\{partial^2 f\right\}\left\{partial x_j, partial x_i\right\}\left(a_1, dots, a_n\right).$

In words, the partial derivatives of this function commute at that point.

### Clairaut's constant

A byproduct of this theorem is Clairaut's constant (alternatively known as "Clairaut's formula" and "Clairaut's parameter"), which relates the latitude, $phi,!$, and (here, spherical) azimuth, $widehat\left\{alpha\right\},!$, of points on a sphere's great circle. The identification of a particular great circle equals its azimuth at the equator, or arc path, $widehat\left\{Alpha\right\},!$:
$sin\left(widehat\left\{Alpha\right\}\right)=Big|cos\left(phi_q\right)sin\left(widehat\left\{alpha\right\}_q\right)Big|.,!$

## Distribution theory formulation

One can also apply the theory of distributions to get round any analytic problems with the symmetry. Firstly the derivative of any function can be defined (provided it is integrable), as a distribution. Secondly the use of integration by parts throws the symmetry question back onto the test functions, which are smooth and certainly satisfy the symmetry. One concludes that, in the sense of distributions, the symmetry always holds. (Another approach, where the Fourier transform of a function is defined, is to note that on transforms the partial derivatives become multiplication operators that commute much more obviously).

## Pathology is possible

The fact remains that in the worst case there are counterexamples. In the case of two variables, near (0, 0) one can consider two limiting processes on

f(h, k) − f(h, 0) − f(0, k) + f(0, 0)

corresponding to making h → 0 first, and to making k → 0 first. These processes need not commute (see interchange of limiting operations): it can matter, looking at the first-order terms, which is applied first. This leads to the construction of pathological examples in which the symmetry of second derivatives is not true. Given that the derivatives as Schwartz distributions are symmetric, this kind of example belongs in the 'fine' theory of real analysis.

## Counterexample

Consider the function

$f\left(x,y\right) = begin\left\{cases\right\}$
frac{xy(x^2 - y^2)}{x^2+y^2} & mbox{ for } (x, y) ne (0, 0) 0 & mbox{ for } (x, y) = (0, 0). end{cases}

Then the mixed partial derivatives of f exist, and are continuous everywhere except at $\left(0,0\right)$. Moreover

$frac \left\{partial\right\}\left\{partial x\right\} left\left(frac \left\{ partial f \right\}\left\{ partial y\right\} right\right) ne$
frac {partial}{partial y} left(frac { partial f }{ partial x} right)

at $\left(0,0\right)$.

## In Lie theory

A more sophisticated argument is this: consider the first-order differential operators Di to be infinitesimal operators on Euclidean space. That is, Di in a sense generates the one-parameter group of translations parallel to the xi-axis. These groups certainly all commute with each other, and therefore we expect that the infinitesimal generators do also; the Lie bracket

[Di, Dj] = 0

is the way that is reflected. When one comes to ask whether this is true as applied to spaces of functions on Euclidean space, this question is an elementary part of the theory of differentiable vectors in representation theory; that is, a suitable subspace of a function space on which the Lie algebra representation deriving from a representation of a Lie group has desirable and transparent properties.

Search another word or see Symmetry_of_second_derivativeson Dictionary | Thesaurus |Spanish