of n variables. If the partial derivative with respect to xi is denoted with a subscript i, then the symmetry is the assertion that the second-order partial derivatives fij satisfy the identity
so that they form an n × n symmetric matrix. This is sometimes known as Young's theorem.
This matrix of second-order partial derivatives of f is called the Hessian matrix of f. The entries in it off the main diagonal are the mixed derivatives; that is, successive partial derivatives with respect to different variables.
In most normal circumstances the Hessian matrix is indeed a symmetric matrix; but from the point of view of mathematical analysis this isn't a safe statement, without some hypothesis on f that goes further than simply stating the existence of the second derivatives at a particular point. Clairaut's theorem gives a sufficient condition on f for this to occur.
In symbols, the symmetry says that, for example,
This equality can also be written as
Alternatively, the symmetry can be written as an algebraic statement involving the differential operator Di which takes the partial derivative with respect to xi:
From this relation it follows that the ring of differential operators with constant coefficients, generated by the Di, is commutative. But one should naturally specify some domain for these operators. It is easy to check the symmetry as applied to monomials, so that one can take polynomials in the xi as a domain. In fact smooth functions is possible.
In words, the partial derivatives of this function commute at that point.
One can also apply the theory of distributions to get round any analytic problems with the symmetry. Firstly the derivative of any function can be defined (provided it is integrable), as a distribution. Secondly the use of integration by parts throws the symmetry question back onto the test functions, which are smooth and certainly satisfy the symmetry. One concludes that, in the sense of distributions, the symmetry always holds. (Another approach, where the Fourier transform of a function is defined, is to note that on transforms the partial derivatives become multiplication operators that commute much more obviously).
The fact remains that in the worst case there are counterexamples. In the case of two variables, near (0, 0) one can consider two limiting processes on
corresponding to making h → 0 first, and to making k → 0 first. These processes need not commute (see interchange of limiting operations): it can matter, looking at the first-order terms, which is applied first. This leads to the construction of pathological examples in which the symmetry of second derivatives is not true. Given that the derivatives as Schwartz distributions are symmetric, this kind of example belongs in the 'fine' theory of real analysis.
Consider the function
Then the mixed partial derivatives of f exist, and are continuous everywhere except at . Moreover
A more sophisticated argument is this: consider the first-order differential operators Di to be infinitesimal operators on Euclidean space. That is, Di in a sense generates the one-parameter group of translations parallel to the xi-axis. These groups certainly all commute with each other, and therefore we expect that the infinitesimal generators do also; the Lie bracket
is the way that is reflected. When one comes to ask whether this is true as applied to spaces of functions on Euclidean space, this question is an elementary part of the theory of differentiable vectors in representation theory; that is, a suitable subspace of a function space on which the Lie algebra representation deriving from a representation of a Lie group has desirable and transparent properties.