Definitions

# Positive-definite matrix

In linear algebra, a positive-definite matrix is a (Hermitian) matrix which in many ways is analogous to a positive real number. The notion is closely related to a positive-definite symmetric bilinear form (or a sesquilinear form in the complex case).

## Definition

An n × n real symmetric matrix M is positive definite if zTMz > 0 for all non-zero vectors z with real entries (i.e. zRn), where zT denotes the transpose of z.

For complex matrices, this definition becomes: a Hermitian matrix is positive definite if z*Mz > 0 for all non-zero complex vectors z, where z* denotes the conjugate transpose of z. The quantity z*Mz is always real because $M$ is a Hermitian matrix. For this reason, positive-definite matrices are often defined to be Hermitian matrices satisfying z*Mz > 0. The section Non-Hermitian matrices discusses the consequences of dropping the requirement that M be Hermitian.

## Characterizations

Let M be an n × n Hermitian matrix. The following properties are equivalent to M being positive definite:
1. All eigenvalues $lambda_i$ of $M$ are positive. Recall that any Hermitian M, by the spectral theorem, may be regarded as a real diagonal matrix D that has been re-expressed in some new coordinate system (i.e., $M = P^\left\{-1\right\}DP$ for some unitary matrix P whose rows are orthonormal eigenvectors of M, forming a basis). So this characterization means that M is positive definite if and only if the diagonal elements of D (the eigenvalues) are all positive. In other words, in the basis consisting of the eigenvectors of M, the action of M is component-wise multiplication with a (fixed) element in Cn with positive entries.
2. The sesquilinear form
$langle textbf\left\{x\right\},textbf\left\{y\right\}rangle = textbf\left\{x\right\}^\left\{*\right\} M textbf\left\{y\right\}$
defines an inner product on Cn. (In fact, every inner product on Cn arises in this fashion from a Hermitian positive definite matrix.)
3. M is the Gram matrix of some collection of linearly independent vectors
$textbf\left\{x\right\}_1,ldots,textbf\left\{x\right\}_n in mathbb\left\{C\right\}^k$

for some k. That is, M satisfies:

$M_\left\{ij\right\} = langle textbf\left\{x\right\}_i, textbf\left\{x\right\}_jrangle = textbf\left\{x\right\}_i^\left\{*\right\} textbf\left\{x\right\}_j.$

The vectors xi may optionally be restricted to fall in Cn. In other words, M is of the form A*A where A is not necessarily square but must be injective in general.

4. All the following matrices have a positive determinant (the Sylvester criterion):
• the upper left 1-by-1 corner of $M$
• the upper left 2-by-2 corner of $M$
• the upper left 3-by-3 corner of $M$
• ...
• $M$ itself

In other words, all of the leading principal minors are positive. For positive semidefinite matrices, all principal minors have to be non-negative. The leading principal minors alone do not imply positive semidefiniteness, as can be seen from the example

$begin\left\{bmatrix\right\} 1 & 1 & 1 1 & 1 & 1 1 & 1 & 0 end\left\{bmatrix\right\}.$
5. There exists a unique lower triangular matrix $L$, with strictly positive diagonal elements, that allows the factorization of $M$ into
$M=L L^*$.
where $L^*$ is the conjugate transpose of $L$. This factorization is called Cholesky decomposition.

For real symmetric matrices, these properties can be simplified by replacing $mathbb\left\{C\right\}^n$ with $mathbb\left\{R\right\}^n$, and "conjugate transpose" with "transpose."

Echoing condition 2 above, one can also formulate positive-definiteness in terms of quadratic forms. Let K be the field R or C, and V be a vector space over K. A Hermitian form

$B : V times V rightarrow K$

is a bilinear map such that B(x, y) is always the complex conjugate of B(y, x). Such a function B is called positive definite if B(x, x) > 0 for every nonzero x in V.

## Negative-definite, semidefinite and indefinite matrices

The n × n Hermitian matrix $M$ is said to be negative-definite if

$x^\left\{*\right\} M x < 0,$

for all non-zero $x in mathbb\left\{R\right\}^n$ (or, equivalently, all non-zero $x in mathbb\left\{C\right\}^n$).

It is called positive-semidefinite if

$x^\left\{*\right\} M x geq 0$

for all $x in mathbb\left\{R\right\}^n$ (or $mathbb\left\{C\right\}^n$).

It is called negative-semidefinite if

$x^\left\{*\right\} M x leq 0$

for all $x in mathbb\left\{R\right\}^n$ (or $mathbb\left\{C\right\}^n$).

A matrix M is positive-semidefinite if and only if it arises as the Gram matrix of some set of vectors. In contrast to the positive-definite case, these vectors need not be linearly independent.

For any matrix $A$, the matrix A*A is positive semidefinite, and rank($A$) = rank(A*A). Conversely, any positive semidefinite matrix $M$ can be written as M = A*A; this is the Cholesky decomposition.

A Hermitian matrix which is neither positive- nor negative-semidefinite is called indefinite.

A matrix is negative definite if all kth order leading principal minors are negative if k is odd and positive if k is even.

## Further properties

If $M$ is positive semi-definite, one sometimes writes $M geq 0$ and if $M$ is positive-definite one writes $M > 0$.The notion comes from functional analysis where positive definite matrices define positive operators.

For arbitrary square matrices $M,N$ we write $Mgeq N$ if $M-N geq 0$, i.e. $M-N$ is positive semi-definite. This defines a partial ordering on the set of all square matrices. One can similarly define a strict partial ordering $M>N$.

1. Every positive definite matrix is invertible and its inverse is also positive definite. If $M geq N > 0$ then $N^\left\{-1\right\} geq M^\left\{-1\right\} > 0.$
2. If $M$ is positive definite and $r > 0$ is a real number, then $r M$ is positive definite. If $M$ and $N$ are positive definite, then the sum $M + N$ and the products $MNM$ and $NMN$ are also positive definite. If $M N = N M$, then $M N$ is also positive definite.

3. M=(m_{ij}) > 0 then the diagonal entries $m_\left\{ii\right\}$ are real and positive. As a consequence $text\left\{tr\right\}\left(M\right)>0$. Furthermore
$> m_\left\{ij\right\} | leq sqrt\left\{m_\left\{ii\right\} m_\left\{jj\right\}\right\} leq frac\left\{m_\left\{ii\right\}+m_\left\{jj\right\}\right\}\left\{2\right\}.$

4. A matrix $M$ is positive definite, if and only if there is a positive definite matrix $B>0$, called the square root of B, with $B^2 = M$. One writes $B = M^\left\{1/2\right\}$. This matrix $B$ is unique (but only under the assumption $B>0$). If $M > N > 0$ then $M^\left\{1/2\right\} > N^\left\{1/2\right\}>0$.
5. If $M,N > 0$ then $Motimes N > 0.$ (Here $otimes$ denotes Kronecker product.)
6. For matrices $M=\left(m_\left\{ij\right\}\right),N=\left(n_\left\{ij\right\}\right)$ write $Mcirc N$ for the entry-wise product of $M$ and $N$, i.e. the matrix whose $i,j$ entry is $m_\left\{ij\right\} n_\left\{ij\right\}$. Then $M circ N$ is the Hadamard product of $M$ and $N$. If $M,N>0$ then $Mcirc N > 0$ and if $M,N$ are real matrices, the following inequality, due to Oppenheim, holds: $det\left(Mcirc N\right) geq \left(det N\right) prod_\left\{i\right\} m_\left\{ii\right\}.$

7. Let $M > 0$ and $N$ Hermitian. If $MN+NM geq 0$ ($MN+NM > 0$) then $Ngeq 0$ ($N > 0.$ )
8. If $M,Ngeq 0$ are real matrices then $text\left\{tr\right\}\left(MN\right)geq 0.$
9. If $M>0$ is real, then there is a $delta>0$ such that $Mgeq delta I$ where $I$ is the identity matrix.

## Non-Hermitian matrices

A real matrix M may have the property that xTMx > 0 for all nonzero real vectors x without being symmetric. The matrix

$begin\left\{bmatrix\right\} 1 & 1 -1 & 1 end\left\{bmatrix\right\}$

satisfies this property, because for all real vectors $x = \left(x_1, x_2\right)^T$ such that $x ne 0$,

$begin\left\{bmatrix\right\} x_1 & x_2 end\left\{bmatrix\right\} begin\left\{bmatrix\right\} 1 & 1 -1 & 1 end\left\{bmatrix\right\} begin\left\{bmatrix\right\} x_1 x_2 end\left\{bmatrix\right\} = x_1^2 + x_2^2 > 0 .$

In general, we have xTMx > 0 for all real nonzero vectors x if and only if the symmetric part, (M + MT) / 2, is positive definite.

The situation for complex matrices may be different, depending on how one generalizes the inequality z*Mz > 0. If z*Mz is real for all complex vectors z, then the matrix M is necessarily Hermitian. So, if we require that z*Mz be real and positive, then M is automatically Hermitian. On the other hand, we have that Re(z*Mz) > 0 for all complex nonzero vectors z if and only if the Hermitian part, (M + M*) / 2, is positive definite.

In summary, the distinguishing feature between the real and complex case is that, a bounded positive operator on a complex Hilbert space is necessarily Hermitian, or self adjoint. The general claim can be argued using the polarization identity. That is no longer true in the real case.

There is no agreement in the literature on the proper definition of positive-definite for non-Hermitian matrices.