An orthogonal matrix is a square matrix with real entries whose columns and rows are orthogonal unit vectors or orthonormal vectors. Similarly, a matrix Q is orthogonal if its transpose is equal to its inverse.
An orthogonal matrix Q is necessarily square and invertible with inverse Q−1 = QT. As a linear transformation, an orthogonal matrix preserves the dot product of vectors and therefore acts as an isometry of Euclidean space. In other words, it is a unitary transformation.
The set of n × n orthogonal matrices forms a group O(n), known as the orthogonal group. The subgroup SO(n), consisting of orthogonal matrices with determinant +1, is called the special orthogonal group, and each of its elements is a special orthogonal matrix. As a linear transformation, every special orthogonal matrix acts as a rotation.
The most common examples of orthogonal matrices are rotations and reflections. Both are important for developing numerical methods. The fact that orthogonal matrices don't change the lengths of vectors makes them very desirable in numerical applications since they do not increase rounding errors significantly. They are therefore favored for stable algorithms.They are particularly important for eigenvalue methods for symmetric matrices because they produce similarity transforms that preserve the symmetric property.