Definitions

Theorems and definitions in linear algebra

This article collects the main theorems and definitions in linear algebra.

Vector spaces

A vector space(or linear space) V over a number field² F consists of a set on which two operations (called addition and scalar multiplication, respectively) are defined so, that for each pair of elements x, y, in V there is a unique element x + y in V, and for each element a in F and each element x in V there is a unique element ax in V, such that the following conditions hold.

• (VS 1) For all $x, y$ in V, $x+y=y+x$ (commutativity of addition).
• (VS 2) For all $x, y, z$ in V, $\left(x+y\right)+z=x+\left(y+z\right)$ (associativity of addition).
• (VS 3) There exists an element in V denoted by $0$ such that $x+0=x$ for each $x$ in V.
• (VS 4) For each element $x$ in V there exists an element $y$ in V such that $x+y=0$.
• (VS 5) For each element $x$ in V, $1x=x$.
• (VS 6) For each pair of element a in F and each pair of elements $x,y$ in V'', $a\left(x+y\right)=ax+ay$.
• (VS 7) For each element $a$ in F and each pair of elements $x,y$ in V, $a\left(x+y\right)=ax+ay$.
• (VS 8) For each pair of elements $a,b$ in F and each pair of elements $x$ in V, $\left(a+b\right)x=ax+bx$.

Linear transformations and matrices

===Linear transformations=== ===Null spaces=== ===Ranges=== ===The matrix representation of a linear transformation=== ===Composition of linear transformations=== ===Matrix multiplication=== ===Invertibility=== ===Isomorphisms=== ===The change-of-coordinates matrix===

P.S. coefficient of the differential equation,differentiability of complex function,vector space of functionsdifferential operator, auxiliary polynomial, to the power of a complex number, exponential function.

$\left\{color\left\{Blue\right\}~2.1\right\}$ N(T)&R(T) are subspaces

Let V and W be vector spaces and I: V→W be linear. Then N(T) and R (T) are subspaces of Vand W, respectively.

$\left\{color\left\{Blue\right\}~2.2\right\}$ R(T)= span of T(basis in V)

Let V and W be vector spaces, and let T: V→W be linear. If $beta=\left\{v_1,v_2,...,v_n\right\}$ is a basis for V, then
$mbox\left\{R\left(T\right)\right\}=mbox\left\{span\right\}\left(T\left(betambox\left\{\right)\right)\right\}=mbox\left\{span\right\}\left(\left\{T\left(v_1\right),T\left(v_2\right),...,T\left(v_n\right)\right\}\right)$.

$\left\{color\left\{Blue\right\}~2.3\right\}$ Dimension Theorem

Let V and W be vector spaces, and let T: V→W be linear. If V is finite-dimensional, then
$mbox\left\{nullity\right\}\left(T\right)+mbox\left\{rank\right\}\left(T\right)=dim\left(V\right).$

$\left\{color\left\{Blue\right\}~2.4\right\}$ one-to-one ⇔ N(T)={0}

Let V and W be vector spaces, and let T: V→W be linear. Then T is one-to-one if and only if N(T)={0}.

$\left\{color\left\{Blue\right\}~2.5\right\}$ one-to-one ⇔ onto ⇔ rank(T)=dim(V)

Let V and W be vector spaces of equal (finite) dimension, and let T:V→W be linear. Then the following are equivalent.
(a) T is one-to-one.
(b) T is onto.
(c) rank(T)=dim(V).

$\left\{color\left\{Blue\right\}~2.6\right\}$ ∀ $\left\{w_1,w_2...w_n\right\}=$ exactly one T(basis),

Let V and W be vector space over F, and suppose that $\left\{v_1, v_2,...,v_n\right\}$ is a basis for V. For $w_1, w_2,...w_n$ in W, there exists exactly one linear transformation T: V→W such that $mbox\left\{T\right\}\left(v_i\right)=w_i$ for $i=1,2,...n.$
Corollary. Let V and W be vector spaces, and suppose that V has a finite basis $\left\{v_1,v_2,...,v_n\right\}$. If U, T: V→W are linear and $U\left(v_i\right)=T\left(v_i\right)$ for $i=1,2,...,n,$ then U=T.

$\left\{color\left\{Blue\right\}~2.7\right\}$ T is vector space

Let V and W be vector spaces over a field F, and let T, U: V→W be linear.
(a) For all $a$F, $ambox\left\{T\right\}+mbox\left\{U\right\}$ is linear.
(b) Using the operations of addition and scalar multiplication in the preceding definition, the collection of all linear transformations form V to W is a vector space over F.

$\left\{color\left\{Blue\right\}~2.8\right\}$ linearity of matrix representation of linear transformation

Let V and W ve finite-dimensional vector spaces with ordered bases β and γ, respectively, and let T, U: V→W be linear transformations. Then
(a)$\left[T+U\right]_beta^gamma=\left[T\right]_beta^gamma+\left[U\right]_beta^gamma$ and
(b)$\left[aT\right]_beta^gamma=a\left[T\right]_beta^gamma$ for all scalars $a$.

$\left\{color\left\{Blue\right\}~2.9\right\}$ commutative law of linear operator

Let V,w, and Z be vector spaces over the same field f, and let T:V→W and U:W→Z be linear. then UT:V→Z is linear.

$\left\{color\left\{Blue\right\}~2.10\right\}$ law of linear operator

Let v be a vector space. Let T, U1, U2$mathcal\left\{L\right\}$(V). Then
(a) T(U1+U2)=TU1+TU2 and (U1+U2)T=U1T+U2T
(b) T(U1U2)=(TU1)U2
(c) TI=IT=T
(d) $a$(U1U2)=($a$U1)U2=U1($a$U2) for all scalars $a$.

$\left\{color\left\{Blue\right\}~2.11\right\}$ [UT]αγ=[U]βγ[T]αβ

Let V, W and Z be finite-dimensional vector spaces with ordered bases α β γ, respectively. Let T: V⇐W and U: W→Z be linear transformations. Then
$\left[UT\right]_alpha^gamma=\left[U\right]_beta^gamma\left[T\right]_alpha^beta$.

Corollary. Let V be a finite-dimensional vector space with an ordered basis β. Let T,U∈$mathcal\left\{L\right\}$(V). Then [UT]β=[U]β[T]β.

$\left\{color\left\{Blue\right\}~2.12\right\}$ law of matrix

Let A be an m×n matrix, B and C be n×p matrices, and D and E be q×m matrices. Then
(a) A(B+C)=AB+AC and (D+E)A=DA+EA.
(b) $a$(AB)=($a$A)B=A($a$B) for any scalar $a$.
(c) ImA=AIm.
(d) If V is an n-dimensional vector space with an ordered basis β, then [Iv]β=In.

Corollary. Let A be an m×n matrix, B1,B2,...,Bk be n×p matrices, C1,C1,...,C1 be q×m matrices, and $a_1,a_2,...,a_k$ be scalars. Then

$ABigg\left(sum_\left\{i=1\right\}^k a_iB_iBigg\right)=sum_\left\{i=1\right\}^k a_iAB_i$
and
$Bigg\left(sum_\left\{i=1\right\}^k a_iC_iBigg\right)A=sum_\left\{i=1\right\}^k a_iC_iA$.

$\left\{color\left\{Blue\right\}~2.13\right\}$ law of column multiplication

Let A be an m×n matrix and B be an n×p matrix. For each $j \left(1le jle p\right)$ let $u_j$ and $v_j$ denote the jth columns of AB and B, respectively. Then
(a) $u_j=Av_j$
(b) $v_j=Be_j$, where $e_j$ is the jth standard vector of Fp.

$\left\{color\left\{Blue\right\}~2.14\right\}$ [T(u)]γ=[T]βγ[u]β

Let V and W be finite-dimensional vector spaces having ordered bases β and γ, respectively, and let T: V→W be linear. Then, for each u ∈ V, we have
$\left[T\left(u\right)\right]_gamma=\left[T\right]_beta^gamma\left[u\right]_beta$.

$\left\{color\left\{Blue\right\}~2.15\right\}$ laws of LA

Let A be an m×n matrix with entries from F. Then the left-multiplication transformation LA: Fn→Fm is linear. Furthermore, if B is any other m×n matrix (with entries from F) and β and γ are the standard ordered bases for Fn and Fm, respectively, then we have the following properties.
(a) $\left[L_A\right]_beta^gamma=A$.
(b) LA=LB if and only if A=B.
(c) LA+B=LA+LB and L$a$A=$a$LA for all $a$∈F.
(d) If T:Fn→Fm is linear, then there exists a unique m×n matrix C such that T=LC. In fact, $mbox\left\{C\right\}=\left[L_A\right]_beta^gamma$.
(e) If W is an n×p matrix, then LAE=LALE.
(f ) If m=n, then $L_\left\{I_n\right\}=I_\left\{F^n\right\}$.

$\left\{color\left\{Blue\right\}~2.16\right\}$ A(BC)=(AB)C

Let A,B, and C be matrices such that A(BC) is defined. Then A(BC)=(AB)C; that is, matrix multiplication is associative.

$\left\{color\left\{Blue\right\}~2.17\right\}$ T-1is linear

Let V and W be vector spaces, and let T:V→W be linear and invertible. Then T-1: W →V is linear.

$\left\{color\left\{Blue\right\}~2.18\right\}$ [T-1]γβ=([T]βγ)-1

Let V and W be finite-dimensional vector spaces with ordered bases β and γ, respectively. Let T:V→W be linear. Then T is invertible if and only if $\left[T\right]_beta^gamma$ is invertible. Furthermore, $\left[T^\left\{-1\right\}\right]_gamma^beta=\left(\left[T\right]_beta^gamma\right)^\left\{-1\right\}$

Lemma. Let T be an invertible linear transformation from V to W. Then V is finite-dimensional if and only if W is finite-dimensional. In this case, dim(V)=dim(W).

Corollary 1. Let V be a finite-dimensional vector space with an ordered basis β, and let T:V→V be linear. Then T is invertible if and only if [T]β is invertible. Furthermore, [T-1]β=([T]β)-1.

Corollary 2. Let A be an n×n matrix. Then A is invertible if and only if LA is invertible. Furthermore, (LA)-1=LA-1.

$\left\{color\left\{Blue\right\}~2.19\right\}$ V is isomorphic to W ⇔ dim(V)=dim(W)

Let W and W be finite-dimensional vector spaces (over the same field). Then V is isomorphic to W if and only if dim(V)=dim(W).

Corollary. Let V be a vector space over F. Then V is isomorphic to Fn if and only if dim(V)=n.

$\left\{color\left\{Blue\right\}~2.20\right\}$ ??

Let W and W be finite-dimensional vector spaces over F of dimensions n and m, respectively, and let β and γ be ordered bases for V and W, respectively. Then the function $~Phi$: $mathcal\left\{L\right\}$(V,W)→Mm×n(F), defined by $~Phi\left(T\right)=\left[T\right]_beta^gamma$ for T∈$mathcal\left\{L\right\}$(V,W), is an isomorphism.

Corollary. Let V and W be finite-dimensional vector spaces of dimension n and m, respectively. Then $mathcal\left\{L\right\}$(V,W) is finite-dimensional of dimension mn.

$\left\{color\left\{Blue\right\}~2.21\right\}$Φβ is an isomorphism

For any finite-dimensional vector space V with ordered basis β, Φβ is an isomorphism.

$\left\{color\left\{Blue\right\}~2.22\right\}$ ??

Let β and β' be two ordered bases for a finite-dimensional vector space V, and let $Q=\left[I_V\right]_\left\{beta\text{'}\right\}^beta$. Then
(a) $Q$ is invertible.
(b) For any $vin$ V, $~\left[v\right]_beta=Q\left[v\right]_\left\{beta\text{'}\right\}$.

$\left\{color\left\{Blue\right\}~2.23\right\}$ [T]β'=Q-1[T]βQ

Let T be a linear operator on a finite-dimensional vector space V,and let β and β' be two ordered bases for V. Suppose that Q is the change of coordinate matrix that changes β'-coordinates into β-coordinates. Then
$~\left[T\right]_\left\{beta\text{'}\right\}=Q^\left\{-1\right\}\left[T\right]_beta Q$.

Corollary. Let A∈Mn×n(F), and le t γ be an ordered basis for Fn. Then [LA]γ=Q-1AQ, where Q is the n×n matrix whose jth column is the jth vector of γ.

$\left\{color\left\{Blue\right\}~2.27\right\}$p(D)(x)=0 (p(D)∈C∞)⇒ x(k)exists (k∈N)

Any solution to a homogeneous linear differential equation with constant coefficients has derivatives of all orders; that is, if $x$ is a solution to such an equation, then $x^\left\{\left(k\right)\right\}$ exists for every positive integer k.

$\left\{color\left\{Blue\right\}~2.28\right\}$ {solutions}= N(p(D))

The set of all solutions to a homogeneous linear differential equation with constant coefficients coincides with the null space of p(D), where p(t) is the auxiliary polynomial with the equation.

Corollary. The set of all solutions to s homogeneous linear differential equation with constant coefficients is a subspace of $mbox\left\{C\right\}^infty$.

$\left\{color\left\{Blue\right\}~2.29\right\}$ derivative of exponential function

For any exponential function $f\left(t\right)=e^\left\{ct\right\}, f\text{'}\left(t\right)=ce^\left\{ct\right\}$.

$\left\{color\left\{Blue\right\}~2.30\right\}$ {e-at} is a basis of N(p(D+aI))

The solution space for the differential equation,
$y\text{'}+a_0y=0$
is of dimension 1 and has $\left\{e^\left\{-a_0t\right\}\right\}$as a basis.

Corollary. For any complex number c, the null space of the differential operator D-cI has {$e^\left\{ct\right\}$} as a basis.

$\left\{color\left\{Blue\right\}~2.31\right\}$$e^\left\{ct\right\}$ is a solution

Let p(t) be the auxiliary polynomial for a homogeneous linear differential equation with constant coefficients. For any complex number c, if c is a zero of p(t), then to the differential equation.

$\left\{color\left\{Blue\right\}~2.32\right\}$ dim(N(p(D)))=n

For any differential operator p(D) of order n, the null space of p(D) is an n_dimensional subspace of C.

Lemma 1. The differential operator D-cI: C to C is onto for any complex number c.

Lemma 2 Let V be a vector space, and suppose that T and U are linear operators on V such that U is onto and the null spaces of T and U are finite-dimensional, Then the null space of TU is finite-dimensional, and

dim(N(TU))=dim(N(U))+dim(N(U)).

Corollary. The solution space of any nth-order homogeneous linear differential equation with constant coefficients is an n-dimensional subspace of C.

$\left\{color\left\{Blue\right\}~2.33\right\}$ ecit is linearly independent with each other (ci are distinct)

Given n distinct complex numbers $c_1, c_2,...,c_n$, the set of exponential functions $\left\{e^\left\{c_1t\right\},e^\left\{c_2t\right\},...,e^\left\{c_nt\right\}\right\}$ is linearly independent.

Corollary. For any nth-order homogeneous linear differential equation with constant coefficients, if the auxiliary polynomial has n distinct zeros $c_1, c_2, ..., c_n$, then $\left\{e^\left\{c_1t\right\},e^\left\{c_2t\right\},...,e^\left\{c_nt\right\}\right\}$ is a basis for the solution space of the differential equation.

Lemma. For a given complex number c and positive integer n, suppose that (t-c)^n is athe auxiliary polynomial of a homogeneous linear differential equation with constant coefficients. Then the set

$beta=\left\{e^\left\{c_1t\right\},e^\left\{c_2t\right\},...,e^\left\{c_nt\right\}\right\}$
is a basis for the solution space of the equation.

$\left\{color\left\{Blue\right\}~2.34\right\}$ general solution of homogeneous linear differential equation

Given a homogeneous linear differential equation with constant coefficients and auxiliary polynomial
$\left(t-c_1\right)^n_1\left(t-c_2\right)^n_2...\left(t-c_k\right)^n_k,$
where $n_1, n_2,...,n_k$ are positive integers and $c_1, c_2, ..., c_n$ are distinct complex numbers, the following set is a basis for the solution space of the equation:
$\left\{e^\left\{c_1t\right\}, te^\left\{c_1t\right\},...,t^\left\{n_1-1\right\}e^\left\{c_1t\right\},...,e\left\{c_kt\right\},te^\left\{c_kt\right\},..,t^\left\{n_k-1\right\}e^\left\{c_kt\right\}\right\}$.

Determinants

If

$A = begin\left\{pmatrix\right\}$
a & b c & d end{pmatrix} is a 2×2 matrix with entries form a field F, then we define the determinant of A, denoted det(A) or |A|, to be the scalar $ad-bc$.

＊Theorem 1: linear function for a single row.
＊Theorem 2: nonzero determinant ⇔ invertible matrix

Theorem 1: The function det: M2×2(F) → F is a linear function of each row of a 2×2 matrix when the other row is held fixed. That is, if $u,v,$ and $w$ are in F² and $k$ is a scalar, then

$detbegin\left\{pmatrix\right\}$
u + kv w end{pmatrix} =detbegin{pmatrix} u w end{pmatrix} + kdetbegin{pmatrix} v w end{pmatrix}

and

$detbegin\left\{pmatrix\right\}$
w u + kv end{pmatrix} =detbegin{pmatrix} w u end{pmatrix} + kdetbegin{pmatrix} w v end{pmatrix}

Theorem 2: Let A $in$ M2×2(F). Then thee deter minant of A is nonzero if and only if A is invertible. Moreover, if A is invertible, then

$A^\left\{-1\right\}=frac\left\{1\right\}\left\{det\left(A\right)\right\}begin\left\{pmatrix\right\}$
A_{22}&-A_{12} -A_{21}&A_{11} end{pmatrix}

Diagonalization

Characteristic polynomial of a linear operator/matrix

$\left\{color\left\{Blue\right\}~5.1\right\}$ diagonalizable⇔basis of eigenvector

A linear operator T on a finite-dimensional vector space V is diagonalizable if and only if there exists an ordered basis β for V consisting of eigenvectors of T. Furthermore, if T is diagonalizable, $beta= \left\{v_1,v_2,...,v_n\right\}$ is an ordered basis of eigenvectors of T, and D = [T]β then D is a diagonal matrix and $D_\left\{jj\right\}$ is the eigenvalue corresponding to $v_j$ for $1le j le n$.

$\left\{color\left\{Blue\right\}~5.2\right\}$ eigenvalue⇔det(A-λIn)=0

Let A∈Mn×n(F). Then a scalar λ is an eigenvalue of A if and only if det(AIn)=0

$\left\{color\left\{Blue\right\}~5.3\right\}$ characteristic polynomial

Let A∈Mn×n(F).
(a) The characteristic polynomial of A is a polynomial of degree n with leading coefficient(-1)n.
(b) A has at most n distinct eigenvalues.

$\left\{color\left\{Blue\right\}~5.4\right\}$ υ to λ⇔υ∈N(T-λI)

Let T be a linear operator on a vector space V, and let λ be an eigenvalue of T.
A vector υ∈V is an eigenvector of T corresponding to λ if and only if υ≠0 and υ∈N(T-λI).

$\left\{color\left\{Blue\right\}~5.5\right\}$ vi to λi⇔vi is linearly independent

Let T be alinear operator on a vector space V, and let $lambda_1,lambda_2,...,lambda_k,$ be distinct eigenvalues of T. If $v_1,v_2,...,v_k$ are eigenvectors of t such that $lambda_i$ corresponds to $v_i$ ($1le ile k$), then {$v_1,v_2,...,v_k$} is linearly independent.

$\left\{color\left\{Blue\right\}~5.6\right\}$ characteristic polynomial splits

The characteristic polynomial of any diagonalizable linear operator splits.

$\left\{color\left\{Blue\right\}~5.7\right\}$ 1≤dim(Eλ)≤m

Let T be alinear operator on a finite-dimensional vectorspace V, and let λ be an eigenvalue of T haveing multiplicity $m$. Then $1 ledim\left(E_\left\{lambda\right\}\right)le m$.

$\left\{color\left\{Blue\right\}~5.8\right\}$ S=S1∪S2∪...∪Sk is linearly indenpendent

Let T e a linear operator on a vector space V, and let $lambda_1,lambda_2,...,lambda_k,$ be distinct eigenvalues of T. For each $i=1,2,...,k,$ let $S_i$ be a finite linearly indenpendent subset of the eigenspace $E_\left\{lambda_i\right\}$. Then $S=S_1cup S_2 cup...cup S_k$ is a linearly indenpendent subset of V.

$\left\{color\left\{Blue\right\}~5.9\right\}$ ⇔T is diagonalizable

Let T be a linear operator on a finite-dimensional vector space V that the characteristic polynomial of T splits. Let $lambda_1,lambda_2,...,lambda_k$ be the distinct eigenvalues of T. Then
(a) T is diagonalizable if and only if the multiplicity of $lambda_i$ is equal to $dim\left(E_\left\{lambda_i\right\}\right)$ for all $i$.
(b) If T is diagonalizable and $beta_i$ is an ordered basis for $E_\left\{lambda_i\right\}$ for each $i$, then $beta=beta_1cup beta_2cup cupbeta_k$ is an ordered $basis^2$ for V consisting of eigenvectors of T.

Test for diagonlization

Inner Product Spaces

Inner product, standard inner product on Fn, conjugate transpose, adjoint, Frobenius inner product, complex/real inner product space, norm, length, conjugate linear, orthogonal, perpendicular, orthogonal, unit vector, orthonormal, normalizing.

$\left\{color\left\{Blue\right\}~6.1\right\}$ properties of linear product

Let V be an inner product space. Then for x,y,zin V and c in f, the following staements are true.
(a) $langle x,y+zrangle=langle x,yrangle+langle x,zrangle.$
(b) $langle x,cyrangle=bar\left\{c\right\}langle x,yrangle.$
(c) $langle x,mathit\left\{0\right\}rangle=langlemathit\left\{0\right\},xrangle=0.$
(d) $langle x,xrangle=0$ if and only if $x=mathit\left\{0\right\}.$
(e) If$langle x,yrangle=langle x,zrangle$ for all $xin$ V, then $y=z$.

$\left\{color\left\{Blue\right\}~6.2\right\}$ law of norm

Let V be an inner product space over F. Then for all x,yin V and cin F, the following statements are true.
(a) $|cx|=|c|cdot|x|$.
(b) $|x|=0$ if and only if $x=0$. In any case, $|x|ge0$.
(c)(Cauchy-Schwarz In equality)$|langle x,yrangle|le|x|cdot|y|$.
(d)(Triangle Inequality)$|x+y|le|x|+|y|$.

$\left\{color\left\{Blue\right\}~6.3\right\}$ span of orthogonal subset

Let V be an inner product space and S={v_1,v_2,...,v_k} be an orthogonal subset of V consisting of nonzero vectors. If $y$∈span(S), then
$y=sum_\left\{i=1\right\}^n\left\{langle y,v_i rangle over |v_i|^2\right\}v_i$

$\left\{color\left\{Blue\right\}~6.4\right\}$ Gram-Schmidt process

Let V be an inner product space and S=$\left\{w_1,w_2,...,w_n\right\}$ be a linearly independent subset of V. DefineS'=$\left\{v_1,v_2,...,v_n\right\}$, where $v_1=w_1$ and
$v_k=w_k-sum_\left\{j=1\right\}^\left\{k-1\right\}\left\{langle w_k, v_jrangleover|v_j|^2\right\}v_j$
Then S' is an orhtogonal set of nonzero vectors such that span(S')=span(S).

$\left\{color\left\{Blue\right\}~6.5\right\}$ orthonormal basis

Let V be a nonzero finite-dimensional inner product space. Then V has an orthonormal basis β. Furthermore, if β =$\left\{v_1,v_2,...,v_n\right\}$ and x∈V, then
$x=sum_\left\{i=1\right\}^nlangle x,v_irangle v_i$.

Corollary. Let V be a finite-dimensional inner product space with an orthonormal basis β =$\left\{v_1,v_2,...,v_n\right\}$. Let T be a linear operator on V, and let A=[T]β. Then for any $i$ and $j$, $A_\left\{ij\right\}=langle T\left(v_j\right), v_irangle$.

$\left\{color\left\{Blue\right\}~6.6\right\}$ W⊥ by orthonormal basis

Let W be a finite-dimensional subspace of an inner product space V, and let $y$∈V. Then there exist unique vectors $u$∈W and $u$∈W such that $y=u+z$. Furthermore, if $\left\{v_1,v_2,...,v_k\right\}$ is an orthornormal basis for W, then
$u=sum_\left\{i=1\right\}^klangle y,v_irangle v_i$.
S={v_1,v_2,...,v_k} Corollary. In the notation of Theorem 6.6, the vector $u$ is the unique vector in W that is "closest" to $y$; thet is, for any $x$∈W, $|y-x|ge|y-u|$, and this inequality is an equality if and onlly if $x=u$.

$\left\{color\left\{Blue\right\}~6.7\right\}$ properties of orthonormal set

Suppose that $S=\left\{v_1,v_2,...,v_k\right\}$ is an orthonormal set in an $n$-dimensional inner product space V. Than
(a) S can be extended to an orthonormal basis $\left\{v_1, v_2, ...,v_k,v_\left\{k+1\right\},...,v_n\right\}$ for V.
(b) If W=span(S), then $S_1=\left\{v_\left\{k+1\right\},v_\left\{k+2\right\},...,v_n\right\}$ is an orhtonormal basis for W(using the preceding notation).
(c) If W is any subspace of V, then dim(V)=dim(W)+dim(W).

$\left\{color\left\{Blue\right\}~6.8\right\}$ linear functional representation inner product

Let V be a finite-dimensional inner product space over F, and let $g$:V→F be a linear transformation. Then there exists a unique vector $y$∈ V such that $rm\left\{g\right\}\left(x\right)=langle x, yrangle$ for all $x$∈ V.

$\left\{color\left\{Blue\right\}~6.9\right\}$ definition of T*

Let V be a finite-dimensional inner product space, and let T be a linear operator on V. Then there exists a unique function T*:V→V such that $langlerm\left\{T\right\}\left(x\right),yrangle=langle x, rm\left\{T\right\}^*\left(y\right)rangle$ for all $x,y$ ∈ V. Furthermore, T* is linear

$\left\{color\left\{Blue\right\}~6.10\right\}$ [T*]β=[T]*β

Let V be a finite-dimensional inner product space, and let β be an orthonormal basis for V. If T is a linear operator on V, then
$\left[T^*\right]_beta=\left[T\right]^*_beta$.

$\left\{color\left\{Blue\right\}~6.11\right\}$ properties of T*

Let V be an inner product space, and let T and U be linear operators onV. Then
(a) (T+U)*=T*+U*;
(b) ($c$T)*=$bar c$ T* for any c∈ F;
(c) (TU)*=U*T*;
(d) T**=T;
(e) I*=I.

Corollary. Let A and B be n×nmatrices. Then
(a) (A+B)*=A*+B*;
(b) ($c$A)*=$bar c$ A* for any $c$∈ F;
(c) (AB)*=B*A*;
(d) A**=A;
(e) I*=I.

$\left\{color\left\{Blue\right\}~6.12\right\}$ Least squares approximation

Let A ∈ Mm×n(F) and $y$∈Fm. Then there exists $x_0$ ∈ Fn such that $\left(A*A\right)x_0=A*y$ and $|Ax_0-Y|le|Ax-y|$ for all x∈ Fn

Lemma 1. let A ∈ Mm×n(F), $x$∈Fn, and $y$∈Fm. Then

$langle Ax, yrangle _m =langle x, A*yrangle _n$

Lemma 2. Let A ∈ Mm×n(F). Then rank(A*A)=rank(A).

Corollary.(of lemma 2) If A is an m×n matrix such that rank(A)=n, then A*A is invertible.

$\left\{color\left\{Blue\right\}~6.13\right\}$ Minimal solutions to systems of linear equations

Let A ∈ Mm×n(F) and b∈ Fm. Suppose that $Ax=b$ is consistent. Then the following statements are true.
(a) There existes exactly one minimal solution $s$ of $Ax=b$, and $s$∈R(LA*).
(b) Ther vector $s$ is the only solutin to $Ax=b$ that lies in R(LA*); that is , if $u$ satisfies $\left(AA*\right)u=b$, then $s=A*u$.

References

• Linear Algebra 4th edition, by Stephen H. Friedberg Arnold J. Insel and Lawrence E. spence ISBN7040167336
• Linear Algebra 3rd edition, by Serge Lang (UTM) ISBN0387964126
Search another word or see schwarz in-equalityon Dictionary | Thesaurus |Spanish
Copyright © 2015 Dictionary.com, LLC. All rights reserved.
• Please Login or Sign Up to use the Recent Searches feature