Added to Favorites

Popular Searches

Definitions

This article collects the main theorems and definitions in linear algebra.
## Vector spaces

A vector space(or linear space) V over a number field² F consists of a set on which two operations (called addition and scalar multiplication, respectively) are defined so, that for each pair of elements x, y, in V there is a unique element x + y in V, and for each element a in F and each element x in V there is a unique element ax in V, such that the following conditions hold.### Vector spaces

### Subspaces

### Linear combinations

### Systems of linear equations

### Linear dependence

### Linear independence

### Bases

### Dimension

## Linear transformations and matrices

===Linear transformations===
===Null spaces===
===Ranges===
===The matrix representation of a linear transformation===
===Composition of linear transformations===
===Matrix multiplication===
===Invertibility===
===Isomorphisms===
===The change-of-coordinates matrix===### $\{color\{Blue\}~2.1\}$ N(T)&R(T) are subspaces

Let V and W be vector spaces and I: V→W be linear. Then N(T) and R (T) are subspaces of Vand W, respectively.
### $\{color\{Blue\}~2.2\}$ R(T)= span of T(basis in V)

Let V and W be vector spaces, and let T: V→W be linear. If $beta=\{v\_1,v\_2,...,v\_n\}$ is a basis for V, then

### $\{color\{Blue\}~2.3\}$ Dimension Theorem

Let V and W be vector spaces, and let T: V→W be linear. If V is finite-dimensional, then

### $\{color\{Blue\}~2.4\}$ one-to-one ⇔ N(T)={0}

Let V and W be vector spaces, and let T: V→W be linear. Then T is one-to-one if and only if N(T)={0}.
### $\{color\{Blue\}~2.5\}$ one-to-one ⇔ onto ⇔ rank(T)=dim(V)

Let V and W be vector spaces of equal (finite) dimension, and let T:V→W be linear. Then the following are equivalent.

### $\{color\{Blue\}~2.6\}$ ∀ $\{w\_1,w\_2...w\_n\}=$ exactly one T(basis),

Let V and W be vector space over F, and suppose that $\{v\_1,\; v\_2,...,v\_n\}$ is a basis for V. For $w\_1,\; w\_2,...w\_n$ in W, there exists exactly one linear transformation T: V→W such that $mbox\{T\}(v\_i)=w\_i$ for $i=1,2,...n.$

Corollary. Let V and W be vector spaces, and suppose that V has a finite basis $\{v\_1,v\_2,...,v\_n\}$. If U, T: V→W are linear and $U(v\_i)=T(v\_i)$ for $i=1,2,...,n,$ then U=T.### $\{color\{Blue\}~2.7\}$ T is vector space

Let V and W be vector spaces over a field F, and let T, U: V→W be linear.

### $\{color\{Blue\}~2.8\}$ linearity of matrix representation of linear transformation

Let V and W ve finite-dimensional vector spaces with ordered bases β and γ, respectively, and let T, U: V→W be linear transformations. Then

### $\{color\{Blue\}~2.9\}$ commutative law of linear operator

Let V,w, and Z be vector spaces over the same field f, and let T:V→W and U:W→Z be linear. then UT:V→Z is linear.
### $\{color\{Blue\}~2.10\}$ law of linear operator

Let v be a vector space. Let T, U_{1}, U_{2} ∈ $mathcal\{L\}$(V). Then

(a) T(U_{1}+U_{2})=TU_{1}+TU_{2} and (U_{1}+U_{2})T=U_{1}T+U_{2}T

(b) T(U_{1}U_{2})=(TU_{1})U_{2}

(c) TI=IT=T

(d) $a$(U_{1}U_{2})=($a$U_{1})U_{2}=U_{1}($a$U_{2}) for all scalars $a$.
### $\{color\{Blue\}~2.11\}$ [UT]_{α}^{γ}=[U]_{β}^{γ}[T]_{α}^{β}

Let V, W and Z be finite-dimensional vector spaces with ordered bases α β γ, respectively. Let T: V⇐W and U: W→Z be linear transformations. Then

### $\{color\{Blue\}~2.12\}$ law of matrix

Let A be an m×n matrix, B and C be n×p matrices, and D and E be q×m matrices. Then

### $\{color\{Blue\}~2.13\}$ law of column multiplication

Let A be an m×n matrix and B be an n×p matrix. For each $j\; (1le\; jle\; p)$ let $u\_j$ and $v\_j$ denote the jth columns of AB and B, respectively. Then

(a) $u\_j=Av\_j$

(b) $v\_j=Be\_j$, where $e\_j$ is the jth standard vector of F^{p}.
### $\{color\{Blue\}~2.14\}$ [T(u)]_{γ}=[T]_{β}^{γ}[u]_{β}

Let V and W be finite-dimensional vector spaces having ordered bases β and γ, respectively, and let T: V→W be linear. Then, for each u ∈ V, we have

### $\{color\{Blue\}~2.15\}$ laws of L_{A}

Let A be an m×n matrix with entries from F. Then the left-multiplication transformation L_{A}: F^{n}→F^{m} is linear. Furthermore, if B is any other m×n matrix (with entries from F) and β and γ are the standard ordered bases for F^{n} and F^{m}, respectively, then we have the following properties.

(a) $[L\_A]\_beta^gamma=A$.

(b) L_{A}=L_{B} if and only if A=B.

(c) L_{A+B}=L_{A}+L_{B} and L_{$a$A}=$a$L_{A} for all $a$∈F.

(d) If T:F^{n}→F^{m} is linear, then there exists a unique m×n matrix C such that T=L_{C}. In fact, $mbox\{C\}=[L\_A]\_beta^gamma$.

(e) If W is an n×p matrix, then L_{AE}=L_{A}L_{E}.

(f ) If m=n, then $L\_\{I\_n\}=I\_\{F^n\}$.### $\{color\{Blue\}~2.16\}$ A(BC)=(AB)C

Let A,B, and C be matrices such that A(BC) is defined. Then A(BC)=(AB)C; that is, matrix multiplication is associative.
### $\{color\{Blue\}~2.17\}$ T^{-1}is linear

Let V and W be vector spaces, and let T:V→W be linear and invertible. Then T^{-1}: W
→V is linear.
### $\{color\{Blue\}~2.18\}$ [T^{-1}]_{γ}^{β}=([T]_{β}^{γ})^{-1}

Let V and W be finite-dimensional vector spaces with ordered bases β and γ, respectively. Let T:V→W be linear. Then T is invertible if and only if $[T]\_beta^gamma$ is invertible. Furthermore, $[T^\{-1\}]\_gamma^beta=([T]\_beta^gamma)^\{-1\}$

### $\{color\{Blue\}~2.19\}$ V is isomorphic to W ⇔ dim(V)=dim(W)

Let W and W be finite-dimensional vector spaces (over the same field). Then V is isomorphic to W if and only if dim(V)=dim(W).

### $\{color\{Blue\}~2.20\}$ ??

Let W and W be finite-dimensional vector spaces over F of dimensions n and m, respectively, and let β and γ be ordered bases for V and W, respectively. Then the function $~Phi$: $mathcal\{L\}$(V,W)→M_{m×n}(F), defined by $~Phi(T)=[T]\_beta^gamma$ for T∈$mathcal\{L\}$(V,W), is an isomorphism.

### $\{color\{Blue\}~2.21\}$ Φ_{β} is an isomorphism

For any finite-dimensional vector space V with ordered basis β, Φ_{β} is an isomorphism.
### $\{color\{Blue\}~2.22\}$ ??

Let β and β' be two ordered bases for a finite-dimensional vector space V, and let $Q=[I\_V]\_\{beta\text{'}\}^beta$. Then

(a) $Q$ is invertible.

(b) For any $vin$ V, $~[v]\_beta=Q[v]\_\{beta\text{'}\}$.### $\{color\{Blue\}~2.23\}$ [T]_{β'}=Q^{-1}[T]_{β}Q

Let T be a linear operator on a finite-dimensional vector space V,and let β and β' be two ordered bases for V. Suppose that Q is the change of coordinate matrix that changes β'-coordinates into β-coordinates. Then

### $\{color\{Blue\}~2.24\}$

### $\{color\{Blue\}~2.25\}$

### $\{color\{Blue\}~2.26\}$

### $\{color\{Blue\}~2.27\}$ p(D)(x)=0 (p(D)∈C^{∞})⇒ x^{(k)}exists (k∈N)

Any solution to a homogeneous linear differential equation with constant coefficients has derivatives of all orders; that is, if $x$ is a solution to such an equation, then $x^\{(k)\}$ exists for every positive integer k.
### $\{color\{Blue\}~2.28\}$ {solutions}= N(p(D))

The set of all solutions to a homogeneous linear differential equation with constant coefficients coincides with the null space of p(D), where p(t) is the auxiliary polynomial with the equation.### $\{color\{Blue\}~2.29\}$ derivative of exponential function

For any exponential function $f(t)=e^\{ct\},\; f\text{'}(t)=ce^\{ct\}$.

### $\{color\{Blue\}~2.30\}$ {e^{-at}} is a basis of N(p(D+aI))

The solution space for the differential equation,

### $\{color\{Blue\}~2.31\}$ $e^\{ct\}$ is a solution

Let p(t) be the auxiliary polynomial for a homogeneous linear differential equation with constant coefficients. For any complex number c, if c is a zero of p(t), then to the differential equation.
### $\{color\{Blue\}~2.32\}$ dim(N(p(D)))=n

For any differential operator p(D) of order n, the null space of p(D) is an n_dimensional subspace of C^{∞}.### $\{color\{Blue\}~2.33\}$ e^{cit} is linearly independent with each other (c_{i} are distinct)

Given n distinct complex numbers $c\_1,\; c\_2,...,c\_n$, the set of exponential functions $\{e^\{c\_1t\},e^\{c\_2t\},...,e^\{c\_nt\}\}$ is linearly independent.### $\{color\{Blue\}~2.34\}$ general solution of homogeneous linear differential equation

Given a homogeneous linear differential equation with constant coefficients and auxiliary polynomial

## Elementary matrix operations and systems of linear equations

### Elementary matrix operations

### Elementary matrix

### Rank of a matrix

### Matrix inverses

### System of linear equations

## Determinants

## Diagonalization

Characteristic polynomial of a linear operator/matrix
### $\{color\{Blue\}~5.1\}$ diagonalizable⇔basis of eigenvector

A linear operator T on a finite-dimensional vector space V is diagonalizable if and only if there exists an ordered basis β for V consisting of eigenvectors of T. Furthermore, if T is diagonalizable, $beta=\; \{v\_1,v\_2,...,v\_n\}$ is an ordered basis of eigenvectors of T, and D = [T]_{β} then D is a diagonal matrix and $D\_\{jj\}$ is the eigenvalue corresponding to $v\_j$ for $1le\; j\; le\; n$.
### $\{color\{Blue\}~5.2\}$ eigenvalue⇔det(A-λIn)=0

Let A∈M_{n×n}(F). Then a scalar λ is an eigenvalue of A if and only if det(A-λI_{n})=0
### $\{color\{Blue\}~5.3\}$ characteristic polynomial

Let A∈Mn×n(F).

(a) The characteristic polynomial of A is a polynomial of degree n with leading coefficient(-1)n.

(b) A has at most n distinct eigenvalues.### $\{color\{Blue\}~5.4\}$ υ to λ⇔υ∈N(T-λI)

Let T be a linear operator on a vector space V, and let λ be an eigenvalue of T.

A vector υ∈V is an eigenvector of T corresponding to λ if and only if υ≠0 and υ∈N(T-λI).### $\{color\{Blue\}~5.5\}$ vi to λi⇔vi is linearly independent

Let T be alinear operator on a vector space V, and let $lambda\_1,lambda\_2,...,lambda\_k,$ be distinct eigenvalues of T. If $v\_1,v\_2,...,v\_k$ are eigenvectors of t such that $lambda\_i$ corresponds to $v\_i$ ($1le\; ile\; k$), then {$v\_1,v\_2,...,v\_k$} is linearly independent.
### $\{color\{Blue\}~5.6\}$ characteristic polynomial splits

The characteristic polynomial of any diagonalizable linear operator splits.
### $\{color\{Blue\}~5.7\}$ 1≤dim(Eλ)≤m

Let T be alinear operator on a finite-dimensional vectorspace V, and let λ be an eigenvalue of T haveing multiplicity $m$. Then $1\; ledim(E\_\{lambda\})le\; m$.
### $\{color\{Blue\}~5.8\}$ S=S1∪S2∪...∪Sk is linearly indenpendent

Let T e a linear operator on a vector space V, and let $lambda\_1,lambda\_2,...,lambda\_k,$ be distinct eigenvalues of T. For each $i=1,2,...,k,$ let $S\_i$ be a finite linearly indenpendent subset of the eigenspace $E\_\{lambda\_i\}$. Then $S=S\_1cup\; S\_2\; cup...cup\; S\_k$ is a linearly indenpendent subset of V.
### $\{color\{Blue\}~5.9\}$ ⇔T is diagonalizable

Let T be a linear operator on a finite-dimensional vector space V that the characteristic polynomial of T splits. Let $lambda\_1,lambda\_2,...,lambda\_k$ be the distinct eigenvalues of T. Then

(a) T is diagonalizable if and only if the multiplicity of $lambda\_i$ is equal to $dim(E\_\{lambda\_i\})$ for all $i$.

(b) If T is diagonalizable and $beta\_i$ is an ordered basis for $E\_\{lambda\_i\}$ for each $i$, then $beta=beta\_1cup\; beta\_2cup\; cupbeta\_k$ is an ordered $basis^2$ for V consisting of eigenvectors of T.## Inner Product Spaces

Inner product, standard inner product on F^{n}, conjugate transpose, adjoint, Frobenius inner product, complex/real inner product space, norm, length, conjugate linear, orthogonal, perpendicular, orthogonal, unit vector, orthonormal, normalizing.
### $\{color\{Blue\}~6.1\}$ properties of linear product

Let V be an inner product space. Then for x,y,zin V and c in f, the following staements are true.

(a) $langle\; x,y+zrangle=langle\; x,yrangle+langle\; x,zrangle.$

(b) $langle\; x,cyrangle=bar\{c\}langle\; x,yrangle.$

(c) $langle\; x,mathit\{0\}rangle=langlemathit\{0\},xrangle=0.$

(d) $langle\; x,xrangle=0$ if and only if $x=mathit\{0\}.$

(e) If$langle\; x,yrangle=langle\; x,zrangle$ for all $xin$ V, then $y=z$.### $\{color\{Blue\}~6.2\}$ law of norm

Let V be an inner product space over F. Then for all x,yin V and cin F, the following statements are true.

(a) $|cx|=|c|cdot|x|$.

(b) $|x|=0$ if and only if $x=0$. In any case, $|x|ge0$.

(c)(Cauchy-Schwarz In equality)$|langle\; x,yrangle|le|x|cdot|y|$.

(d)(Triangle Inequality)$|x+y|le|x|+|y|$.

### $\{color\{Blue\}~6.3\}$ span of orthogonal subset

Let V be an inner product space and S={v_1,v_2,...,v_k} be an orthogonal subset of V consisting of nonzero vectors. If $y$∈span(S), then

### $\{color\{Blue\}~6.4\}$ Gram-Schmidt process

Let V be an inner product space and S=$\{w\_1,w\_2,...,w\_n\}$ be a linearly independent subset of V. DefineS'=$\{v\_1,v\_2,...,v\_n\}$, where $v\_1=w\_1$ and

### $\{color\{Blue\}~6.5\}$ orthonormal basis

Let V be a nonzero finite-dimensional inner product space. Then V has an orthonormal basis β. Furthermore, if β =$\{v\_1,v\_2,...,v\_n\}$ and x∈V, then

### $\{color\{Blue\}~6.6\}$ W^{⊥} by orthonormal basis

Let W be a finite-dimensional subspace of an inner product space V, and let $y$∈V. Then there exist unique vectors $u$∈W and $u$∈W^{⊥} such that $y=u+z$. Furthermore, if $\{v\_1,v\_2,...,v\_k\}$ is an orthornormal basis for W, then

### $\{color\{Blue\}~6.7\}$ properties of orthonormal set

Suppose that $S=\{v\_1,v\_2,...,v\_k\}$ is an orthonormal set in an $n$-dimensional inner product space V. Than

(a) S can be extended to an orthonormal basis $\{v\_1,\; v\_2,\; ...,v\_k,v\_\{k+1\},...,v\_n\}$ for V.

(b) If W=span(S), then $S\_1=\{v\_\{k+1\},v\_\{k+2\},...,v\_n\}$ is an orhtonormal basis for W^{⊥}(using the preceding notation).

(c) If W is any subspace of V, then dim(V)=dim(W)+dim(W^{⊥}).### $\{color\{Blue\}~6.8\}$ linear functional representation inner product

Let V be a finite-dimensional inner product space over F, and let $g$:V→F be a linear transformation. Then there exists a unique vector $y$∈ V such that $rm\{g\}(x)=langle\; x,\; yrangle$ for all $x$∈ V.
### $\{color\{Blue\}~6.9\}$ definition of T*

Let V be a finite-dimensional inner product space, and let T be a linear operator on V. Then there exists a unique function T*:V→V such that $langlerm\{T\}(x),yrangle=langle\; x,\; rm\{T\}^*(y)rangle$ for all $x,y$ ∈ V. Furthermore, T* is linear
### $\{color\{Blue\}~6.10\}$ [T*]_{β}=[T]*_{β}

Let V be a finite-dimensional inner product space, and let β be an orthonormal basis for V. If T is a linear operator on V, then

### $\{color\{Blue\}~6.11\}$ properties of T*

Let V be an inner product space, and let T and U be linear operators onV. Then

(a) (T+U)*=T*+U*;

(b) ($c$T)*=$bar\; c$ T* for any c∈ F;

(c) (TU)*=U*T*;

(d) T**=T;

(e) I*=I.

### $\{color\{Blue\}~6.12\}$ Least squares approximation

Let A ∈ M_{m×n}(F) and $y$∈F^{m}. Then there exists $x\_0$ ∈ F^{n} such that $(A*A)x\_0=A*y$ and $|Ax\_0-Y|le|Ax-y|$ for all x∈ F^{n
}

### $\{color\{Blue\}~6.13\}$ Minimal solutions to systems of linear equations

Let A ∈ M_{m×n}(F) and b∈ F^{m}. Suppose that $Ax=b$ is consistent. Then the following statements are true.

(a) There existes exactly one minimal solution $s$ of $Ax=b$, and $s$∈R(L_{A*}).

(b) Ther vector $s$ is the only solutin to $Ax=b$ that lies in R(L_{A*}); that is , if $u$ satisfies $(AA*)u=b$, then $s=A*u$.
## Canonical forms

## References

- (VS 1) For all $x,\; y$ in V, $x+y=y+x$ (commutativity of addition).
- (VS 2) For all $x,\; y,\; z$ in V, $(x+y)+z=x+(y+z)$ (associativity of addition).
- (VS 3) There exists an element in V denoted by $0$ such that $x+0=x$ for each $x$ in V.
- (VS 4) For each element $x$ in V there exists an element $y$ in V such that $x+y=0$.
- (VS 5) For each element $x$ in V, $1x=x$.
- (VS 6) For each pair of element a in F and each pair of elements $x,y$ in V'', $a(x+y)=ax+ay$.
- (VS 7) For each element $a$ in F and each pair of elements $x,y$ in V, $a(x+y)=ax+ay$.
- (VS 8) For each pair of elements $a,b$ in F and each pair of elements $x$ in V, $(a+b)x=ax+bx$.

Change of coordinate matrix

Clique

Coordinate vector relative to a basis

Dimension theorem

Dominance relation

Identity matrix

Identity transformation

Incidence matrix

Inverse of a linear transformation

Inverse of a matrix

Invertible linear transformation

Isomorphic vector spaces

Isomorphism

Kronecker delta

Left-multiplication transformation

Linear operator

Linear transformation

Matrix representing a linear transformation

Nullity of a linear transformation

Null space

Ordered basis

Product of matrices

Projection on a subspace

Projection on the x-axis

Range

Rank of a linear transformation

Reflection about the x-axis

Rotation

Similar matrices

Standard ordered basis for $F\_n$

Standard representation of a vector space with respect to a basis

Zero transformation

P.S. coefficient of the differential equation,differentiability of complex function,vector space of functionsdifferential operator, auxiliary polynomial, to the power of a complex number, exponential function.

- $mbox\{R(T)\}=mbox\{span\}(T(betambox\{))\}=mbox\{span\}(\{T(v\_1),T(v\_2),...,T(v\_n)\})$.

- $mbox\{nullity\}(T)+mbox\{rank\}(T)=dim(V).$

- (a) T is one-to-one.

- (b) T is onto.

- (c) rank(T)=dim(V).

Corollary. Let V and W be vector spaces, and suppose that V has a finite basis $\{v\_1,v\_2,...,v\_n\}$. If U, T: V→W are linear and $U(v\_i)=T(v\_i)$ for $i=1,2,...,n,$ then U=T.

- (a) For all $a$ ∈ F, $ambox\{T\}+mbox\{U\}$ is linear.

- (b) Using the operations of addition and scalar multiplication in the preceding definition, the collection of all linear transformations form V to W is a vector space over F.

- (a)$[T+U]\_beta^gamma=[T]\_beta^gamma+[U]\_beta^gamma$ and

- (b)$[aT]\_beta^gamma=a[T]\_beta^gamma$ for all scalars $a$.

(a) T(U

(b) T(U

(c) TI=IT=T

(d) $a$(U

- $[UT]\_alpha^gamma=[U]\_beta^gamma[T]\_alpha^beta$.

Corollary. Let V be a finite-dimensional vector space with an ordered basis β. Let T,U∈$mathcal\{L\}$(V). Then [UT]_{β}=[U]_{β}[T]_{β}.

- (a) A(B+C)=AB+AC and (D+E)A=DA+EA.

- (b) $a$(AB)=($a$A)B=A($a$B) for any scalar $a$.

- (c) I
_{m}A=AI_{m}.

- (d) If V is an n-dimensional vector space with an ordered basis β, then [I
_{v}]_{β}=I_{n}.

Corollary. Let A be an m×n matrix, B_{1},B_{2},...,B_{k} be n×p matrices, C_{1},C_{1},...,C_{1} be q×m matrices, and $a\_1,a\_2,...,a\_k$ be scalars. Then

- $ABigg(sum\_\{i=1\}^k\; a\_iB\_iBigg)=sum\_\{i=1\}^k\; a\_iAB\_i$

- $Bigg(sum\_\{i=1\}^k\; a\_iC\_iBigg)A=sum\_\{i=1\}^k\; a\_iC\_iA$.

(a) $u\_j=Av\_j$

(b) $v\_j=Be\_j$, where $e\_j$ is the jth standard vector of F

- $[T(u)]\_gamma=[T]\_beta^gamma[u]\_beta$.

(a) $[L\_A]\_beta^gamma=A$.

(b) L

(c) L

(d) If T:F

(e) If W is an n×p matrix, then L

(f ) If m=n, then $L\_\{I\_n\}=I\_\{F^n\}$.

Lemma. Let T be an invertible linear transformation from V to W. Then V is finite-dimensional if and only if W is finite-dimensional. In this case, dim(V)=dim(W).

Corollary 1. Let V be a finite-dimensional vector space with an ordered basis β, and let T:V→V be linear. Then T is invertible if and only if [T]_{β} is invertible. Furthermore, [T^{-1}]_{β}=([T]_{β})^{-1}.

Corollary 2. Let A be an n×n matrix. Then A is invertible if and only if L_{A} is invertible. Furthermore, (L_{A})^{-1}=L_{A-1}.

Corollary. Let V be a vector space over F. Then V is isomorphic to F^{n} if and only if dim(V)=n.

Corollary. Let V and W be finite-dimensional vector spaces of dimension n and m, respectively. Then $mathcal\{L\}$(V,W) is finite-dimensional of dimension mn.

(a) $Q$ is invertible.

(b) For any $vin$ V, $~[v]\_beta=Q[v]\_\{beta\text{'}\}$.

- $~[T]\_\{beta\text{'}\}=Q^\{-1\}[T]\_beta\; Q$.

Corollary. Let A∈M_{n×n}(F), and le t γ be an ordered basis for F^{n}. Then [L_{A}]_{γ}=Q^{-1}AQ, where Q is the n×n matrix whose jth column is the jth vector of γ.

Corollary. The set of all solutions to s homogeneous linear differential equation with constant coefficients is a subspace of $mbox\{C\}^infty$.

- $y\text{'}+a\_0y=0$

Corollary. For any complex number c, the null space of the differential operator D-cI has {$e^\{ct\}$} as a basis.

Lemma 1. The differential operator D-cI: C^{∞} to C^{∞} is onto for any complex number c.

Lemma 2 Let V be a vector space, and suppose that T and U are linear operators on V such that U is onto and the null spaces of T and U are finite-dimensional, Then the null space of TU is finite-dimensional, and

- dim(N(TU))=dim(N(U))+dim(N(U)).

Corollary. The solution space of any nth-order homogeneous linear differential equation with constant coefficients is an n-dimensional subspace of C^{∞}.

Corollary. For any nth-order homogeneous linear differential equation with constant coefficients, if the auxiliary polynomial has n distinct zeros $c\_1,\; c\_2,\; ...,\; c\_n$, then $\{e^\{c\_1t\},e^\{c\_2t\},...,e^\{c\_nt\}\}$ is a basis for the solution space of the differential equation.

Lemma. For a given complex number c and positive integer n, suppose that (t-c)^n is athe auxiliary polynomial of a homogeneous linear differential equation with constant coefficients. Then the set

- $beta=\{e^\{c\_1t\},e^\{c\_2t\},...,e^\{c\_nt\}\}$

- $(t-c\_1)^n\_1(t-c\_2)^n\_2...(t-c\_k)^n\_k,$

- $\{e^\{c\_1t\},\; te^\{c\_1t\},...,t^\{n\_1-1\}e^\{c\_1t\},...,e\{c\_kt\},te^\{c\_kt\},..,t^\{n\_k-1\}e^\{c\_kt\}\}$.

If

- $A\; =\; begin\{pmatrix\}$

＊Theorem 1: linear function for a single row.

＊Theorem 2: nonzero determinant ⇔ invertible matrix

Theorem 1:
The function det: M_{2×2}(F) → F is a linear function of each row of a 2×2 matrix when the other row is held fixed. That is, if $u,v,$ and $w$ are in F² and $k$ is a scalar, then

- $detbegin\{pmatrix\}$

and

- $detbegin\{pmatrix\}$

Theorem 2:
Let A $in$ M_{2×2}(F). Then thee deter minant of A is nonzero if and only if A is invertible. Moreover, if A is invertible, then

- $A^\{-1\}=frac\{1\}\{det(A)\}begin\{pmatrix\}$

(a) The characteristic polynomial of A is a polynomial of degree n with leading coefficient(-1)n.

(b) A has at most n distinct eigenvalues.

A vector υ∈V is an eigenvector of T corresponding to λ if and only if υ≠0 and υ∈N(T-λI).

(a) T is diagonalizable if and only if the multiplicity of $lambda\_i$ is equal to $dim(E\_\{lambda\_i\})$ for all $i$.

(b) If T is diagonalizable and $beta\_i$ is an ordered basis for $E\_\{lambda\_i\}$ for each $i$, then $beta=beta\_1cup\; beta\_2cup\; cupbeta\_k$ is an ordered $basis^2$ for V consisting of eigenvectors of T.

Test for diagonlization

(a) $langle\; x,y+zrangle=langle\; x,yrangle+langle\; x,zrangle.$

(b) $langle\; x,cyrangle=bar\{c\}langle\; x,yrangle.$

(c) $langle\; x,mathit\{0\}rangle=langlemathit\{0\},xrangle=0.$

(d) $langle\; x,xrangle=0$ if and only if $x=mathit\{0\}.$

(e) If$langle\; x,yrangle=langle\; x,zrangle$ for all $xin$ V, then $y=z$.

(a) $|cx|=|c|cdot|x|$.

(b) $|x|=0$ if and only if $x=0$. In any case, $|x|ge0$.

(c)(Cauchy-Schwarz In equality)$|langle\; x,yrangle|le|x|cdot|y|$.

(d)(Triangle Inequality)$|x+y|le|x|+|y|$.

orthonormal basis,Gram-schmidtprocess,Fourier coefficients,orthogonal complement,orthogonal projection

- $y=sum\_\{i=1\}^n\{langle\; y,v\_i\; rangle\; over\; |v\_i|^2\}v\_i$

- $v\_k=w\_k-sum\_\{j=1\}^\{k-1\}\{langle\; w\_k,\; v\_jrangleover|v\_j|^2\}v\_j$

- $x=sum\_\{i=1\}^nlangle\; x,v\_irangle\; v\_i$.

Corollary. Let V be a finite-dimensional inner product space with an orthonormal basis β =$\{v\_1,v\_2,...,v\_n\}$. Let T be a linear operator on V, and let A=[T]_{β}. Then for any $i$ and $j$, $A\_\{ij\}=langle\; T(v\_j),\; v\_irangle$.

- $u=sum\_\{i=1\}^klangle\; y,v\_irangle\; v\_i$.

(a) S can be extended to an orthonormal basis $\{v\_1,\; v\_2,\; ...,v\_k,v\_\{k+1\},...,v\_n\}$ for V.

(b) If W=span(S), then $S\_1=\{v\_\{k+1\},v\_\{k+2\},...,v\_n\}$ is an orhtonormal basis for W

(c) If W is any subspace of V, then dim(V)=dim(W)+dim(W

Least squares approximation,Minimal solutions to systems of linear equations

- $[T^*]\_beta=[T]^*\_beta$.

(a) (T+U)*=T*+U*;

(b) ($c$T)*=$bar\; c$ T* for any c∈ F;

(c) (TU)*=U*T*;

(d) T**=T;

(e) I*=I.

Corollary. Let A and B be n×nmatrices. Then

(a) (A+B)*=A*+B*;

(b) ($c$A)*=$bar\; c$ A* for any $c$∈ F;

(c) (AB)*=B*A*;

(d) A**=A;

(e) I*=I.

Lemma 1. let A ∈ M_{m×n}(F), $x$∈F^{n}, and $y$∈F^{m}. Then

- $langle\; Ax,\; yrangle\; \_m\; =langle\; x,\; A*yrangle\; \_n$

Lemma 2. Let A ∈ M_{m×n}(F). Then rank(A*A)=rank(A).

Corollary.(of lemma 2) If A is an m×n matrix such that rank(A)=n, then A*A is invertible.

(a) There existes exactly one minimal solution $s$ of $Ax=b$, and $s$∈R(L

(b) Ther vector $s$ is the only solutin to $Ax=b$ that lies in R(L

- Linear Algebra 4th edition, by Stephen H. Friedberg Arnold J. Insel and Lawrence E. spence ISBN7040167336
- Linear Algebra 3rd edition, by Serge Lang (UTM) ISBN0387964126

Wikipedia, the free encyclopedia © 2001-2006 Wikipedia contributors (Disclaimer)

This article is licensed under the GNU Free Documentation License.

Last updated on Monday June 16, 2008 at 23:24:47 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

This article is licensed under the GNU Free Documentation License.

Last updated on Monday June 16, 2008 at 23:24:47 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

Copyright © 2014 Dictionary.com, LLC. All rights reserved.