Definitions

# Outer product

In linear algebra, the outer product typically refers to the tensor product of two vectors. The result of applying the outer product to a pair of vectors is a matrix. The name contrasts with the inner product, which takes as input a pair of vectors and produces a scalar.

The outer product of vectors can be also regarded as a special case of the Kronecker product of matrices.

Some authors use the expression "outer product of tensors" as a synonym of "tensor product". The outer product is also a higher-order function in some computer programming languages such as APL and Mathematica.

## Definition

Given a vector $mathbf\left\{u\right\} = \left(u_1, u_2, dots, u_m\right)$ with m elements and a vector $mathbf\left\{v\right\}= \left(v_1, v_2, dots, v_n\right)$ with n elements, their outer product $mathbf\left\{u\right\} otimes mathbf\left\{v\right\}$ is defined as the $mtimes n$ matrix $mathbf\left\{A\right\}$ obtained by multiplying each element of $mathbf\left\{u\right\}$ by each element of $mathbf\left\{v\right\}$:

$mathbf\left\{u\right\} otimes mathbf\left\{v\right\} = mathbf\left\{A\right\} =$
begin{bmatrix}u_1v_1 & u_1v_2 & dots & u_1v_n u_2v_1 & u_2v_2 & dots & u_2v_n cdots & cdots & ddots & cdots u_mv_1 & u_mv_2 & dots & u_mv_n end{bmatrix}.

For complex vectors, it is customary to use the complex conjugate of $mathbf\left\{v\right\}$ (denoted $bar mathbf\left\{v\right\}$). Namely, matrix $mathbf\left\{A\right\}$ is obtained by multiplying each element of $mathbf\left\{u\right\}$ by the complex conjugate of each element of $mathbf\left\{v\right\}$.

## Definition (matrix multiplication)

The outer product $mathbf\left\{u\right\} otimes mathbf\left\{v\right\}$ as defined above is equivalent to a matrix multiplication $mathbf\left\{u\right\} mathbf\left\{v\right\}^T$, provided that $mathbf\left\{u\right\}$ is represented as a $mtimes 1$ column vector and $mathbf\left\{v\right\}$ as a $n times 1$ column vector. For instance, if $m = 4 ,$ and $n = 3, ,$

$mathbf\left\{u\right\} otimes mathbf\left\{v\right\} = mathbf\left\{u\right\} mathbf\left\{v\right\}^T =$
begin{bmatrix}u_1 u_2 u_3 u_4end{bmatrix} begin{bmatrix}v_1 & v_2 & v_3end{bmatrix} = begin{bmatrix}u_1v_1 & u_1v_2 & u_1v_3 u_2v_1 & u_2v_2 & u_2v_3 u_3v_1 & u_3v_2 & u_3v_3 u_4v_1 & u_4v_2 & u_4v_3end{bmatrix}

For complex vectors, it is customary to use the complex conjugate of $mathbf\left\{v\right\}^T$ (denoted $mathbf\left\{v\right\}^H$):

$mathbf\left\{u\right\} otimes mathbf\left\{v\right\} = mathbf\left\{u\right\} mathbf\left\{v\right\}^H$

### Contrast with inner product

If m = n, then one can take the matrix product the other way, yielding a scalar (or $1 times 1$ matrix):
$leftlangle mathbf\left\{u\right\}, mathbf\left\{v\right\}rightrangle = mathbf\left\{v\right\}^H mathbf\left\{u\right\}$
which is the standard inner product for Euclidean vector spaces, better known as the dot product.

## Definition (abstract)

Let V and W be two vector spaces, and let W* be the dual space of W. Given a vector x ∈ V and y* ∈ W*, then the tensor product y* ⊗ x corresponds to the map A : W → V given by

$w mapsto y^*\left(w\right)x$

Here y*(w) denotes the value of the linear functional y* (which is an element of the dual space of W) when evaluated at the element w ∈ W. This scalar in turn is multiplied by x to give as the final result an element of the space V.

If V and W are finite-dimensional, then the space of all linear transformations from W to V, denoted Hom(W,V), is generated by such outer products. In this case Hom(W,V) is isomorphic to W* ⊗ V.

### Contrast with inner product

If $W=V$, then one can also pair the covector w*∈V* with the vector vV via $\left(w^*,v\right)mapsto w^*\left(v\right)$, which is the duality pairing between V and its dual, sometimes called the inner product.

## Definition (tensor multiplication)

The outer product on tensors is typically referred to as the tensor product. Given a tensor a with rank q and dimensions (i 1, ..., i q), and a tensor b with rank r and dimensions (j 1, ..., j r), their outer product c has rank q+r and dimensions (k 1, ..., k q+r) which are the i  dimensions followed by the j  dimensions. For example, if A has rank 3 and dimensions (357) and B has rank 2 and dimensions (10100), their outer product c has rank 5 and dimensions (35710100). If A[224] = 11 and B[888]= 13 then C[224888] = 143. .

To understand the matrix definition of outer product in terms of the definition of tensor product:

1. The vector v can be interpreted as a rank 1 tensor with dimension (M), and the vector u as a rank 1 tensor with dimension (N). The result is a rank 2 tensor with dimension (MN).
2. The rank of the result of an inner product between two tensors of rank q and r is the greater of q+r-2 and 0. Thus, the inner product of two matrices has the same rank as the outer product (or tensor product) of two vectors.
3. It is possible to add arbitrarily many leading or trailing 1 dimensions to a tensor without fundamentally altering its structure. These 1 dimensions would alter the character of operations on these tensors, so any resulting equivalences should be expressed explicitly.
4. The inner product of two matrices V with dimensions (d, e) and U with dimensions (e, f) is $sum_\left\{j = 1\right\}^e V_i,_j U_j,_k$ where $i in \left\{1..d\right\}$ and $k in \left\{1..f\right\}$, For the case where e =1, the summation is trivial (involving only a single term).

It should be emphasized that the term "rank" is being used in its tensor sense, and should not be interpreted as matrix rank.

## Applications

The outer product is useful in computing physical quantities (e.g. the tensor of inertia), and performing transform operations in digital signal processing and digital image processing. It is also useful in statistical analysis for computing the covariance and auto-covariance matrices for two random variables.