Definitions

linear transformation

Direct linear transformation

Direct linear transformation (DLT) is an algorithm which solves a set of variables from a set of similarity relations:

mathbf{x}_{k} propto mathbf{A} , mathbf{y}_{k}   for , k = 1, ldots, N

where mathbf{x}_{k} and mathbf{y}_{k} are known vectors, , propto denotes equality up to an unknown scalar multiplication, and mathbf{A} is a matrix (or linear transformation) which contains the unknowns to be solved.

This type of relation appears frequently in projective geometry. Practical examples include the relation between 3D points in a scene and their projection onto the image plane of a pinhole camera, and homographies.

Introduction

An ordinary linear equation

mathbf{x}_{k} = mathbf{A} , mathbf{y}_{k}   for , k = 1, ldots, N

can be solved, for example, by rewriting it as a matrix equation mathbf{X} = mathbf{A} , mathbf{Y} where matrices mathbf{X} and mathbf{Y} contain the vectors mathbf{x}_{k} and mathbf{y}_{k} in their respective columns. Given that there exists a unique solution, it is given by

mathbf{A} = mathbf{X} , mathbf{Y}^{T} , (mathbf{Y} , mathbf{Y}^{T})^{-1} .

Solutions can also be described in the case that the equations are over or under determined.

What makes the direct linear transformation problem distinct from the above standard case is the fact that the left and right sides of the defining equation can differ by an unknown multiplicative factor which is dependent on k. As a consequence, mathbf{A} cannot be computed as in the standard case. Instead, the similarity relations are rewritten as proper linear homogeneous equations which then can be solved by a standard method. The combination of rewriting the similarity equations as homogeneous linear equations and solving them by standard methods is referred to as a direct linear transformation algorithm or DLT algorithm.

Example

Let mathbf{x}_{k} in mathbb{R}^{2} and mathbf{y}_{k} in mathbb{R}^{3} be two sets of known vectors and the problem is to find 2 times 3 matrix mathbf{A} such that

alpha_{k} , mathbf{x}_{k} = mathbf{A} , mathbf{y}_{k}   for , k = 1, ldots, N

where alpha_{k} neq 0 is the unknown scalar factor related to equation k.

To get rid of the unknown scalars and obtain homogeneous equations, define the anti-symmetric matrix

mathbf{H} = begin{pmatrix} 0 & -1 1 & 0 end{pmatrix}

and multiply both sides of the equation with mathbf{x}_{k}^{T} , mathbf{H} from the left

alpha_{k} , mathbf{x}_{k}^{T} , mathbf{H} , mathbf{x}_{k} = mathbf{x}_{k}^{T} , mathbf{H} , mathbf{A} , mathbf{y}_{k}   for , k = 1, ldots, N .

Since mathbf{x}_{k}^{T} , mathbf{H} , mathbf{x}_{k} = 0, the following homogeneous equations, which no longer contain the unknown scalars, are at hand

0 = mathbf{x}_{k}^{T} , mathbf{H} , mathbf{A} , mathbf{y}_{k}   for , k = 1, ldots, N .

In order to solve mathbf{A} from this set of equations, consider the elements of the vectors mathbf{x}_{k} and mathbf{y}_{k} and matrix mathbf{A} :

mathbf{x}_{k} = begin{pmatrix} x_{1k} x_{2k} end{pmatrix} ,   mathbf{y}_{k} = begin{pmatrix} y_{1k} y_{2k} y_{3k} end{pmatrix} ,   and   mathbf{A} = begin{pmatrix} a_{11} & a_{12} & a_{13} a_{21} & a_{22} & a_{23} end{pmatrix}

and the above homogeneous equation becomes

0 = a_{11} , x_{2k} , y_{1k} - a_{21} , x_{1k} , y_{1k} + a_{12} , x_{2k} , y_{2k} - a_{22} , x_{1k} , y_{2k} + a_{13} , x_{2k} , y_{3k} - a_{23} , x_{1k} , y_{3k}   for , k = 1, ldots, N.

This can also be written

0 = mathbf{b}_{k}^{T} , mathbf{a}   for , k = 1, ldots, N

where mathbf{b}_{k} and mathbf{a} both are 6-dimensional vectors defined as

mathbf{b}_{k} = begin{pmatrix} x_{2k} , y_{1k} -x_{1k} , y_{1k} x_{2k} , y_{2k} -x_{1k} , y_{2k} x_{2k} , y_{3k} -x_{1k} , y_{3k} end{pmatrix}   and   mathbf{a} = begin{pmatrix} a_{11} a_{21} a_{12} a_{22} a_{13} a_{23} end{pmatrix}.

This set of homogeneous equation can also be written in matrix form

mathbf{0} = mathbf{B} , mathbf{a}

where mathbf{B} is a N times 6 matrix which holds the vectors mathbf{b}_{k} in its rows. This means that mathbf{a} lies in the null space of mathbf{B} and can be determined, for example, by a singular value decomposition of mathbf{B} ; mathbf{a} is a right singular vector of mathbf{B} corresponding to a singular value that equals zero. Once mathbf{a} has been determined, the elements of mathbf{A} can be found by a simple rearrangement from a 6-dimensional vector to a 2 times 3 matrix. Notice that the scaling of mathbf{a} or mathbf{A} is not important (except that it must be non-zero) since the defining equations already allow for unknown scaling.

In practice the vectors mathbf{x}_{k} and mathbf{y}_{k} may contain noise which means that the similarity equations are only approximately valid. As a consequence, there may not be a vector mathbf{a} which solves the homogeneous equation mathbf{0} = mathbf{B} , mathbf{a} exactly. In these cases, a total least squares solution can be used by choosing mathbf{a} as a right singular vector corresponding to the smallest singular value of mathbf{B}.

More general cases

The above example has mathbf{x}_{k} in mathbb{R}^{2} and mathbf{y}_{k} in mathbb{R}^{3} , but the general strategy for rewriting the similarity relations into homogeneous linear equations can be generalized to arbitrary dimensions for both mathbf{x}_{k} and mathbf{y}_{k}.

If mathbf{x}_{k} in mathbb{R}^{2} and mathbf{y}_{k} in mathbb{R}^{q} the previous expressions can still lead to an equation

0 = mathbf{x}_{k}^{T} , mathbf{H} , mathbf{A} , mathbf{y}_{k}   for   , k = 1, ldots, N

where mathbf{A} now is 2 times q. Each k provides one equation in the 2q unknown elements of mathbf{A} and together these equations can be written mathbf{B} , mathbf{a} = mathbf{0} for the known N times 2 , q matrix mathbf{B} and unknown 2q-dimensional vector mathbf{a}. This vector can be found in a similar way as before.

In the most general case mathbf{x}_{k} in mathbb{R}^{p} and mathbf{y}_{k} in mathbb{R}^{q} . The main difference compared to previously is that the matrix mathbf{H} now is p times p and anti-symmetric. When p > 0 the space of such matrices is no longer one-dimensional, it is of dimension

M = frac{p,(p-1)}{2}.

This means that each value of k provides M homogeneous equations of the type

0 = mathbf{x}_{k}^{T} , mathbf{H}_{m} , mathbf{A} , mathbf{y}_{k}   for   , m = 1, ldots, M   and for , k = 1, ldots, N

where mathbf{H}_{m} is a M-dimensional basis of the space of p times p anti-symmetric matrices.

Example p = 3

In the case that p = 3 the following three matrices mathbf{H}_{m} can be chosen

mathbf{H}_{1} = begin{pmatrix} 0 & 0 & 0 0 & 0 & -1 0 & 1 & 0 end{pmatrix} ,   mathbf{H}_{2} = begin{pmatrix} 0 & 0 & 1 0 & 0 & 0 -1 & 0 & 0 end{pmatrix} ,   mathbf{H}_{3} = begin{pmatrix} 0 & -1 & 0 1 & 0 & 0 0 & 0 & 0 end{pmatrix} .

In this particular case, the homogeneous linear equations can be written as

mathbf{0} = [mathbf{x}_{k}]_{times} , mathbf{A} , mathbf{y}_{k}   for   , k = 1, ldots, N

where [mathbf{x}_{k}]_{times} is the matrix representation of the vector cross product. Notice that this last equation is vector valued; the left hand side is the zero element in mathbb{R}^{3} .

Each value of k provides three homogeneous linear equations in the unknown elements of mathbf{A} . However, since [mathbf{x}_{k}]_{times} has rank = 2, at most two equations are linearly independent. In practice, therefore, it is common to only use two of the three matrices mathbf{H}_{m} , for example, for m=1, 2. However, the linear dependency between the equations is dependent on mathbf{x}_{k} , which means that in unlucky cases it would have been better to choose, for example, m=2,3. As a consequence, if the number of equations is not a concern, it may be better to use all three equations when the matrix mathbf{B} is constructed.

The linear dependence between the resulting homogeneous linear equations is a general concern for the case p > 2 and has to be dealt with either by reducing the set of anti-symmetric matrices mathbf{H}_{m} or by allowing mathbf{B} to become larger than necessary for determining mathbf{a}.

References

Search another word or see linear transformationon Dictionary | Thesaurus |Spanish
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature
FAVORITES
RECENT

;