See R. Abraham et al., Manifolds, Tensor Analysis, and Applications (1988).
Tensor calculus was developed around 1890 by Gregorio Ricci-Curbastro under the title absolute differential calculus, and was made accessible to many mathematicians by the publication of Tullio Levi-Civita's 1900 classic text of the same name (in Italian; translations followed). In the 20th century, the subject came to be known as tensor analysis, and achieved broader acceptance with the introduction of Einstein's theory of general relativity, around 1915.
General relativity is formulated completely in the language of tensors. Einstein had learned about them, with great difficulty, from the geometer Marcel Grossmann, or perhaps from Levi-Civita himself. Tensors are used also in other fields such as continuum mechanics.
In mathematics, a tensor is (in an informal sense) a generalized linear 'quantity' or 'geometrical entity' that can be expressed as a multi-dimensional array relative to a choice of basis of the particular space on which it is defined. The intuition underlying the tensor concept is inherently geometrical: as an object in and of itself, a tensor is independent of any chosen frame of reference. However, in the modern treatment, tensor theory is best regarded as a topic in multilinear algebra. Engineering applications do not usually require the full, general theory, but theoretical physics now does.
For example, the Euclidean inner product (dot product)—a real-valued function of two vectors that is linear in each—is a mathematical tensor. Similarly, on a smooth curved surface such as a torus, the metric tensor (field) essentially defines a different inner product of tangent vectors at each point of the surface. Just as a linear transformation can be represented as a matrix of numbers with respect to given vector bases, so a tensor can be written as an organized collection of numbers. In physics, the numbers may be obtained as physical quantities that depend on a basis, and the collection is determined to be a tensor if the quantities transform appropriately under change of basis.
Many mathematical structures informally called 'tensors' are actually tensor fields—a tensor valued function defined on a geometric or topological space. This use of the term is analogous to vector fields such as electromagnetic fields, but with the 'tensor' defined so that it is invariant under a change of coordinates. Differential equations posed in terms of tensor quantities are basic to modern mathematical physics, so that tensor fields are usually defined on differentiable manifolds.
In mathematics, the term rank of a tensor may mean either of two things, and it is not always clear from the context which.
In the first definition, the rank of a tensor T is the number of indices required to write down the components of T. Under this definition a scalar is a tensor of rank 0, a vector is a tensor of rank 1, and a matrix is a tensor of rank 2. This is the sum of the number of covariant and contravariant indices. Expressed by means of the tensor product of multilinear algebra, this is the number of factors of the tensor product needed to express T.
In the second definition, the rank of a tensor is defined in a way that extends the definition of the rank of a matrix given in linear algebra. A tensor of rank 1 (also called a simple tensor) is a tensor that can be written as a tensor product of the form
For example, a matrix is a tensor with 2 indices, and so has rank 2 in the first definition. On the other hand, the rank of the tensor in the second definition is just the rank of the matrix. This latter meaning is possibly the intended one, whenever the array of components is two-dimensional.
To avoid this ambiguity, it is now preferred to use the terminology of tensor order to denote the number of indices, and tensor rank to designate the number of simple tensors necessary to decompose a tensor. Hence the definition of rank is now used in a way that is consistent with Linear Algebra.
In physical applications, array indices are distinguished by being contravariant (superscripts) or covariant (subscripts), depending upon the type of transformation properties. The valence of a particular tensor is the number and type of array indices; tensors with the same tensor order but different valence are not, in general, identical. However, any given covariant index can be transformed into a contravariant one, and vice versa, by applying the metric tensor. This operation is generally known as raising or lowering indices.
Specifically, a 2nd rank tensor quantifying stress in a 3-dimensional/solid object has components which can be conveniently represented as a 3x3 array. The three Cartesian faces of a cube-shaped infinitesimal volume segment of the solid are each subject to some given force. The force's vector components are also three in number (being in three-space). Thus, 3x3, or 9 components are required to describe the stress at this cube-shaped infinitesimal segment (which may now be treated as a point). Within the bounds of this solid is a whole mass of varying stress quantities, each requiring 9 quantities to describe. Thus, the need for a 2nd order tensor is produced.
While tensors can be represented by multi-dimensional arrays of components, the point of having a tensor theory is to explain further implications of saying that a quantity is a tensor, beyond specifying that it requires a number of indexed components. In particular, tensors behave in specific ways under coordinate transformations. The abstract theory of tensors is a branch of linear algebra, now called multilinear algebra.
Physicists and engineers are among the first to recognise that vectors and tensors have a physical significance as entities, which goes beyond the (often arbitrary) coordinate system in which their components are enumerated. Similarly, mathematicians find there are some tensor relations which are more conveniently derived in a coordinate notation.
As a simple example, consider a ship in the water. We want to describe its response to an applied force. Force is a vector, and the ship will respond with an acceleration, which is also a vector. The relationship between force and acceleration is linear in classical mechanics. Such a relationship is described by a rank two tensor of type (1,1) (that is to say, here it transforms a plane vector into another such vector). The tensor can be represented as a matrix which when multiplied by a vector results in another vector. Just as the numbers which represent a vector will change if one changes the coordinate system, the numbers in the matrix that represents the tensor will also change when the coordinate system is changed.
In engineering, the stresses inside a solid body or fluid are also described by a tensor; the word "tensor" is Latin for something that stretches, i.e., causes tension. If a particular surface element inside the material is singled out, the material on one side of the surface will apply a force on the other side. In general, this force will not be orthogonal to the surface, but it will depend on the orientation of the surface in a linear manner. This is described by a tensor of type (2,0), in linear elasticity, or more precisely by a tensor field of type (2,0) since the stresses may change from point to point.
Formally speaking, a tensor has a particular type according to the construction with tensor products that give rise to it. For computational purposes, it may be expressed as the sequence of values represented by a function with a tuple-valued domain and a scalar valued range. Domain values are tuples of counting numbers, and these numbers are called indices. For example, a rank 3 tensor might have dimensions 2, 5, and 7. Here, the indices range from «1, 1, 1» through «2, 5, 7»; thus the tensor would have one value at «1, 1, 1», another at «1, 1, 2», and so on for a total of 70 values. As a special case, (finite-dimensional) vectors may be expressed as a sequence of values represented by a function with a scalar valued domain and a scalar valued range; the number of distinct indices is the dimension of the vector. Using this approach, the rank 3 tensor of dimension (2,5,7) can be represented as a 3-dimensional array of size 2 × 5 × 7. In this usage, the number of "dimensions" comprising the array is equivalent to the "rank" of the tensor, and the dimensions of the tensor are equivalent to the "size" of each array dimension.
A tensor field associates a tensor value with every point on a manifold. Thus, instead of simply having 70 values as indicated in the above example, for a rank 3 tensor field with dimensions «2, 5, 7»; every point in the space would have 70 values associated with it. In other words, a tensor field means there's some tensor-valued function which has, for example, Euclidean space as its domain.
In the end the same computational content is expressed. See glossary of tensor theory for a listing of technical terms.