I've repeatedly read things that reference tensors, and despite reading the wiki page and other answers here on stackexchange I still don't know what a tensor is. I'm fine with hearing things in the framework on linear algebra, abstract algebra, and physics-but wiki just says it's a geometric object and in physics my professors all ways say it builds spaces out of other spaces. Can somebody clarify things?

  • 2
    Try this video which i found helpful. http://www.google.com/url?q=http://www.youtube.com/watch%3Fv%3Df5liqUk0ZTw&sa=U&ei=zJHqUo7bGsaLkAe3n4HQCw&ved=0CCEQtwIwAA&sig2=N_EYRkWDvw-lW5fPu1JjtQ&usg=AFQjCNEcsDguyuqcn9WOWxnhcvRwVkrfxg – neofoxmulder Jan 30 '14 at 17:57
  • 1
    "A tensor is something that *transforms* like a tesnor" --A. Zee – user76568 Feb 06 '14 at 18:59
  • 2
    http://math.stackexchange.com/questions/10282/an-introduction-to-tensors http://math.stackexchange.com/questions/405108/tensors-as-mutlilinear-maps http://math.stackexchange.com/questions/424168/the-definition-of-tensor – mle Apr 05 '14 at 16:38

6 Answers6


At the lowest level of understanding a tensor $T$ of rank $r$ is an $r$-dimensional array (think of a spreadsheet) whose "side-lengths" are all equal to a given $n\geq1$. Therefore $T$ has $n^r$ entries, which we assume to be real numbers in the following.

When we are setting up such a tensor we have some application in mind, say in geometry or physics. That's where the difficulties come in. The tensor is meant to be "applied" to one or several (variable) vectors, and the result will be a number or a vector of interest in the context at hand. E.g., the value $T(x,y)$ could be the scalar product of $x$ and $y$, or the area of the parallelogram spanned by $x$ and $y$, or the image of $x$ under $T$ when $T$ is considered as a linear map, or the retaliatory force felt when moving in direction $x$, and on and on. For the computation of actual values we need the coordinates of $x$ and $y$. Now these depend on the choice of basis in the ground space ${\mathbb R}^n$, and when we change the basis the coordinate values of the points $x$ change. But the scalar product or some induced force, being "well defined" geometric or physical quantities, should not change. This in turn implies that the entries in our tensor (spreadsheet) $T$ will have to change, albeit in a characteristic way, called "contravariant" or "covariant", depending on the case at hand.

(As an aside: When a certain Excel-spreadsheet is meant to be a price list for various fabrics, then its entries will change as well in a characteristic way when the currency or the units of measurement are changed.)

But we have a definite feeling that there is some hidden "robust identity" incorporated in $T$ that is independent of the more or less accidental values appearing in the spreadsheet. It is only in the second half of the last century that mathematics has found a universal (and abstract!) way to express and to deal with this "hidden identity" of $T$. The field of mathematics concerned with this is called multilinear algebra. Only in this realm it then makes sense to talk about the tensor product. But I won't go into this here.

Christian Blatter
  • 216,873
  • 13
  • 166
  • 425

A $k$-tensor is a multilinear function from $V\times V\times\dots\times V$ to the reals, where $V$ is a vector space and $k$ is the number of the $V$'s in the above Cartesian product. (Calculus on Manifolds, Michael Spivak, 1965, page 75).

This is the best definition I can find. I am with you, and thank you for asking the question, because I hate definitions that lack a noun. So and so is a __ (noun), please!

I don't know if a k-tensor is the most general type of tensor or not.

  • 13,560
  • 1
  • 28
  • 49
  • 216
  • 2
  • 3
  • 2
    Welcome to MSE! For me as well (all those years ago), Spivak's definition of a tensor was the first definition that made sense. Not sure why you got downvoted.... – Andrew D. Hwang Apr 05 '14 at 16:56
  • Thank you! As I am new, one thing I don't get is that there is another question on MSE almost identical (what is a tensor, really?) Can these be combined or linked easily? – user140651 Apr 05 '14 at 23:00
  • The site does try to avoid duplicate questions (by linking to answered versions and "closing" the newer duplicate), but it tends to happen more for sharply-focused or homework-type questions than for "philosophical" questions such as this one. As for how to link, clicking "help" (below the "add comment" button) pops up a micro-reference on Markdown. See also the site [help] page. – Andrew D. Hwang Apr 05 '14 at 23:15

The simplest case is a tensor product of two vector spaces. If $V$ is a vector space with basis $\{v_i\}$ and $W$ is a vector space with basis $\{w_i\}$ then $V \otimes W$ is a vector space with basis $\{v_i \otimes w_i\}$.

There is more theory behind it than that. I'm sure you've read stuff about it being universal with respect to bilinear maps and such. But in terms of "building spaces out of other spaces" it's not that complicated. A tensor product of an $n$-dimensional vector space with an $m$-dimensional vector space is just an $nm$-dimensional vector space.

  • 29,407
  • 2
  • 51
  • 88
  • But what is that space filled with? The vectors of V and W, or some hyrbidization? –  Jan 31 '14 at 00:09
  • Any pair of vectors $v \in V$ and $w \in W$ gives a vector $v \otimes w \in V \otimes W$, but unfortunately there are more vectors in $V \otimes W$ then just what you can get from such pairs. So there is no easy description of the elements of $V \otimes W$ in terms of elements of $V$ and elements of $W$. The elements of $V \otimes W$ are just linear combinations of the symbols $v_i \otimes w_j$. – Jim Jan 31 '14 at 00:21
  • How do you read that symbol between v and w? –  Jan 31 '14 at 00:22
  • $V \otimes W$ is read as "V tensor W" – Jim Jan 31 '14 at 00:23
  • And building a tensor product of two vectors- does this just give a plane? Are you saying then that this plane can only be described via pairs of these vectors? –  Jan 31 '14 at 00:23
  • Same with the elements of V and W? v and w? –  Jan 31 '14 at 00:24
  • There are no planes here. The tensor product of two vectors $v \otimes w$ is a single vector in $V \otimes W$. – Jim Jan 31 '14 at 02:16

Every tensor is associated with a linear map that produces a scalar.

For instance, a vector can be identified with a map that takes in another vector (in the presence of an inner product) and produces a scalar. If I have a vector $v$ and some input vector $a$, then I define the map $\underline v(a) \equiv v \cdot a$.

A matrix is just a representation of a map that takes in two vectors. Usually we say matrices take in vectors and produce vectors. $T(a) \mapsto a'$ for instance. But you can instead use the inner product and say there is a map $\underline T(a, b)$ which produces a scalar by $\underline T(a, b) = T(a) \cdot b$.

Tensors obey certain transformation laws. A change of basis for a matrix is a similarity transformation; for tensors the rule is just a generalization of this idea. This gives a way to compute the components of a tensor in a new basis, but the underlying map can be thought of as unchanging, the same way a vector expressed in a new basis is geometrically considered the same as it was before.

Some tensors correspond to geometric objects or primitives. As I said, vectors can be thought of as very simple tensors. Some other tensors correspond to planes, volumes, and so on, formed directly from 2, 3, or more vectors. Clifford algebra is a part of the tensor algebra, dealing directly with such geometrically significant objects. Not all tensors are so easily to visualize or imagine, though; the rest can only be thought of abstractly as maps.

  • 18,790
  • 1
  • 23
  • 56

The description below presents an aspect of tensors that may help in understanding them intuitionally. For formal definition and other explanations, please do look at other answers.

Tensors in physics and mathematics have two different but related interpretations - as physical entities and as transformation mapping. The questioner mentions "geometric object", and seems to be interested in the physical interpretation in part, which I'll tackle below.

From a physical entity point of view, a tensor can be interpreted as something that brings together different components of the same entity together without adding them together in a scalar or vector sense of addition. E.g.

  1. If I have 2gm of Calcium and 3gm of Calcium together, I immediately have 5gm of Calcium - this is scalar addition, and we can perceive the resulting substance.
  2. If I am moving at 5i m/s and 6j m/s at the same time, I'm moving at (5i+6j) m/s. This is vector addition, and once again, we can make sense of the resulting entity.
  3. If I have monochromatic pixels embedded in a cube that emit light at different angles, we can define pixels per unit area ($\chi$) in the cube as $\begin{bmatrix} \chi_x&\chi_y&\chi_z \end{bmatrix}$ where $\chi_x$ is the number of pixels emitting light perpendicular to the area in yz plane, and so on.
    This entity, $\chi$, has three components, and by writing $\chi$, we are writing the three components together. Apart from that, the three components cannot be added like a scalar or vector, and we cannot visualize $\chi$ as a single entity.

$\chi$ above is an example of a tensor. Though we may not be able to see $\chi$ as a single perceivable thing, it can be used to fetch or understand perfectly comprehensible entities, e.g. for a given area $\vec{s}$, we can get the total number of pixels emitting light perpendicular to it by the equation: $ \begin{bmatrix}\chi_x&\chi_y&\chi_z \end{bmatrix}\cdot \begin{bmatrix}s_x\\s_y\\s_z \end{bmatrix}$

Change the monochromatic pixels in this example to RGB ones, and we get something very similar to the stress tensor (a tensor of rank 2), and we can get the traction vector (force per unit area for a given unit area n) by the equation:

$\textbf{T}^{(\textbf{n})} = \begin{bmatrix} T_x\\T_y\\T_z \end{bmatrix}^{(n)} = \textbf{n} \cdot \boldsymbol{\sigma} = \begin{bmatrix}\sigma_{xx}&\sigma_{xy}&\sigma_{xz}\\ \sigma_{yx}&\sigma_{yy}&\sigma_{yz}\\ \sigma_{zx}&\sigma_{zy}&\sigma_{zz}\\ \end{bmatrix} \begin{bmatrix}n_x\\n_y\\n_z \end{bmatrix} $

Though it's difficult to visualize the stress tensor in totality, each of its components tells us something very discrete, e.g. $\sigma_{xx}$ tells us how much force in x-direction is being experienced by a unit surface area that is perpendicular to the x-direction (at a given point in a solid). The complete stress tensor, $\sigma$, tells us the total force a surface with unit area facing any direction will experience. Once we fix the direction, we get the traction vector from the stress tensor, or, I do not mean literally though, the stress tensor collapses to the traction vector.

Note that the possibility of interpreting tensors as a single physical entity or something that makes sense visually is not zero. E.g., vectors are tensors and we can visualize most of them (e.g. velocity, electromagnetic field).

  • 61
  • 3
  • one can not add 5i m/s and 6j m/s either. That's why you wrote (5i + 6j) m/s instead of 11 m/s or something. Vectors are rank 1 tensors aren't they? – goteguru Dec 14 '21 at 14:34
  • (5i+6j), though written like that algebraically, can be added geometrically and perceived as a single entity intuitionally (i.e. something having one magnitude and one direction). Yes, Vectors are rank 1 tensors *by definition*. This means that we may, for all practical purposes, add and use the three components of the example in 3. above assuming they are part of a Vector. And the resulting Vector can interact with other Vectors. This resulting Vector above does not have any physical interpretation as a single entity, but you are right - it can be and is rightfully taken as Vector. – manisar Dec 14 '21 at 14:54
  • I indeed can draw a vector on a paper but I still can not describe it with single scalar value. There is no way to do that, therefore I need at least two different "things" to "compose" the 2D vector. Maybe the word "vector" in the above answer is restricted to 2 or 3D "arrows"? In programming and in linear algebra vectors may have any dimensions and no need to have any physical interpretation. Still vectors are not called tensors. Maybe n dimensional vectors used as *values* called tensors? – goteguru Dec 15 '21 at 12:52

Think of a tensor as a matrix transforming one vector into another (or one-form into another one form). For example in linear algebra we learned that given a vector $\nu$ and the matrix $A$, we can get a new vector by multiplying the matrix by the vector $\nu$, i.e. $$ \nu\cdot A = \omega $$ In linear algebra $A$ is just a matrix, in tensor analysis $A$ would be a dyad tensor (tensor of order two), in linear algebra we have $A_{i,j}$ in tensor analysis we have $A_{ij}$. In other words a tensor is a matrix.

Stefan Hansen
  • 24,191
  • 7
  • 51
  • 79
  • 7
  • 1