Wikipedia says that a linear transformation is a $(1,1)$ tensor. Is this restricting it to transformations from $V$ to $V$ or is a transformation from $V$ to $W$ also a $(1,1)$ tensor? (where $V$ and $W$ are both vector spaces). I think it must be the first case since it also states that a linear functional is a $(0,1)$ tensor and this is a transformation from $V$ to $R$. If it is the second case, could you please explain why linear transformations are $(1,1)$ tensors.

1It would be helpful if you told us what definition exactly you have of a (1,1)tensor. – Mariano SuárezÁlvarez Jan 18 '15 at 10:15

3@MarianoSuárezAlvarez I would define it as a element of the vector space $V \bigotimes V^*$ where $V$ is a vector space and $V^*$ its dual – Quantum spaghettification Jan 18 '15 at 10:48

1Note that I've added a note to that Wikipedia statement now... – Fizz Jan 19 '15 at 12:25

4After tangling a bit on the Wikipedia's article talk page I've discovered that the regulars there have the following agenda/beliefs: (1) tensors are never defined over infinitedimensional vector spaces and (2) tensors are never defined over different vector spaces (even if finite). As I don't have the time/stamina to battle people with a weird agenda and who are willing to spend most of their waking hours pushing it over there (despite the literature)... my note might be gone soon enough. Which is why reading Wikipedia on any topic is fraught with hazards... even on something as tame as math. – Fizz Jan 19 '15 at 15:07
3 Answers
It's very common in tensor analysis to associate endomorphisms on a vector space with (1,1) tensors. Namely because there exists an isomorphism between the two sets.
Define $E(V)$ to be the set of endomorphisms on $V$.
Let $A\in E(V)$ and define the map $\Theta:E(V)\rightarrow T^1_1(V)$ by \begin{align*} (\Theta A)(\omega,X)&=\omega(AX). \end{align*} We show that $\Theta$ is an isomorphism of vector spaces. Let $\{e_i\}$ be a basis for $V$ and let $\{\varepsilon^i\}$ be the corresponding dual basis. First, we note $\Theta$ is linear by the linearity of $\omega$. To show injectivity, suppose $\Theta A = \Theta B$ for some $A,B\in E(V)$ and let $X\in V$, $\omega \in V^*$ be arbitrary. Then \begin{align*} (\Theta A)(\omega,X)&=(\Theta B)(\omega,X)\\ \\ \iff \omega(AXBX)&=0. \end{align*} Since $X$ and $\omega$ were arbitrary, it follows that \begin{align*} AX&=BX\\ \iff A&=B. \end{align*} To show surjectivity, suppose $f\in T^1_1$ has coordinate representation $f^j_i \varepsilon^i \otimes e_j$. We wish to find $A\in E(V)$ such that $\Theta A = f$. We simply choose $A\in E (V)$ such that $A$ has the matrix representation $(f^j_i)$. If we write the representation of our vector $X$ and covector $\omega$ as \begin{align*} X&=X^i e_i\\ \omega&=\omega_i \varepsilon^i, \end{align*} we have \begin{align*} (\Theta A)(\omega, X)&=\omega(AX)\\ \\ &=\omega_k \varepsilon^k(f^j_i X^i e_j)\\ \\ &=f^j_i X^i \omega_k \varepsilon^k (e_j)\\ \\ &=f^j_i X^i \omega_k \delta^k_j\\ \\ &=f^k_i X^i \omega_k. \end{align*} However we see \begin{align*} f(\omega,X)&=f(\omega_k\varepsilon^k,X^ie_i)\\ \\ &=\omega_k X^i f(\varepsilon^k,e_i)\\ \\ &=f^k_i X^i \omega_k. \end{align*} Since $X$ and $\omega$ were arbitrary, it follows that $\Theta A = f$. Thus, $\Theta$ is linear and bijective, hence an isomorphism.
 529
 3
 15
 1,694
 10
 21

N.B.: This proof is addressing the 2nd part of the OP's question. I'm curious which one he is going to accept given that he [asked two questions in one](http://meta.stackexchange.com/questions/246328/dealingwithbundleomnibuslistofquestionsquestionthatconsistsofratherd)... – Fizz Jan 19 '15 at 00:35

Also, this result can be stated more generally for a tensor product of two different vector spaces (i.e., not just for a tensor space), as the existence of an isomorphism between $\mathrm{Hom}(V, W)$ and $V^*\otimes W$. – Fizz Jan 19 '15 at 00:58

2Also your proof (of surjectivity) does not hold for an infinitedimensional vector space $V$ because the Kronecker delta formula for covector basis in only valid for finitedimensional [co]vector spaces. – Fizz Jan 19 '15 at 09:51

1
Let $T : V \mapsto W$. Then define $\tau: V \times W^* \mapsto K$ such that, for $a \in V$ and $\alpha \in W^*$, we have
$$\tau(a, \alpha) = (\alpha \circ T)(a)$$
Note that it's typical to define tensor to mean a multilinear map that is a function of vectors only in the same vector space, or of covectors in the associated dual space, or some combination of the two. So, we could identify a linear operator $T: V \mapsto V$ with a $(1,1)$ tensor $\tau: V \times V^* \mapsto K$, but in the case that $V$ and $W$ are distinct vector spaces, these would just be some construction of multilinear maps, not tensors.
 18,790
 1
 23
 56

Bilinear (and multilinear) maps can be linearized. This is in fact how [Yokonuma's textbook](https://books.google.com/books?id=bDkf3W65GwC) (pp. 47) introduces *tensor product*, so you can define a tensor product for $V$ and $W$ different. This approach is then extended (p. 12) to multilinear maps where all the vector spaces are different. On the other hand, when Yokonuma introduces *tensor space* (p. 33) he defines it as product of copies of the same vector space $V$ and its dual $V*$. It doesn't make sense to speak of a $(i,j)$type tensor (space) unless it is over the same vector space. – Fizz Jan 19 '15 at 00:33

1And one more terminology issues worth mentioning here is that a the tensor product of two different vector spaces is sometimes called a [*tensor product space*](https://www.google.com/search?q=%22tensor+product+space%22&btnG=Search+Books&tbm=bks), but this is usually not what people refer to when they use just *tensor space*. – Fizz Jan 19 '15 at 01:15

Just to prove that there's an exception to every rule, the Handbook of Linear Algebra (2nd ed., [p. "157"](https://books.google.com/books?id=Er7MBQAAQBAJ&pg=SA15PA7)) does define a "order*d* tensor" as an element of a tensor product of $d$ different vector spaces. Ha! – Fizz Jan 19 '15 at 01:29

And there are physics books that adopt this terminology too. E.g. *Geometry of the Fundamental Interactions* ([p. 48](https://books.google.com/books?id=wEWw_vGBDW8C&pg=PA48)), calls the objects of the tensor product space $U\otimes V$ "secondorder tensors". – Fizz Jan 19 '15 at 01:44

Can the argument go both ways without assuming that $W$ is reflexive? I.e., starting with a multilinear map $\tau:V \times W^* \rightarrow K$, constructing a linear operator $T:V \rightarrow W$. Without this, we have constructed an isometric embedding, but not an isomorphism. It seems to me the natural linear map associated with a given $\tau$ would be $T:V \mapsto (W^*)^*$. Or maybe I'm missing something..? – Nick Alger Sep 27 '16 at 22:18
To summarize as an answer what I wrote in various comments above: first beware that autors differ in their definition of tensor, even when using the same approach, i.e. using the tensor product in this case.
For some authors a tensor is defined only as ...
$$ T\in \underbrace{V \otimes\dots\otimes V}_{n \text{ copies}} \otimes \underbrace{V^* \otimes\dots\otimes V^*}_{m \text{ copies}}$$
From which it makes sense to speak of a type$(n,m)$ tensor.
For others, a tensor is any...
$$T\in V_1 \otimes\dots\otimes V_d$$
where $V_1, \dots, V_d$ can be different vector spaces, however all must be over the same scalar field. And with this latter definition one can speak of an order$d$ tensor. A type$(n,m)$ tensor [in the former sense] is a tensor of order $d=n+m$ in the latter sense, but second definition is broader for it does not restrict us to a single vector space. In particular, a secondorder tensor is an element of $V \otimes W$ where $V$ and $W$ may be two different vector spaces. Type(1,1) tensors are tensors of second order, but the converse of this statement doesn't make sense. (N.B.: I've updated Wikipedia to reflect these different definitions.)
As for your 2nd question, endomorphisms (linear maps) from a vector space to itself are (isomorphic with) type(1,1) tensors (detailed proof given here by beedge89), but if you consider homomorphisms (linear maps) between different vector spaces $V$ and $W$, i.e. $\mathrm{Hom}(V,W)$, these are isomorphic with only a certain class of order2 tensors, namely with $V^* \otimes W$. If we let $(\phi, w)\in V^* \times W$, then the correspondence is given by $\phi \otimes w \leftrightarrow F_{\phi, w}$, where the latter is a (linear) map defined as $F_{\phi, w} (v) = \phi(v)w$. (Remember that covectors are themselves maps from vectors to scalars, so the formula for $F$ makes sense as it's a product of the scalar $\phi(v)$ with the vector $w$). A detailed proof of the fact that this is an isomorphism is given in Yokonuma (pp. 1819). Apologies for not including it here.
As you may expect, the result for type(1,1) tensors also follows as a corollary of this, i.e. $\mathrm{Hom}(V,V)$ is isomorphic with $V^* \otimes V$ (and with $V \otimes V^*$ by commutativity of the tensor product, which is also understood in the sense of an isomorphism between $V \otimes W$ and $W \otimes V$ for any vector spaces $V$ and $W$).
And one important caveat here: this is an isomorphism only for finitedimensional vector spaces. (The introduction of Yokonuma's book actually says to assume all vector spaces in the book are finitedimensional unless stated otherwise.) If both $V$ and $W$ are infinitedimensional, then it turns out $V^*\otimes W$ is only a proper subspace of $\mathrm{Hom}(V,W)$, namely it is the subspace of linear transformation of finite rank.
And to tie this in with bilinear (and in general with multilinear) maps: there's also a oneone correspondence between bilinear maps $f : V\times W \to U$ and $linear$ maps $g : V\otimes W \to U$. (For a proof see for instance http://www.landsburg.com/algebra.pdf) That's why secondorder tensors are basically said to be just bilinear maps, and in general why dorder tensors are said to be just multilinear maps.

Why does Wikipedia say that a tensor T is a multilinear map from V ⊗ V... ⊗ V* ⊗ V*... to k (the field)? Your definition seems to be that of a tensor product space. – Yan King Yin Apr 21 '20 at 06:36