What is an eigenspace? No video or anything out there really explains what an eigenspace is? From what i have understood it is just a direction. But why do we need it?.

The following questions have been bugging me for quite a while, and I can't find a real straight forward answer to them. Hopefully one of you can help me.

  1. What is an eigenspace?
  2. Why are the eigenvectors calculated in a diagonal?
  3. What is the practical use of the eigenspace? Like what does it do or what is it used for? other than calculating the diagonal of a matrix.
  4. why is it important o calculate the diagonal of a matrix?

I want to know mainly because I just passed a linear algebra course and I have no idea what eigenspace is. Which is embarrassing for me and for my professor because he didn't explain what they were, he just basically said: "This is how you calculate it and if you want to know more then read about it in the book ".

  • 129
  • 7
  • 409
  • 1
  • 4
  • 9
  • Have you looked at https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors and http://math.stackexchange.com/q/243533/856 ? –  Mar 23 '14 at 20:12
  • 'How you calculate' is what an eigenspace is. – Git Gud Mar 23 '14 at 20:12

5 Answers5

  1. The eigenspace is the space generated by the eigenvectors corresponding to the same eigenvalue - that is, the space of all vectors that can be written as linear combination of those eigenvectors.

  2. The diagonal form makes the eigenvalues easily recognizable: they're the numbers on the diagonal. And the eigenvectors take the comfortable basis-like form, with one coordinate equal to one and all the others null.

  3. All quantum mechanics is full of this mess. Namely, particles can be seen as vectors (things are way more complicate, but essentially that's how it works), and vectors belonging to the same eigenspace have the same energy - eigenspace of the "energy matrix", loosely speaking. It's quite approximative, but it's an idea.

  4. Because, for instance, it turns out it may be of matter to calculate the exponential of a matrix, and you do so by employing its power series. Now, the power of a diagonal matrix is trivially computed (just the power of the diagonal elements), the power of a non-diagonal matrix is hell, really. Furthermore, as I said before, it looks nice to identify the eigenvalues and their multiplicity.

  • 1,200
  • 6
  • 18

Consider $\lambda$ to be an eigenvalue of a linear transformation $T$, then the eigenspace corresponding to $\lambda$ is $E=\{x\in V |T(x)=\lambda x\}$. First of all it is a subspace. Secondly it is a special type of subspace, for example the eigenspace consists of all those vectors which transforms as scalar multiples of each other

eigenvectors are calculated in diagonal because when we have an eigenbasis for $V$ and we represent its vectors as the linear combination of standard basis then the change of basis matrix consists of columns which are eigen-vectors.

Suppose you have a matrix $A$ and you need to calculate $A^{k}$, then its not always easy to do this but suppose $A\sim D$ where $D$ is a diagonal matrix then $A=PDP^{-1}$ and then $A^{k}=PD^{k}P^{-1}$ which is very easy to calculate.

Nagabhushan S N
  • 698
  • 1
  • 9
  • 24
  • 2,860
  • 1
  • 13
  • 17

To make a long story short: Given a linear map $L:\>V\to V$ and any complex number $\lambda$ one considers the set $E_\lambda\subset V$ of all vectors $x\in V$ that are just stretched by the factor $\lambda$ under $L$: $$E_\lambda:=\{x\in V\>|\> Lx=\lambda x\}\ .$$ This set is obviously a subspace, but (in a finite-dimensional setting) only for a handful of $\lambda$-values it is interesting, that is to say: of positive dimension. These special $\lambda$s are called eigenvalues of the given $L$ and together form the set ${\rm spec} (L)\subset{\mathbb C}$.

It turns out that (a) the nontrivial $E_\lambda$ are all-important in the understanding of the map $L:\>V\to V$ and that (b) working with vectors $e_\lambda\in E_\lambda$, $\>\lambda\in{\rm spec} (L)$, as basis vectors greatly simplifies computations involving $L$.

An example: When $L$ is the differentiation operator ${d\over dt}$ on functions $f:\>{\mathbb R}\to{\mathbb C}$ it pays to use the functions $e_\lambda:\ t\mapsto e^{\lambda t}$ as "basis functions", because ${d\over dt}e_\lambda =\lambda\>e_\lambda$.

Christian Blatter
  • 216,873
  • 13
  • 166
  • 425

Why eigenspaces/vectors?
It is not just about ease of calulations.. If you want the solutions for a system (In physics, say), finding the eigenspaces not only shows you how every solution looks like, it also reveals the simple nature and form of what appears like a "black-box". You gain insight, and as a bonus you calculate with ease.

Make a short story long, you can consider this classic: http://www.math24.net/mass-spring-system.html

  • 4,362
  • 17
  • 35

Let $V$ be a vector space and let $L:V\to V$ be a linear map. If $v\in V$ and $\lambda\in\mathbb{R}$ are such that $L(v)=\lambda v$ then we call $\lambda$ an eigenvalue of $L$ and $v$ and eigenvector of $L$ (corresponding to $\lambda$).

So let $\lambda$ be an eigenvalue of $L$. Note that in general there are many eigenvectors corresponding to $\lambda$; for example if $v$ is an eigenvector corresponding to $\lambda$ then so is $\alpha v$ for any $\alpha\in\mathbb{R}$. Moreover, define \begin{align*} L_\lambda&=\{w\in V:\text{$w$ is an eigenvector of $L$ corresponding to $\lambda$}\}\\ &=\{w\in V:L(w)=\lambda w\}. \end{align*} It is rather straightforward to check from the definitions that $L_\lambda$ is a vector subspace of $V$. We call it the eigenspace of $\lambda$.

Now let $n$ be the dimension of $V$, and let $\langle\cdot,\cdot\rangle$ be an inner product on $V$. An important theorem (called the spectral theorem) states that if $L$ is a symmetric map with respect to $\langle\cdot,\cdot\rangle$ then there is an orthonormal basis $v_1,\ldots,v_n$ of $V$, with each $v_1,\ldots,v_n$ an eigenvector of $L$. Now if you write the matrix of $L$ in terms of this basis, the fact that they are all eigenvectors is exactly the statement that this matrix is diagonal.

  • 2,300
  • 13
  • 20