Wikipedia defines an eigenvector like this:

An eigenvector of a square matrix is a non-zero vector that, when multiplied by the matrix, yields a vector that differs from the original vector at most by a multiplicative scalar.

So basically in layman language: An eigenvector is a vector that when you multiply it by a square matrix, you get the same vector or the same vector multiplied by a scalar.

There are a lot of terms which are related to this like eigenspaces and eigenvalues and eigenbases and such, which I don't quite understand, in fact, I don't understand at all.

Can someone give an explanation connecting these terms? So that it is clear what they are and why they are related.

Rodrigo de Azevedo
  • 18,977
  • 5
  • 36
  • 95
  • 793
  • 6
  • 6
  • Are you confused by their definitions, or do you want to see what their applications are? – Herng Yi Feb 11 '13 at 10:36
  • @HerngYi Their applications are also very interesting, but I think I need to grasp their definitions first. I find that wikipedia obfuscates their definitions a lot which makes it unclear to me. So the former – Eigeneverything Feb 11 '13 at 10:39
  • 1
    Salman Khan should do a screencast on eigen vectors. – Chloe Feb 12 '13 at 01:29
  • Few links: http://blog.stata.com/2011/03/09/understanding-matrices-intuitively-part-2/ http://mathoverflow.net/questions/31838/intuitions-connections-examples-for-eigen And there is a really interesting gif on that wiki site, if you are a visual type. – ante.ceperic Feb 13 '13 at 09:53
  • 2
    @Eigeneverything You made ​​me laugh when I read what you wrote: "There are a lot of terms which are related to this like **eigenspaces** and **eigenvalues** and **eigenbases** and such, which I don't quite understand, in fact, I don't understand at all" and looking at your username I found another term *Eigeneverything*. Thank you and I hope all is well now with these eigen.... :) –  Jul 25 '13 at 14:22

7 Answers7


Eigenvectors are those vectors that exhibit especially simple behaviour under a linear transformation: Loosely speaking, they don't bend and rotate, they simply grow (or shrink) in length (though a different interpretation of growth/shrinkage may apply if the ground field is not $\mathbb R$). If it is possible to express any other vector as a linear combination of eigenvectors (preferably if you can in fact find a whole basis made of eigenvectors) then applying the - otherwise complicated - linear transformation suddenly becomes easy because with respect to a basis of eigenvectors the linear transformation is given simply by a diagonal matrix.

Especially when one wants to investigate higher powers of a linear transformation, this is practically only possible for eigenvectors: If $Av=\lambda v$, then $A^nv=\lambda^nv$, and even exponentials become easy for eigenvectors: $\exp(A)v:=\sum\frac1{n!}A^n v=e^\lambda v$. By the way, the exponential functions $x\mapsto e^{cx}$ are eigenvectors of a famous linear tranformation: differentiation, i.e. mapping a function $f$ to its derivative $f'$. That's precisely why exponetials play an important role as base solutions for linear differential equations (or even their discrete counterpart, linear recurrences like the Fibonacci numbers).

All other terminology is based on this notion: An (nonzero) eigenvector $v$ such that $Av$ is a multiple of $v$ determines its eigenvalue $\lambda$ as the scalar factor such that $Av=\lambda v$. Given an eigenvalue $\lambda$, the set of eigenvectors with that eigenvalue is in fact a subspace (i.e. sums and multiples of eigenvectors with the same(!) eigenvalue are again eigen), called the eigenspace for $\lambda$. If we find a basis consisting of eigenvectors, then we may obviously call it eigenbasis. If the vectors of our vector space are not mere number tuples (such as in $\mathbb R^3$) but are also functions and our linear transformation is an operator (such as differentiation), it is often convenient to call the eigenvectors eigenfunctions instead; for example, $x\mapsto e^{3x}$ is an eigenfunction of the differentiation operator with eigenvalue $3$ (because the derivative of it is $x\mapsto 3e^{3x}$).

Hagen von Eitzen
  • 1
  • 29
  • 331
  • 611
  • 5
    +1 If I'd realised you were writing this, I wouldn't have bothered with my own answer! – Tara B Feb 11 '13 at 11:02
  • Thank you, this is what I needed. I can see now how eigen-things can be extremely important pretty much everywhere – Eigeneverything Feb 11 '13 at 11:03
  • Just a possible typo, but did you leave out the word "behaviour" in the first sentence? "Eigenvectors are those vectors that exhibit especially simple [behaviour] under..." – Ricardo Altamirano Feb 11 '13 at 14:13
  • 4
    @Hagen, "The don't bend and and rotate, they simply grow (or shrink) in length" - if you meant to write that they don't bend or rotate that's incorrect: the eigenspace may be complex in which case there is a rotation involved. – alancalvitti Feb 11 '13 at 15:11
  • This is the best introduction into eigen-things I have ever read. – Alexander Feb 11 '13 at 16:09
  • 3
    It's now probably worth updating Wikipedia with this information, and a link to this source. – Yann Feb 11 '13 at 19:52
  • 1
    @alancalvitti: Surely rotation is a notion we only tend to imagine properly in real vector spaces...for example multiplication of a complex number by $i$ is rotation by $90$ degrees...but only when viewed in $\mathbb{R}^2$ as a REAL vector space. – fretty Feb 13 '13 at 16:25
  • @fretty, $\mathbb{R}^2$ ~ $\mathbb{C}$. Rotation by 90 degrees equals the action $e^{i \pi / 2} : \mathbb{C} \to \mathbb{C}$ – alancalvitti Feb 13 '13 at 17:47
  • Yes, but the elements of these two spaces are different things...rotations here are thought of as transformations from one point in $\mathbb{R}^2$ to another. We have no intuition for rotating numbers! – fretty Feb 13 '13 at 17:49
  • It matters also which field you are working over, as I say intuitively speaking rotations are things on real vector spaces...yes by isomorphisms we can view some of them as transformations on complex spaces but clearly here Hagen is viewing them as on real spaces. – fretty Feb 13 '13 at 17:51
  • What do you mean by "two different things"? They are isomorphic with respect to such transformations. – alancalvitti Feb 13 '13 at 17:51
  • It's the *same* thing: in the real vector space setting, one vector is mapped to another at right angles, and in the complex number field, one number is mapped to another at right angles. What's the difference? – alancalvitti Feb 13 '13 at 17:52
  • Isomorphic doesn't mean "equal", it means "the same upto relabelling". Elements of $\mathbb{R}^2$ are points in the plane, on which we can visualise rotations. Elements of $\mathbb{C}$ are complex numbers. Yes we may define the corresponding "rotation of a complex number" by what happens to the corresponding point on $\mathbb{R}^2$ but intuitively we cannot visualise rotation of a number without having a geometrical picture! – fretty Feb 13 '13 at 17:54
  • Rotations are geometric things. The $\mathbb{R}$-vector space $\mathbb{C}$ is not a geometric thing...not until you view it as $\mathbb{R}^2$, but that then requires the isomorphism to exist! – fretty Feb 13 '13 at 17:58
  • Why do you think they call it the complex *plane*? There's plenty of material on the geometry of complex numbers, eg: Needham's http://www.amazon.com/Visual-Complex-Analysis-Tristan-Needham/dp/0198534469, http://www.ams.org/bookstore/pspdf/amstext-12-prev.pdf, http://www.ias.ac.in/resonance/January2008/p35-53.pdf – alancalvitti Feb 13 '13 at 17:58
  • 1
    Yes, it is the complex plane, but in order to view it as a plane you need to choose an isomorphism with $\mathbb{R}^2$... – fretty Feb 13 '13 at 18:00
  • My point is that clearly Hagen was talking about nice places – fretty Feb 13 '13 at 18:01
  • It's understood that there are many such isomorphisms, of course (So by Barry Mazur's definition of equality, they are not equal) but the action, and spectrum - of linear transformations is natural - it doesn't matter what the isomorphism is (see Roman's *Lattices and Ordered Sets*) – alancalvitti Feb 13 '13 at 18:01
  • @fretty, bottom line: a real-valued matrix $\mathbb{R}^n \to \mathbb{R}^n$, even $\mathbb{R}^2 \to \mathbb{R}^2$ may have a complex eigenstructure. – alancalvitti Feb 13 '13 at 18:03
  • Yes of course...but the explanation given above was not meant to be completely rigorous! My point is that the intuitive notion of "rotation" is an action on points in a real vector space. So Hagen is technically correct in what was being said. Think of it this way, if you didnt know that $\mathbb{C}$ was isomorphic to $\mathbb{R}^2$ how could you make sense of $1+i$ rotated by $40$ degrees say? It only makes sense intuitively once you view the complex numbers AS a plane! Anyway, this is getting out of hand. – fretty Feb 13 '13 at 18:05
  • Also: what is the polynomial $31+25x$ rotated by $60$ degrees? You can only really make sense out of this after using an isomorphism of the space of linear polynomials over $\mathbb{R}$ with $\mathbb{R}^2$...after all polynomials never WERE things that were meant to be rotated eh? – fretty Feb 13 '13 at 18:09

As far as I understand it, the 'eigen' in words like eigenvalue, eigenvector etc. means something like 'own', or a better translation in English would perhaps be 'characteristic'.

Each square matrix has some special scalars and vectors associated with it. The eigenvectors are the vectors which the matrix preserves (up to scalar multiplication). As you probably know, an $n\times n$ matrix acts as a linear transformation on an $n$-dimensional space, say $F^n$. A vector and its scalar multiples form a line through the origin in $F^n$, and so you can think of the eigenvectors as indicating lines through the origin preserved by the linear transformation corresponding to the matrix.

Defn Let $A$ be an $n\times n$ matrix over a field $F$. A vector $v\in F^n$ is an eigenvector of $A$ if $Av = \lambda v$ for some $\lambda$ in $F$. A scalar $\lambda\in F$ is an eigenvalue of $A$ if $Av = \lambda v$ for some $v\in F^n$.

The eigenvalues are then the factors by which these special lines through the origin are either stretched or contracted.

Tara B
  • 5,241
  • 19
  • 38

The term "eigen" comes from german and means "proper". It means that the eigenvectors and eigenvalues have a special status with respect to the operator/matrix you are studying. In this case, it is that they are essentially invariant with respect to applications of the operator/matrix. After application you get back the same eigenvector multiplied by a factor that factor being the eigenvalue.

Since some people doubt that "proper" is an appropriate translation in this context, I'll have you know that "proper vector" is sometimes used as a synonym for "eigenvector". Also see this and this.

So, I think the meaning here is really that of having a special or peculiar status.

  • 14,777
  • 2
  • 44
  • 81
  • 12
    Actually, "proper" rather means "eigentlich", I think, whereas "eigen" is more or less "own". At least one should remember the trivia that there is no mathematician named Eigen (as opposed to the *Killing form*, which *is* named after a W. Killing and not because it kills vectors) – Hagen von Eitzen Feb 11 '13 at 10:44
  • 1
    @HagenvonEitzen Indeed, the dutch word is also eigen. – Eigeneverything Feb 11 '13 at 10:45
  • 'proper' is listed as one of the possible meanings of 'eigen' in one dictionary I looked at, but I don't think that is the appropriate meaning in this context. – Tara B Feb 11 '13 at 11:03
  • Proper vector is actually an official synonym for eigenvector. – Raskolnikov Feb 11 '13 at 11:21
  • 6
    'eigen' means 'proper' only insofar as 'proper' means 'for oneself', as in 'proprietary' or French *propre*. Mostly *eigen* means self-oriented. – AlexChaffee Feb 11 '13 at 19:10
  • 2
    FWIW, I've seen a number of old British linear algebra papers interchangeably use "characteristic value/vector" and "proper value/vector". – J. M. ain't a mathematician Apr 12 '13 at 11:41

Matrices can be viewed as linear maps on vector spaces...in fact if working over a nice field such as the field of real numbers, matrices give geometrical transformations.

For example the reflection in the line $y=x$ in $\mathbb{R}^2$ can be simulated by multiplication by the matrix:

$\left(\begin{array}[aa]\\ 0 1 \\ 1 0\end{array}\right)$

Now it is clear that geometrically there are certain symmetries here. For example if you choose any point on the line $y=x$ it gets sent to itself and if you choose any point on the line $y=-x$ then it gets sent to a point in the complete opposite direction from the origin.

This information is essentially what the eigenvalues and eigenvectors of the above matrix captures. The eigenvectors are the vectors on those two lines and the eigenvalues are the corresponding scalar multiple of the vector you get after applying the reflection.

Things on the line $y=x$ got sent to themselves, i.e. $Av = v$, a scalar multiple of $1$.

Things on the line $y=-x$ got sent to the negative of themselves, i.e. $Av = -v$, a scalar multiple of $-1$.

Thus we expect two eigenvalues $\pm 1$ and two "eigenspaces", $V_1, V_{-1}$ consisting of all vectors of eigenvalues $1$ and $-1$ respectively.

These spaces are exactly the vectors lying on the lines $y=x$ and $y=-x$ respectively.

Of course there are ways to work out these things using only the matrix but hopefully you can see a bit of significance to them now. They come in useful in many areas of maths.

  • 10,610
  • 1
  • 22
  • 37

Think of the eigenvalue $\lambda$ as a magnification factor: If $|\lambda | < 1,$ you have a contraction. If $|\lambda | > 1,$ you have a dilation. If $|\lambda | = 1,$ then the transformation preserves the length of the vector, possibly reversing its direction.

  • 816
  • 5
  • 9

Given a square matrix $A$, an eigenvalue is every such scalar appearing for some eigenvector.

The eigenspace to an eigenvalue is the vector space spanned by all eigenvectors, where this particular scalar occurs.

Julian Kuelshammer
  • 9,352
  • 4
  • 35
  • 78

If you are interested in a "physicist" way of thinking about it, then Quantum Mechanics(QM) is probably the best way to understand the meaning of eigenvalues.

As you probably know, in QM we work with operators, i.e. momentum operator, energy operator etc. Those operators can be expressed in terms of matrices. Now, if we, say, want to find all the possible energies that our systems admits, one would have to find the eigenvalues of its energy matrix! Furthermore, those operators are self-adjoint which means that their eigenvalues are always real. So, the physical point of view makes sense.

For example, the energy levels of a simple harmonic oscillator are quantized. One can see that by looking at the eigenvalues of its energy matrix (called Hamiltonian of the system) which are discrete, i.e. the corresponding energies are found to be

$E_{n}=\hbar\omega \left(n+\frac{1}{2}\right)$

Hope this helps!

  • 134
  • 1
  • 8