Questions tagged [eigenvalues-eigenvectors]

Eigenvalues are numbers associated to a linear operator from a vector space $V$ to itself: $\lambda$ is an eigenvalue of $T\colon V\to V$ if the map $x\mapsto \lambda x-Tx$ is not injective. An eigenvector corresponding to $\lambda$ is a non-trivial solution to $\lambda x - Tx = 0$.

A linear operator from a vector space $V$ to itself may well not have eigenvalues. That's the case, for instance, when $V=\mathbb{R}^2$ and $T(x,y)=(-y,x)$. However, if $V$ is a finite-dimensional complex vector space, then every linear map from $V$ into itself has one eigenvalue, at least.

The eigenvalues of a linear map $T$ from a finite-dimensional vector space into itself are the roots of the characteristic polynomial of $T$.

If $V$ is a vector space and if $T\colon V\to V$ is a linear map, then $T$ is diagonalizable if and only if there is a basis of $V$ such that each of its elements is an eigenvector of $T$.

13014 questions
338
votes
11 answers

What is the importance of eigenvalues/eigenvectors?

What is the importance of eigenvalues/eigenvectors?
Ryan
  • 5,069
  • 6
  • 18
  • 10
230
votes
6 answers

Eigenvectors of real symmetric matrices are orthogonal

Can someone point me to a paper, or show here, why symmetric matrices have orthogonal eigenvectors? In particular, I'd like to see proof that for a symmetric matrix $A$ there exists decomposition $A = Q\Lambda Q^{-1} = Q\Lambda Q^{T}$ where…
187
votes
8 answers

Proof that the trace of a matrix is the sum of its eigenvalues

I have looked extensively for a proof on the internet but all of them were too obscure. I would appreciate if someone could lay out a simple proof for this important result. Thank you.
JohnK
  • 5,730
  • 4
  • 25
  • 48
187
votes
7 answers

How to intuitively understand eigenvalue and eigenvector?

I'm learning multivariate analysis and I have learnt linear algebra for two semester when I was a freshman. Eigenvalue and eigenvector is easy to calculate and the concept is not difficult to understand.I found that there are many application of…
164
votes
8 answers

What is the difference between "singular value" and "eigenvalue"?

I am trying to prove some statements about singular value decomposition, but I am not sure what the difference between singular value and eigenvalue is. Is "singular value" just another name for eigenvalue?
161
votes
7 answers

Intuitively, what is the difference between Eigendecomposition and Singular Value Decomposition?

I'm trying to intuitively understand the difference between SVD and eigendecomposition. From my understanding, eigendecomposition seeks to describe a linear transformation as a sequence of three basic operations ($P^{-1}DP$) on a vector: Rotation…
user541686
  • 12,494
  • 15
  • 48
  • 93
141
votes
3 answers

Why is the eigenvector of a covariance matrix equal to a principal component?

If I have a covariance matrix for a data set and I multiply it times one of it's eigenvectors. Let's say the eigenvector with the highest eigenvalue. The result is the eigenvector or a scaled version of the eigenvector. What does this really…
Ryan
  • 5,069
  • 6
  • 18
  • 10
135
votes
8 answers

Show that the determinant of $A$ is equal to the product of its eigenvalues

Show that the determinant of a matrix $A$ is equal to the product of its eigenvalues $\lambda_i$. So I'm having a tough time figuring this one out. I know that I have to work with the characteristic polynomial of the matrix $\det(A-\lambda I)$.…
onimoni
  • 5,646
  • 8
  • 26
  • 42
107
votes
8 answers

How to prove that eigenvectors from different eigenvalues are linearly independent

How can I prove that if I have $n$ eigenvectors from different eigenvalues, they are all linearly independent?
Corey L.
  • 1,081
  • 2
  • 8
  • 5
81
votes
4 answers

Are the eigenvalues of $AB$ equal to the eigenvalues of $BA$?

First of all, am I being crazy in thinking that if $\lambda$ is an eigenvalue of $AB$, where $A$ and $B$ are both $N \times N$ matrices (not necessarily invertible), then $\lambda$ is also an eigenvalue of $BA$? If it's not true, then under what…
dantswain
  • 965
  • 1
  • 7
  • 5
79
votes
7 answers

What exactly are eigen-things?

Wikipedia defines an eigenvector like this: An eigenvector of a square matrix is a non-zero vector that, when multiplied by the matrix, yields a vector that differs from the original vector at most by a multiplicative scalar. So basically in layman…
77
votes
4 answers

A simple explanation of eigenvectors and eigenvalues with 'big picture' ideas of why on earth they matter

A number of areas I'm studying in my degree (not a maths degree) involve eigenvalues and eigvenvectors, which have never been properly explained to me. I find it very difficult to understand the explanations given in textbooks and lectures. Does…
robintw
  • 905
  • 1
  • 8
  • 7
74
votes
6 answers

Geometric interpretation for complex eigenvectors of a 2×2 rotation matrix

The rotation matrix $$\pmatrix{ \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta}$$ has complex eigenvalues $\{e^{\pm i\theta}\}$ corresponding to eigenvectors $\pmatrix{1 \\i}$ and $\pmatrix{1 \\ -i}$. The real eigenvector of a 3d rotation…
74
votes
1 answer

What is the relation between rank of a matrix, its eigenvalues and eigenvectors

I am quite confused about this. I know that zero eigenvalue means that null space has non zero dimension. And that the rank of matrix is not the whole space. But is the number of distinct eigenvalues ( thus independent eigenvectos ) is the rank of…
Shifu
  • 883
  • 1
  • 7
  • 5
67
votes
11 answers

What is the largest eigenvalue of the following matrix?

Find the largest eigenvalue of the following matrix $$\begin{bmatrix} 1 & 4 & 16\\ 4 & 16 & 1\\ 16 & 1 & 4 \end{bmatrix}$$ This matrix is symmetric and, thus, the eigenvalues are real. I solved for the possible eigenvalues and,…
1
2 3
99 100