A Hermitian matrix always has real eigenvalues and real or complex orthogonal eigenvectors. A real symmetric matrix is a special case of Hermitian matrices, so it too has orthogonal eigenvectors and real eigenvalues, but could it ever have complex eigenvectors?

My intuition is that the eigenvectors are always real, but I can't quite nail it down.

Rodrigo de Azevedo
  • 18,977
  • 5
  • 36
  • 95
Phil H
  • 1,308
  • 1
  • 8
  • 13
  • 3
    For a real symmetric matrix, you can find a basis of orthogonal real eigenvectors. But you can also find complex eigenvectors nonetheless (by taking complex linear combinations). – Joel Cohen Jun 26 '11 at 11:04
  • 1
    @Joel, I do not believe that linear combinations of eigenvectors are eigenvectors as they span the entire space. Even if you combine two eigenvectors $\mathbf v_1$ and $\mathbf v_2$ with corresponding eigenvectors $\lambda_1$ and $\lambda_2$ as $\mathbf v_c = \mathbf v_1 + i\mathbf v_2$, $\mathbf A \mathbf v_c$ yields $\lambda_1\mathbf v_1 + i\lambda_2\mathbf v_2$ which is clearly not an eigenvector unless $\lambda_1 = \lambda_2$. – Tpofofn Jun 26 '11 at 11:42
  • 2
    @Tpofofn : You're right, I should have written "linear combination of eigenvectors for the **same eigenvalue**" or "linear combination of eigenvectors **that are still eigenvectors**". Since any vector is a linear combination of eigenvector (because you can find a basis of eigenvectors), I couldn't possibly mean all linear combinations but only some of them (e.g. $\lambda v$ where $v$ is an eigenvector). – Joel Cohen Jun 26 '11 at 12:01
  • Related: https://math.stackexchange.com/q/354115 – Rodrigo de Azevedo Aug 08 '20 at 13:39

5 Answers5


Always try out examples, starting out with the simplest possible examples (it may take some thought as to which examples are the simplest). Does for instance the identity matrix have complex eigenvectors? This is pretty easy to answer, right?

Now for the general case: if $A$ is any real matrix with real eigenvalue $\lambda$, then we have a choice of looking for real eigenvectors or complex eigenvectors. The theorem here is that the $\mathbb{R}$-dimension of the space of real eigenvectors for $\lambda$ is equal to the $\mathbb{C}$-dimension of the space of complex eigenvectors for $\lambda$. It follows that (i) we will always have non-real eigenvectors (this is easy: if $v$ is a real eigenvector, then $iv$ is a non-real eigenvector) and (ii) there will always be a $\mathbb{C}$-basis for the space of complex eigenvectors consisting entirely of real eigenvectors.

As for the proof: the $\lambda$-eigenspace is the kernel of the (linear transformation given by the) matrix $\lambda I_n - A$. By the rank-nullity theorem, the dimension of this kernel is equal to $n$ minus the rank of the matrix. Since the rank of a real matrix doesn't change when we view it as a complex matrix (e.g. the reduced row echelon form is unique so must stay the same upon passage from $\mathbb{R}$ to $\mathbb{C}$), the dimension of the kernel doesn't change either. Moreover, if $v_1,\ldots,v_k$ are a set of real vectors which are linearly independent over $\mathbb{R}$, then they are also linearly independent over $\mathbb{C}$ (to see this, just write out a linear dependence relation over $\mathbb{C}$ and decompose it into real and imaginary parts), so any given $\mathbb{R}$-basis for the eigenspace over $\mathbb{R}$ is also a $\mathbb{C}$-basis for the eigenspace over $\mathbb{C}$.

Pete L. Clark
  • 93,404
  • 10
  • 203
  • 348

If $A$ is a symmetric $n\times n$ matrix with real entries, then viewed as an element of $M_n(\mathbb{C})$, its eigenvectors always include vectors with non-real entries: if $v$ is any eigenvector then at least one of $v$ and $iv$ has a non-real entry.

On the other hand, if $v$ is any eigenvector then at least one of $\Re v$ and $\Im v$ (take the real or imaginary parts entrywise) is non-zero and will be an eigenvector of $A$ with the same eigenvalue. So you can always pass to eigenvectors with real entries.

  • 2,092
  • 13
  • 16
  • Sorry, that's gone slightly over my head... what is Mn(C)? Are you saying that complex vectors can be eigenvectors of A, but that they are just a phase rotation of real eigenvectors, i.e. that the system is underdefined? – Phil H Jun 26 '11 at 11:13
  • @Phil $M_n(\mathbb{C})$ is the set (or vector space, etc, if you prefer) of n x n matrices with entries in $\mathbb{C}.$ – barf Jun 26 '11 at 11:44
  • 2
    If $A$ is a matrix with real entries, then "the eigenvectors of $A$" is ambiguous. For example, it could mean "the vectors in $\mathbb{R}^n$ which are eigenvectors of $A$", or it could mean "the vectors in $\mathbb{C}^n$ which are eigenvectors of $A$". For this question to make sense, we want to think about the second version, which is what I was trying to get at by saying we should think of $A$ as being in $M_n(\mathbb{C})$. – mac Jun 26 '11 at 11:52

If $x$ is an eigenvector correponding to $\lambda$, then for $\alpha\neq0$, $\alpha x$ is also an eigenvector corresponding to $\lambda$. If $\alpha$ is a complex number, then clearly you have a complex eigenvector. But if $A$ is a real, symmetric matrix ( $A=A^{t}$), then its eigenvalues are real and you can always pick the corresponding eigenvectors with real entries. Indeed, if $v=a+bi$ is an eigenvector with eigenvalue $\lambda$, then $Av=\lambda v$ and $v\neq 0$. So $A(a+ib)=\lambda(a+ib)\Rightarrow Aa=\lambda a$ and $Ab=\lambda b$. Thus, because $v\neq 0$ implies that either $a\neq 0$ or $b\neq 0$, you just have to choose.

L. Lima
  • 141
  • 1
  • 2

I also had this doubt once.

The eigenvectors are usually assumed (implicitly) to be real, but they could also be chosen as complex, it does not matter.

Specifically: for a symmetric matrix $A$ and a given eigenvalue $\lambda$, we know that $\lambda$ must be real, and this readily implies that we can always find a real $\mathbf{p}$ such that

$$\mathbf{A} \mathbf{p} = \lambda \mathbf{p}$$

But recall that we the eigenvectors of a matrix are not determined, we have quite freedom to choose them: in particular, if $\mathbf{p}$ is eigenvector of $\mathbf{A}$, then also is $\mathbf{q} = \alpha \, \mathbf{p}$ , where $\alpha \ne 0$ is any scalar: real or complex.

  • 56,395
  • 9
  • 64
  • 139
  • 2
    The eigenvectors certainly are "determined": they are are determined by the definition. Probably you mean that finding a basis of each eigenspace involves a choice. – Pete L. Clark Jun 26 '11 at 15:40

I have a shorter argument, that does not even use that the matrix $A\in\mathbf{R}^{n\times n}$ is symmetric, but only that its eigenvalue $\lambda$ is real. Imagine a complex eigenvector $z=u+ v\cdot i$ with $u,v\in \mathbf{R}^n$. We simply have $(A-\lambda I_n)(u+v\cdot i)=\mathbf{0}\implies (A-\lambda I_n)u=(A-\lambda I_n)v=\mathbf{0}$, i.e., the real and the imaginary terms of the product are both zero. We obtained that $u$ and $v$ are two real eigenvectors, and so, the complex eigenvector $z$ is merely a combination of other real eigenvectors. One can always multiply real eigenvectors by complex numbers and combine them to obtain complex eigenvectors like $z$.

Daniel Porumbel
  • 349
  • 3
  • 4