First of all, am I being crazy in thinking that if $\lambda$ is an eigenvalue of $AB$, where $A$ and $B$ are both $N \times N$ matrices (not necessarily invertible), then $\lambda$ is also an eigenvalue of $BA$?

If it's not true, then under what conditions is it true or not true?

If it is true, can anyone point me to a citation? I couldn't find it in a quick perusal of Horn & Johnson. I have seen a couple proofs that the characteristic polynomial of $AB$ is equal to the characteristic polynomial of $BA$, but none with any citations.

A trivial proof would be OK, but a citation is better.

  • 5,543
  • 2
  • 24
  • 46
  • 965
  • 1
  • 7
  • 5

4 Answers4


If $v$ is an eigenvector of $AB$ for some nonzero $\lambda$, then $Bv\ne0$ and $$\lambda Bv=B(ABv)=(BA)Bv,$$ so $Bv$ is an eigenvector for $BA$ with the same eigenvalue. If $0$ is an eigenvalue of $AB$ then $0=\det(AB)=\det(A)\det(B)=\det(BA)$ so $0$ is also an eigenvalue of $BA$.

More generally, Jacobson's lemma in operator theory states that for any two bounded operators $A$ and $B$ acting on a Hilbert space $H$ (or more generally, for any two elements of a Banach algebra), the non-zero points of the spectrum of $AB$ coincide with those of the spectrum of $BA$.

Bob Pego
  • 4,999
  • 3
  • 24
  • 18
  • 2
    Bob, I am a little confused on your proof. How did you know to use the trick $Bv \ne 0$ to prove this? – diimension Nov 24 '12 at 01:54
  • 6
    @diimension You just take an eigenvector of $AB$ and do the only calculation you can. Then it turns out that $Bv$ fulfills the eigenvector equation for $BA$, so you hope that it is not 0 and check this in the end. There's no need to know it at the start of the calculation. – Phira Dec 21 '12 at 17:27
  • 1
    @Phira thank you very much! – diimension Dec 26 '12 at 08:21
  • 2
    I know this is an ancient thread but hopefully you're still lurking out there somewhere. How come you need to hope that $\lambda\ne 0$? Doesn't the argument still work just fine in the case where $\lambda=0$? – crf Apr 22 '13 at 12:32
  • 3
    @crf No -- the trouble is that when $Bv=0$, maybe $v$ is not in the range of $A$. The argument given for $\lambda\ne0$ works in Hilbert space, for example, but for $\lambda=0$ the result is not generally true there: On the infinite sequence space $\ell^2$, let $A$ be the right shift ($A(x_1,x_2,\ldots)=(0,x_1,x_2,\ldots)$, and $B$ the left shift. Then $AB(1,0,\ldots)=0$, but $BA$ is the identity. – Bob Pego May 07 '13 at 15:31
  • 1
    More specifically: in the case that $Bv=0$, the equations $\lambda B v = B(AB v) = (BA)Bv$ from Bob’s answer still hold (because all those expressions vanish), but it is no longer the case that these expressions imply that $Bv$ is an eigenvector of $BA$. (For *any* matrix $M$, we of course have $Mw=0$ when $w=0$, but we don’t say $w$ is an eigenvector of $M$ with eigenvalue $0$.) But this counterexample relies on having an infinite dimensional vector space, and strictly speaking the original question presupposed finite-dimensional matrices. – Jess Riedel Jan 11 '21 at 17:28

It is true that the eigenvalues (counting multiplicity) of $AB$ are the same as those of $BA$.

This is a corollary of Theorem 1.3.22 in the second edition of "Matrix Analysis" by Horn and Johnson, which is Theorem 1.3.20 in the first edition.

Paraphrasing from the cited Theorem: If $A$ is an $m$ by $n$ matrix and $B$ is an $n$ by $m$ matrix with $n \geq m$ then the characteristic polynomial $p_{BA}$ of $BA$ is related to the characteristic polynomial $p_{AB}$ of $AB$ by $$p_{BA}(t) = t^{n-m} p_{AB}(t).$$

In your case, $n = m$, so $p_{BA} = p_{AB}$ and it follows that the eigenvalues (counting multiplicity) of $AB$ and $BA$ are the same.

You can see Horn and Johnson's proof in the Google Books link above. A similar proof was given in this answer from Maisam Hedyelloo.

  • 351
  • 3
  • 3
  • Nice answer, thanks!! Is there a way to relate the eigenvectors of $AB$ and $BA?$ when $A, B$ have different, compatible dimensions?Yes, I see that if $v\ne 0$ is an eigenvector of $AB$, then $BA(Bv)= \lambda(Bv),$ provided $Bv \ne 0.$ So there's a catch there it seems, what I'm looking for is a theorem that'd completely relates the eigenvectors of $AB$ and $BA.$ I'm more interested in the case where $A=X, B=X^{T}.$ – Learning Math Apr 06 '20 at 18:22

Here is an alternative proof for this result, following Exercises 6.2.8-9 of Hoffman & Kunze's Linear Algebra (p. 190):

Lemma: Let $A,B\in M_n(\mathbb{F})$, where $\mathbb{F}$ is an arbitrary field. If $I-AB$ is invertible, then so is $I-BA$, and


Proof of Lemma: Since $I-AB$ is invertible,

\begin{align} &I=(I-AB)(I-AB)^{-1}=(I-AB)^{-1}-AB(I-AB)^{-1}\\ &\implies (I-AB)^{-1} = I+ AB(I-AB)^{-1}. \end{align}

Then we have

\begin{align} I+B(I-AB)^{-1}A&= I+B[I+ AB(I-AB)^{-1}]A= I+BA+BAB(I-AB)^{-1}A\\ \implies I&=I+B(I-AB)^{-1}A-BA-BAB(I-AB)^{-1}A\\ &=I[I+B(I-AB)^{-1}A]-BA[I+B(I-AB)^{-1}A]\\ &=(I-BA)[I+B(I-AB)^{-1}A].\checkmark. \end{align}

Proposition: $\forall A,B\in M_n(\mathbb{F}):$ $AB$ and $BA$ have the same eigenvalues.

Proof: Let $\alpha\in\mathbb{F}$ be an eigenvalue of $AB$. If $\alpha=0$, then $0=\det(0I-AB)=\det(-A)\det(B)=\det(B)\det(-A)=\det(0I-BA)$ and so $0$ is an eigenvalue of $BA$ also.

Otherwise $\alpha\neq0$. Suppose $\alpha$ is not an eigenvalue of $BA$. Then $0\neq\det(\alpha I-BA)=\alpha^n\det(I-(\frac{1}{\alpha}B)A)$. Then $0\neq\det(I-(\frac{1}{\alpha}B)A),$ so that $I-(\frac{1}{\alpha}B)A$ is invertible. By the lemma above we know that $I-A(\frac{1}{\alpha}B)$ is invertible as well, meaning $0\neq\det(I-A(\frac{1}{\alpha}B))=\det(I-\frac{1}{\alpha}AB) \implies 0\neq\det(\alpha I-AB)$. But we assumed $\alpha$ to be an eigenvalue for $AB$, $\unicode{x21af}$.

Jay Zha
  • 7,616
  • 1
  • 24
  • 34
Alp Uzman
  • 6,988
  • 2
  • 19
  • 50

Notice that $\lambda$ being an eigenvalue of $AB$ implies that $\det(AB-\lambda I)=0$ which implies that $$\det(A^{-1})\det(AB-\lambda I)\det(B^{-1})=0=\det(A^{-1}(AB-\lambda I)B^{-1})=\det((B-\lambda A^{-1})B^{-1}) $$ $$=\det(I-\lambda A^{-1}B^{-1}) = 0.$$ This further implies that $$\det(BA)\det(I-\lambda A^{-1}B^{-1})=\det(BA(I-\lambda A^{-1}B^{-1}))=\det(BA-\lambda I)=0,$$ i.e., $\lambda$ is an eigenvalue of $BA$. This proof holds only for invertible matrices $A$ and $B$ though. For singular matrices you can show that $0$ is a common eigenvalue, but I can't think of a way to show that the rest of the eigenvalues are equal.

J. W. Tanner
  • 1
  • 3
  • 35
  • 77
Siddharth Joshi
  • 1,338
  • 8
  • 15