135

Show that the determinant of a matrix $A$ is equal to the product of its eigenvalues $\lambda_i$.

So I'm having a tough time figuring this one out. I know that I have to work with the characteristic polynomial of the matrix $\det(A-\lambda I)$. But, when considering an $n \times n$ matrix, I do not know how to work out the proof. Should I just use the determinant formula for any $n \times n$ matrix? I'm guessing not, because that is quite complicated. Any insights would be great.

Rodrigo de Azevedo
  • 18,977
  • 5
  • 36
  • 95
onimoni
  • 5,646
  • 8
  • 26
  • 42
  • 3
    This is only true if there are $n$ distinct eigenvalues. In that case, you will have a diagonalisation of the matrix, so it is immediate from the multiplicative property of $\det$. – user1537366 Jan 22 '15 at 05:45
  • 1
    @user1537366 are you saying that this is not necessarily true in cases where the eigenvalues have multiplicity > 1? – makansij Nov 01 '17 at 04:29
  • 1
    @user1537366, is it the product of all eigenvalues, or only a product of the set of distinct eigenvalues? thanks you. – makansij Aug 26 '18 at 22:47
  • 1
    The statement in the question was correct. The product of all eigenvalues (repeated ones counted multiple times) is equal to the determinant of the matrix. – inavda Mar 23 '19 at 20:40
  • @inavda Why can you say that the determinant is the product of the eigenvalues? consider $\begin{pmatrix} 0 & -1 \\ 0 & 1 \end{pmatrix}$ **over $\mathbb{R}$** which doens't have any eigenvalues but determinant 1. I guess we have to require the underlying field to be algebraically closed. – Ramanujan Jun 01 '19 at 17:39
  • 1
    @inavda I meant $\begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}$. – Ramanujan Jun 01 '19 at 18:51

8 Answers8

208

Suppose that $\lambda_1, \ldots, \lambda_n$ are the eigenvalues of $A$. Then the $\lambda$s are also the roots of the characteristic polynomial, i.e.

$$\begin{array}{rcl} \det (A-\lambda I)=p(\lambda)&=&(-1)^n (\lambda - \lambda_1 )(\lambda - \lambda_2)\cdots (\lambda - \lambda_n) \\ &=&(-1) (\lambda - \lambda_1 )(-1)(\lambda - \lambda_2)\cdots (-1)(\lambda - \lambda_n) \\ &=&(\lambda_1 - \lambda )(\lambda_2 - \lambda)\cdots (\lambda_n - \lambda) \end{array}$$

The first equality follows from the factorization of a polynomial given its roots; the leading (highest degree) coefficient $(-1)^n$ can be obtained by expanding the determinant along the diagonal.

Now, by setting $\lambda$ to zero (simply because it is a variable) we get on the left side $\det(A)$, and on the right side $\lambda_1 \lambda_2\cdots\lambda_n$, that is, we indeed obtain the desired result

$$ \det(A) = \lambda_1 \lambda_2\cdots\lambda_n$$

So the determinant of the matrix is equal to the product of its eigenvalues.

Gaurang Tandon
  • 6,141
  • 3
  • 31
  • 66
onimoni
  • 5,646
  • 8
  • 26
  • 42
  • 15
    Interesting, but don't you also have to show that the leading coefficient of the polynomial is 1? Or is that obvious? – DanielV Sep 28 '13 at 09:05
  • 18
    $\det(A-\lambda I)=(\lambda_1 - \lambda)^{m_1}(\lambda_2 - \lambda)^{m_2}\cdots (\lambda_n - \lambda)^{m_n}.$ so $\det(A) = \lambda_1^{m_1} \lambda_2^{m_2}\cdots\lambda_n^{m_n}.$ with $m_i$ is the multiplicity of $\lambda_i$ – mohamez Jan 14 '14 at 07:39
  • I dont get this. How/Why are you setting the $\lambda$ to 0 to begin with? – Spacey Feb 18 '14 at 18:16
  • Fixed wrong edition that messed up the sign. – leonbloy Mar 22 '14 at 11:01
  • 3
    @omar "The leading (highest degree) coefficient $(-1)^{n}$ can be obtained by expanding the determinant along the diagonal." A quick search on google tells that determinant expansion along the diagonal isn't possible, however! Regardless, where does this $(-1)^{n}$ come from? – Muno Jul 14 '16 at 01:11
  • 3
    @Muno: I believe that "Expanding the determinant along the diagonal" refers to the permuatation method of computing the determinant. This gives an polynomial with $n!$ terms, but only one will contribute $n$th power of $\lambda$: when the factors come from the diagonal. – Jason DeVito Jan 29 '18 at 20:18
  • 2
    I was confused how to justify the coefficient $(-1)^n$ and understood it like this: We know $p(\lambda)$ factors in $(\lambda-\lambda_1) \ldots (\lambda-\lambda_n)$. But if we multiply it out we get that the coefficient of $\lambda^n$ is $1$. This is not what we get when we compute $\det(A-\lambda I)$, because there we get an $(-1)^n \lambda^n$. So we need to add $(-1)^n$ manually so that our factorization is correct. – philmcole Apr 10 '18 at 12:29
  • 2
    @philmcole would you please explain why the coefficient of $\lambda^n$ is $(-1)^n$? I did not get it... – Matics Oct 16 '18 at 14:42
  • 1
    I don't understand the first equality at all given this explanation. Can someone elaborate?? – user3180 Aug 23 '20 at 04:57
  • @user3180 Start with the Leibniz formula for determinants. Note that in each term of this expression $\lambda$ appears between $0$ and $n$ times. Hence it's a polynomial of degree $n$. By the fundamental theorem of algebra, such polynomials can be factored into $n$ possibly repeated roots. The roots are, by definition, the eigenvalues. Exactly $1$ permutation in this entire formula corresponds to the diagonal: the identity permutation. But only this term yields the polynomial term of order $n$. Since each $\lambda$ in this term has a minus sign in front of it, we get the coefficient $-1^n$. – Colm Bhandal Mar 11 '21 at 10:08
  • @ColmBhandal Indeed, to put the last nail in the coffin, we should also discuss the sign in the Leibniz formula, which is always positive as the identity permutation has no inversions! – Danish A. Alvi Jun 03 '21 at 12:10
14

I am a beginning Linear Algebra learner and this is just my humble opinion.

One idea presented above is that

Suppose that $\lambda_1,\ldots \lambda_2$ are eigenvalues of $A$.

Then the $\lambda$s are also the roots of the characteristic polynomial, i.e.

$$\det(A−\lambda I)=(\lambda_1-\lambda)(\lambda_2−\lambda)\cdots(\lambda_n−\lambda)$$.

Now, by setting $\lambda$ to zero (simply because it is a variable) we get on the left side $\det(A)$, and on the right side $\lambda_1\lambda_2\ldots \lambda_n$, that is, we indeed obtain the desired result

$$\det(A)=\lambda_1\lambda_2\ldots \lambda_n$$.

I dont think that this works generally but only for the case when $\det(A) = 0$.

Because, when we write down the characteristic equation, we use the relation $\det(A - \lambda I) = 0$ Following the same logic, the only case where $\det(A - \lambda I) = \det(A) = 0$ is that $\lambda = 0$. The relationship $\det(A - \lambda I) = 0$ must be obeyed even for the special case $\lambda = 0$, which implies, $\det(A) = 0$

UPDATED POST

Here i propose a way to prove the theorem for a 2 by 2 case. Let $A$ be a 2 by 2 matrix.

$$ A = \begin{pmatrix} a_{11} & a_{12}\\ a_{21} & a_{22}\\\end{pmatrix}$$

The idea is to use a certain property of determinants,

$$ \begin{vmatrix} a_{11} + b_{11} & a_{12} \\ a_{21} + b_{21} & a_{22}\\\end{vmatrix} = \begin{vmatrix} a_{11} & a_{12}\\ a_{21} & a_{22}\\\end{vmatrix} + \begin{vmatrix} b_{11} & a_{12}\\b_{21} & a_{22}\\\end{vmatrix}$$

Let $ \lambda_1$ and $\lambda_2$ be the 2 eigenvalues of the matrix $A$. (The eigenvalues can be distinct, or repeated, real or complex it doesn't matter.)

The two eigenvalues $\lambda_1$ and $\lambda_2$ must satisfy the following condition :

$$\det (A -I\lambda) = 0 $$ Where $\lambda$ is the eigenvalue of $A$.

Therefore, $$\begin{vmatrix} a_{11} - \lambda & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = 0 $$

Therefore, using the property of determinants provided above, I will try to decompose the determinant into parts.

$$\begin{vmatrix} a_{11} - \lambda & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} - \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} - \lambda\\\end{vmatrix}= \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22}\\\end{vmatrix} - \begin{vmatrix} a_{11} & a_{12} \\ 0 & \lambda \\\end{vmatrix}-\begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} - \lambda\\\end{vmatrix}$$

The final determinant can be further reduced.

$$ \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} \\\end{vmatrix} - \begin{vmatrix} \lambda & 0\\ 0 & \lambda\\\end{vmatrix} $$

Substituting the final determinant, we will have

$$ \begin{vmatrix} a_{11} - \lambda & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22}\\\end{vmatrix} - \begin{vmatrix} a_{11} & a_{12} \\ 0 & \lambda \\\end{vmatrix} - \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} \\\end{vmatrix} + \begin{vmatrix} \lambda & 0\\ 0 & \lambda\\\end{vmatrix} = 0 $$

In a polynomial $$ a_{n}\lambda^n + a_{n-1}\lambda^{n-1} ........a_{1}\lambda + a_{0}\lambda^0 = 0$$ We have the product of root being the coefficient of the term with the 0th power, $a_{0}$.

From the decomposed determinant, the only term which doesn't involve $\lambda$ would be the first term

$$ \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \\\end{vmatrix} = \det (A) $$

Therefore, the product of roots aka product of eigenvalues of $A$ is equivalent to the determinant of $A$.

I am having difficulties to generalize this idea of proof to the $n$ by $$ case though, as it is complex and time consuming for me.

enedil
  • 1,700
  • 10
  • 22
Anthony
  • 570
  • 1
  • 4
  • 10
  • 6
    In the previously given proof, the fact that p(λ) = 0 was not used. Hence it is valid for all A and not just det(A) = 0 – Tyg13 Apr 01 '17 at 23:57
  • The first part seems accurate as long as the characteristic polynomial can be factored in linear terms as you did. This always happens over $\mathbb{C}$ (due to the Fundamental Theorem of Algebra), but not always over $\mathbb{R}$. What happens if $p(\lambda)$ may not be factored in linear terms? Does the formula still holds? – Marra Oct 04 '18 at 23:54
  • Tyg has already clarified, but I want to add one more thought. If Av = λv, then (A-λI)v=0. When a matrix transforms any vector v to 0, it means it is singular. So here (A-λI) is singular which means det of (A-λI) is 0. This holds true for ALL A which has λ as its eigenvalue. Though onimoni's brilliant deduction did not use the fact that the determinant =0, (s)he could have used it and whatever results/theorem came out of it would hold for all A. (for e.g. given the above situation prove that at least one of those eigenvalue should be 0) – Allohvk Sep 06 '21 at 16:05
6

From eigen decomposition

$A = S \lambda S^{-1}$, where $\lambda$ is a matrix formed by eigen values of A.

$\implies det(A) = det(S)\phantom{1}det(\lambda)\phantom{1}det(S^{-1})$

$\implies det(A) = det(\lambda) $

$ det(\lambda)$ is nothing but $\lambda_1$$\lambda_2$....$\lambda_n$

  • 1
    Is this shown by saying $det(S) = 1/det(S^-1)$? – makansij Nov 01 '17 at 04:27
  • Indeed. determinant satisfies that $det(AB) = det(A)det(B)$ for any two $n\times n$ matrices A, B. Since $S S^{ -1}=Id$ then $det(S) det(S^{ -1})= det(Id) = 1$. – eduard Nov 11 '17 at 14:06
  • 3
    but this decomposition only exists, if A is diagonalizable –  Apr 01 '18 at 21:01
  • 3
    As noted above, this does not work if $A$ is not diagonalizable. Instead, you could use a square matrix $T$ over the algebraic closure of the given field s.t. $T^{-1}AT$ has Jordan form. – qwertz Jul 09 '18 at 14:06
4

The approach I would use is to Decompose the matrix into 3 matrices based on the eigenvalues.

Then you know that the $det(A*B) = det(A)*det(B)$, and that $det(inv(A)) = \dfrac{1}{det(A)}$.

You can probably fill in the rest of the details from the article, depending on how rigorous your proof needs to be.

Edit: I just realized this won't work on all matrices, but it might give you an idea of an approach.

M. Al Jumaily
  • 159
  • 1
  • 2
  • 12
DanielV
  • 22,195
  • 5
  • 35
  • 67
  • 1
    Why not? Any matrix has some "eigendecomposition": Schur, Jordan,... – Algebraic Pavel Sep 29 '13 at 11:20
  • 2
    I liked this. I did it this way, but is this a correct proof? **a)** If matrix $A$ has linearly independent columns. $$A=SDS^{-1}$$ now take $\det$ of both sides $$\det(A)=\det(SDS^{-1})=\det(S)\det(D)\det(S^{-1})=\det(D)$$ and $\det(D)$ is just the product of all $\lambda_i$. **b)** If matrix $A$ has linearly *de* pendent columns. Then $$\det(A)=0$$ but what are it's eigenvalues? – jacob Mar 08 '14 at 07:11
  • 2
    @jacob Having linearly independent columns does not imply diagonalisable... – user1537366 Jan 22 '15 at 05:47
  • His Idea is good but needs more arguments the D will be a Jordan Block matrix. Such matrices are almost diagonal. – Kori Sep 25 '16 at 02:07
2

Instead of assuming that the matrix is diagonalisable, as done in some of the previous answers, we can use the Jordan form. That is, every matrix has an associated Jordan form through a similarity transformation: $$J=M^{-1}AM$$ for a certain invertible $M$. Since $J$ is triangular, its determinant is simply the product of its diagonal entries, which also happen to be the eigenvalues of $A$. That is, $$\det(J)=\prod_i{\lambda_i(A)}$$ Where $\lambda_i(A)$ denotes the $i$th eigenvalue of $A$.

Computing the determinants, we have $$\det(J)=\det(M^{-1})\det(A)\det(M)$$ The determinants of $M$ and $M^{-1}$ cancel to give $1$, and so $$\det(J)=\det(A)$$ Combining our second and fourth equations we have the result: $$\det(A)=\prod_i{\lambda_i(A)}=\lambda_1\lambda_2...\lambda_n$$ .

TL;DR:

$\det(A)=\det(M^{-1})\det(J)\det(M)$

$\implies \det(A) = \det(J)$.

And since $J$ is triangular and has eigenvalues along its diagonal, $\det(A)=\det(J)=\lambda_1\lambda_2...\lambda_n$.

Rickson
  • 46
  • 1
2

You must know the following:

== If we take an extension of the basis field then both the determinant and the trace of a (square) matrix remain unchanged when evaluating them in the new field

== Take a splitting field of the characteristic polynomial of $\;A\;$ and calculate this matrix's Jordan Canonical form. Since this last is a triangular matrix its determinant is the product of the elements in its main diagonal, and we know that in this diagonal appear the eigenvalues of $\;A\;$ so we're done.

DonAntonio
  • 205,374
  • 17
  • 121
  • 273
0

A few places in this thread I noticed people raised issues about 'what if A doesn't have independent columns' or 'what if the determinant is 0'. I believe the following are all equivalent:

  • 0 is an eigenvalue of A
  • A has linearly dependent columns (or rows)
  • $det(A)=0$
  • $\prod_i \lambda _i = 0$ (the product of eigenvalues of A)

so we can take care of this issue by saying "Suppose one of the above is true. Then $det(A)=\prod_i \lambda _i = 0$, otherwise... (note it is clear that 0 being an eigenvalue results in the product being 0)

-2

I think this is right...

Write A = $$\begin{pmatrix}a_{11} & a_{12}&\cdots&a_{1n}\\\ \vdots & \vdots&&\vdots\\\ a_{n1}& a_{n2}&\cdots& a_{nn} \end{pmatrix}$$

Let the n eigenvalues of A be $\lambda_1, \cdots , \lambda_n$. Finally, denote the characteristic polynomial of A by $p(\lambda) = |\lambda I − A| = \lambda_n + c_{n−1}\lambda{n−1} + \cdots + c_1λ + c_0$.

Note that since the eigenvalues of A are the zeros of $p(\lambda)$, this implies that $p(\lambda)$ can be factorised as $p(\lambda) = (\lambda − \lambda_1)\cdots(\lambda − \lambda_n)$. Consider the constant term of $p(λ), c_0$. The constant term of $p(\lambda)$ is given by $p(0)$, which can be calculated in two ways:

Firstly, $p(0) = (0 − λ_1)\cdots(0 − λ_n) = (−1)^nλ_1 \cdots λ_n$.

Secondly, $p(0) = |0I − A| = | − A| = (−1)^n |A|$.

Therefore $c_0 = (−1)^nλ_1 \cdots λ_n = (−1)^n |A|$, and so $λ_1 \cdots λ_n = |A|$.

That is, the product of the n eigenvalues of A is the determinant of A.

emonHR
  • 2,257
  • 1
  • 11
  • 33
Joeypenny
  • 41
  • 2
  • 7
    This question has been answered for a year now. It is fine to add an answer after a long time, but don't you think it should be different from the other answers? In particular, the accepted answer explains the same argument, and it has a math formatting. – zarathustra Nov 05 '14 at 18:41
  • 2
    @zarathustra your and others' complaint was only partly valid as Joey had definitely given more algebraic details as to how to prove the theorem; whereas the accepted answer didn't use the actual $p(\lambda)$ for any deduction and his statement of `expanding the determinant along diagonal` was also questioned by commenters. This answer actually at least serves me well and had I read this one first I'd have comfortably disregarded the accepted answer but not other way round. – stucash Feb 25 '19 at 00:16
  • @zarathustra I don't think other answer use characteristic polynomial as like this answer do. Perhaps I found logic in it :) – emonHR May 18 '19 at 12:07