Here's a cute problem that was frequently given by the late Herbert Wilf during his talks.

Problem: Let $A$ be an $n \times n$ matrix with entries from $\{0,1\}$ having all positive eigenvalues. Prove that all of the eigenvalues of $A$ are $1$.


Use the AM-GM inequality to relate the trace and determinant.

Is there any other proof?

  • 122,076
  • 7
  • 103
  • 187
  • 37,797
  • 15
  • 117
  • 251

3 Answers3


If one wants to use the AM-GM inequality, you could proceed as follows: Since $A$ has all $1$'s or $0$'s on the diagonal, it follows that $tr(A)\leq n$. Now calculating the determinant by expanding along any row/column, one can easily see that the determinant is an integer, since it is a sum of products of matrix entries (up to sign). Since all eigenvalues are positive, this integer must be positive. AM-GM inequality implies $$det(A)^{\frac{1}{n}}=\left(\prod_{i}\lambda_{i}\right)^{\frac{1}{n}}\leq \frac{1}{n}\sum_{i=1}^{n}\lambda_{i}\leq 1.$$ Since $det(A)\neq 0$, and $m^{\frac{1}{n}}>1$ for $m>1$, the above inequality forces $det(A)=1$. We therefore have equality which happens precisely when $\lambda_{i}=\lambda_{j}$ for all $i,j$. Combining this with the above equality gives the result.

  • I guess you said you already knew how to do it this way. –  Dec 16 '13 at 00:54
  • How do we know that the only values of $\lambda_i$ which satisfy $\sum_i \lambda_i = n$, $\prod_i \lambda_i = 1$, and $\lambda_i > 0$ are $\lambda_i = 1$ for all $i$? – David Jul 22 '15 at 15:39
  • 1
    @David: It was also proved that $\lambda_i=\lambda_j$ for all $i,j$. That yields $\sum_i\lambda_i=n\lambda_1=n$. – joriki Jul 22 '15 at 15:50

Suppose that A has a column with only zero entries, then we must have zero as an eigenvalue. (e.g. expanding det(A-rI) using that column). So it must be true that in satisfying the OP's requirements we must have each column containing a 1. The same holds true for the rows by the same argument. Now suppose that we have a linear relationship between the rows, then there exists a linear combination of these rows that gives rise to a new matrix with a zero row. We've already seen that this is not allowed so we must have linearly independent rows. I am trying to force the form of the permissible matrices to be restricted enough to give the result. Linear independence of the rows gives us a glimpse at the invertability of such matrices but alas not its diagonalizability. The minimal polynomial $(1-r)^n$ results from upper or lower triangular matrices with ones along the diagonal and I suspect that we may be able to complete this proof by looking at what happens to this polynomial when there are deviations from the triangular shape. The result linked by user1551 may be the key. Trying to gain some intuition about what are the possibilities leads one to match the binomial theorem with Newton's identities: http://en.wikipedia.org/wiki/Newton%27s_identities#Expressing_elementary_symmetric_polynomials_in_terms_of_power_sums

and the fact that the trace must be $n$(diagonal ones) and the determinant 1. I would like to show that any deviation from this minimal polynomial must lead to a non-positive eigenvalue. Two aspects of the analysis, a combinatorial argument to show what types of modifications(from triangular) are permissible while maintaining row/column independence and looking at the geometrical effects of the non-dominant terms in the resulting characteristic polynomials. Maybe some type of induction argument will surface here.

  • 1,275
  • 8
  • 20
  • @Potato Fixed. Ty – JEM Mar 31 '13 at 16:48
  • there are non-triangular matrices that have the minimal polynomial e.g. (1100,0100,1111,1101). Will need a diff approach. – JEM Mar 31 '13 at 21:21
  • Newton's theorem on logarithmically convex polynomials surfaced in a number theory context today. Does anyone know of similar results for polynomials that have alternating coefficients? Still working on these characteristic polynomials. – JEM Apr 04 '13 at 00:00
  • These are some interesting thoughts, but I don't think this answers the question in its current form. – Potato Nov 21 '13 at 19:19

Since $\det(A) \neq 0$, $A$ is nonsingular and all diagonals of $A$ must equal one. Next, we know the relationship between the determinant and trace is \begin{align*} \det(A) &= \exp(tr(\ln(A))) \\ &= \exp(0) \\ &= \exp(\sum_i \ln(\lambda_i)) \\ \end{align*}. Step 2 holds because all diagonals of $A$ are one, so their log is zero, and step 3 because natural logs are analytic. This suggests $$\sum_i \ln(\lambda_i) = 0$$, or, $$\prod_i \lambda_i = 1$$.

Then, because all diagonals of $A$ are one, $tr(A) = \sum_i \lambda_i = n$ must be true. Hence, the AM-GM inequality states $$ \left[ \prod_{i} \lambda_i \right]^{\frac{1}{n}} \leq \frac{1}{n} \sum_{i} \lambda_{i}$$, with equality holding only when $\lambda_i = \lambda_j$ $\forall i,j$. Thus, $$ 1^{\frac{1}{n}} = \frac{1}{n} \cdot n$$, and all eigenvalues of $A$ must be equal. Since $\prod_i \lambda_i = 1$, all eigenvalues must be one.

  • 471
  • 3
  • 6
  • 1
    Hmm ... (1) See the OP's comment to his/her question. The OP already knows the AM$\ge$GM trick, and he/she is looking for a different proof. (2) In your first line, why must all diagonal entries of $A$ equal to one? (3) If you want to play the AM$\ge$GM trick, you may do so without most of the sidetracks in your proof. See, e.g., derivation (1) on p.2 of [the paper](https://cs.uwaterloo.ca/journals/JIS/VOL7/Sloane/sloane15.pdf) I linked to in my previous comment to the OP's question. – user1551 Dec 26 '12 at 09:17
  • RE: (1), I went back and read the comments, and that's a good question - what are some of the other ways to arrive at such a result? RE: (2), matrix elements can be either 0 or 1. If a diagonal element is zero, then the matrix is singular and the determinant is therefore also zero, thus all eigenvalues cannot be positive. RE: (3), that paper definitely spells out a more direct route, thanks for sharing it. – Bryce Dec 26 '12 at 23:13
  • 8
    Re:(2), $B=\begin{pmatrix}0&1\\1&0\end{pmatrix}$ is a 0-1 matrix that has a zero diagonal, but it is nonsingular and one of its eigenvalue is equal to $1$. Certainly, the $B$ in this counterexample is indefinite, and it turns out that any $A$ that satisfies all assumptions of the OP's question must have a diagonal of ones, but the point is, you seem to have jumped to this conclusion on the first line of your proof without substantiate it. – user1551 Dec 27 '12 at 06:13