This might be a dumb question; I know only enough group theory to be able to ask dumb questions.

Ken W. Smith has pointed out that one way to get intuition about the determinant is to observe that it maps matrix multiplication to real multiplication. As it is continuous, too, this means that it is a Lie group homomorphism from $\mathrm{GL}_n(\mathbb R)$ to the multiplicative group of the nonzero reals, $\mathbb R^\times$. The natural question to ask, then, is whether it is the only such homomorphism.

Obviously not: any function $\mathbf A \mapsto (\det\mathbf A)^k$ for $k\in\mathbb Z$ is also a homomorphism.

But are all homomorphisms between the two groups of such a form? That is,

Is every Lie group homomorphism from $\mathrm{GL}_n(\mathbb R)$ to $\mathbb R^\times$ identical to $\mathbf A \mapsto f(\det\mathbf A)$ for some homomorphism $f:\mathbb R^\times\to\mathbb R^\times$?

  • 1
    a Lie group homomorphism is determined by the corresponding Lie algebra homomorphism. So what are all of the Lie algebra homomorphisms $\mathfrak{gl}_n(\mathbb R) \to \mathbb R$? These are the elements of the dual space of $\mathfrak{gl}_n(\mathbb R)$ that vanish on commutators. Are multiples of the trace the only ones (the trace is the derivative of $\det$)? – Eric O. Korman Mar 26 '14 at 03:52
  • 1
    @Eric: That sounds like a great question which I am completely unqualified to answer! However on Googling I found that [the commutator subgroup of $\mathrm{GL}_n(\mathbb R)$ is precisely $\mathrm{SL}_n(\mathbb R)$](http://twoplusonet.wordpress.com/2011/06/13/commutator-subgroup-of-glnr/), which I assume is equivalent to the commutators of $\mathfrak{gl}_n(\mathbb R)$ being the traceless matrices. So the answer to both our questions is... yes, right? –  Mar 26 '14 at 04:19
  • 1
    I'm not sure I see how knowing that $SL$ is the commutator of $GL$ tells you that the trace is the unique form that vanishes on commutators. You could probably prove this though using uniqueness of the Killing form on $\mathfrak{sl}_n(\mathbb R)$. – Eric O. Korman Mar 26 '14 at 04:29

3 Answers3


In general, if $\phi : G \to H$ is a Lie group homomorphism then we get a Lie algebra homomorphism $\phi_* :\mathfrak g \to \mathfrak h$. If $\phi,\psi : G\to H$ are two Lie group homomorphisms then $\phi_* = \psi_*$ implies $\phi = \psi$. So we really just have to classify Lie algebra homomorphisms $\mathfrak{gl}_n(\mathbb R)\to \mathbb R$ and then see which ones lift to the groups.

Since the Lie algebra $\mathbb R$ is abelian, these Lie algebra homomorphisms are those elements of $\mathfrak{gl}_n(\mathbb R)^*$ that vanish on commutators. It turns out that this property uniquely characterizes the trace up to scale (I wasn't able to find a good reference for this but it should be easy, if a bit messy, to prove by hand).

So given $\phi : \mathrm{GL}_n(\mathbb R) \to \mathbb R^\times$, we have $\phi_* = c \operatorname{tr}$ for some $c \in \mathbb R$. Now $\phi_*$ restricts to the connected components where it integrates to $\mathrm{GL}_n(\mathbb R)_+ \ni A\mapsto(\det A)^c \in \mathbb R^+$. Of course for this homomorphism to extend to all of $\mathrm{GL}_n(\mathbb R)$ requires that the map $a \mapsto a^c$ be well-defined on all of $\mathbb R^\times$ so Lie group homomorphism is $\det$ followed by an automorphism of $\mathbb R^\times$.

Eric O. Korman
  • 18,051
  • 3
  • 52
  • 82

Here's a more elementary proof, which has the advantage of not using the assumption that the homomorphism is continuous. (It is a very minor adaptation of the proof of Theorem 3.2 in Cullen's Matrices and Linear Transformations, an early step in Cullen's axiomatic development of the determinant.)

Let $\varphi\colon \text{GL}_n(\mathbb{R})\to\mathbb{R}_\times$ be a group homomorphism. Denote elementary matrices as follows:

  1. $I_{kR_i}$ is the matrix obtained by multiplying the $i$th row of the identity matrix by $k$.
  2. $I_{kR_i+R_j}$ is the matrix obtained by adding $k$ times the $i$ row of the identity matrix to the $j$th row.
  3. $I_{R_i\leftrightarrow R_j}$ is the matrix obtained by exchanging the $i$th and $j$th rows of the identity matrix.

Let $i\colon\mathbb{R}_\times\to\text{GL}_n(\mathbb{R})$ be defined by $i(x) = I_{xR_1}$. Note that $i$ is a homomorphism. We will show that, for all $A\in\text{GL}_n(\mathbb{R})$, we have $$ \varphi(A) = \varphi(i(\det(A))) \tag{$\ast$} $$ This establishes the desired result with $f=\varphi\circ i$.

First, ($\ast$) holds for $A=I_{xR_1}$, since then $A=i(\det(A))$.

Second, from the identity $$ I_{kR_j} = I_{R_1\leftrightarrow R_j} I_{kR_1} I_{R_1\leftrightarrow R_j} $$ we obtain \begin{align*} \varphi(I_{kR_j}) &= \varphi(I_{R_1\leftrightarrow R_j} I_{kR_1} I_{R_1\leftrightarrow R_j}) \\ &= \varphi(I_{R_1\leftrightarrow R_j} I_{R_1\leftrightarrow R_j} I_{kR_1}) &&\text{(since $\mathbb{R}_\times$ is abelian)} \\ &= \varphi(I_{kR_1}) \end{align*} which yields that ($\ast$) holds for elementary matrices of type 1.

Similarly, from the identity $$ I_{kR_j+R_i} = I_{k^{-1}R_j} I_{1R_j+R_i} I_{kR_j} $$ we get $\varphi(I_{kR_j+R_i}) = \varphi(I_{1R_j+R_i})$ for all $k\in\mathbb{R}$ (except $k=0$, which can be handled separately). But since $I_{1R_j+R_i}^2=I_{2R_j+R_i}$, this yields $\varphi(I_{1R_j+R_i})=1$, whence $\varphi(I_{kR_j+R_i})=1$, verifying ($\ast$) for elementary matrices of type 2.

Similarly again, from the identity $$ I_{R_i\leftrightarrow R_j} = I_{(-1)R_i} I_{1R_i+R_j} I_{(-1)R_j+R_i} I_{1R_i+R_j} $$ we get $\varphi(I_{R_i\leftrightarrow R_j}) = \varphi(i(-1))$, which establishes ($\ast$) for elementary matrices of type 3.

Finally, since elementary matrices generate $\text{GL}_n(\mathbb{R})$, this establishes ($\ast$) for all invertible matrices.

  • Old post but I just want to say this is a lovely result. It's actually quite a strong statement. OP's function $f$ need not just be a homomorphism, but must also factor through $GL(n, K)$ in a certain way. – Charles Hudgins Sep 15 '21 at 21:27

Let $f : GL(n, K) \to K^\times$ be a homomorphism.

The subgroup of diagonal matrices is a product of $n$ copies of $GL(1) = K^\times$. The factors are conjugate within $GL(n)$ by swaps and hence $f$, being a homomorphism into an abelian group, acts identically on each factor. If $f' : K^\times \to K^\times$ is the restriction of $f$ to any such factor, we must have $f(\text{diag}(x_1, \ldots, x_n)) = f'(x_1) \cdots f'(x_n)$.

Every matrix in $GL(n)$ can be reduced to a diagonal matrix using only row operations where we subtract a multiple of one row from another. For a given pair of distinct rows i and j, we have a subgroup corresponding to these row operations. Let $E_k$ for $k \in K$ denote the subtraction of $k$ times the ith row from the jth row. For all nonzero $k, k' \in K$, the elements $E_k, E_{k'}$ are conjugate within $GL(n)$ via multiplication of the ith row by $k/k'$ and hence $f(E_k) = f(E_{k'})$ since $f$ is a homomorphism into an abelian group. Therefore $f$ on these subgroups is trivial: the row operations don't alter the value of $f$.

With this we've now reduced the problem to the $n = 1$ case, where the determinant is simply the identity function, so every $f$ in this case must be the composition of $\det$ with some endomorphism of $K^\times$.

When $K = \mathbb{R}$ the endomorphisms are the solutions to the classical functional equation $f(xy) = f(x) f(y)$. If we require continuity, these take the form $f(x) = f(\text{sign}(x)) |x|^a$ where $\text{sign} : K^\times \to \{+1, -1\}$ and $f(-1)$ can be $+1$ or $-1$.

Per Vognsen
  • 511
  • 6
  • 8