Let the $c_i$ be column vectors of a matrix (rows could equivalently be used).

The formal definition of the determinant (that I'm familiar with) is as follows:

$\det(c_1,c_2,\cdots,c_n):\Bbb R^{n\times n}\to\Bbb R$, , is the unique function that is alternating multilinear in all its arguments that also satisfies $\det(\operatorname{Id})=1$.

I have seen the proofs that this generates the more familiar ways of computing the determinant and all its properties.

However, it doesn't quite scratch the itch of the geometric meaning. Yes, this determinant is the scaling of volume under a linear map, but it doesn't seem motivated to be that way.

The proof that the Lebesgue measure of a measurable set in $\Bbb R^n$ scales by the determinant under a linear map relies on the following:

Let $A$ be a square matrix. The polar decomposition theorem gives $A=QR$, for an orthogonal $Q$ and a positive semi-definite, symmetric, matrix $R$. The spectral theorem gives that $R$ can be orthogonally diagonalised as $UDU^{-1}$, for $U$ an orthogonal matrix and $D$ a diagonal matrix. It can be shown that the Lebesgue measure of area under an orthogonal matrix is unchanged, and it can be shown the effect of multiple linear maps is the combination of the individual effects - thus the Lebesgue measure of a set under $A$ is changed only by how much it would be changed under $D$, and as $D$ is diagonal it can be shown that the (absolute value of the) determinant of $D$ is precisely this scaling. As $|\det(QUDU^{-1})|=|\det(D)|$, we have that $|\det(A)|$ is precisely this scaling.

So that leads me to consider the following characterisation of the determinant:

$\det(c_1,c_2,\cdots,c_n):\Bbb R^{n\times n}\to\Bbb R$ is the (unique?) function that satisfies:

  • $\det(AB)=\det(A)\cdot\det(B)$ for any square matrices of the same dimension $A,B$
  • $\det(Q)=1$ for any orthogonal matrix $Q$, or $-1$ if there is an element of reflection
  • $\det(D)$ is the product of diagonal entries for any diagonal matrix $D$.

Does this lead to the same two $\det$ functions? I know the first characterisation implies mine, but I'm curious if one can motivate and define the determinant purely geometrically.

Many thanks for any insight.

  • 8,766
  • 3
  • 7
  • 25
  • You might appreciate how [this post](https://math.stackexchange.com/q/668/81360) connects the usual definition to the geometric meaning of the determinant. – Ben Grossmann Nov 04 '21 at 20:13
  • Thank you, but I'm looking for formalities here @BenGrossmann. I am familiar already with the arguments presented in that post judging by what I can skim – FShrike Nov 04 '21 at 20:19
  • In fact, I suspect that $\det(AB) = \det(A)\det(B)$ with $\det(I) = 1$ is sufficient, which is even "cheaper" than the definition you give. – Ben Grossmann Nov 04 '21 at 20:28
  • Apparently [it's not quite enough](https://math.stackexchange.com/q/668/81360), but it's close. – Ben Grossmann Nov 04 '21 at 20:32

1 Answers1


Yes, your characterization is sufficient to uniquely describe the determinant. One proof of this is as follows.

Let $\delta:\Bbb R^{n \times n} \to \Bbb R$ be a function satisfying the three properties you list. Properties 2 and 3 imply that for any diagonal matrix $D$ and orthogonal matrix $Q$, we have $\delta(D) = \det(D)$ and $\delta(Q) = \det(Q)$.

Let $A$ be an arbitrary $n \times n$ matrix. $A$ has a singular value decomposition $A = U \Sigma V^T$ with $\Sigma$ diagonal (with positive entries) and $U,V$ orthogonal. It follows that \begin{align} \delta(A) &= \delta([U\Sigma]V^T) = \delta(U\Sigma) \delta(V^T) = \delta(U) \delta(\Sigma) \delta(V^T) \\ & = \det(U)\det(\Sigma) \det(V^T) = \det(U\Sigma V^T) = \det(A), \end{align} which was what we wanted.

Ben Grossmann
  • 203,051
  • 12
  • 142
  • 283
  • I suppose I really could have just taken the measure argument in reverse! I wasn't sure if my deliberately cheap characterisation would match up to the proper formal one – FShrike Nov 04 '21 at 20:22