I recently took linear algebra course, all that I learned about orthogonal matrix is that Q transposed is Q inverse, and therefore it has a nice computational property. Recently, to my surprise, I learned that transformations by orthogonal matrices are generalizations of rotations and reflections. They preserve lengths and angles. Why is this true?

On a side note, I would like to learn more about linear algebra, in particular with emphasis on visualization or geometrical interpretations such as above.Is there any good textbook or resources for that?

Rodrigo de Azevedo
  • 18,977
  • 5
  • 36
  • 95
  • 515
  • 1
  • 5
  • 9
  • 2
    Hint regarding the length preserving property: For every matrix $Q$ you have $$(Qx)^T (Qx) = x^TQ^T Q x = x^T (Q^T Q) x.$$ Now use the fact that $Q^T$ is the inverse if $Q$. – gammatester Dec 19 '13 at 14:08
  • 9
    I think it's more accurate to say that orthogonal maps _are_ exactly the reflections and rotations, not merely "generalize". – paul garrett Dec 19 '13 at 14:55
  • @paulgarrett but that's just in dimension $\leq3$, no? – Andrea Mori Dec 21 '13 at 20:41
  • @AndreaMori, I suppose it depends on what characterization of "rotations and reflections" one takes in higher dimensions. There might not be unanimity? But one possibility, which I endorse, is that these are actions given by orthogonal groups, ... otherwise I don't know what "reflections and rotations" would mean. Maybe there is a convention... – paul garrett Dec 21 '13 at 20:44
  • 1
    @MongHNg 's comment posted as answer: *"For some reason I could not comment, but a great reference book is optimization model by Laurent el ghaoui. It is the textbook used by UC Berkeley in one of its upper division course. It merges Lin Alg with many Machine learning application. Fun read! Also I heard Convex Optimization – Boyd and Vandenberghe is pretty good too."* – peterh Feb 03 '18 at 01:18
  • The best resource I know to learn about the geometrical interpretation of the basics of linear is this 3blue1brown series https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab – tales Jan 10 '22 at 07:23

3 Answers3


If $X$ and $Y$ are column vectors in $\Bbb R^n$, their scalar product can be computed via row-column multiplication as $$ \left<X,Y\right>={}^tX\cdot Y. $$ Now, if $A$ is an orthogonal matrix, we have $$ \left<A\cdot X,A\cdot Y\right>={}^t(A\cdot X)\cdot(A\cdot Y)= {}^tX\cdot{}^tA\cdot A\cdot Y={}^tX\cdot A^{-1}\cdot A\cdot Y={}^tX\cdot Y. $$ this shows that the transformation $X\mapsto A\cdot X$ preserves the scalar products and in this respect can be considered as generalizations of reflections and rotations.

Andrea Mori
  • 24,810
  • 1
  • 41
  • 76
  • 1
    And, conversely, if equality holds for all column vectors $X$ and $Y$, then $A^{-1}\cdot A = I$: Take standard basis vectors $X=\mathbf{e}_i$ and $Y=\mathbf{e}_j$ to conclude that the $(i,j)$-entry of $A^{-1}\cdot A$ is $\delta_{ij}$, a.k.a. the $(i,j)$-entry of the identity matrix $I$. :) – Andrew D. Hwang Dec 19 '13 at 14:27
  • @user86418 : you mean the transpose of $A$, not its inverse. – Andrea Mori Dec 19 '13 at 14:33
  • Oops...yes, of course. – Andrew D. Hwang Dec 19 '13 at 14:50

Since you are interested in visualizations (sorry, I have no reference recommendation), I'll try to explain it that way. There is something called the eigenvalue decomposition of a matrix, and it is very much related to the Schur decomposition. Since complex vectors are hard to imagine, I will go here for the real Schur decomposition which tells us that every real matrix $A$ can be decomposed as

$$A = Q T Q^T,$$

where $Q$ is orthogonal and $T$ is quasitriangular (block triangular with the diagonal blocks of order $1$ and $2$).

Now, if we assume that $A$ is also orthogonal, we can show that $T$ is quasidiagonal, i.e., block diagonal with the diagonal blocks of order $1$ and $2$, and also orthogonal.

Since $Q$ is orthogonal, its columns form an orthonormal basis. In other words, this is very much like the standard coordinate system, except that it's rotated in some directions (spanning vectors are still orthogonal and have the length $1$).

So, in that orthonormal basis, orthogonal operator is just a quasidiagonal matrix

$$T = T_1 \oplus T_2 \oplus \cdots \oplus T_k = \operatorname{diag}(T_1, T_2, \dots, T_k),$$

and its diagonal elements can be:

  • $T_i = 1$, which is a null-reflector (does nothing);
  • $T_i = -1$, which is a simple reflectors along the corresponding coordinate vector (column of $Q$);
  • a matrix of order $2$ with complex conjugate eigenvalues of modulus $1$, i.e.,

    $$T_i = \begin{bmatrix} a & b \\ -b & a \end{bmatrix} \quad \text{or} \quad T_i = \begin{bmatrix} a & b \\ b & -a \end{bmatrix},$$

    $|\det T_i| = 1$. The first of these two is a rotation, and the second one is a rotation composed with a reflection along the second of two corresponding coordinate axes.

So, basically, orthogonal matrix is just a combination of one-dimensional reflectors and rotations written in appropriately chosen orthonormal basis (the coordinate system you're used to, but possibly rotated).

Fun fact: All orthogonal matrices (even rotations) of order $n$ can be presented as compositions of at most $n$ reflectors. This is called Cartan–Dieudonné theorem and works for fields more general than $\mathbb{R}$ and all non-degenerate symmetric bilinear forms (read: generalizations of standard scalar product).

Vedran Šego
  • 10,931
  • 1
  • 28
  • 52

Orthogonal matrices preserve angles because they leave the scalar product intact:

$$(Qv,Qu) = (Q^{\ast}Qv,u) = (v,u).$$

As a consequense, $\|Qu\| = \|u\|$, and, therefore, the angle between $u$ and $v$ doesn't change under orthogonal transformation.

As a particular case, rotations and reflections are linear orthogonal transformations.

  • 22,262
  • 1
  • 31
  • 66