**Theorem:**

If $A$ and $B$ are two square matrices such that $A B=I$, then $B A=I$.

**Proof:**

By applying the properties of determinants of square matrices, we get that

$\det A\cdot\det B=\det(A B)=\det I=1\ne0$.

So it results that $\;\det A\ne0\;$ and $\;\det B\ne0\;$.

Now we consider the matrix

$C=\frac{1}{\det A}\cdot\text{adj}(A)$

where $\;\text{adj}(A)\;$ is the adjugate matrix of $A$ which is the transpose of the cofactor matrix of $A$.

$\text{adj}(A)=
\begin{pmatrix}
A_{1,1} & A_{2,1} & \cdots & A_{n,1} \\
A_{1,2} & A_{2,2} & \cdots & A_{n,2} \\
\vdots & \vdots & \ddots & \vdots \\
A_{1,n} & A_{2,n} & \cdots & A_{n,n}
\end{pmatrix}$

where $\;A_{i,j}\;$ is the cofactor of the element $a_{i,j}$ of the matrix $A$.

So $\;A_{i,j}=(-1)^{i+j}\det M_{i,j}$

where $\;M_{i,j}\;$ is the submatrix of $A$ formed by deleting the $i^{th}$ row and the $j^{th}$ column.

We are going to use the Laplace expansions which are the following equalities:

$a_{i,1}A_{j,1}+a_{i,2}A_{j,2}+\ldots+a_{i,n}A_{j,n}=
\begin{cases}
\det A\;,\quad\text{ if } i=j\\
0\;,\quad\quad\;\,\text{ if } i\ne j
\end{cases}$

$A_{1,i}a_{1,j}+A_{2,i}a_{2,j} +\ldots+A_{n,i}a_{n,j}=
\begin{cases}
\det A\;,\quad\text{ if } i=j\\
0\;,\quad\quad\;\,\text{ if } i\ne j
\end{cases}$

By applying Laplace expansions, we get that

$A C = C A = I$.

Since for hypothesis $\;A B =I\;,\;$ then $\;C(A B)= C I\;,\;$ hence $\;(C A)B=C,\;$ therefore $\;I B = C\;$ and so we get that $\;B=C$.

Consequently,

$B A = C A = I\;,$

so we have proved that

$B A = I$.