Is it possible to divide a matrix by another? If yes, What will be the result of $\dfrac AB$ if $$ A = \begin{pmatrix} a & b \\ c & d \\ \end{pmatrix}, B = \begin{pmatrix} w & x \\ y & z \\ \end{pmatrix}? $$

6see for example [here](http://www.mathwords.com/i/inverse_of_a_matrix.htm). You'll have to distinguish between $AB^{1}$ and $B^{1}A$. – Raymond Manzoni Nov 03 '12 at 16:43

To add various types of brackets around entries of a matrix, you can use pmatrix or bmatrix or other variants. See [tutorial at meta](http://meta.math.stackexchange.com/a/5023/). – Martin Sleziak Nov 03 '12 at 16:49
7 Answers
For ordinary numbers $\frac{a}{b}$ means the solution to the equation $xb=a$. This is the same as $bx=a$, but since matrix multiplication is not commutative, there are two different possible generalizations of "division" to matrices.
If $B$ is invertible, then you can form $AB^{1}$ or $B^{1}A$, but these are not in general the same matrix. They are the solutions to $XB=A$ and $BX=A$ respectively.
If $B$ is not invertible, then $XB=A$ and $BX=A$ may have solutions, but the solutions will not be unique. So in that situation speaking of "matrix division" is even less warranted.
 276,945
 22
 401
 655
There is a way to performa sort of division , but I am not sure if it is the way you are looking for. For motivation ,consider the ordinary real numbers $\mathbb{R}$ . We have that for two real numbers, $x/y$ is really the same as multiplying x and $y^{1}=1/y$. We call $y^{1}$ the inverse of y, and note that it has the property that $yy^{1}=1.$
The same goes for different algebraic structures. That is, for two elements x,y in this algebraic structure we define $x/y$ as $xy^{1}$ (under some operation). Most notably, we have a notion of division in any division ring (hence the name!) . It turns out that if you consider invertible $n \times n$ matrices with addition and ordinary matrix multiplication, there is a sensible way to define division since every invertible matrix has well, an inverse. So just to help you grip what an inverse is, say that you have a 2x2 matrix $$A= \begin{bmatrix} a & b \\ c & d \end{bmatrix}.$$ The inverse of A is then given by $$A^{1} = \dfrac{1}{(adbc)} \begin{bmatrix} d & b \\ c & a \end{bmatrix}$$ and you should check that $AA^{1}=E$, the identity matrix. Now, for two matrices $B$ and $A$, $B/A = BA^{1}$.
 3,770
 1
 25
 46
Normally, matrix division is defined as $\frac{A}{B}=AB^{1}$ where $B^{1}$ stands for the inverse matrix of $B$. In the case where the inverse doesn't exist the so called pseudoinverse may be used.
We can say $$\frac{A}{B}=A\times B^{1}$$ where $B^{1}$ is inverse matrice of $B$
 16,347
 4
 26
 51

3

1That is an other possibility since product of matrices is not commutative.So division of matrices can not be defined in an unique way – Adi Dani Nov 03 '12 at 19:02
There are two issues: first, that matrices have divisors of zero; second, that matrix multiplication is in general not commutative.
To give meaning to $A/B$, you need to give meaning to $I/B$ (because then $A/B=A(I/B)$. Now, no one ever writes $I/B$, people actually write $B^{1}$. Anyway, what is $B^{1}$? It should be a matrix such that multiplied by $B$ gives you the identity. Now, there exist nonzero matrices $C$, $B$ with $BC=0$. If $B$ had an inverse $B^{1}$, we would have $$ 0=B^{1}0=B^{1}BC=C, $$ a contradiction. So such a matrix $B$ cannot have an inverse, i.e. "$I/B$" does not make sense.
The invertible matrices are exactly those with nonzero determinant. So, if $\det B\ne0$, then $AB^{1}$ does make sense.
In you case, that would be the condition $wzyx\ne0$. In that case, $$ \begin{bmatrix}w&x\\ y&z\end{bmatrix}^{1}=\frac1{wzyx}\begin{bmatrix}z&x\\ y&w\end{bmatrix} $$
The second issue is a nonissue, because it can be proven that, for matrices, if $B^{1}A=I$, then $AB^{1}=I$.
 179,760
 14
 120
 240
Do you know why matrix multiplication is defined in such a weird way ? It is defined thus, so that the effect on a column vector of leftmultiplying it by one matrix, and then leftmultiplying the result by another matrix, is exactly the same as the effect of leftmultiplying by a single matrix that is the product of those two matrices. That is, matrix multiplication corresponds to composition of linear operators. If you didn't know that already, then you should try to convince yourself that it's true, in a simple class of special cases. I would recommend trying a 2 by 1 column vector with variables for entries, and two 2 by 2 matrices, also with variables for entries. So "dividing by" a matrix would correspond to "undoing" the effect of a linear operator. For instance, rotating the plane clockwise by a certain angle "undoes" the anticlockwise rotation of the plane by the same angle. However, there are plenty of linear operators that can't be undone (because they have squashed something flat), and the matrices representing these operators (with respect to any given basis) must therefore be notdividableby. This is unlike the situation with the real numbers, for instance, in which there is only one of them by which you can't divide, namely zero.
 2,027
 12
 23
Elementwise division is frequently used to normalize matrices in R o MatLab. For instance, in R if dtm
is a contingency matrix counting occurrences of something, you can normalize the rows so that all the rows sum 1.0 as:
dtm.proportions < dtm / rowSums(dtm)
Or normalize the columns transposing the matrix first:
dtm.proportions < t(t(dtm)/colSums(dtm))
 157
 5