Questions tagged [matrices]

For any topic related to matrices. This includes: systems of linear equations, eigenvalues and eigenvectors (diagonalization, triangularization), determinant, trace, characteristic polynomial, adjugate and adjoint, transpose, Jordan normal form, matrix algorithms (e.g. LU, Gauss elimination, SVD, QR), invariant factors, quadratic forms, etc. For questions specifically concerning matrix equations, use the (matrix-equations) tag.

A matrix is a rectangular array of elements, usually numbers or variables, arranged in rows and columns. A matrix with $m$ rows and $n$ columns has $m \times n$ elements and is called an $m$ by $n$ matrix. Matrices are a part of .

Matrices can be added and subtracted. Furthermore, if they have compatible shapes, they can be multiplied. More precisely, given two matrices $A$ and $B$, the matrix $AB$ is defined when the number of columns of $A$ is equal to the number of rows of $B$. In particular, given a natural number $n$, any two matrices $A$ and $B$ with $n$ columns and $n$ rows can be multiplied in both ways (that is, both $AB$ and $BA$ exist).


For questions specifically concerning matrix equations, use the tag.

51383 questions
763
votes
17 answers

What's an intuitive way to think about the determinant?

In my linear algebra class, we just talked about determinants. So far I’ve been understanding the material okay, but now I’m very confused. I get that when the determinant is zero, the matrix doesn’t have an inverse. I can find the determinant of a…
Jamie Banks
  • 12,214
  • 7
  • 34
  • 39
439
votes
4 answers

What is the intuitive relationship between SVD and PCA?

Singular value decomposition (SVD) and principal component analysis (PCA) are two eigenvalue methods used to reduce a high-dimensional data set into fewer dimensions while retaining important information. Online articles say that these methods are…
369
votes
33 answers

If $AB = I$ then $BA = I$

If $A$ and $B$ are square matrices such that $AB = I$, where $I$ is the identity matrix, show that $BA = I$. I do not understand anything more than the following. Elementary row operations. Linear dependence. Row reduced forms and their…
Dilawar
  • 5,747
  • 6
  • 25
  • 36
329
votes
0 answers

Limit of sequence of growing matrices

Let $$ H=\left(\begin{array}{cccc} 0 & 1/2 & 0 & 1/2 \\ 1/2 & 0 & 1/2 & 0 \\ 1/2 & 0 & 0 & 1/2\\ 0 & 1/2 & 1/2 & 0 \end{array}\right), $$ $K_1=\left(\begin{array}{c}1 \\ 0\end{array}\right)$ and consider the sequence of matrices defined by $$ K_L =…
Eckhard
  • 7,347
  • 3
  • 20
  • 29
285
votes
9 answers

Is a matrix multiplied with its transpose something special?

In my math lectures, we talked about the Gram-Determinant where a matrix times its transpose are multiplied together. Is $A A^\mathrm T$ something special for any matrix $A$?
Martin Ueding
  • 4,141
  • 3
  • 18
  • 24
276
votes
3 answers

How does one prove the determinant inequality $\det\left(6(A^3+B^3+C^3)+I_{n}\right)\ge 5^n\det(A^2+B^2+C^2)$?

Let $\,A,B,C\in M_{n}(\mathbb C)\,$ be Hermitian and positive definite matrices such that $A+B+C=I_{n}$, where $I_{n}$ is the identity matrix. Show that $$\det\left(6(A^3+B^3+C^3)+I_{n}\right)\ge 5^n \det \left(A^2+B^2+C^2\right)$$ This problem is…
230
votes
6 answers

Eigenvectors of real symmetric matrices are orthogonal

Can someone point me to a paper, or show here, why symmetric matrices have orthogonal eigenvectors? In particular, I'd like to see proof that for a symmetric matrix $A$ there exists decomposition $A = Q\Lambda Q^{-1} = Q\Lambda Q^{T}$ where…
220
votes
8 answers

What are the Differences Between a Matrix and a Tensor?

What is the difference between a matrix and a tensor? Or, what makes a tensor, a tensor? I know that a matrix is a table of values, right? But, a tensor?
Aurelius
  • 2,591
  • 3
  • 12
  • 13
216
votes
6 answers

Why does this matrix give the derivative of a function?

I happened to stumble upon the following matrix: $$ A = \begin{bmatrix} a & 1 \\ 0 & a \end{bmatrix} $$ And after trying a bunch of different examples, I noticed the following remarkable pattern. If $P$ is a polynomial,…
201
votes
6 answers

How could we define the factorial of a matrix?

Suppose I have a square matrix $\mathsf{A}$ with $\det \mathsf{A}\neq 0$. How could we define the following operation? $$\mathsf{A}!$$ Maybe we could make some simple example, admitted it makes any sense, with $$\mathsf{A} = \left(\begin{matrix} 1…
194
votes
4 answers

What is the geometric interpretation of the transpose?

I can follow the definition of the transpose algebraically, i.e. as a reflection of a matrix across its diagonal, or in terms of dual spaces, but I lack any sort of geometric understanding of the transpose, or even symmetric matrices. For example,…
187
votes
8 answers

Proof that the trace of a matrix is the sum of its eigenvalues

I have looked extensively for a proof on the internet but all of them were too obscure. I would appreciate if someone could lay out a simple proof for this important result. Thank you.
JohnK
  • 5,730
  • 4
  • 25
  • 48
181
votes
12 answers

Inverse of the sum of matrices

I have two square matrices: $A$ and $B$. $A^{-1}$ is known and I want to calculate $(A+B)^{-1}$. Are there theorems that help with calculating the inverse of the sum of matrices? In general case $B^{-1}$ is not known, but if it is necessary then it…
Tomek Tarczynski
  • 2,524
  • 4
  • 18
  • 17
177
votes
7 answers

Transpose of inverse vs inverse of transpose

I can't seem to find the answer to this using Google. Is the transpose of the inverse of a square matrix the same as the inverse of the transpose of that same matrix?
Void Star
  • 2,205
  • 2
  • 14
  • 19
173
votes
14 answers

Intuition behind Matrix Multiplication

If I multiply two numbers, say $3$ and $5$, I know it means add $3$ to itself $5$ times or add $5$ to itself $3$ times. But If I multiply two matrices, what does it mean ? I mean I can't think it in terms of repetitive addition. What is the…
Happy Mittal
  • 3,059
  • 4
  • 23
  • 30
1
2 3
99 100