Questions tagged [linear-algebra]

For questions about vector spaces of all dimensions and linear transformations between them, including systems of linear equations, bases, dimensions, subspaces, matrices, determinants, traces, eigenvalues and eigenvectors, diagonalization, Jordan forms, etc. For questions specifically concerning matrices, use the (matrices) tag. For questions specifically concerning matrix equations, use the (matrix-equations) tag.

Linear algebra is concerned with vector spaces of all dimensions and linear transformations between them (e.g. $(x_1, \dots, x_n)\mapsto a_1x_1 + \dots + a_nx_n$). Concepts include systems of linear equations, bases, dimensions, subspaces, matrices, determinants, kernels, null spaces, column spaces, traces, eigenvalues and eigenvectors, diagonalization, Jordan normal forms, and so forth.

This is a general tag, and most of the subjects included in its scope also have secondary tags (e.g. , , , , , etc.). Please consider using the appropriate secondary tags as they are applicable.

See here for more information.

116358 questions
763
votes
17 answers

What's an intuitive way to think about the determinant?

In my linear algebra class, we just talked about determinants. So far I’ve been understanding the material okay, but now I’m very confused. I get that when the determinant is zero, the matrix doesn’t have an inverse. I can find the determinant of a…
Jamie Banks
  • 12,214
  • 7
  • 34
  • 39
439
votes
4 answers

What is the intuitive relationship between SVD and PCA?

Singular value decomposition (SVD) and principal component analysis (PCA) are two eigenvalue methods used to reduce a high-dimensional data set into fewer dimensions while retaining important information. Online articles say that these methods are…
369
votes
33 answers

If $AB = I$ then $BA = I$

If $A$ and $B$ are square matrices such that $AB = I$, where $I$ is the identity matrix, show that $BA = I$. I do not understand anything more than the following. Elementary row operations. Linear dependence. Row reduced forms and their…
Dilawar
  • 5,747
  • 6
  • 25
  • 36
338
votes
11 answers

What is the importance of eigenvalues/eigenvectors?

What is the importance of eigenvalues/eigenvectors?
Ryan
  • 5,069
  • 6
  • 18
  • 10
329
votes
0 answers

Limit of sequence of growing matrices

Let $$ H=\left(\begin{array}{cccc} 0 & 1/2 & 0 & 1/2 \\ 1/2 & 0 & 1/2 & 0 \\ 1/2 & 0 & 0 & 1/2\\ 0 & 1/2 & 1/2 & 0 \end{array}\right), $$ $K_1=\left(\begin{array}{c}1 \\ 0\end{array}\right)$ and consider the sequence of matrices defined by $$ K_L =…
Eckhard
  • 7,347
  • 3
  • 20
  • 29
276
votes
3 answers

How does one prove the determinant inequality $\det\left(6(A^3+B^3+C^3)+I_{n}\right)\ge 5^n\det(A^2+B^2+C^2)$?

Let $\,A,B,C\in M_{n}(\mathbb C)\,$ be Hermitian and positive definite matrices such that $A+B+C=I_{n}$, where $I_{n}$ is the identity matrix. Show that $$\det\left(6(A^3+B^3+C^3)+I_{n}\right)\ge 5^n \det \left(A^2+B^2+C^2\right)$$ This problem is…
273
votes
4 answers

Norms Induced by Inner Products and the Parallelogram Law

Let $ V $ be a normed vector space (over $\mathbb{R}$, say, for simplicity) with norm $ \lVert\cdot\rVert$. It's not hard to show that if $\lVert \cdot \rVert = \sqrt{\langle \cdot, \cdot \rangle}$ for some (real) inner product $\langle \cdot, \cdot…
238
votes
7 answers

Why do we care about dual spaces?

When I first took linear algebra, we never learned about dual spaces. Today in lecture we discussed them and I understand what they are, but I don't really understand why we want to study them within linear algebra. I was wondering if anyone knew a…
WWright
  • 5,190
  • 4
  • 28
  • 33
230
votes
6 answers

Eigenvectors of real symmetric matrices are orthogonal

Can someone point me to a paper, or show here, why symmetric matrices have orthogonal eigenvectors? In particular, I'd like to see proof that for a symmetric matrix $A$ there exists decomposition $A = Q\Lambda Q^{-1} = Q\Lambda Q^{T}$ where…
218
votes
5 answers

What is the difference between linear and affine function

I am a bit confused. What is the difference between a linear and affine function? Any suggestions will be appreciated
user34790
  • 3,772
  • 4
  • 20
  • 26
216
votes
6 answers

Why does this matrix give the derivative of a function?

I happened to stumble upon the following matrix: $$ A = \begin{bmatrix} a & 1 \\ 0 & a \end{bmatrix} $$ And after trying a bunch of different examples, I noticed the following remarkable pattern. If $P$ is a polynomial,…
201
votes
6 answers

How could we define the factorial of a matrix?

Suppose I have a square matrix $\mathsf{A}$ with $\det \mathsf{A}\neq 0$. How could we define the following operation? $$\mathsf{A}!$$ Maybe we could make some simple example, admitted it makes any sense, with $$\mathsf{A} = \left(\begin{matrix} 1…
194
votes
4 answers

What is the geometric interpretation of the transpose?

I can follow the definition of the transpose algebraically, i.e. as a reflection of a matrix across its diagonal, or in terms of dual spaces, but I lack any sort of geometric understanding of the transpose, or even symmetric matrices. For example,…
191
votes
1 answer

Derivative of Softmax loss function

I am trying to wrap my head around back-propagation in a neural network with a Softmax classifier, which uses the Softmax function: \begin{equation} p_j = \frac{e^{o_j}}{\sum_k e^{o_k}} \end{equation} This is used in a loss function of the…
Moos Hueting
  • 2,107
  • 3
  • 11
  • 10
187
votes
8 answers

Proof that the trace of a matrix is the sum of its eigenvalues

I have looked extensively for a proof on the internet but all of them were too obscure. I would appreciate if someone could lay out a simple proof for this important result. Thank you.
JohnK
  • 5,730
  • 4
  • 25
  • 48
1
2 3
99 100