Questions tagged [diagonalization]

For questions about matrix diagonalization. Diagonalization is the process of finding a corresponding diagonal matrix for a diagonalizable matrix or linear map. This tag is NOT for diagonalization arguments common to logic and set theory.

A square matrix $A$ is diagonalisable if there is an invertible matrix $P$ such that $P^{-1}AP$ is a diagonal matrix. One can view $P$ as a change of basis matrix so that, if $A$ is viewed as the standard matrix of a linear map $T$ from a vector space to itself in some basis, it is equivalent to say there exists an ordered basis such that the standard matrix of $T$ is diagonal. Diagonal matrices present the eigenvalues of the corresponding linear transformation along its diagonal. A square matrix that is not diagonalizable is called defective.

Not every matrix is diagonalisable over $\mathbb{R}$ (i.e. only allowing real matrices $P$). For example, $$\begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}$$

Diagonalization can be used to compute the powers of a matrix $A$ efficiently, provided the matrix is diagonalizable.

Diagonalization Procedure :

Let $A$ be the $n×n$ matrix that you want to diagonalize (if possible).

  • Find the characteristic polynomial $p(t)$ of $A$.
  • Find eigenvalues $λ$ of the matrix $A$ and their algebraic multiplicities from the characteristic polynomial $p(t)$.
  • For each eigenvalue $λ$ of $A$, find a basis of the eigenspace $E_λ$. If there is an eigenvalue $λ$ such that the geometric multiplicity of $λ$, $dim(E_λ)$, is less than the algebraic multiplicity of $λ$, then the matrix $A$ is not diagonalizable. If not, $A$ is diagonalizable, and proceed to the next step.

  • If we combine all basis vectors for all eigenspaces, we obtained $n$ linearly independent eigenvectors $v_1,v_2,…,v_n$.

  • Define the nonsingular matrix $$P=[v_1\quad v_2\quad …\quad v_n]$$
  • Define the diagonal matrix $D$, whose $(i,i)$-entry is the eigenvalue $λ$ such that the $i^{th}$ column vector $v_i$ is in the eigenspace $E_λ$.
  • Then the matrix A is diagonalized as $$P^{−1}AP=D$$

References:

Diagonal Matrix on Wikipedia

Matrix Diagonalization on Wolfram MathWorld

2248 questions
0
votes
0 answers

Continuity of a basis of a family of quadratic forms

Let $p$ be a parameter in a k-sphere $\mathbb{S}^k$ and let $Q^p$ be a family of real quadratic forms over $\mathbb{R}^n$, which is continuous with respect to $p$. For every fixed $p$, there exists an orthonormal basis $\left(e_i(p)\right)_i$ with…
0
votes
0 answers

Absorbing generator matrix in continuous-time Markov chain models

Setting Let $[0,T]$ with $T\in\mathbb{R}^{+}$ be a time horizon over which $N\in\mathbb{N}^{+}$ continuous-time time-homogeneous Markov chains make transitions between $\{1,...,h\}$ states with $h\in\mathbb{N}^{+}$ and state $h$ being an absorption…
0
votes
0 answers

Find P with $PAP^{-1}=D$ with no calculations

I have this exersize and we were told these 2 matrices can be diagonalized with no calculations at all. We need to find P such that $PAP^{-1}=D$ is diagonal (if the matrices are diagonalizable) for both $A_1$ and $A_2$: $A1= \begin{bmatrix} 2 && 0…
0
votes
1 answer

Computing a closed formula for a recurrent sequence using eigen -values and -vectors

How would you use eigenvalues and eigenvectors to compute a closed formula for the following sequence: $$\{x_0=1, x_1=2, x_n=5x_{n-1} + 14x_{n-2}, n \ge 0 \}$$ I have come up with the following formula: $$ \begin{bmatrix} 0 & 1 \\ 14 &…
0
votes
1 answer

Why this matrix is not diagonalizable?

Let be $A$ a $n\times n$ matrix such that, rank($A)=n-1$ and rank($A^2)=n-2$. ¿Why a matrix like that is not diagonalizable?
galba
  • 109
  • 5
0
votes
2 answers

For a linear map $f: V \to V$ if $f^2$ is diagonalizable and $\ker f = \ker f^2$ then is $f$ diagonalizable?

Here $V$ is a finite dimensional vector space, of dimension $n$, over an algebraically closed field $F$. My original approach was to use a minimal polynomial argument by showing that $\pi_f$ (which I will let denote the minimal polynomial of a…
0
votes
1 answer

Hermitian Matrix eigenvalues

I am trying to show the following sentence: Let H be hermitian with $\sigma(H) \subseteq \lbrace-r,r\rbrace$. Show that H ∘ H = $r^2$* I. Since H is hermitian, we know that we can decompose the matrix into H=SDS* with S unitary and D diagonal…
0
votes
1 answer

One matrix is diagnolized by orthonormal basis of another matrix

Let $A\in \mathbb R^{n\times n}$. Suppose $B$ is symmetric and positive definite, and \begin{equation}\label{eq:sym} A^TB=BA, \end{equation} then $A$ is diagnolizable by a B-orthonormal basis. Therefore, there exist a B-orthogonal matrix…
0
votes
1 answer

Question about inverse of a matrix.

Consider the block upper triangular matrix $$A = \left[ \begin{matrix} A_{11} & A_{12} \\ 0 & A_{22} \end{matrix} \right], $$ where $A\in\mathbb R^{n\times n}$ and $A_{11}\in\mathbb R^{k\times k}$ with $1 \le k \le n$. Suppose $A_{12} \ne 0$ and…
0
votes
1 answer

Diagonalising a matrix: does order of eigenvectors matter?

So suppose I have a matrix $A$ with real distinct eigenvalues, and I am finding the diagonal matrix $D$ such that $D = P^{-1}A P$. Then $P$ consists of columns that are eigenvectors of $A$. In what order do I put these columns in $P$? And does this…
0
votes
0 answers

Every real, skew-symmetric matrix is diagonalisable by a unitary matrix

I need to show, that every real, skew-symmetric matrix M can be diagonalized by a unitary matrix U. $$ M=-M^T \implies M = U D U^\dagger \quad \textrm{with} \quad U U^\dagger = U^\dagger U = \mathbb{I} $$ I managed to show that $D$ is purely…
0
votes
0 answers

Diagonalization and dimension of the eigenspaces of a matrix $A$

If a matrix $A$ is diagonalizable, what does this imply involving the dimensions of the eigenspaces of $A$?
0
votes
1 answer

Diagonalizability of $AA$ when $A$ has zeros on diagonal and $\lvert a_{ij} \rvert = \lvert a_{ji} \rvert$

Let $S$ and $A$ be two real square matrices such that $S=AA$. I know that, if $A$ is skew-symmetric, then $S$ is symmetric; moreover, its nonzero eigenvalues are negative and have even multiplicity. I also know that, if $A$ is itself symmetric, then…
0
votes
1 answer

T a diagonalizable linear operator on V $\implies$ By the Complex Spectral Theorem, T is normal

Prove if the following statement is true. Let V be a finite dimensional C-vector space with inner product and T a diagonalizable linear operator on V. Then there is a basis of eigenvectors of T for V. Applying the Gram-Schmidt orthogonalization…
0
votes
1 answer

If $T(W)⊆W$ show $W$ spaned by eigenvectors.

Let $T$ be a linear transformation of a finite dimensional real vector space $V$ and assume that $V$ is spanned by eigenvectors of $T$. If $T(W)⊆W$ for some subspace $W⊆V$, show that $W$ spaned by eigenvectors. Any suggestion? Thanks.
Yeyeye
  • 1,159
  • 6
  • 19
1 2 3
99
100