Questions tagged [orthogonal-matrices]

Matrices with orthonormalized rows and columns. An orthogonal matrix is an invertible real matrix whose inverse is equal to its transpose. For complex matrices the analogous term is *unitary*, meaning the inverse is equal to its conjugate transpose.

961 questions
-1
votes
1 answer

Is there a name for matrices with singular values all equal to $1$?

I at first thought these are just the orthogonal matrices, since the SVD of $X$ is $U \Sigma V^T$ and if $\Sigma$ is the identity matrix then this implies X is orthogonal. However, singular values equal to 1 doesnt imply $\Sigma$ is identity,…
user56834
  • 11,887
  • 6
  • 32
  • 91
-1
votes
2 answers

Let A be an $n \times k$ matrix with $W$ as the column space. Show that the solution space of $A(A^T) x = 0$ is given by $W^⊥$.

How can I solve this question? $A^T$ refers to $A$ transposed and $W^⊥$ refers to the set of vectors orthogonal to W.
-1
votes
1 answer

A question related to skew symmetric matrix and Orthogonal matrices

Consider the following problem asked in a masters exam for which I am self studying. Write V for the space of $3 \times 3$ skew - symmetric real matrices. (A) Show that for $A\in SO_3(\mathbb{R})$ and $M\in V$ , $AMA^t \in V$. (B) Show that…
-1
votes
3 answers

If a square matrix's column vectors and row vectors all have norm 1, then is the matrix orthonormal?

Well, this is a question I had to ask myself while solving a problem that asked me to prove a matrix is orthonormal. I could show that both the column vectors and the row vectors of said matrix all had unit length, but didn't know how to proceed…
Francisco
  • 39
  • 2
-1
votes
1 answer

Does $\sum_i e_i \otimes e_i$ in $V\otimes V$ have a name?

If $V$ is a vector space with orthogonal basis $(e_i)$, consider the element $\sum_i e_i \otimes e_i \in V\otimes V$. It can be shown that this element is independent of the choice of orthogonal basis. Does this element have a name?
-1
votes
2 answers

A continous extension of the group of permutation matrices

For any $n \in \mathbb{N}$, the collection $\mathcal{P}_n$ of permutation matrices is obtained by permuting the rows of the $n \times n$ identity matrix. Under the operation of matrix multiplication, $\mathcal{P_n}$ is a subgroup of $\mathcal{G_n}$…
-1
votes
2 answers

An orthogonal matrix in $\mathbb{R}^{3\times3}$ with real eigenvalues is diagonalizable

I know there are two non trivial (i.e. if we solve these two cases the other cases are trivial) cases: $\lambda_{1,2,3}=1$ and: $\lambda_1=1,\lambda_{1,2}=-1$ I have been trying to use generalized eigenvectors and the Jordan Canonical and the fact…
-1
votes
2 answers

$3\times 3$ orthogonal matrix, which doesn't consist of zeros and ones

I'm stuck with my homework in a subject called Matrices in Statistics. Can you guys help with the following task? I would be very thankful! The task is as follows: Find a $3\times 3$ orthogonal matrix, which doesn't consist of zeros and ones. A…
M. Smithy
  • 57
  • 3
-1
votes
2 answers

Is this statement true $AA^t = A^{-1}$

I want to know if its true or not. According to what I have read this is true for orthogonal matrices. Is it true or not ? Are there any other cases in which this could be true?
lemniscate
  • 309
  • 2
  • 12
-1
votes
3 answers

Given an orthogonal matrix, how can I find a second orthogonal matrix that gives a product of zero?

Assume I have some orthogonal matrix $\mathbf{Q}^\text{T}\mathbf{Q} = \mathbf{I}$, How can I find a second orthogonal matrix $\mathbf{S}^\text{T}\mathbf{S} = \mathbf{I}$ that gives a product of zero, i.e $$ \mathbf{Q}^\text{T}\mathbf{S} =…
user2350366
  • 493
  • 1
  • 3
  • 12
-1
votes
2 answers

Prove that $\|Q\mathbf{v}\|=\|\mathbf{v}\|$

Prove that if $Q$ is a real $n\times n$ orthogonal matrix and $\mathbf{v}$ is in $\mathbb{R}^{n}$, then $$\|Q\mathbf{v}\| = \|\mathbf{v}\|.$$ Be sure to set out your arguments clearly and logically, giving full reasons. Hello all, To solve this…
Luke Xu
  • 195
  • 1
  • 1
  • 9
-2
votes
2 answers

Proving Properties of orthogonal Matrix

I was given a task whereby its defined that a $n \times n$ matrix, A, is orthogonal if $\langle A\vec{u},A\vec{v}\rangle$ = $\langle \vec{u}$,$\vec{v}\rangle$ and i have also been given the property that $\langle A\vec{u},\vec{v}\rangle$ = $\langle…
-2
votes
1 answer

Proof for one of the properties of Orthogonal matrices

Consider a matrix O, let's assume it has orthonormal basis. If this... $$o^{T}=o^{-1}$$ is satisfied, then 0 is a orthogonal matrix. But how does one go to prove that the inverse of an orthogonal matrix is equal to its transpose? (Basically can…
EPIC Tube HD
  • 181
  • 6
-2
votes
1 answer

If matrix $A$ is unitary and $B^2 = A$ then $B$ is also unitary

I need to prove or give a counterexample: If matrix $A$ is unitary and $B^2 = A$ then $B$ is also unitary I think the statement is true since the unitary matrix A can only be Identity matrix I or negative identity matrix $-I$; and $B=A^2$ is an…
Denny Shen
  • 23
  • 2
-2
votes
1 answer

Gram-Schmidt orthogonalization: Dealing with Complex numbers

A complex valued matrix "A" has n columns a_1 through a_n. Elements of these columns are complex numbers. The orthogonal complex valued matrix U of A has n columns as well u_1 through u_n. u_1 is same as a_1. Projection of column a_x on u_y is…
Raj
  • 99
  • 3
1 2 3
64
65