Questions tagged [matrix-equations]

This tag is for questions related to equations, with matrices as coefficients and unknowns. A matrix equation is an equation in which a variable stands for a matrix .

Definition: Let $~v_1,~v_2,~\cdots~,v_n~ $ and $~b~$ be vectors in $~\mathbb{R^n}~$. Consider the vector equation $$x_1~v_1+x_2~v_2+~\cdots~+x_n~v_n=b~$$This is equivalent to the matrix equation$$~Ax=b~$$

where $~~A=\begin{pmatrix} \cdot & \cdot & \cdots & \cdot \\ v_1 & v_2 & \cdots & v_n \\ \cdot & \cdot & \cdots & \cdot \\ \end{pmatrix};~~ x=\begin{pmatrix} x_1 \\ x_2\\ \cdots\\ x_n \end{pmatrix} ~~\text{and}~~ b=\begin{pmatrix} b_1 \\ b_2\\ \cdots\\ b_n \end{pmatrix}$

Since a matrix equation $ ~AX=B~$ (where $ ~X~$ is a column vector of variables) is equivalent to a system of linear equations, we can use the same methods we have used on systems of linear equations to solve matrix equations. Namely:

$(1.)~~$ Write down the augmented matrix $ ~A \vdots B$.

$(2.)~~$ Row-reduce to a new augmented matrix $~ \overline A \vdots \overline B~$ in row echelon form.

$(3.)~~$ Use this new matrix to write a matrix equation equivalent to the original one.

$(4.)~~$ Use this new, equivalent matrix equation to find the solutions to the original equation.

In mathematics, matrix equation (which is a system of linear equations) is the basis and a fundamental part of linear algebra, a subject which is used in most parts of modern mathematics. Computational algorithms for finding the solutions are an important part of numerical linear algebra, and play a prominent role in engineering, physics, chemistry, computer science, and economics. A system of non-linear equations can often be approximated by a linear system (see linearization), a helpful technique when making a mathematical model or computer simulation of a relatively complex system.

Very often, the coefficients of the equations are real or complex numbers and the solutions are searched in the same set of numbers, but the theory and the algorithms apply for coefficients and solutions in any field. For solutions in an integral domain like the ring of the integers, or in other algebraic structures, other theories have been developed, see Linear equation over a ring. Integer linear programming is a collection of methods for finding the "best" integer solution (when there are many). Gröbner basis theory provides algorithms when coefficients and unknowns are polynomials. Also tropical geometry is an example of linear algebra in a more exotic structure.

3973 questions
1
vote
2 answers

Intersection of subspaces: $U_1 = Sp\{x^3+2x^2+3x+6, 4x^3-x^2+3x+6, 5x^3+x^2+6x+12\}$, $U_2 = Sp\{x^3-x^2+x+1,2x^3-x^2+4x+5\}.$

Let $U_1, U_2 $ be sub spaces in $R_4[x]$, such that: $$U_1 = Sp\{x^3+2x^2+3x+6, 4x^3-x^2+3x+6, 5x^3+x^2+6x+12\}$$ $$U_2 = Sp\{x^3-x^2+x+1,2x^3-x^2+4x+5\}$$ Find $U_1 \cap U_2$ My idea (i need help how to proceed): compare linear combination of…
1
vote
2 answers

Simplifying matrix expression by centering columns

In an assignment I received, I was asked to show that $$\beta^TX^T(H-\frac{1}{n}J)X\beta=\beta_R^TX_C^TX_C\beta_R$$ Where $H=X(X^TX)^{-1}X^T$, $J$ is an $n\times n$ matrix of ones, $X=[1, X_R]$ for a vector of ones, and $X_C$ is $X_R$ with centered…
Quaere Verum
  • 677
  • 3
  • 10
1
vote
1 answer

If $AX = U$ has infinitely many solutions, then prove that $BX=V$ cannot have a unique solution.

$A=\begin{bmatrix} a&1&0\\ 1&b&d\\ 1&b&c \end{bmatrix}, B=\begin{bmatrix} a&1&1\\ 0&d&c\\ f&g&h\end{bmatrix}, U=\begin{bmatrix} f\\ g\\ h \end{bmatrix} U=\begin{bmatrix} a^2\\ 0\\ 0 \end{bmatrix}, X=\begin{bmatrix} x\\ y\\ z \end{bmatrix}$ If $AX =…
user3290550
  • 3,210
  • 8
  • 21
1
vote
1 answer

Proof of $C^{T}DC$ is always symmetric for real matrices

I am trying to prove that $C^{T}DC$ is symmetric. I start by assuming that $B=C^{T}DC$. Using the rule of multiplication of matrices, $$ b_{ij} = \sum \limits_{l} \sum \limits_{k} c_{il}^{T} d_{lk} c_{kj} = \sum \limits_{l} \sum\limits_{k} c_{li}…
Ali Baig
  • 38
  • 5
1
vote
2 answers

Find all matrices that commute with a Jordan block with $0$ diagonal

Question: Find all $n\times n$ matrices that commutes with $$T=\left(\begin{matrix}0&1&&\\&0&\ddots&\\&&\ddots&1\\&&&0\end{matrix}\right)$$ You are probably wondering why I am answering my own question. Well, as I was typing my question I…
trisct
  • 4,859
  • 10
  • 31
1
vote
3 answers

Prove that there exists a matrix $M$ such that $(AM)^2 = AM$

Prove that $$\forall A \in\mathbb{M}_n(\mathbb{K}), \exists M\in GL_n(\mathbb{K}): (AM)^2 = AM$$ The case where $A$ is invertible is simple, but the other case is not.
Math Buster
  • 467
  • 2
  • 7
1
vote
1 answer

Proving that $B(u) = \lim_{t\to0^{+}}\frac{e^{tB}u - u}{t}$

I hope I am not posting too many questions in a row regarding matrix exponential, but I am solving exercises and got stuck trying to prove $$B(u) = \lim_{t\to0^{+}}\frac{e^{tB}u - u}{t}$$ for $B \in M(d \times d, \mathbb{R})$ and $u \in…
mohsen23
  • 160
  • 8
1
vote
1 answer

Solving system of non-linear equations with 2 unknowns.

I have these system of equations that I'm solving for a 3DoF robotic joint. I've come to the pinnacle of the problem and I'm kinda stuck. Here $P_x$, $P_y$, $P_z$ and $a_1$, $a_2$, $a_3$ are known constants and $\theta_1$, $\theta_2$ and $\theta_3$…
1
vote
1 answer

How to eliminate this variable from set of simultaneous quadratic forms, in terms of determinants?

I am working through some of the derivations in this paper In the appendix (p23, Eq. B.17), we arrive at a set of quadratic equations for variables $E,L$ of the form: $$f_1 E^2 - 2g_1 E L - h_1 L^2 - d_1 = 0$$ $$f_2 E^2 - 2g_2 E L - h_2 L^2 - d_2 =…
1
vote
1 answer

Searching $A$ to maximize $\|x\|$ whilst fulfilling a constraint $x^TA^TAx=c$

May $A \in \mathbb R^{n\times n}$ be a symmetric matrix, $x\in \mathbb R^n$ be a vector and $c\in \mathbb R$ be a constant, such that: $$ \langle Ax, Ax \rangle = c $$ If you want to maximize $\|x\|$, how would you choose $A$?
whitegreen
  • 1,381
  • 1
  • 11
  • 21
1
vote
1 answer

Solutions to linear matrix stochastic differential equation

Let A(t), $0\leq t\leq$ T be a random process taking on a value of N$\times$ N real matrices, consider the random matrices Q(t) that satisfy the equation $$\partial_tQ = QA, Q(0) = \hat1$$ then the solution to this equation can be written in terms…
1
vote
2 answers

Find all solutions $A \in M_{2}(\mathbb{R})$ and $B \in M_{2}(\mathbb{C})$ to $J^2+I=0$

(a) Show $J=\left(\begin{array}{rr}{0} & {-1} \\ {1} & {0}\end{array}\right) \in M_{2}(\mathbb{R}$ is a solution to $J^{2}+I=O$. By inserting the matrix in the matrix equation. (b) Are there other solutions $A \in M_{2}(\mathbb{R})$ to the above…
Xenusi
  • 990
  • 1
  • 4
  • 12
1
vote
1 answer

Some $2\times 2$ real matrix

Suppose $A$ a real matrix of dimension $2$, having two not real conjugate eigenvalues. I want to prove if $B$ is another real $2\times 2$ matrix commuting with $A$, then the eigenvalues of $B$ are conjugate (same norm). Maybe the proof relies on…
Toni Mhax
  • 1,122
  • 8
  • 16
1
vote
2 answers

Show that if a matrix $A$ of order $3×3$ satisfies $A^3=O$, then $I^3−A$ is invertible and its inverse is equal to $I^3+A+A^2.$

I need to show that if $A^3=0$, then $I^3−A$ is invertible and its inverse is equal to $I^3+A+A^2$. I have been at this question for almost an hour and do not know how to approach it, any help would be appreciated :)
1
vote
0 answers

Order reduction for a matrix polynomial

I have a polynomial matrix $M(x) = f(x) I + g(x)A + AB$, where $f(x), g(x)$ are polynomials of degree $k, l$ respectively, $I$ is $N\times N$ identity matrix and $A, B$ some $N\times N$ matrices. Are there any polynomials $h(x), p(x)$ and $2N \times…
Andrey Gorbunov
  • 513
  • 2
  • 14
1 2 3
99
100