Questions tagged [matrix-calculus]

Matrix calculus is about doing calculus, especially derivative and infinite series over spaces of vectors and matrices.

Matrix calculus studies derivatives and differentials of scalar, vector and matrix with respect to vector and matrix. It has been widely applied into different areas such as machine learning, numerical analysis, economics etc.

There are basically two methods.

  • Direct: Regard vectors and matrices as scalar so as to compute in the usual way in calculus. And The Matrix Cookbook provides a lot of basic facts.

  • Component-wise: Write everything in indices notation and compute in the usual way componentwisely. Einstein summation convention is frequently used.

3392 questions
1
vote
1 answer

Derivative of vector valued function with Kronecker products involved

I would like to calulate the derivative of the following function $f:R^m \rightarrow R$ with: $ f(\mathbf{x})=f(x_1,\dots,x_m )= \| A - G B\|_F^2 $, where $A \in C^{n\times k},$ $B \in C^{m\times k},$ and $G \in C^{n\times m}$ with…
1
vote
1 answer

Question about the matrix cookbook

I've been learning matrix calculus by myself, and sometimes use this as a quick references: https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf. I got confused regarding two equations in this book. Eq.38 states that…
1
vote
1 answer

Derivative of the log determinant of the covariance matrix

I have a covariance matrix defined as a rank-one matrix plus a diagonal matrix, with free parameters including a scalar $k$ and a column vector $\vec{v}$. This covariance matrix can be written as $\Sigma=kvv^T+(1-k)I\circ{vv^T}$. I am interested in…
1
vote
1 answer

prove negative definiteness of $\mathbf{1}_{(0, \infty)}(x)$

I would like to show the negative definiteness of $\mathbf{1}_{(0, \infty)}(x)$ with certain condition. Let $h\left(x_{i}, x_{j}\right)=\mathbf{1}_{(0, \infty)}\left(|x_{i}-x_{j}|\right)$. if $c_{1}+\cdots+c_{n}=0,c_{i}\in \mathbb{R} $ then…
1
vote
0 answers

Second order partial derivative of $\log|PWP^T|$

Let $\phi = \text{log}|PWP^T|$, where $P = J + XU^T$. Except $W$ others are rectangular matrices. Further $W$ is positive definite. I want to find the second partial derivative of $\phi$ with respect to $X$ and $W$. After referring some posts in the…
1
vote
0 answers

Given matrix $A$ and column vector $x$, what is the derivative of $Ax$ with respect to $x$?

What is the correct answer of $\cfrac{\partial Ax}{\partial x}$, where $A \in \mathbb{R}^{m \times n}$ and $x \in \mathbb{R}^{n}$(column vector). Here in page 2, the answer is $$\nabla_{x} A x=\left[\begin{array}{c}{\nabla_{x} \tilde{a}_{1}^{T} x}…
1
vote
1 answer

How to compute the derivative of a matrix algebra expression?

I came across a question pertaining to finding the derivative of a particular matrix expression. How do you compute the derivative of a matrix algebra expression? The article the question refers to can be found at:…
1
vote
0 answers

Relation between two matrix sequences that share the same eigenvalue distribution

Given a sequence $\lbrace A_n \rbrace_n$ of matrices of increasing dimension ($A_n \in M_{d_n}(\mathbb{C})$ with $d_{n+1} > d_n$), we say that the sequence is distributed with respect to the eigenvalues as a function $f \in L^1$ on a domain D if…
1
vote
0 answers

Row reducing: Finding linear condition of column space of a matrix

Question For the matrix $$ A=\begin{pmatrix} 1 & x & x^2 \\ 1 & y & y^2 \\ 1 & z & z^2 \\ \end{pmatrix} $$ the set $$S=\{\textbf b \in \mathbb{R^3} : \textbf{b} = A\textbf{x} \text{ for some }\textbf x \in \mathbb{R^3}\}$$ is the column space for…
1
vote
0 answers

Maximizing the determinant $\det (I+SS^T)$

Let $$S=\alpha X + (1-\alpha)Y$$ where $X$ and $Y$ are $m \times n$ matrices with $m > n$ whose singular values are not larger than 1, i.e., $\sigma_{\max}(X) \leq 1$ and $\sigma_{\max}(Y) \leq 1$, where $\sigma_{\max}(\cdot)$ is the largest…
1
vote
2 answers

Prove $\sum\limits_{k=0}^{\infty} x^{k} = \frac{1}{1-x}$ also holds for matrices.

I need to prove $\sum\limits_{k=0}^{\infty} x^{k} = \frac{1}{1-x}$ also holds for matrices. Thus the previous should be true when $x$ is a matrix. I honestly have no idea where to start, so any suggestions are welcome.
Mathbeginner
  • 861
  • 8
  • 13
1
vote
0 answers

Upper bound on the trace of k-th Frechet derivative of the matrix function $|x^k|$

For any Hermite function $A$, we define $|A|=\sqrt{A^2}$. Let $B$ be a Hermite matrix. Define $g(t)=Tr |(A+tB)^k|$. Is it true that $|g^{(k)}(0)|$ is upper bounded by $O(\|B\|_k^k)$, where the hidden constant is independent of $A$ and the dimension…
1
vote
1 answer

Bound $\|(A+B)-(A^{1/4}(1+A^{-1/2}BA^{-1/2})^{1/2}A^{1/4})^2\|$ in terms of commutator $\|AB-BA\|$

For positive definite matrices $A$ and $B$, can $$ \|(A+B)-(A^{1/4}(1+A^{-1/2}BA^{-1/2})^{1/2}A^{1/4})^2\| $$ be bounded in terms of $\|AB-BA\|$? Note that if the matrices commute, then both norms are zero.
1
vote
0 answers

Closed form of $\sum_{k=1}^{\infty} X^k M (X^\top)^k$

Let $X,M\in M_n(\mathbb R)$ and additionally suppose that $X$ is a convergent matrix, that is to say its spectral radius $\rho(X)$ is strictly less than $1$. Define matrix $C$ to be $$C=\sum_{k=1}^\infty X^k M (X^\top)^k.$$ Is there a closed…
1
vote
0 answers

Directional derivative of cost function under optimization. Finding bounds on decision variables.

Given the optimization problem \begin{align} \boldsymbol{ x}^*, \boldsymbol{y}^* &= \min_{ \boldsymbol{x}, \boldsymbol{y}}( V( \boldsymbol{x}, \boldsymbol{y}))\\ s.t: \quad & \boldsymbol{A x} + \boldsymbol{I y }= \boldsymbol{0} \end{align} …
1 2 3
99
100