Consider the following matrix inequality: $A(x) = x_1A_1 + x_2A_2 + \dots + x_nA_n \preceq B$. In Stephen Boyd's book on convex optimization, it is mentioned that the solution set of the above matrix inequality, $\{x \mid A(x) \preceq B \}$, is the inverse image of the the positive semidefinite cone under the affine function $f: \mathbf{R}^n \rightarrow \mathbf{S}^m$ given by $f(x) = B  A(x)$. In other words, $B  A(x) \succeq 0$. But how is this equivalent to $A(x) \preceq B$? The first one is an expression of positive semdefiniteness while the second one is that of a matrix inequality (componentwise inequality). How are they related?
Asked
Active
Viewed 638 times
0

See this: http://math.stackexchange.com/questions/669085/whatdoescurlycurvedlessthansignsucccurlyeqmean/ – Michael Grant Jul 21 '16 at 21:30
1 Answers
2
Boyd uses $\prec$ and $\preceq$ to denote componentwise inequality when comparing vectors ($v \prec w$ denotes $v_i < w_i$ for all $i$), and positive [semi]definiteness when comparing matrices ($A \prec B$ is defined to be $BA \succ 0$). So everything you've written above are matrix inequalities (statements about positive semidefiniteness). I believe this notation is standard; I don't think people deal with componentwise matrix inequalities very much.
As I am looking through the book, I see that he does not really make this clear. Example 2.15 on page 43 does mention this definition in passing.
angryavian
 81,252
 5
 56
 122

Absolutely right. See the question I posted in the comments above. – Michael Grant Jul 22 '16 at 00:51