15

This question is perhaps a little vague; part of what I want to know is what question I should ask.

First, recall the following form of the Cauchy-Schwarz inequality: let $V$ be a real vector space, and suppose $(\cdot, \cdot) : V \times V \to \mathbb{R}$ is a symmetric bilinear form which is positive semidefinite, that is, $(x,x) \ge 0$ for all $x$. Then for any $x,y \in V$ we have $|(x,y)|^2 \le (x,x) (y,y)$.

I'd like to know what happens if we replace $\mathbb{R}$ by some other space $W$. Suppose at first that $W$ is a real vector space, equipped with a partial order $\le$ that makes it an an ordered vector space, as well as a multiplication operation $\cdot$ that makes it an algebra. Then it makes sense to speak of a positive semidefinite symmetric bilinear form $(\cdot, \cdot) : V \times V \to W$, and ask whether it satisfies the Cauchy-Schwarz inequality $(v,w)\cdot(v,w) \le (v,v) \cdot (v,w)$.

Under what conditions on $W$ does this "generalized Cauchy-Schwarz inequality" hold?

At a minimum I expect we will need some more structure on $W$; in particular I assume we would like the multiplication and the partial ordering in $W$ to interact in some reasonable way, so that for instance $w\cdot w \ge 0$ for all $w \in W$. Are there other properties that $W$ should have?

There are lots of proofs of the classical Cauchy-Schwarz inequality; presumably one should try to find one of them which generalizes. But I couldn't immediately see how to do this.


Here are some motivating examples.

As a fairly simple one, let $X$ be any set, and $W = \mathbb{R}^X$ the vector space of all real-valued functions on $X$. We can equip $W$ with the pointwise multiplication and ordering. Then let $V$ be any linear subspace of $W$, and let the bilinear form $V \times V \to W$ also be pointwise multiplication. Then of course Cauchy-Schwarz holds since we can just prove it pointwise.

For a slightly less trivial example, let $(X,\mu)$ be a measure space, and $W = L^0(X,\mu)$ be the vector space of all measurable functions on $X$, mod $\mu$-almost-everywhere equality (so an element of $W$ is in fact an equivalence class of functions). Again let $\cdot$ be pointwise multiplication (which is well defined), and the ordering $f \le g$ when $f(x) \le g(x)$ almost everywhere. Take again a linear subspace $V \subset W$, and pointwise multiplication as the bilinear form. Now Cauchy-Schwarz holds because we can prove it pointwise on a set of full measure.

A related but more complicated example from probability (and my original motivation) is the quadratic variation form from probability. For instance, we could take $V$ to be the vector space of continuous $L^2$ martingales on some filtered probability space over some time interval $[0,T]$, and $W$ the vector space of continuous adapted processes of bounded variation, mod indistinguishability, with pointwise multiplication and the partial order $X \le Y$ iff $X_t \le Y_t$ for all $t$ almost surely. Then the quadratic variation $\langle M,N \rangle$ is a symmetric positive semidefinite bilinear form from $V \times V$ to $W$.

In this case I can prove the Cauchy-Schwarz inequality pointwise: fix $M,N \in V$. For almost every $\omega$, for all $t \in [0,T]$ and all $q \in \mathbb{Q}$ I can say $$q^2 \langle M,M \rangle_t(\omega) \pm 2 \langle M,N \rangle_t(\omega) + \frac{1}{q^2} \langle N,N \rangle_t(\omega) = \langle q M \pm \frac{1}{q} N \rangle_t(\omega) \ge 0$$ and then letting $q$ be a rational very close to $\sqrt{\langle N,N \rangle_t(\omega) / \langle M,M \rangle_t(\omega)}$ shows that $$|\langle M,N \rangle_t(\omega)| \le \sqrt{\langle M,M \rangle_t(\omega) \langle N,N \rangle_t(\omega)}$$ which is what we want.

In each of these examples, we are working on function spaces (or quotients thereof), and the proof essentially operates pointwise. I'm hoping for some kind of more abstract global argument.

Nate Eldredge
  • 90,018
  • 13
  • 119
  • 248

1 Answers1

5

I think the space $W$ should be defined a partial order $\leq$ and zero element $0$ firstly and satisfy:

  1. If $a\leq b$ then $ca\leq cb$, $\forall a,b\in W$ and $0\leq c\in W$.
  2. If $a\leq b$ then $a-b\leq 0$.

Secondly, a multiply operator $\cdot$ should be defined in $W$ and satisfy $0\leq a\cdot a\stackrel{\triangle}{=}a^2$ for $\forall a\in W$. Also, the inverse operator of $\cdot$ should be defined in $W$ (Alternatively, the inverse element is defined in $W$). That is, if $ab=c$ then $c\stackrel{\triangle}{=}a/b$ for $\forall a,b,c\in W$ and $b\neq 0$ where $/$is the inverse operator of $\cdot$. What is more, these operators should be closed in $W$. Say, if $\forall a,b\in W$ then $a\cdot b\in W$ and $a/b\in W$ if $b\neq 0$. Finally, the operators $\cdot$ and /should satisfy commutative law.

Thirdly, there should have a multiply operator between the elements from $W$ and $V$ because we will define inner product by using this operator. What is more, to hold the Cauchy-Schwarz inequality, the properties of inner product is important. I believe the Cauchy-Schwarz inequality is valid in a space which define a inner product whose definition is classical. In another word, if a space $V$ have been defined a inner product $(*,*)$ (say, a bilinear form that $V\times V\rightarrow W$) satisfy the following conditions:

  1. Commutative: $(x,y)=(y,x)$, $\forall x,y\in V$ (If V is a complex space, the right hand side should be dual. But for the sake of simplicity, we ignore it here.)
  2. Linearity: $(\alpha x+\beta y, z)=\alpha(x,z)+\beta(y,z)$, $\forall x,y,z\in V$ and $\alpha,\beta\in W$.
  3. Positive define: $(x,x)\geq0$, $\forall x\in V$. The equal sign is valid iff $x=0$ is valid where $0$ donate the zero element in $W$.

Then, by this definition, the Cauchy-Schwarz inequality is valid. The proof are as follow:

For $\forall\lambda\in W$ and $\forall x,y \in V$, we have: \begin{equation} 0\leq (x+\lambda y,x+\lambda y)=(x,x)+2\lambda(x,y)+\lambda^2(y,y) \end{equation} If $y=0$, that is a trivial case and Cauchy-Schwarz inequality is valid obviously. If $y\neq 0$, let $\lambda=-(x,y)/(y,y)$ then we have: \begin{equation} 0\leq(x,x)-2(x,y)^2/(y,y)+(x,y)^2/(y,y)^2(y,y)\\ (x,y)^2\leq (x,x)(y,y) \end{equation} This is the Cauchy-Schwarz inequality.

In fact, Cauchy-Schwarz inequality imply that the inner product of two elements is less than the their product of length because there is an angle between them. And $W$ is a space to measure the inner product of $V$. So I think the conditions I assume at start is reasonable.

Lion
  • 2,068
  • 11
  • 17
  • Thank you very much for your answer. I edited my question to add some more (simpler) examples to better explain what I'm looking for, and in light of them it seems that your assumptions are rather onerous. For instance, in my first example, the pointwise multiplication on $W = \mathbb{R}^X$ doesn't admit an inverse (because a function can be zero at some points), and we can't expect that it makes sense to multiply $wv$ where $w \in W$ and $v \in V$ (since $V$ could be a very small subspace of $W$). – Nate Eldredge Feb 06 '14 at 22:23
  • I am a few confused. I can not get the meaning of 'pointwise multiplication' you mentioned. Do you refer to the multiply two functions at same point? I think both inverse and the multiplication between $V$ and $W$ is reasonable even in your example. For one thing, the inverse element (alternatively, division operator) should be define on some nontrivial subspace on $W$, say nonzero. Just like in $R$, as we all know, $1/0$ is make no sense. – Lion Feb 07 '14 at 10:29
  • I think the inverse element is necessary because if there are multiplication and unit element '1' on $W$, and the problem $a\cdot b=1$ is reasonable if $0\neq a,b\in W$. What is more, the inverse element have been used both in my and your proof. If we can avoid it, I think we can cancel the condition of inverse element. For another, the multiplication between $W$ and $V$ is also reasonable. Because, if cancel it, how to define the linearity of inner product? Obviously, define it by using complex numbers is unreasonable because the measure of $V$ do not define on $\mathbb{C}$ while in $W$. – Lion Feb 07 '14 at 10:38
  • Personally, it is a new problem that what construction on $W$ which used to measure $V$ when it is not a number set and I am very interesting on in. But I think Cauchy-Schwarz inequality describe a property of a well defined inner product on $V$. So the validation of the inequality is largely depend on the definition of the inner product. Maybe part of my answer have something wrong and I am very glad to find the correct answer together. – Lion Feb 07 '14 at 10:45
  • 1
    I agree with @NateEldredge. His `pointwise multiplication` means that $f.g$ is a function from $X$ to $\mathbb{R}$ maps $x$ to $f(x).g(x)$. This operator doesn't admit an inverse. – Du Phan Feb 08 '14 at 19:46
  • I get your mean. But I think, we can check the existence of inverse element by means of piecewise check. We divide $X=Y\cup Z$ where $Y=\{x\in X|f(x),g(x)\neq 0\}$ and $Z=\{x\in X|f(x)=0 or g(x)=0\}$. And then we can check the Cauchy-Schwarz inequality in $Y$ by define the inverse element and the deduction is as previous. In $Z$, although we can not define inverse element, we can check Cauchy-Schwarz inequality obviously because that is a trivial case. – Lion Feb 09 '14 at 02:33
  • In summary, as my comment as above, we should define the inverse element in the nonzero domain of $W$. For the null space, we do not need any proof (Surely, we do not need define inverse element) because the Cauchy-Schwarz inequality is obviously. – Lion Feb 09 '14 at 02:36
  • @Lion Sorry, I still don't get why the Cauchy-Schwarz inequality is obvious when both $(x,x)$ and $(y, y)$ don't have any inverse. – Du Phan Feb 11 '14 at 11:55
  • @DuPhan Maybe some mistakes and ambiguities in my deduction. But what I want to say is that check the inequality piecewise. I believe if $f$ do not have inverse element in a domain $N\subset W$ then $N$ is the null set of $f$, say $\forall x\in N$ $f(x)=0$. Then we can not define inverse element of $f$ on $N$. But the Cauchy-Schwarz inequality is obvious in this case since $f=0$ on $x\in N$. I think that is reasonable. – Lion Feb 11 '14 at 13:38
  • For example, let $V=\mathbb{R}^2$ and $W=\mathbb{R}$. And define the classical inner product on $V$. Surely, we can not define the inverse element of $(x,x)$ (reciprocal) if $(x,x)=0$. But C-S inequality is obviously since $x$ is zero vector. – Lion Feb 11 '14 at 13:42
  • @Lion: We know that if $W = \mathbb{R}$, the C-S inequality happens. This leads to $C-S$ inequality happens when $W = \mathbb{R}^X$ as suggested by Nate Eldredge. The poinwise argument can be applied in this case. But in general, how can you get over the difficulty that $(x ,x)$ and $(y, y)$ don't have any inverse in $W$? – Du Phan Feb 12 '14 at 14:09
  • @DuPhan If $(x,x)\in W$ do not have any inverse, then it must be the zero element, say $0$. Otherwise, the inverse of $0\neq a\in W$ can be given by $a\cdot b=1$ where $b\neq 0$ and $1$ is the identical element. On the contrary, obviously, $0$ do not have inverse element. So I believe they are necessary and sufficient conditions for each other (though need strict proof). Thus, if $(x,x)\in W$ do not have inverse element, it must be an zero element. In this case, the C-S inequality is obviously because each sides of the inequality is zero. – Lion Feb 12 '14 at 17:00
  • @Lion: Consider the case $X = \{0, 1\}$, and $f \in W = \mathbb{R}^X$ defined by $f(0) = 0, f(1) = 1$. It is clear that $(f, f) = f$ which is still nonzero in $W$ and has no inverse. – Du Phan Feb 13 '14 at 04:39
  • @Lion: Anyway, your exposition is valuable for who gets interested in this problem. Regards, :) – Du Phan Feb 13 '14 at 06:45
  • @DuPhan For your example, we note that the $f$ is a map $X\subset\mathbb{R}\rightarrow W\subset\mathbb{R}$. If we choose $x=0\in X$, we have $(f,f)=f=0$, this is a zero element in $W$. Thank you for your comment. I am glad to answer this question that make sense. :) – Lion Feb 13 '14 at 10:30
  • @Lion: What is the role of $x$ in your argument? Here $f$ is a function from $X$ to $\mathbb{R}$. The zero element of $W$ is the function $0$ (defined by $0 \Rightarrow 0$ and $1 \Rightarrow 0$. The unit element of $W$ is the function $1$ (defined by $0 \Rightarrow 1$ and $1 \Rightarrow 1$. – Du Phan Feb 13 '14 at 13:10
  • @Lion: I make this comment to clarify what Nate Eldredge means in his example for you. Let $X$ is any nonempty set. Let $W = \mathbb{R}^X$. You can consider each element of $W$ as a function from $X$ to $\mathbb{R}$. Define the inner product $(f,g)$ by $(f,g)(x) = f(x).g(x)$ for all $x\in X$. The order in $W$ is $f \leq g \Leftrightarrow f(x) \leq g(x)$ for all $x\in X$. To prove $((f,g),(f,g)) \leq ((f,f),(g,g))$, we just prove it pointwisely (due to our definition of order). This is trivial because each of them is equal to $f(x)^2.g(x)^2$. – Du Phan Feb 13 '14 at 13:21
  • I mean $((f,g),(f,g))(x) = ((f,f),(g,g))(x) = f(x)^2.g(x)^2$. Sorry for the previous comment, I mis-remember some Latex command. Use $\mapsto$ instead of $\Rightarrow$. – Du Phan Feb 13 '14 at 13:29
  • I am so sorry I misled your and @NateEldredge's meaning. But I would like to point out that, as your definition, the inner product is a map $W\times W\mapsto\mathbb{R}$. And in $\mathbb{R}$, as we all know, the inverse element is defined by reciprocal. Your example is a good idea to defeat the validation of existence of inverse element. But in my opinion, this is a special case. Just like we can check the identity $\sin^2 A+\cos^2 A=1$ by $A=\pi/2$ numerically without any deduction. – Lion Feb 13 '14 at 15:27
  • If you define the inner product by using pointwise multiplication, it is obvious that $(f,g)\cdot(f,g)=f\cdot f\cdot g\cdot g=(f,f)\cdot(g,g)$. Moreover, the multiplication you defined can just satisfy the commutative law while even it is useless for the three properties of inner product. But in functional analysis, we can just have the three properties for a inner product. In this case, it is necessary to make more assumptions to get the C-S inequality. – Lion Feb 13 '14 at 15:41
  • What is more, if there is not definition of inverse element in $W$, say $W$ is not a division algebra or associative algebra, it is not a 'good' measure for inner product. Although I can not prove the division algebra is a necessary construction for $W$, I believe the condition is important for abstract analysis. – Lion Feb 13 '14 at 15:49
  • This may be my final comment because these discussions may distract other people. First, the inner product is a map $W x W \to W$. Second, all three properties of the inner product works. Third, when you come to spaces of functions, hardly to find inverse elements there. Fourth, the division algebra cannot be a necessary construction for $W$ as the above example points out. – Du Phan Feb 14 '14 at 10:41
  • Fifth, in general, if you have two spaces $A$ and $B$ where $A \subset B$, you may want to work with $A$ because $B$ is too big to handle. But think this way: if you work with $B$ you will get results happenning just in $B$, not in $A$ because $A$ is so small for that result happen. I think that this is why mathematicians try to generalize results. – Du Phan Feb 14 '14 at 10:43
  • Ok, thank you very much. – Lion Feb 14 '14 at 13:13