Questions tagged [inner-products]

For questions about inner products and inner product spaces, including questions about the dot product. An inner product space is a vector space equipped with an inner product. The dot product (seen in multivariable calculus and linear algebra) is a simple example of an inner product—other inner products may be seen as generalizations of the dot product.

Given vectors $x = (x_1, x_2, \dotsc, x_n)$ and $y = (y_1, y_2, \dotsc, y_n)$ in $\mathbb{R}^n$, the dot product of $x$ and $y$ is $$ x \cdot y = \sum_{j=1}^{n} x_j y_j. $$ The dot product on $\mathbb{R}^n$ is linear in both $x$ and $y$ and has the property that $x\cdot x \ge 0$ for all $x$, with equality if and only if $x = 0$. Moreover $x \cdot y = \lVert x\rVert \lVert y\rVert \cos(\theta)$, where $\lVert x\rVert$ denotes the length of $x$ and $\theta$ is the measure of the angle between the vectors $x$ and $y$. The dot product is then an algebraic tool which can be used to describe geometric properties of $\mathbb{R}^n$ (e.g. distance and angle).

An inner product is a generalization of the dot product. An inner product space is a vector space over a field $\mathbb K$ (either $\mathbb R$ or $\mathbb C$) endowed with a map $\langle\cdot,\cdot\rangle\colon V\times V\longrightarrow\mathbb K$ such that

  1. $(\forall v_1,v_2,v\in V):\langle v_1+v_2,v\rangle=\langle v_1,v\rangle+\langle v_2,v\rangle$;
  2. $(\forall v_1,v_2\in V)(\forall\lambda\in\mathbb{K}):\langle\lambda v_1,v_2\rangle=\lambda\langle v_1,v_2\rangle$;
  3. $(\forall v_1,v_2\in V):\langle v_1,v_2\rangle=\overline{\langle v_2,v_1\rangle}$;
  4. $(\forall v\in V):\langle v,v\rangle\geqslant0$ and $\langle v,v\rangle=0\iff v=0$.

Such a map is called an inner product. As an example, consider the space $\mathcal{C}\bigl([0,1]\bigr)$ of all continuous functions from $[0,1]$ into $\mathbb C$. If $f,g\in\mathcal{C}\bigl([0,1]\bigr)$, define$$\langle f,g\rangle=\int_0^1f(t)\overline{g(t)}\ \mathrm dt.$$

4492 questions
22
votes
3 answers

The difference between hermitian, symmetric and self adjoint operators.

I am struggling with the concept of hermitian operators, symmetric operators and self adjoint operators. All of the relevant material seems quite self contradictory, and the only notes I have do not quite seem to do the job. Overall I would like to…
Ellya
  • 11,455
  • 1
  • 19
  • 37
22
votes
1 answer

Maximum angle between a vector $x$ and its linear transformation $A x$

Let $A \in \mathbb{R}^{n \times n}$ be a given symmetric positive definite matrix. I would like to find the maximal rotation $A$ can create over any unit vector $x \in \mathbb{R}^n$. In other words, the minimum value of (or a lower bound…
20
votes
3 answers

What is a complex inner product space "really"?

To be clear on this, I know what is the definition of an inner product space and some properties and theorems about them. What I am asking for is an intuition for this definition in the complex case. In the real case, the intuition (or at least one…
KotelKanim
  • 2,666
  • 15
  • 30
20
votes
3 answers

Relation between metric spaces, normed vector spaces, and inner product space.

I am wondering what exactly is the relationship between the three aforementioned spaced. All of them seem to show up many times in: Linear Algebra, Topology, and Analysis. However, I feel like I'm missing the bigger picture of how these spaces…
20
votes
5 answers

Proof: Sum of dimension of orthogonal complement and vector subspace

Let $V$ be a finite dimensional real vector space with inner product $\langle \, , \rangle$ and let $W$ be a subspace of $V$. The orthogonal complement of $W$ is defined as $$ W^\perp= \left\{ v \in V \,:\, \langle v,w \rangle = 0 \text{ for all…
Jake
  • 607
  • 1
  • 5
  • 13
20
votes
3 answers

To show that orthogonal complement of a set A is closed.

To show that orthogonal complement of a set A is closed. My try: I first show that the inner product is a continuous map. Let $X$ be an inner product space. For all $x_1,x_2,y_1,y_2 \in X$, by Cauchy-Schwarz inequality we get, $$|\langle…
User8976
  • 11,885
  • 7
  • 36
  • 81
19
votes
3 answers

Derivation of the polarization identities?

For a real (or complex) inner product space $V$, the inner product can be expressed in terms of the norm as either $$ \langle x,y\rangle=\frac{1}{4}(\|x+y\|^2-\|x-y\|^2) $$ or $$ \langle…
Jon Butler
  • 191
  • 1
  • 1
  • 3
19
votes
3 answers

Double dot product vs double inner product

Anything involving tensors has 47 different names and notations, and I am having trouble getting any consistency out of it. This document (http://www.polymerprocessing.com/notes/root92a.pdf) clearly ascribes to the colon symbol (as "double dot…
Nick
  • 967
  • 1
  • 9
  • 21
19
votes
5 answers

What is the general formula for calculating dot and cross products in spherical coordinates?

I was writing a C++ class for working with 3D vectors. I have written operations in the Cartesian coordinates easily, but I'm stuck and very confused at spherical coordinates. I googled my question but couldn't find a direct formula for vector…
hkBattousai
  • 4,115
  • 7
  • 36
  • 53
18
votes
3 answers

Canonical examples of inner product spaces that are not Hilbert spaces?

That is, what are some good examples of vector spaces which are inner product spaces but in which not every Cauchy sequence converges?
Huck Bennett
  • 295
  • 2
  • 10
18
votes
4 answers

Show that $(2,0,4) , (4,1,-1) , (6,7,7)$ form a right triangle

What I tried: Let $A(2,0,4)$, $B(4,1,-1)$, $C(6,7,7)$ then $$\vec{AB}=(2,1,-5), \vec{AC}=(4,7,3), \vec{BC}=(2,6,8)$$ Then I calculated the angle between vectors: $$\begin{aligned} \alpha_1 &=…
18
votes
4 answers

Why is the matrix-defined Cross Product of two 3D vectors always orthogonal?

By matrix-defined, I mean $$\left\times\left = \left| \begin{array}{ccc} i & j & k\\ a & b & c\\ d & e & f \end{array} \right|$$ ...instead of the definition of the product of the magnitudes multiplied by the…
Justin L.
  • 13,646
  • 22
  • 60
  • 73
17
votes
6 answers

Does taking the dot product of two column vectors involve converting one of the vectors into row vectors first?

If you have two vectors living in subspace $V$ and you want to take dot product, it seems that you cannot technically do this operation because if you write both vectors in matrix form, they would both be column vectors living in the same subspace.…
Firesauce
  • 197
  • 1
  • 5
17
votes
2 answers

Multiplicative norm on $\mathbb{R}[X]$.

How to prove that : there is no function $N\colon \mathbb{R}[X] \rightarrow \mathbb{R}$, such that : $N$ is a norm of $\mathbb{R}$-vector space and $N(PQ)=N(P)N(Q)$ for all $P,Q \in \mathbb{R}[X]$. Once, my teacher asked if there is a multipicative…
user10676
  • 8,103
  • 2
  • 25
  • 49
17
votes
1 answer

What exactly is an integral kernel?

I am not sure if I have seen integral transforms in the right way, but given a transform like Fourier transform - it's actually a basis transformation right ? $$ F(y) = \int K(x,y) f(x) \text{d}x $$ where $K(x,y) = \text{e}^{-ixy}$ for the case…