12

Let $f : \mathbb R \to \mathbb R$ be a given function with $\lvert f(x) \rvert \le 1$ and $f(0) = 1$. Is there a nice simplified expression for $$\begin{align}F(x) &= f(x) f(x/2) f(x/4) f(x/8) \cdots \\ &= \prod_{i=0}^\infty f(x/2^i)?\end{align}$$ If there isn't a general solution, as seems likely, can anything useful be said about the case when $f(x) = \operatorname{sinc} x = \frac{1}{x} \sin x$?

This question arose when idly wondering about the limit of convolving an infinite number of dyadically(?) scaled versions of a kernel $g$ together. Taking the Fourier transform of $g(x) * 2g(2x) * 4g(4x) * 8g(8x) * \cdots$ yields the above expression.

KON3
  • 3,807
  • 13
  • 28
  • Please add tags as appropriate; I couldn't think of any others. –  Jan 09 '11 at 08:45
  • f(x) = sin x is the only case I know where the answer is nice. It's a nice exercise. I guess something like f(x) = e^x as well. – Qiaochu Yuan Jan 09 '11 at 13:44
  • @Qiaochu Yuan: If sin(x) is nice, then I would have to conclude (based on the method given below) that sinc(x) will also work out. But perhaps you meant sinc(x) since sin(0) = 0. – hardmath Jan 09 '11 at 18:31
  • Oh, this reminds me, I went ahead and calculated a related product a few months ago: http://deoxygerbe.wordpress.com/2010/10/13/infinite-product-of-n1-to-infinity-of-n-sin1n – graveolensa Jan 09 '11 at 23:52
  • @Qiaochu: I guess you mean f(x) = cos x, for which F(x) = sinc x. It's fun to visualize that in terms of convolutions. –  Jan 09 '11 at 23:54
  • @Rahul: yes, that's the example I was thinking of. – Qiaochu Yuan Jan 09 '11 at 23:54

3 Answers3

9

Assuming analyticity of $f(x)$ it seems more tractable to work with the form involving sums after taking logarithms:

$$ \log F(x) = \sum_{i=0}^\infty \log f(x/2^i)$$

Now if $\log f(x)$ has convergent power series expansion in a neighborhood of zero:

$$ \log f(x) = \sum_{k=1}^\infty a_k x^k$$

Note that since $f(0) = 1$ the constant term $a_0$ of $\log f(x)$ is zero and thus omitted.

Then:

$$ \log F(x) = \sum_{k=1}^\infty a_k (\sum_{i=0}^\infty 2^{-ik}) x^k$$

Note that the inner summations are just $\sum_{i=0}^\infty 2^{-ik} = 1/(1 - 2^{-k})$ and thus uniformly bounded by $2$.

hardmath
  • 35,235
  • 19
  • 69
  • 133
  • Thanks, that looks like a good way to go about it. Unfortunately there doesn't seem to be a nice expression for the power series expansion of log f(x) in terms of that of f(x). –  Jan 09 '11 at 23:54
  • @Rahul Narain: Apparently Sage (via Maxima) can produce the power series expansion of $\log \operatorname{sinc}(x)$. Let me play around with it and see what comes out. – hardmath Jan 10 '11 at 01:27
  • 1
    @RahulNarain Using this approach yields $F\left(x\right)=e^\left(\sum _{n=1}^{\infty}\frac{(-1)^n 2^{4 n-1}B_{2 n} x^n}{\left(4^n-1\right)n(2n)!}\right)$ where $B_{i}$ is the ith [Bernoulli Number](http://en.wikipedia.org/wiki/Bernoulli_number) – kram1032 Dec 15 '12 at 13:22
  • $x^n$ above should be $x^{2n}$. My bad. – kram1032 Dec 15 '12 at 13:31
  • This approach can not hold beyond the first zero of $F\left(x\right)$ though, can it? Because the series *has* to become $-\infty$ at that that point to satisfy it. And beyond that point, it would have to become complex to go negative, which it definitely does... (see my graph of it [here](http://math.stackexchange.com/q/240687/49989) ) – kram1032 Dec 15 '12 at 13:54
  • @kram1032: Yes, the representation of $\log F(x)$ as a power series should blow up at the nearest zero of $F(x)$ to the origin. Thanks for taking up this problem. Note the OP's conclusion (other answer here) that the solution is infinitely differentiable but nowhere analytic for $f(x) = {sinc}(x)$. – hardmath Dec 15 '12 at 14:34
  • @hardmath yeah, just like the Fouriertransform of it, which my question was all about. I wonder if there are algorithmic ways to quickly approximate the limit more directly than doing a finite sum or product, stopping at an arbitrary term. I suspect the Fabius function to be potentially valuable for interpolation and/or as wavelet kernel. Googling something along those lines yielded nothing though. – kram1032 Dec 15 '12 at 15:48
  • @kram1032: About the coefficients of the expansion of $\log(\operatorname{sinc}(x))$, could you please add the explicit formula (through Bernoulli Numbers) in OEIS, [link](http://oeis.org/A046989). As you found above, A046989 is the denominator of $(-1)^n 2^{2n-1}B_{2n}/(2n)!$ or something like this. – Egor Maximenko Feb 17 '14 at 23:10
  • @EgorMaximenko: this was an old comment. After a few minutes, you can no longer edit comments. The actual answer isn't from me. So unfortunately, I cannot fulfill your request. – kram1032 Feb 22 '14 at 00:41
4

For what it's worth, I found a partial answer to the thought that motivated my original question. When $f(x) = \operatorname{sinc}(x)$, it is the Fourier transform of a rectangular function, $g(t) = 1/2$ for $t \in [-1,1]$ and $g(t) = 0$ otherwise (or some scaling thereof, depending on convention). The infinite product $f(x) f(x/2) f(x/4) f(x/8) \cdots$ corresponds to the convolution which I wrote as $g(t) * 2g(2t) * 4g(4t) * 8g(8t) * \cdots$ in a small abuse of notation. Each term in the latter convolution is the probability distribution function of a uniformly distributed random variable, and the convolution is the pdf of their sum. This distribution is given by (some scaling of) the Fabius function: a compactly supported, infinitely differentiable function that is not analytic anywhere in its support. So the original infinite product falls off exponentially fast but probably cannot be expressed in any nice form.

1

I've uploaded here plots of some approximation of limit distributions which are defined by inverse Fourier transform: $$ \mathrm{h}_a(x):=\mathcal{F^{-1}}\biggl[\prod\limits_{k=1}^{\infty}\mathrm{sinc}(t\cdot a^{-k})\biggr](x). $$

Oleg Kravchenko
  • 173
  • 2
  • 9