For any $x\in\mathbb{R}$, the series $$ \sum_{n\geq 1}\tfrac{1}{n}\,\sin\left(\tfrac{x}{n}\right) $$ is trivially absolutely convergent. It defines a function $f(x)$ and I would like to show that $f(x)$ is unbounded over $\mathbb{R}$. Here there are my thoughts/attempts:

  1. $$(\mathcal{L} f)(s) = \sum_{n\geq 1}\frac{1}{1+n^2 s^2} = \frac{-s+\pi\coth\frac{\pi}{s}}{2s}=\sum_{m\geq 1}\frac{(-1)^{m+1}\,\zeta(2m)}{s^{2m}}$$ is a function with no secrets. It behaves like $\frac{\pi}{2s}$ in a right neighbourhood of the origin, like $\frac{\pi^2}{6s^2}$ in a left neighbourhood of $+\infty$. The origin is an essential singularity and there are simple poles at each $s$ of the form $\pm\frac{i}{m}$ with $m\in\mathbb{N}^+$. These facts do not seem to rule out the possibility that $f$ is bounded;
  2. For any $N\in\mathbb{N}^+$ there clearly is some $x\ll e^N$ such that $\sin(x),\sin\left(\frac{x}{2}\right),\ldots,\sin\left(\frac{x}{N}\right)$ are all positive and large enough, making a partial sum of $ \sum_{n\geq 1}\tfrac{1}{n}\,\sin\left(\tfrac{x}{n}\right) $ pretty close to $C\log N$. On the other hand I do not see an effective way for controlling $ \sum_{n>N}\tfrac{1}{n}\,\sin\left(\tfrac{x}{n}\right) $ - maybe by summation by parts, by exploiting the bounded-ness of the sine integral function?
  3. Some probabilistic argument might be effective. For any $n\geq 3$ we may define $E_n$ as the set of $x\in\mathbb{R}^+$ such that $\sin\left(\frac{x}{n}\right)\geq \frac{1}{\log n}$. The density of any $E_n$ in $\mathbb{R}^+$ is close to $\frac{1}{2}$, so by a Borel-Cantellish argument it looks reasonable that the set of points such that $|f(x)|\geq \frac{\log x}{100}$ is unbounded, but how to make it really rigorous?
  4. To compute $\lim_{x\to x_0}f(x)$ through convolutions with approximate identities seems doable but not really appealing.
Jack D'Aurizio
  • 338,356
  • 40
  • 353
  • 787
  • @JayTuma: no, $\sin\left(\frac{0}{n}\right)=0$, and $$\lim_{x\to 0^+}f(x) = \lim_{m\to +\infty}\sum_{n\geq 1}\frac{m}{1+m^2 n^2}=0.$$ – Jack D'Aurizio Nov 27 '17 at 18:47
  • If $\sin(x) \approx 1$, how can $\sin(x/2) \approx 1$? [this relates to your point 2] Now, if we had $\cos(x/n)$ then we could consider $x = 2\pi k!$ and then $\cos(x/n) = 1$ for $n \in \{1, ..., k\}$. – Michael Nov 27 '17 at 18:59
  • @Michael: at $x=\frac{2\pi}{3}$ both $\sin(x)$ and $\sin(x/2)$ are pretty large. – Jack D'Aurizio Nov 27 '17 at 19:03
  • Well, pretty large is not the same as "close to 1." I assume you really want to say "within $[1/2,1]$" rather than "close to 1." – Michael Nov 27 '17 at 19:06
  • @Michael: $\sqrt{3}/2$ is an element of such interval. Anyway, the issue is not to find some $x$ such that $\sum_{n=1}^{N}\frac{1}{n}\sin\frac{x}{n}$ is approximately $\log N$. The real issue is to show the tail of the series does not compensate such initial grow. – Jack D'Aurizio Nov 27 '17 at 19:09
  • $$f'(x)=\sum _{n=1}^{\infty } \frac{\cos \left(\frac{x}{n}\right)}{n^2}<\sum _{n=1}^{\infty } \frac{1}{n^2}=\frac{\pi ^2}{6}$$ is bounded. Why do you think that $f(x)$ is unbounded? – Raffaele Nov 27 '17 at 19:22
  • @Raffaele: $\log(1+x)$ has a bounded derivative on $\mathbb{R}^+$ and it is an unbounded function. Evidence for the unboundedness of $f(x)$ is given by point 2., mostly. – Jack D'Aurizio Nov 27 '17 at 19:24
  • 1
    I have plotted the function in $[-10000,10000]$. It stays in the strip $-3 – Raffaele Nov 27 '17 at 19:38
  • @Raffaele: so? The function $$ g(z)=\max_{x\in[0,z]}f(x)$$ is expected to grow veeery slowly, so a plot of $f(x)$, even if over a huge interval, is not enough to prove that $f(x)$ is bounded (i.e. disprove my claim). Also: numerical errors accumulate pretty fast, so it is not simple to have an accurate graph over a huge interval. – Jack D'Aurizio Nov 27 '17 at 19:43
  • I'm not sure how helpful this is, but one can write $f(x)=\sum_{k\ge1}(-1)^{k+1}x^{2k-1}\zeta(2k)/(2k-1)!$. – Jason Nov 27 '17 at 19:49
  • Look at the integral $$\int_1^{\infty } \frac{\sin \left(\frac{x}{t}\right)}{t} \, dt=\text{Si}(x)$$ It is bounded. Its values are inside $(-2,2)$ Why should integer $t$ behave so different? – Raffaele Nov 27 '17 at 19:50
  • @Raffaele: but in order to prove the behaviour of the series is the same as the behaviour of such integral you have to show that some form of the EML or Abel-Plana formula applies. If it is the case, why is it the case? For instance $$\sum_{n\geq 1}\frac{\cos(2\pi n\cdot n!)}{n}$$ is divergent while the associated integral is convergent. – Jack D'Aurizio Nov 27 '17 at 19:52
  • For $x>0$ one can write $f(x)=g(\sqrt{2\pi x})$, where $$g(x)=\sum_{k\geq 0}\frac{\binom{4k+2}{2k+1}}{(4k+2)!}\,\frac{B_{2k+2}}{2k+2}\,x^{4k+2}$$ and $B_n$ is the $n$-th Bernoulli number. – Fimpellizzeri Nov 27 '17 at 20:35
  • @Fimpellizieri: and how such series representation is useful for proving/disproving the boundedness of $f(x)$? – Jack D'Aurizio Nov 27 '17 at 20:45
  • One can also write $$f(x)=\pi\cdot\sum_{k\geq0}\frac{B_{2k+2}}{{(2k+2)!}^2}\,(2k+2)\,{(2\pi x)}^{2k+1}$$ Formally at least, we could then write $$F(x)=\frac12 \sum_{k\geq0}\frac{B_{2k+2}}{{(2k+2)!}^2}\,{(2\pi x)}^{2k+2}$$ and have $F'(x)=f(x)$. Now, the expression for $F(x)$ is reminiscent of $\frac{x}2\coth(\pi x)$, but the squared factorial looks troublesome. – Fimpellizzeri Nov 27 '17 at 20:48
  • @JackD'Aurizio who knows – Fimpellizzeri Nov 27 '17 at 20:49
  • 2
    Interesting question! (+1) Based on numerical simulation, it seems to me that for large $x$, the fractional part $n \mapsto \{ x/(2\pi n) \}$ experiences a sudden transition from behaving like a sample of i.i.d. random numbers uniformly drawn from $[0, 1]$ to behaving like a deterministic curve $u \mapsto \{1/(2\pi u) \}$ w.r.t. the rescaled time $u = n/x$. So I am also prone to believe that the $\sup_{[0,x]}f$ diverges but probably at the speed much slower than $\log x$. Of course this explanation provides no single improvement to the current situation... :( – Sangchul Lee Nov 28 '17 at 02:27
  • 1
    I found a reference to Hardy/Littlewood, stating that the function is unbounded, see my comment to the (duplicate) question here: https://math.stackexchange.com/questions/2630757/is-fx-sum-n-1-infty-frac1n-sin-left-fracxn-right-bounded?noredirect=1#comment5432612_2630757. – Martin R Feb 01 '18 at 06:50
  • 1
    http://onlinelibrary.wiley.com/doi/10.1112/jlms/s2-4.3.385/full –  Feb 01 '18 at 07:35
  • 1
    https://www.sciencedirect.com/science/article/pii/S0377042712002804 –  Feb 01 '18 at 07:41
  • 1
    http://www.numdam.org/article/RSMUP_1984__72__373_0.pdf –  Feb 01 '18 at 07:41
  • 1
    https://www.cs.purdue.edu/homes/wxg/slidesHL.pdf –  Feb 01 '18 at 08:47
  • 1
    @HJol: this entirely solves the problem. Hardy and Littlewood approach is elementary and beautiful, it might be interesting to summarize their proof in an answer. Other proofs have interesting parts, too (a variation of the Denjoy-Koksma inequality and the idea to exploit $\mathcal{L}^{\color{red}{-1}}f$), so this might be a good moment for outlining classical techniques for similar problems. – Jack D'Aurizio Feb 01 '18 at 10:17
  • @JackD'Aurizio , do it yourself and good luck ;) –  Feb 01 '18 at 10:43

2 Answers2


Thanks to MartinR and HJol for referencing a classical result of Hardy and Littlewood, which I am going to outline. The final result is

$$ \sum_{n\geq 1}\frac{1}{n}\,\sin\frac{x}{n}=\Omega\left(\sqrt{\log\log x}\right)\text{ as }x\to +\infty.\tag{FR}$$


Let $q$ be a number of the form $\prod p^\alpha$, where each prime $p$ is of the form $4n+1$. Since $\mathbb{Z}[i]$ is a UFD, the set of $q$s is the set of odd numbers which can be represented as a sum of two coprime squares. Let $K=\prod_{q\leq 4k+1}q$, $x=\frac{\pi}{2}K$ and $x_j=(4j+1)x$ for any $j\in[1,K]$. By introducing $$ Q^*(x) = \sum_{n=1}^{K}\frac{1}{n}\,\sin\frac{x}{n}$$ it is enough to approximate $Q^*$ evaluated at each $x_j$. If $n\mid K$ then $\frac{K}{n}$ is a number of the form $4n+1$ and $\sin\left(\frac{x_j}{n}\right)=1$, hence $$ Q^*(x_j) = \sum_{n\mid K}\frac{1}{n}+\sum_{\substack{1\leq n \leq K\\ n\nmid K}}\frac{1}{n}\,\sin\left(\frac{x_j}{n}\right)=\lambda(k)+R(x_j)$$ and $$ \frac{1}{K}\sum_{j=1}^{K}Q^*(x_j) = \lambda(k)+\frac{1}{K}\sum_{\substack{1\leq n \leq K\\ n\nmid K}}\frac{1}{n}\sum_{j=1}^{K}\sin\frac{(4j+1)x}{n}.$$ In the inner sum on the right, $\frac{2x}{n}$ differs from $\pi$ by at least $\frac{C}{n}$, such that $\left|\sum_{j=1}^{K}\sin\frac{(4j+1)x}{n}\right|\ll\frac{1}{\left|\sin\frac{2x}{n}\right|}=O(n)$ and the LHS of the above equation is $\geq \lambda(k)(1-\varepsilon)$. On the other hand it is trivial that $$ \lambda(k)\geq \sum_{q\leq 4k+1}\frac{1}{q}.$$ Let $N(n)$ be the number of $q$ which do not exceed $n$. By sieve methods it is well-known that $N(n)\sim\frac{Dn}{\sqrt{\log n}}$, hence the inequality

$$ \sum_{q\leq 4k+1}\frac{1}{q}=\sum_{n\leq 4k+1}\frac{N(n)-N(n-1)}{n}\geq \sum_{n\leq 4k+1}\frac{N(n)}{n(n+1)} $$ finishes the proof.

Similarly it can be proved that

$$ \sum_{n\geq 1}\frac{1}{n}\,\cos\frac{x}{n}=\Omega\left(\log\log x\right)\text{ as }x\to +\infty.\tag{CR}$$

Alternative approaches: to replace $\sum_{n=m}^{M}\frac{1}{n}\sin\frac{x}{n}$ with a similarly shaped integral and to carefully estimate the error terms. A classical tool for performing such manipulations is the Denjoy-Koksma inequality. It can be combined with Van Der Corput's trick and known inequalities in order to produce interesting generalizations, as shown by Flett and Codecà.

Jack D'Aurizio
  • 338,356
  • 40
  • 353
  • 787

[Edit: according to https://math.stackexchange.com/q/182491 Hardy and Littlewood originally showed that this function was unbounded]

A probabilistic argument on the head $\sum_{n\leq N}\tfrac1n\sin(x/n)$ shows that it must take a $O(1)$ value in any interval of length $\Omega(N).$ The tail $\sum_{n>N}\tfrac1n\sin(x/n)$ is $O(1/N)$-Lipschitz so varies by $O(1)$ over this interval. So if the head is unbounded above but $f$ is bounded above, then $f$ is unbounded below.

This doesn't rule out the possibility that $f$ is bounded above or below as $x\to+\infty,$ though it must be unbounded above and below on $\mathbb R$ by oddness.

In more detail: define

$$f_{\leq N}(x)=\sum_{n\leq N}\frac1n\sin\left(\frac xn\right),$$ $$f_{> N}(x)=\sum_{n> N}\frac1n\sin\left(\frac xn\right).$$

For any $x,$ $$\frac1{2N}\int_{x-N}^{x+N}f_{\leq N}(t)dt=\frac1{2N}\sum_{n\leq N}\left[-\cos\left(\frac tn\right)\right]_{x-N}^{x+N}\leq 1.$$ So there is some $t\in[x-N,x+N]$ with $\sum_{n\leq N}\frac1n\sin\left(\frac tn\right)\leq 1.$ To bound the Lipschitz constant of the tail we compute $$\left|\frac{d}{dx}f_{>N}(x)\right|=\left|\sum_{n>N}\frac1{n^2}\cos\left(\frac xn\right)\right|\leq 1/N.$$

Assume $x$ satisfies $f_{\geq N}(x)\geq C\log N$ and $f(x)\leq\tfrac 12C\log N.$ Then $f_{>N}(x)\leq -\tfrac 12C\log N.$ We showed there is a $t\in[x-N,x+N]$ with $f_{\leq N}(t)\leq 1,$ and the Lipschitz bound gives $f_{>N}(t)\leq 1-\tfrac 12C\log N.$ So $f(t)\leq 2-\tfrac 12C\log N$ which is unbounded below.

  • 24,775
  • 15
  • 47