76

How to prove this identity? $$\pi=\sum_{k=-\infty}^{\infty}\left(\dfrac{\sin(k)}{k}\right)^{2}\;$$ I found the above interesting identity in the book $\bf \pi$ Unleashed.

Does anyone knows how to prove it?

Thanks.

J. M. ain't a mathematician
  • 71,951
  • 6
  • 191
  • 335
Neves
  • 5,371
  • 1
  • 31
  • 56

7 Answers7

82

Find a function whose Fourier coefficients are $\sin{k}/k$. Then evaluate the integral of the square of that function.

To wit, let

$$f(x) = \begin{cases} \pi & |x|<1\\0&|x|>1 \end{cases}$$

Then, if

$$f(x) = \sum_{k=-\infty}^{\infty} c_k e^{i k x}$$

then

$$c_k = \frac{1}{2 \pi} \int_{-\pi}^{\pi} dx \: f(x) e^{i k x} = \frac{\sin{k}}{k}$$

By Parseval's Theorem:

$$\sum_{k=-\infty}^{\infty} \frac{\sin^2{k}}{k^2} = \frac{1}{2 \pi} \int_{-\pi}^{\pi} dx \: |f(x)|^2 = \frac{1}{2 \pi} \int_{-1}^{1} dx \: \pi^2 = \pi $$

ADDENDUM

This result is easily generalizable to

$$\sum_{k=-\infty}^{\infty} \frac{\sin^2{a k}}{k^2} = \pi\, a$$

where $a \in[0,\pi)$, using the function

$$f(x) = \begin{cases} \pi & |x|<a\\0&|x|>a \end{cases}$$

Ron Gordon
  • 134,112
  • 16
  • 181
  • 296
  • 6
    On the Wolfram MathWorld web page for the sinc function it states, "The remarkable fact that the sums of $\text{sinc}(k)$ and $\text{sinc}^{2}(k)$ are equal appears to have first been published in Baillie (1978)." I remember reading that a long time ago and thinking that summing $\text{sinc}^{2}(k)$ must be really difficult if no one realized they were the same until 1978. But the evaluation is mostly straightforward as you showed. Strange. – Random Variable Mar 15 '13 at 18:56
  • 1
    @RandomVariable: Huh, how odd. Interesting find. (+1) – Ron Gordon Mar 15 '13 at 19:13
  • 2
    I accidentally plugged $\sum _{k=-\infty }^{\infty } \frac{\sin (k)}{k}$ (without the square) into Mathematica and it also gave $\pi$. I wonder why this identity with the square is the common one. The evaluation of this sum to the first few powers are: $\left\{\pi ,\pi ,\frac{3 \pi }{4},\frac{2 \pi }{3},\frac{115 \pi }{192},\frac{11 \pi }{20}\right\}$ (in case you see these on Jeapardy). The seventh power evaluates to $\frac{129423 \pi -201684 \pi ^2+144060 \pi ^3-54880 \pi ^4+11760 \pi ^5-1344 \pi ^6+64 \pi ^7}{23040}$ and the subsequent powers are all complicated-looking like this. – amr Mar 16 '13 at 20:26
  • 1
    @amr: see the reference that Random provided. It is kind of interesting that the sum over sinc equals this sum. I'd like to see it derived as something other than a coincidence. Interesting also about the seventh power thing. Even powers of sinc may be summed using Parseval similar to how I outline above. – Ron Gordon Mar 16 '13 at 20:30
  • 1
    @amr This discussion here was very helpful for me in this topic which you might wish to see: https://mathematica.stackexchange.com/questions/157444/possible-bug-in-infinite-sum-sumsink-km-k-0-%e2%88%9e/157496#157496 – Dr. Wolfgang Hintze Oct 10 '17 at 15:54
  • 1
    Since you linked this answer from [this question](https://math.stackexchange.com/q/3611785), I should point out that for $a\ge0$, we get $$ \sum_{k\in\mathbb{Z}}\frac{\sin^2(ak)}{k^2}=\pi a+\left(2\pi a-\pi^2\right)\lfloor a/\pi\rfloor-\pi^2\lfloor a/\pi\rfloor^2 $$ which agrees with your result for $a\in[0,\pi)$. – robjohn Apr 06 '20 at 07:40
  • @robjohn: very interesting result. It must use a different derivation because $a$ can be greater than the period of the Fourier series representing the function we used in the Parseval relation. – Ron Gordon Apr 09 '20 at 16:17
  • @RonGordon: The extension just brings arguments outside of $[0,2\pi)$ back into that range. The result for $a+\pi$ is the same as the result for $a$. – robjohn Apr 09 '20 at 18:49
  • @robjohn: d’oh! Of course. Thanks. – Ron Gordon Apr 09 '20 at 18:50
  • @RonGordon: BTW, the other derivation uses Poisson Summation; so, yes, it is a different approach. – robjohn Apr 09 '20 at 19:06
25

Assume $a\in\left[0,\frac\pi2\right]$.

An integral $$ \begin{align} \int_0^a\frac{\sin(2kx)}{k}\mathrm{d}x &=\int_0^a\frac{2\sin(kx)}{k^2}\mathrm{d}\sin(kx)\\ &=\left.\frac{\sin^2(kx)}{k^2}\right]_0^a\\ &=\frac{\sin^2(ka)}{k^2}\tag{1} \end{align} $$ and a sum $$ \begin{align} \sum_{k=1}^\infty\frac{\sin(2kx)}{k} &=\sum_{k=1}^\infty\frac{e^{i2kx}-e^{-i2kx}}{2ik}\\ &=\frac1{2i}\left(-\log(1-e^{i2x})+\log(1-e^{-i2x})\right)\\ &=\frac1{2i}\log(-e^{-i2x})\\[4pt] &=\frac\pi2-x\quad\text{for }x\in\left(0,\pi\right)\tag{2} \end{align} $$ Putting $(1)$ and $(2)$ together $$ \begin{align} \sum_{k=1}^\infty\frac{\sin^2(ka)}{k^2} &=\int_0^a\left(\frac\pi2-x\right)\,\mathrm{d}x\\ &=\frac\pi2a-\frac{a^2}2\tag{3} \end{align} $$ If we take $\dfrac{\sin(ka)}{ka}=1$ when $k=0$, we get the answer to the question using $a=1$: $$ \sum_{k\in\mathbb{Z}}\left(\frac{\sin(ka)}{ka}\right)^2=\frac\pi a\tag{4} $$


Application to a Riemann Sum

If we multiply $(4)$ by $a$ and set $a=\frac1n$, we get $$ \sum_{k\in\mathbb{Z}}\frac{\sin^2(k/n)}{(k/n)^2}\frac1n=\pi\tag{5} $$ $(5)$ is a Riemann sum which shows that $$ \int_{-\infty}^\infty\frac{\sin^2(x)}{x^2}\,\mathrm{d}x=\pi\tag{6} $$


First Power of Sinc

We can also use $(2)$ at $x=\frac a2$, assuming $\frac{\sin(ka)}{ka}=1$ when $k=0$, to get $$ \sum_{k\in\mathbb{Z}}\frac{\sin(ka)}{ka}=\frac\pi a\tag{7} $$ Again, multiplying $(7)$ by $a$ and letting $a=\frac1n$, we get $$ \sum_{k\in\mathbb{Z}}\frac{\sin(k/n)}{k/n}\frac1n=\pi\tag{8} $$ and $(8)$ is a Riemann sum which shows that $$ \int_{-\infty}^\infty\frac{\sin(x)}{x}\,\mathrm{d}x=\pi\tag{9} $$

robjohn
  • 326,069
  • 34
  • 421
  • 800
  • Interesting approach (+1). How easily does it generalize to other sums of powers of sincs? – Ron Gordon Mar 23 '13 at 08:52
  • @RonGordon: I noticed that $(2)$ gives the result for the first power of sinc, and then noticed that these results apply to Riemann sums. I will look into higher powers. – robjohn Mar 23 '13 at 12:15
  • Very nifty! I'd upvote you again, but...well, you know. Anyway, that's a really important observation and may also help add more insight into a question I answered yesterday about the link between Fourier transforms and Fourier series. – Ron Gordon Mar 23 '13 at 12:18
  • This is that question to which I referred: http://math.stackexchange.com/questions/329576/fourier-transform-on-the-circle/337896#337896 – Ron Gordon Mar 23 '13 at 13:14
  • @RonGordon: I generalized this to third powers in [this answer](http://math.stackexchange.com/a/453260). – robjohn Jul 27 '13 at 14:35
  • Nice technique! I love when things work out to Riemann sums. – Ron Gordon Jul 27 '13 at 14:39
  • @robjohn The method you used here (extended to higher powers) was very helpful for me in a related topic which you might wish to see: https://mathematica.stackexchange.com/questions/157444/possible-bug-in-infinite-sum-sumsink-km-k-0-%e2%88%9e/157496#157496 – Dr. Wolfgang Hintze Oct 10 '17 at 16:00
19

I'll try to tackle the sum for general powers of $\operatorname{sinc}{x}$. It seems that the general formula is $$ \begin{align} \sum_{m = -\infty}^\infty& \left(\frac{\sin{(m)}}{m}\right)^n \tag{0}\\ & = \frac{(-1)^{n}\pi}{2^{n}(n-1)!}\sum_{\ell = -\lfloor n/(2\pi)\rfloor}^{\lfloor n/(2\pi)\rfloor}\left(\sum_{k = 0}^n(-1)^k{n\choose k} (2\pi \ell - n+2k)^{n-1}\operatorname{sign}(2\pi \ell-n+2k)\right). \end{align} $$

Before I get into that, I'm going to give another solution to the problem posed originally in the question. I'll use the Poisson summation formula, namely, $$ \sum_{n = -\infty}^\infty f(x+n) = \sum_{\ell = -\infty}^\infty \hat{f}(2\pi \ell) e^{2\pi i \ell x}. $$ Here $f$ is a function with reasonable regularity and decay properties, and $$ \hat{f}(t) = \int_{-\infty}^\infty f(x)e^{-itx}\,dx $$ is the Fourier transform of $f$. The idea is of course to take take $f(x) = \operatorname{sinc}^2{x}$.


I'm going to start by proving directly that $$ \begin{align} \int_{-\infty}^\infty (\operatorname{sinc}{x})^2e^{-itx}\,dx = \int_{-\infty}^\infty \left(\frac{\sin{x}}{x}\right)^2e^{-itx}\,dx = \frac{\pi}{4}(|x-2| + |x+2| - 2|x|). \tag{1} \end{align} $$ The right hand side is just a concise expression (that happens to be suggestive of the general situation) for the triangle-shaped function that is $\pi(1-|x|/2)$ when $|x| < 2$ and $0$ otherwise. An application of the Poisson summation formula referenced above then gives $$ \sum_{n = - \infty}^\infty \left(\frac{\sin{(x+n)}}{x+n}\right)^2 = \pi, $$ for all $x \in \mathbb{R}$. Taking $x = 0$ will give the result you seek.

To prove the identity $(1)$, write $$ \operatorname{sinc}{x} = \frac{\sin{x}}{x} = \frac{1}{2}\int_{-1}^1 e^{-ixt}\,dt = \hat{g}(x), $$ where $g = (1/2)\chi_{[-1,1]}$ is one half times the characteristic function of the interval $[-1,1]$. The Fourier transform converts convolution to multiplication, so $$ \widehat{g*g}(x) = \left(\frac{\sin{x}}{x}\right)^2. $$ The right-hand side (the square of $\operatorname{sinc}{x}$) is integrable, so an application of the Fourier inversion formula gives us $$ \int_{-\infty}^\infty \left(\frac{\sin{x}}{x}\right)^2e^{itx}\,dx = 2\pi (g*g)(t) = \frac{\pi}{2}\int_{-\infty}^\infty \chi_{[-1,1]}(t-y)\chi_{[-1,1]}(y)\,dy. $$ The evaluation of the integral on the right is straightforward (but a little tedious), and it gives us $(1)$.


The method just used to prove the identity $(1)$ has the benefit of being elementary, and does generalize to higher powers of $\operatorname{sinc}{x}$ (in the sense that $(-1)^n2\pi$ times the $n$-fold convolution of $(1/2)\chi_{[-1,1]}$ with itself is the Fourier transform of $\operatorname{sinc}{x}$ raised to the $n$th power) but unfortunately the computations become quite involved. Thus, I'll use another method, one that requires some basic knowledge of distribution theory, to prove—for positive integers $n$—that $$ \begin{align} \widehat{\operatorname{sinc}^n}{(x)} &= \int_{-\infty}^\infty \left(\frac{\sin{t}}{t}\right)^ne^{-ixt}\,dt \\ & = \frac{(-1)^n\pi}{2^{n}(n-1)!}\sum_{k = 0}^n(-1)^k{n\choose k} (x - n+2k)^{n-1}\operatorname{sign}(x-n+2k). \tag{2} \end{align} $$ Applying the Poisson summation formula to this identity leads to $(0)$—almost. I still have to explain why the upper and lower bounds of summation in the right hand side of $(0)$ are finite, which I'll do now. Since $(2)$ is equal to a constant multiple of the $n$-fold composition of $\chi_{[-1,1]}$ with itself, it is supported in $[-n,n]$. (This can also be seen directly.) Therefore, in the right hand side of the Poisson summation formula, we need only sum over those indices $\ell$ satisfying $|\ell|<n/2\pi$. This is where the upper and lower bounds come from in the equation $(0)$.

So all that remains is to prove the equation $(2)$. Let me briefly outline the tools we'll need. Let $\mathscr{S} = \mathscr{S}(\mathbb{R})$ denote the Schwartz space on $\mathbb{R}$. Let $u_n : \mathscr{S} \to \mathbb R$ be the distribution defined for $\varphi \in \mathscr{S}$ by $$ u_n(\varphi) = \lim_{\epsilon \to 0^+}\int_{|x|>\epsilon} \frac{\varphi(x) - \sum_{k=0}^{n-2}\varphi^{(k)}(0)x^k/k!}{x^n}\,dx. $$ Basically, $u_n$ is the distribution best resembling the function $x^{-n}$. In fact, if $h$ is a smooth function that vanishes to order $n$ at the origin, then the distribution $h\cdot u_n$ is equal to the function $h(x)/x^n$. As one would expect from the relation $\partial^k x^{-1} = (-1)^{k}k! x^{-k-1}$, the $k$th distributional derivative of $u_1$ is given by $\partial^ku_1 = (-1)^kk!\, u_{k+1}$. This is all proved straightforwardly.

It turns out that $$ \hat{u}_1(t) = -i\pi \operatorname{sign}(t), $$ where $\operatorname{sign}(t)$ is the usual sign function that returns $0$ when $t = 0$ and $t/|t|$ otherwise. (This can be proved by regarding $\operatorname{sign}(t)$ as a limit $\lim_{k\to \infty} \chi_{(0,k]}(t) - \chi_{[-k,0)}(t)$, taking the distributional limit of the Fourier transforms, and then applying the inverse Fourier transform.) Now, for any distribution $v$, one has $$ \widehat{\partial^k v} = (it)^k \hat{v}. $$ Thus, using $u_n = (-1)^{n-1}(n-1)!^{-1}\partial^{n-1} u_1$, $$ \hat{u}_n(t) = \frac{(-1)^{n-1}}{(n-1)!}\widehat{\partial^n u_1}(t) = \frac{(-1)^{n-1}}{(n-1)!}(it)^{n-1}\hat{u}_1(t) = \frac{(-1)^{n}i^n\pi}{(n-1)!}t^{n-1} \operatorname{sign}(t). \tag{3} $$ Finally, another property of the Fourier transform dictates that $$ \widehat{e^{ihx}u_n}(t) = \hat{u}_n(x-h). \tag{4} $$ We're now prepared to evaluate the Fourier transform of $(\operatorname{sinc}{x})^n$.

Since $(\sin{x})^n$ vanishes to order $n$ at the origin, we have $$ (\operatorname{sinc}{x})^n = (\sin{x})^n x^{-n} = (\sin{x})^nu_n $$ as distributions. If we now expand $(\sin{x})^n$ into powers of $e^{ix}$, we get $$ (\sin{x})^nu_n = \frac{1}{(2i)^n} \sum_{k = 0}^n(-1)^k{n\choose k}e^{i(n-2k)x}u_n. $$ Taking the term-wise Fourier transform of the right hand side and inserting the formulas $(3)$ and $(4)$, we arrive at $$ \begin{align} \widehat{\operatorname{sinc}^n}\,(t) &= \frac{1}{(2i)^n} \sum_{k = 0}^n(-1)^k{n\choose k}\widehat{e^{i(n-2k)x}u_n}(t) \\ & = \frac{1}{(2i)^n} \sum_{k = 0}^n(-1)^k{n\choose k}\hat{u}_n(t-n+2k)\\ & = \frac{1}{(2i)^n} \sum_{k = 0}^n(-1)^k{n\choose k}\frac{(-1)^{n}i^n\pi}{(n-1)!}(t-n+2k)^{n-1}\operatorname{sign}(t-n+2k). \end{align} $$ Simplifying then gives $(2)$.

Alon Amit
  • 14,886
  • 50
  • 87
Nick Strehlke
  • 8,852
  • 2
  • 37
  • 49
  • In your equation (1), are you integrating over $x$ or $t$? You integrate over $x$ in the LHS but have a function of $x$ in the RHS. – David Zhang May 27 '16 at 04:04
16

This sum may be calculated by computing a Mellin transform of a suitable function and then inverting that to get the sum. First start by rearranging some terms, so that the target sum $S$ becomes $$ S = 1 + 2 \sum_{k\ge 1} \frac{\sin(k)^2}{k^2}.$$ Now introduce $$ f(x) = \sum_{k\ge 1} \frac{\sin(xk)^2}{k^2}$$ so that we are looking for $f(1).$ Rewrite $f(x)$ as follows: $$ f(x) = - \frac{1}{4} \sum_{k\ge 1} \frac{e^{2ixk}-2+e^{-2ixk}}{k^2} = \frac{1}{2}\sum_{k\ge 1} \frac{1}{k^2} - \frac{1}{4} \sum_{k\ge 1} \frac{e^{2ixk}+e^{-2ixk}}{k^2} \\= \frac{\pi^2}{12} - \frac{1}{4} \sum_{k\ge 1} \frac{e^{2ixk}+e^{-2ixk}}{k^2}.$$ Using Mellin transforms, we find $$\mathfrak{M}\left(\sum_{k\ge 1}\frac{e^{2ixk}}{k^2};s\right)= \Gamma(s) \sum_{k\ge 1} \frac{1}{(2ik)^s k^2} = \frac{1}{(2i)^s}\Gamma(s) \zeta(s+2).$$ Similarly, $$\mathfrak{M}\left(\sum_{k\ge 1}\frac{e^{-2ixk}}{k^2};s\right)= \Gamma(s) \sum_{k\ge 1} \frac{1}{(-2ik)^s k^2} = \frac{1}{(-2i)^s}\Gamma(s) \zeta(s+2).$$ Now observe that $$ \frac{1}{(2i)^s} + \frac{1}{(-2i)^s} = e^{-s \log(2i)} + e^{-s \log(-2i)} = e^{-s \log 2 -s i\pi/2} + e^{-s \log 2 + s i\pi/2} \\ = 2^{-s} 2 \cos(s\pi/2)$$ Putting these two together, we obtain $$ \mathfrak{M}\left(\sum_{k\ge 1}\frac{e^{2ixk}+e^{-2ixk}}{k^2};s\right) = g^*(s)= 2 \times 2^{-s} \cos(s\pi/2) \Gamma(s) \zeta(s+2).$$ We will apply Mellin inversion to this term. There is a pole from the zeta and gamma functions at $s=-1$ (which the cosine turns from a double into a single pole) and one from the gamma function at $s=0$ and another one from the gamma function at $s=-2.$ The cosine term cancels the remaining poles of the gamma term at negative odd integers and the zeta term the ones at even integers. We have $$ \operatorname{Res}(g^*(s)x^{-s}; s=0) = \frac{\pi^2}{3},$$ $$ \operatorname{Res}(g^*(s)x^{-s}; s=-1) = -2\pi x,$$ $$ \operatorname{Res}(g^*(s)x^{-s}; s=-2) = 2 x^2.$$ This yields for the Mellin inversion integral that $$ \mathfrak{M}^{-1}(g^*(s);x) = \int_{1-i\infty}^{1+i\infty} g^*(s) x^{-s} ds = 2 x^2 - 2\pi x + \frac{\pi^2}{3}.$$ Returning to $S$ we have shown that $$S = 1 + 2\left(\frac{\pi^2}{12}-\frac{1}{4} \left( 2-2\pi + \frac{\pi^2}{3}\right) \right) = 1 - \frac{1}{2} (2-2\pi) =\pi. $$

Remark, Feb 29 2020. It is proved at the following MSE link that the contribution from the left vertical line segment $\sigma \pm i\infty$ vanishes as we left-shift to $\sigma=-\infty$ when $x\in (0,\pi).$

Marko Riedel
  • 55,456
  • 3
  • 60
  • 132
11

There is a simple way to compute the sum. Note $$ \sum_{k=-\infty}^\infty\frac{\sin^2k}{k^2}=1+2\sum_{k=1}^\infty \frac{\sin^2k}{k^2}=1+\sum_{k=1}^\infty\frac{1-\cos(2k)}{k^2}=1+\frac{\pi^2}{6}-\sum_{k=1}^\infty\frac{\cos(2k)}{k^2}.$$ Letting $x=e^{2\theta i}$ in $\sum_{k=1}^\infty\frac{x^k}{k}=-\ln(1-x)$ gives us $$ \sum_{k=1}^\infty\frac{1}{k}(\cos(2k\theta)+i\sin(2k\theta))=-\ln(1-\cos(2\theta)-i\sin(2\theta)). $$ So $$ \sum_{k=1}^\infty\frac{1}{k}\sin(2k\theta)=-\Im\ln(1-\cos(2\theta)-i\sin(2\theta))=-\arctan(-\cot \theta)=\frac{\pi}{2}-\theta. $$ Integrating this derives $$ -\sum_{k=1}^\infty\frac{1}{2k^2}\cos(2k\theta)=\frac{\pi}{2}\theta-\frac{1}{2}\theta^2+C. $$ Letting $\theta=0$, we have $C=-\frac{\pi^2}{12}$. Thus $$ \sum_{k=1}^\infty\frac{1}{k^2}\cos(2k\theta)=-\pi \theta+\theta^2+\frac{\pi^2}{6}. $$ Letting $\theta=1$, we have $$ \sum_{k=1}^\infty\frac{\cos(2k)}{k^2}=-\pi+1+\frac{\pi^2}{6} $$ and hence $$ \sum_{k=-\infty}^\infty\frac{\sin^2k}{k^2}=1+\frac{\pi^2}{6}-\sum_{k=1}^\infty\frac{\cos(2k)}{k^2}=1+\frac{\pi^2}{6}-(-\pi+1+\frac{\pi^2}{6})=\pi.$$ It is easy to use the same trick to generalize this result to $\sum_{k=-\infty}^\infty\frac{\sin^2(ak)}{k^2}$. I omit the detail.

xpaul
  • 35,127
  • 2
  • 53
  • 73
7

$\newcommand{\+}{^{\dagger}}% \newcommand{\angles}[1]{\left\langle #1 \right\rangle}% \newcommand{\braces}[1]{\left\lbrace #1 \right\rbrace}% \newcommand{\bracks}[1]{\left\lbrack #1 \right\rbrack}% \newcommand{\ceil}[1]{\,\left\lceil #1 \right\rceil\,}% \newcommand{\dd}{{\rm d}}% \newcommand{\ds}[1]{\displaystyle{#1}}% \newcommand{\equalby}[1]{{#1 \atop {= \atop \vphantom{\huge A}}}}% \newcommand{\expo}[1]{\,{\rm e}^{#1}\,}% \newcommand{\fermi}{\,{\rm f}}% \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,}% \newcommand{\half}{{1 \over 2}}% \newcommand{\ic}{{\rm i}}% \newcommand{\imp}{\Longrightarrow}% \newcommand{\isdiv}{\,\left.\right\vert\,}% \newcommand{\ket}[1]{\left\vert #1\right\rangle}% \newcommand{\ol}[1]{\overline{#1}}% \newcommand{\pars}[1]{\left( #1 \right)}% \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}}% \newcommand{\root}[2][]{\,\sqrt[#1]{\,#2\,}\,}% \newcommand{\sech}{\,{\rm sech}}% \newcommand{\sgn}{\,{\rm sgn}}% \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}}% \newcommand{\verts}[1]{\left\vert\, #1 \,\right\vert}% \newcommand{\yy}{\Longleftrightarrow}$ $$ \!\!\!\!\!\sum_{k=-\infty}^{\infty}{\sin^{2}\pars{k} \over k^{2}} = \sum_{k=-\infty}^{\infty}\int_{-\infty}^{\infty}{\sin^{2}\pars{x} \over x^{2}}\, \delta\pars{x - k}\,\dd x = \int_{-\infty}^{\infty}{\sin^{2}\pars{x} \over x^{2}} \bracks{\sum_{k=-\infty}^{\infty}\delta\pars{x - k}}\,\dd x\tag{1} $$

The sum over $k$ in the right hand side of $\pars{1}$ is a periodic function of $x$ with period $1$ and it can be rewritten as follows: $$ \sum_{k=-\infty}^{\infty}\delta\pars{x - k} = \sum_{n = -\infty}^{\infty}a_{n}\expo{2n\pi\ic x}\tag{2} $$ The set of coefficients $\braces{a_{n}\ \ni\ n \in {\mathbb Z}}$ are given by: $$ 1 = \int_{-1/2}^{1/2}\expo{-2m\pi\ic x}\sum_{k=-\infty}^{\infty} \delta\pars{x - k}\,\dd x = \sum_{n = -\infty}^{\infty}a_{n}\int_{-1/2}^{1/2}\expo{2\pars{n - m}\pi\ic x}\,\dd x = a_{m} $$ which yields $\pars{~\mbox{see expression}\ \pars{2}~}$: $$ \sum_{k=-\infty}^{\infty}\delta\pars{x - k} = \sum_{n=-\infty}^{\infty}\expo{2n\pi\ic x}\tag{3} $$

We replace $\pars{3}$ in the right hand side of $\pars{1}$ to get: \begin{align} &\color{#0000ff}{\large\sum_{k=-\infty}^{\infty}{\sin^{2}\pars{k} \over k^{2}}} = \int_{-\infty}^{\infty}{\sin^{2}\pars{x} \over x^{2}} \sum_{n=-\infty}^{\infty}\expo{2n\pi\ic x}\,\dd x \\[3mm]&= \sum_{n=-\infty}^{\infty}\int_{-\infty}^{\infty} \pars{\half\int_{-1}^{1}\expo{\ic tx}\,\dd t} \pars{\half\int_{-1}^{1}\expo{\ic t'x}\,\dd t'} \expo{2n\pi\ic x}\,\dd x\tag{4} \\[3mm]&= {\pi \over 2}\sum_{n=-\infty}^{\infty} \int_{-1}^{1}\dd t\int_{-1}^{1}\dd t'\int_{-\infty}^{\infty}\expo{\pars{t + t' + 2n\pi}\ic x}\, {\dd x \over 2\pi} \\[3mm]&= {\pi \over 2}\sum_{n=-\infty}^{\infty} \int_{-1}^{1}\dd t\int_{-1}^{1}\dd t'\delta\pars{t + t' + 2n\pi} \\[3mm]&= {\pi \over 2}\sum_{n=-\infty}^{\infty} \int_{-1}^{1}\Theta\pars{1 - \verts{-t - 2n\pi}}\,\dd t = {\pi \over 2}\sum_{n=-\infty}^{\infty}\overbrace{\left.\int_{-1}^{1}\dd t \right\vert_{-1 - 2n\pi\ <\ t\ <\ 1 - 2n\pi}}^{\ds{=\ 2\,\delta_{n0}}} = \color{#0000ff}{\Large\pi} \end{align} where we use the identity $\ds{{\sin\pars{x} \over x} = \half\int_{-1}^{1}\expo{\ic tx}\,\dd t}.\quad$ See line $\pars{4}$.

Felix Marin
  • 84,132
  • 10
  • 143
  • 185
6

This is a great place to apply Abel-Plana's formula

$$\sum_{n\geqslant 0} f(n)=\int_0^\infty f(x)\, dx+\frac{f(0)}{2}+i\int_0^\infty\frac{f(ix)-f(-ix)}{e^{2\pi x}-1}\, dx$$

Allow $f(x)=\left(\frac{\sin x}{x}\right)^2$

We note that, with the fact that $f$ is even $$\sum_{n\geqslant 0} \left(\frac{\sin n}{n}\right)^2=\frac{1}{2}\left(\sum_{n\in\Bbb{Z}}\left(\frac{\sin n}{n}\right)^2-1\right)$$

We already know that $$\int_0^\infty \left(\frac{\sin x}{x}\right)^2\, dx=\frac{\pi}{2};$$ and $$\frac{\sin 0}{0}:=1$$

Also, we easily determine that:$$\left(\frac{\sin ix}{ix}\right)^2=\left(\frac{\sin (-ix)}{-ix}\right)^2=\frac{\sinh^2 x}{x^2}$$

With these, the result is obvious.

I'm sorry, I know this is a bit overpowering of a formula to use on this question, but I thought it was worth noting.

Small Margin
  • 165
  • 1
  • 6