This is follow up question on this: How does $ \sum_{p<x} p^{s} $ grow asymptotically for $ \text{Re}(s) < 1 $? There it is stated that: $$ \sum_{p\leq x}p^{s}= \mathrm{li}(x^{1s}) + O\left(\frac{x^{1s}}{1s}e^{c\sqrt{\log(x)}}\right). $$ Does the correctness of Riemann's Hypothesis imply a better bound, especially if $\mathrm{Re}(s)=0$? Or are there any subsets of primes, e.g. primes of the form $6n\pm1$, for which the bound gets significantly better?

I believe it would  in fact any improved zerofree region for $\zeta(s)$ should translate into an improved error term. (Maybe one has to be careful for $\Re s < 1/2$, since it's a little more "natural" to consider the sum over all prime powers instead of all primes, and that has to be corrected for.) – Greg Martin Dec 30 '11 at 22:42

@Greg Martin: But wouldn't a sum over prime powers just translate into $\sum p^{ks}$? I don't see the connection to $\Re (s)<1/2$. And: Is it possible to decrease the error by adding more addends, like ones containing the roots of the Zeta function? – draks ... Dec 31 '11 at 08:24

1Yeah, I should be more explicit. The reason I say that a sum over prime powers is "natural" is because $\log \zeta(s) = \sum_p \log (1p^{s})^{1}$, which is $\sum_p \sum_{k=1}^\infty \frac1k (p^{s})^k = \sum_{p^k} \frac1k (p^k)^{s}$. Therefore $\sum_p p^{s}$ is going to equal something like $\sum_{n=1}^\infty \frac{\mu(n)}n \log\zeta(ns)$. – Greg Martin Jan 01 '12 at 05:18

@GregMartin: Ah, got it. And why do we have to be careful when $\Re (s)<1/2$, especially $=0$? Shouldn't this lead to a more simple thing? b.t.w.: Happy new year! – draks ... Jan 01 '12 at 13:29

1$e^{c\sqrt{\log x}}$ is the kind of error term that shows up in the Prime Number Theorem and related work, and is based on current knowledge of the zeros of zeta. Under RH, it can be replaced (in the PNT) with $x^{(1/2)+\epsilon}$, so I speculate that the same would be true for the error term under consideration. – Gerry Myerson Jan 01 '12 at 16:39

@GerryMyerson: So do I. Do you know, if one has to include term that sums over the zeros of zeta, like $\sum_{\sigma}\text{li}(x^\sigma)$, to get the result correct, despite the error? – draks ... Jan 01 '12 at 16:54

Sorry, that's beyond me. – Gerry Myerson Jan 01 '12 at 19:35
3 Answers
The key to the proof in my other answer was the quantitative prime number theorem $$\pi(x)=\text{li}(x)+O\left(xe^{c\sqrt{\log x}}\right),\ \ \ \ \ \ \ \ \ \ (0)$$ along with partial summation. Because we can use partial summation, all that really matters is the case $s=0$, and this case, which is looking at $\pi(x)$, tells us about everything else. The Riemann Hypothesis implies that $$\pi(x)=\text{li}(x)+O\left(x^{\frac{1}{2}}\log x\right),\ \ \ \ \ \ \ \ \ \ \ \ (1)$$ and we will look at why this is true later on. For now, lets look at the consequence, and what happens to the sum $\sum_{p\leq x}p^{s}$. Going back to the other proof, the error term was just $$t^{s}\left(\pi(t)\text{li}(t)\right)\biggr_{2}^{x}+s\int_{2}^{x}t^{s1}\left(\pi(t)\text{li}(t)\right)dt$$ which after substituting $(1)$ becomes $$O\left(x^{\text{Re}(s)+\frac{1}{2}}\log x+s\int_{2}^{x}t^{\text{Re}(s)\frac{1}{2}}\log tdt\right).$$ The integral is then $$\ll\frac{s}{\text{Re}(s)\frac{1}{2}}x^{\text{Re}(s)+\frac{1}{2}}\log x$$ so that for $\text{Re}(s)\neq\frac{1}{2}$, $\text{Re}(s)<1$, $$\sum_{p\leq x}p^{s}=\text{li}\left(x^{1s}\right)+O\left(\frac{s}{\text{Re}(s)\frac{1}{2}}x^{\text{Re}(s)+\frac{1}{2}}\log x\right).$$ The cases, $\text{Re}(s)=\frac{1}{2}$ and $\text{Re}(s)=1$ are special and must be dealt with separately. For example $$\sum_{p\leq x}p^{\frac{1}{2}+i\gamma}=\text{li}\left(x^{\frac{1}{2}i\gamma}\right)+O\left(\gamma\log^{2}x\right).$$ (We do not consider $\text{Re}s>1$, since the series converges absolutely there.) Notice that if I choose $\epsilon>0$ we can actually remove the denominator concerning $s\frac{1}{2}$. This is done by looking at the two cases, and then taking minimums so the error depends only on $\epsilon$. In particular $$\sum_{p\leq x}p^{s}=\text{li}\left(x^{1s}\right)+O_\epsilon\left(sx^{\text{Re}(s)+\frac{1}{2}+\epsilon}\right).$$
Remark: I realized that in my last post I might of been a bit careless about complex $s$. Some real parts need to be put in for the bounding to make sense, and $s$ in some places as well, all of which can be ignored for real $s$.
Why do we have equation (1)? This is quite an important question, and I won't give a complete answer here. For a complete proof see Titchmarsh's book, or Montgomery and Vaughn's Multiplicative number theorem.
Using some complex analysis (we need some lemmas bounding certain things so everything works out nicely) we can prove that
$$
\sum_{p^k\leq x} \log p=x\sum_{\rho:\zeta(\rho)=0}\frac{x^\rho}{\rho}\frac{\zeta'(0)}{\zeta(0)}.
$$
The left hand side is a step function which jumps on the prime powers (often written as $\psi(x)=\sum_{n\leq x}\Lambda(n)$ whereas the right hand side is a continuous function plus a sum over all of the zeros of the function zeta function. The zeros magically conspire at prime powers to make this conditionally convergent series suddenly jump. We can remove the trivial zeros and create an error bounded by $\log x$, so that this sum really depends on the zeros of zeta. Specifically, if we can bound the real part of the zeros, then we can bound this error term. (Being careful about convergence and all that, and taking certain limits properly) The best bound possible is $\text{Re}(s)=\frac {1}{2}$, which is why the best error will be just slightly larger then $\sqrt{x}$. (About $\log^2x$ larger) Using partial summation then takes us to a bound for $\pi(x)$, in particular we get $(1)$.
I hope this gives an idea why it is true, I suggest looking in some of those books. Another good question to ask is why does equation $(0)$ hold? This requires even more time to prove, as we need construct a zero free region for $\zeta(s)$. (Again this will be in Montgomery and Vaughn's book)
Hope that helps,
 17,825
 7
 59
 171
 69,703
 11
 166
 260

Just 3 comments: i.) I stared at the inequalities after "The integral is then" and they still look the same to me? ii.) I think $s$ is not part of the integral, or does an additional $s$ appear here? iii.) when you deal with the case $\Re(s)=1/2$, you take $s=1/2i\gamma$. I think you meant $\text{li}(x^{1/2+i\gamma})$ on the RHS, right? – draks ... Jan 03 '12 at 08:51

(i) You are right, I think I changed something at some point, so the same expression appeared twice. (ii) The $s$ appears because there is an $s$ in front of the integral, and we have to take that into account. There might be a way to deal with everything to remove the $s$, I don't believe it is necessarily, but for the quick bounding of the integral that I did above, it should remain. – Eric Naslund Jan 03 '12 at 10:00

(iii) Well this is just the way I wrote things. Notice $p^{\frac{1}{2}+i\gamma}=p^{(\frac{1}{2}i\gamma)}$ so $s=\frac{1}{2}i\gamma$. Maybe it would be clearer with opposite signs, but it is correct as is. (Ran out of space in last comment) – Eric Naslund Jan 03 '12 at 10:01

And for $Re(s)=1/2$ and $\gamma = 0$, is $li(\sqrt{x}) + O(x^\epsilon )$ the best asymptotic behaviour possible under the RH? Because here $O(\log^2 x)$ makes no sense. Or would it be $O(1)$? – user3141592 Mar 27 '17 at 20:40
The exact answer is theorem IV due to von Mangoldt and reproduced in Landau"s book "Handbuch der Lehre von der Verteilung der Primzahlen", visible at Google books.
 41
 1

2

I have a paper entitled "The Riemann Hypothesis concerning the zeta function" that contributes to solve this problem but I don´t know how to send it to you. My email aldopperetti@gmail.com – apperetti Dec 29 '17 at 19:30
Consider the analogous problem for $$ \sum_{p^k < x} \frac{\log p}{p^{ks}} $$ Perron's formula tells us that this is equal to $$ \frac{1}{2\pi i} \int_{c  i \infty}^{c + i \infty} \frac{\zeta'}{\zeta} ( s + w) \frac{x^w}{w} dw $$ Shifting contours we collect a pole at $w = 1  s$ and a pole at every zero of $\zeta(s)$ that is when $w = \rho  s$ with $\rho$ a zero of $\zeta(s)$. Therefore (heuristically) we derive the following formula $$ \sum_{p^k < x} \frac{\log p}{p^{ks}} = \frac{x^{1 s}}{1  s}  \sum_{\rho} \frac{x^{\rho  s}}{\rho  s} $$ There are some delicate convergence issues here, and also we are missing some small negligible terms, but for simplicity let's not worry about them.
The way to think about this formula is as follows: The main term is $x^{1s}/(1s)$. When the imaginary part of $s$ is very small the main term dominates and the contribution of the zeros is negligible. When the imaginary part of $s$ is very large one sees only the zeros and in particular those zeros $\rho$ with imaginary part close to the imaginary part of $s$ because of the factor $\rho  s$ in the denominator. There is also an medium range in which both the zeros and the main term contribute.
Note also that the Riemann Hypothesis always gives a good bound for this sum since then the real part of all the zeros is then $1/2$. Provided that $\Re s  1/2 > \epsilon$, the Riemann Hypothesis gives the bound $$ \frac{x^{1s}}{1s} + O(x^{1/2  \Re s} x^{\varepsilon} + x^{\varepsilon}) $$ This does not immediately follow from the heuristic formula above. Notice that this is better than what follows from integration by parts (if we did what Eric did) as soon as $s > x^{\varepsilon}$.
You can pass from this formula to $$ \sum_{p < x} p^{s} $$ by integration by parts and some simple bounds for the prime powers.
 1,762
 13
 11