Let $X_{i}$, $i=1,2,\dots, n$, be independent random variables of geometric distribution, that is, $P(X_{i}=m)=p(1-p)^{m-1}$. How to compute the PDF of their sum $\sum_{i=1}^{n}X_{i}$?

I know intuitively it's a negative binomial distribution $$P\left(\sum_{i=1}^{n}X_{i}=m\right)=\binom{m-1}{n-1}p^{n}(1-p)^{m-n}$$ but how to do this deduction?

  • 33,152
  • 4
  • 26
  • 73
  • 441
  • 1
  • 4
  • 5
  • I think the probabilistic interpretation leads quite naturally to the desired formula. One could do an induction on $n$ and use convolution, but that is less informative. – André Nicolas Nov 02 '13 at 01:28
  • 1
    I think the language interpretation cannot be treated as math deduction. I know I should use convolution, but could anyone teach me that? – TonyLic Nov 02 '13 at 01:52
  • 3
    Typo: $P\left(\sum_{i=1}^{n}X_{i}=n\right)$ should be replaced by: $P\left(\sum_{i=1}^{n}X_{i}=m\right)$ – drhab Nov 02 '13 at 12:12
  • @AndreNicolas what's the probabilistic interpretation? – Bananach Oct 12 '21 at 07:43

2 Answers2


Let $X_{1},X_{2},\ldots$ be independent rvs having the geometric distribution with parameter $p$, i.e. $P\left[X_{i}=m\right]=pq^{m-1}$ for $m=1,2.\ldots$ (here $p+q=1$).

Define $S_{n}:=X_{1}+\cdots+X_{n}$.

With induction on $n$ it can be shown that $S_{n}$ has a negative binomial distribution with parameters $p$ and $n$, i.e. $P\left\{ S_{n}=m\right\} =\binom{m-1}{n-1}p^{n}q^{m-n}$ for $m=n,n+1,\ldots$.

It is obvious that this is true for $n=1$ and for $S_{n+1}$ we find for $m=n+1,n+2,\ldots$:

$P\left[S_{n+1}=m\right]=\sum_{k=n}^{m-1}P\left[S_{n}=k\wedge X_{n+1}=m-k\right]=\sum_{k=n}^{m-1}P\left[S_{n}=k\right]\times P\left[X_{n+1}=m-k\right]$

Working this out leads to $P\left[S_{n+1}=m\right]=p^{n+1}q^{m-n-1}\sum_{k=n}^{m-1}\binom{k-1}{n-1}$ so it remains to be shown that $\sum_{k=n}^{m-1}\binom{k-1}{n-1}=\binom{m-1}{n}$.

This can be done with induction on $m$:


  • 141,227
  • 9
  • 70
  • 190

Another way to do this is by using moment-generating functions. In particular, we use the theorem, a probability distribution is unique to a given MGF(moment-generating functions).
Calculation of MGF for negative binomial distribution:
$$X\sim NegBin(r,p),\ P(X=x) = p^rq^x\binom {x+r-1}{r-1}.$$ Then, using the definition of MGF: $$E(e^{tX})=\sum_{x=0}^{\infty}p^rq^x\binom {x+r-1}{r-1}*e^{tx} = p^r(1-qe^t)^{-r}=M(t)^r$$, where $M(t)$ denotes the moment generating function of a random variable $Y \sim Geo(p)$. As, $$E(e^{t(X_1+X_2+\dots+X_n)})=\prod_{i=1}^nE(e^{tX_i})$$(since they are independant), we are done.

  • 1,017
  • 11
  • 20
  • Is it then, that the expectation of the sum of the $n$ iid geometrically distributed random variables $X_i$ is: $E(\sum_{i=1}^n X_i) = \prod _{i=1}^n E(e^{X_i}) $ ? – kentropy Apr 15 '18 at 20:37
  • No, where did I imply this? If there is some confusion regarding MGF , MGF of a random variable X is defined as: $MGF(t) = E(e^{tX})$, when the right hand side converges. – IamThat Apr 16 '18 at 19:18