I am stuck on the following problem. I believe that my solution is right so far,
but I do not know how to finish the problem. Ideally, I would like to do this
problem without using moment generating functions or the idea of the convolution. Maybe that is not a realistic goal.
Thanks,
Bob
Problem:
Let $X$ and $Y$ be independent binomial r.v.'s with parameters $(n,p)$ and $(m,p)$,
respectively. Let $Z = X + Y$. What is the distribution of $Z$?
Answer:
\begin{eqnarray*}
P(Z = k) &=& \sum_{i = 0}^{k} P(X = i)P(Y = ki) \\
P(Z = k) &=&
\sum_{i = 0}^{k} {n \choose i}p^i(1p)^{ni}
{m \choose {ki} } p^{ki}(1 p)^{m (ki)} \\
P(Z = k) &=&
\sum_{i = 0}^{k} {n \choose i}p^k(1p)^{ni}
{m \choose {ki} } (1 p)^{m k+i} \\
P(Z = k) &=&
\sum_{i = 0}^{k} {n \choose i}p^k(1p)^{n+mk}
{m \choose {ki} } \\
\end{eqnarray*}
Asked
Active
Viewed 1,673 times
0
Bob
 3,126
 2
 14
 38

2Reference: https://math.stackexchange.com/questions/1176385/sumoftwoindependentbinomialvariables – asdf Jul 02 '18 at 10:59
2 Answers
0
$\sum_{i=0}^{k}\binom n i \binom m {ki} = \binom {n+m} k$ is a standard equality, use it to finish the proof
Sumit Kumar Kar
 113
 5
0
You can go on with:$$=p^k(1p)^{n+mk}\sum_{i=0}^k\binom{n}{i}\binom{m}{ki}=p^k(1p)^{n+mk}\binom{n+m}k$$
Btw, you can also deduce more directly that the sum of two independent binomials with equal parameter $p$ is binomial again.
This based on the fact that a binomial is actually a sum of iid Bernoulli distributed random variables.
See here for that.
drhab
 141,227
 9
 70
 190