How can I formally prove that the sum of two independent binomial variables X and Y with same parameter p is also a binomial ?

3why was that question down voted? – bubakazouba Oct 25 '15 at 03:49
4 Answers
Let $(B_k)_k$ be a sequence of iid Bernoulli distributed random variable with $P(B_k=1)=p$ for $k=1,2,\dots$
Then $$X:=B_1+\cdots+B_n$$ is binomially distributed with parameters $n,p$ and $$Y:=B_{n+1}+\cdots+B_{n+m}$$ is binomially distributed with parameters $m,p$. It is evident that $X$ and $Y$ are independent.
Now realize that $$X+Y=B_1+\cdots+B_{n+m}$$ is binomially distributed with parameters $n+m,p$.
This spares you any computations.
 141,227
 9
 70
 190

1Thanks for your answer. Can you explain in some more details how did you get to your first result ? – Piyush Maheshwari Mar 09 '15 at 20:54

1If $X$ has binomial distribution with parameters $n,p$ then it can be identified as the number of "successes" by having $n$ independent experiments all having a probability of $p$ to succeed. Set $B_i=1$ if experiment $i$ succeeds and $B_i=0$ otherwise. This for $i=1,\dots,n$. Then the number of successes (that is $X$) equals $B_1+\cdots +B_n$. – drhab Mar 10 '15 at 09:44

Oh sorry I misread the term Bernouilli as Binomial. Thank you so much, this is very easy to reason about. – Piyush Maheshwari Mar 10 '15 at 17:35
Just compute. Suppose $X \sim \def\Bin{\mathord{\rm Bin}}\Bin(n,p)$, $Y \sim \Bin(m,p)$. Now let $0 \le k \le n+m$, then \begin{align*} \def\P{\mathbb P}\P(X+Y = k) &= \sum_{i=0}^k \P(X = i, Y = ki)\\ &= \sum_{i=0}^k \P(X=i)\P(Y=ki) & \text{by independence}\\ &= \sum_{i=0}^k \binom ni p^i (1p)^{ni} \binom m{ki} p^{ki} (1p)^{mk+i}\\ &= p^k(1p)^{n+mk}\sum_{i=0}^k \binom ni \binom m{ki} \\ &= \binom {n+m}k p^k (1p)^{n+mk} \end{align*} Hence $X+Y \sim \Bin(n+m,p)$.
 80,922
 5
 88
 127

2Thanks for your answer. Can you please explain how is $ \sum_{i=0}^k \binom ni \binom m{ki} $ equal to $ \binom {n+m}{k} $ ? – Piyush Maheshwari Mar 05 '15 at 19:33

11@Piyush see [Vandermonde's identity](https://en.wikipedia.org/wiki/Vandermonde%27s_identity) – polpetti Jan 03 '17 at 20:13
Another way: Suppose $X\sim$ Bin$(n, p)$ and $Y\sim$ Bin$(m, p)$. The characteristic function of $X$ is then $$\varphi_X(t) = E[e^{itX}]=\sum_{k=0}^ne^{itk}{n\choose k}p^k(1p)^{nk}=\sum_{k=0}^n{n\choose k} (pe^{it})^k(1p)^{nk}=(1p+pe^{it})^n.$$
Since $X, Y$ independent, $$\varphi_{X+Y}(t)=\varphi_{X}(t)\varphi_Y(t)=(1p+pe^{it})^n(1p+pe^{it})^m=(1p+pe^{it})^{n+m}.$$
By uniqueness, we get $X+Y\sim$ Bin$(n+m, p)$.
 1,017
 1
 13
 28
We can prove this using Moment generating function as follows if someone is not comfortable with characterstic functions as answered above:
Let $X \sim B(n_1,p_1)$ and $Y \sim B(n_2,p_2)$ be independent random variables.
We know the MGF of the Binomial distribution is as follows:
$M_X(t)=(q_1+p_1e^t)^{n_{1}},M_Y(t)=(q_2+p_2e^t)^{n_{2}} $
Since X and Y are independent
$M_{X+Y}(t)=M_X(t) \cdot M_y(t) =(q_1+p_1e^t)^{n_{1}} \cdot (q_2+p_2e^t)^{n_{2}}$
We see that we cannot express it in the form $(q+pe^t)^{n}$ and thus by uniqueness property of MGF $X+Y$ is not a binomial variate. However if we take $p_1=p_2=p$ then we have:
$M_{X+Y}(t)=M_X(t) \cdot M_y(t) =(q+pe^t)^{n_{1}} \cdot (q+pe^t)^{n_{2}}$ $=(q+pe^t)^{n_{1}+n_{2}} $
which is MGF of binomial variate with parameters $(n_1+n_2,p)$
 2,037
 14
 35