The question I think is a simple one, but I've been unable to answer or find an answer for it yet:

There's a simple game: if you flip heads you win a dollar (from the house), but if you flip tails you lose a dollar (to the house).

If I start with n dollars (and the house has infinite money), how many flips can I expect to do before I've lost all my money? This is different than the common question of how many flips can I do before I have a run of length 'n'. In this case you can lose your money by never having a run of length more than 2, for example, simply by repeating win 1, lose 2, win 1, lose 2, etc...

I can write out a decision tree on this, but I haven't been able to generalize it into a formula yet.

Kang Su
  • 193
  • 1
  • 4
  • [This question](http://math.stackexchange.com/q/153123/25554) uses a biased coin, but it's otherwise the same as yours, so the answers there might be helpful. – MJD Jul 16 '12 at 16:50
  • 4
    [Wikipedia says](http://en.wikipedia.org/wiki/Random_walk): "If $a$ and $b$ are positive integers, then the expected number of steps until $a$ one-dimensional simple random walk starting at $0$ first hits $b$ or $−a$ is $ab$." If the house has infinite money, $b$ goes to infinity, and so does the expected game duration $ab$. – hmakholm left over Monica Jul 16 '12 at 16:56
  • @HenningMakholm: So even if $a=1$? – Thomas Jul 16 '12 at 17:10
  • @Thomas: If you wish to hit $-1$ and you went to $1$ instead then there is a dramatic decrease of the probability that you will pass through $-1$ in the next step. – Asaf Karagila Jul 16 '12 at 17:12
  • @AsafKaragila: ahh yes. I see. thanks. – Thomas Jul 16 '12 at 17:13
  • 3
    @Thomas: It must be either infinite for every $n$ or finite for every $n$. -- If the expected length for $n=1000$ is infinite, then the expected length for $n=1$ must account for the risk that we lose each of the first 999 games and then find ourselves in the $n=1000$ situation. So the expected length at $n=1$ is _at least_ $2^{-999}(999+\infty)$, which is infinite. – hmakholm left over Monica Jul 16 '12 at 17:32
  • ... or rather, "win each of the first 999 games", of course. – hmakholm left over Monica Jul 16 '12 at 17:39
  • 1
    If you have a probability textbook, look for "gambler's ruin" in the index. – GEdgar Jul 16 '12 at 17:48
  • Would the series fail to converge if you tried to compute the number of flips before you had an N% chance of going bust? – Dan Is Fiddling By Firelight Jul 16 '12 at 21:16
  • @Dan: No - the probability of going bust eventually is $1$ (compared with $\frac{a}{a+b}$ if the bank has a finite amount) so the expected time to have an N% chance of going bust is finite, if N% < 100%. – Henry Jul 17 '12 at 07:42

5 Answers5


Here is a direct calculation: Let $f(n)$ be the expected length of the game when we start with $n$ dollars. Obviously $f(0)=0$, and for $n\ge 1$ we must have $$ f(n) = 1 + \frac12 f(n-1) + \frac12 f(n+1)$$ Thus, if $f(n+1)$ is infinite, $f(n)$ will also be infinite, since half of infinity is itself infinity, and adding $1+\frac 12 f(n-1)$ can only make that worse. So if $f(n)$ is infinite for any $n$ it must be infinite all the way down to $f(1)$.

Okay, let us assume that $f(1)$ is finite and see if we can reach a contradiction. The above formula can be rearranged (with $k=n+1$) to give $$ f(k) = 2f(k-1) - f(k-2) - 2$$ so knowing $f(0)$ and $f(1)$ will enable us to calculate $f(n)$ for any $n$. Let's set $f(1)=a$ and see where that gets us: $$ f(0)=0 $$ $$ f(1)=a $$ $$ f(2)=2a-2 $$ $$ f(3)=3a-6 $$ $$ f(4)=4a-12 $$ at which point we notice a pattern and conjecture $$ f(n) = na - n(n-1) = n(a+1-n) $$ This definition fits the recurrence (some tedious but straightforward algebra omitted here), so it must be right. On the other hand, it says that $f(n)$ becomes negative when $n>a+1$, so no matter what finite value we take $a$ to be, we get absurd results. The only way out is to conclude that $f(1)$ was not finite after all.

hmakholm left over Monica
  • 276,945
  • 22
  • 401
  • 655

Here is a solution where one computes nothing.

Let us consider $t_k$ the expected number of flips until one is out of money when one's initial fortune is $k$ and let us compute $t_1$. After the first flip, either one is out of money or one's fortune is $2$, thus $t_1=\frac12(1)+\frac12(1+t_2)$. To compute $t_2$, note that, to be out of money when one's initial fortune is $2$, one needs to lose $1$, which takes in the mean $t_1$ flips, and then to lose $1$ again, which again takes in the mean $t_1$ flips. Thus $t_2=2t_1$ and $t_1=1+t_1$.

The equation $t_1=1+t_1$ has a unique solution, which is $t_1=+\infty$.

  • 271,033
  • 27
  • 280
  • 538
  • This is nice because it generalizes to a biased coin. If you lose with probability $p$, then $t_2=2t_1$ still; but now $t_1 = p + (1-p)(1 + t_2) = 1 + (1-p)(2t_1) \implies t_1 = 1/(2p-1)$, which diverges as $p\rightarrow 1/2+$. – mjqxxxx Jul 16 '12 at 21:38
  • I'm off to Reno... – copper.hat Jul 19 '12 at 07:31
  • @copper.hat Too bad you are... The expected time until out of money is infinite but the time until out of money is finite with full probability (as you know), hence you WILL get broke (almost surely, as they say). – Did Jul 19 '12 at 07:38
  • I'm broke already, I'll just watch :-). – copper.hat Jul 19 '12 at 08:26
  • @copper.hat Whew. Relieved I am. – Did Jul 19 '12 at 08:34

The expected length of the coin flip game is infinite, as shown as Corollary 1 in this package. If two people start with $m$ and $n$ coins, the expected length is $mn$

Ross Millikan
  • 362,355
  • 27
  • 241
  • 432
  • Thanks for all the answers from all. Very informative. I'm extremely surprised by this. I find it very surprising that the length is linear with respect to both $m$ and $n$ (but I believe it). With that said, I do plan to write a little program to test this. :-) – Kang Su Jul 16 '12 at 21:23

Your question can be described via a simple random walk on $\mathbb Z$: Choose i.i.d. random variables $X_1, X_2, \ldots$ such that

$$P[X_i = +1] = P[X_i = -1] = \frac 12$$

$X_i$ describes the outcome of the $i$-th toss of a coin: $+1$ means we win one dollar, $-1$ we lose one dollar. Furthermore, let $S_k = \sum_{i=1}^k X_i$ be the amount of money lost or won until time $k$.

Now, for a positive integer $n$ (which describes the initial amount of money we own), define $$T_{-n} = \inf\{k\in \mathbb N\mid S_k = -n\}$$ to be the number of coin tosses until we go bankrupt. Then we have

Lemma: $P[T_{-n} > k] = \Theta(1/\sqrt k) \qquad \text{ as }k\to \infty$

which is to say: The probability of not being bankrupt after the $k$-th toss of a coin decreases like $1/\sqrt{k}$ (in particular, this goes to zero; so we will go bankrupt, eventually). On the other hand the calculation $$E[T_{-n}]= \sum_{k=1}^\infty P[T_{-n}>k] \approx \sum_{k=1}^\infty \frac{1}{\sqrt{k}} = \infty$$ shows

Corollary: $E[T_{-n}] = \infty$

which says, we may well have to wait a very very long time before we go bankrupt.

In an attempt to be self-contained. Here is

Proof of the Lemma: The proof involves two steps:

Claim 1: We have $$P[S_n = 2k-n] = \binom{n}{k}2^{-n}$$ for $k\le n$, and $P[S_n = x] = 0$ for all other $x$.


Claim 2: For $k>0$ we have $$P[T_{-n} \le k] = P[S_k \notin (-n, n]\,] $$

For Claim 1 just note that in order for $S_n = 2k-n$, we need to win $k$ times and lose $n-k$ times and there are $\binom nk$ possibilites to choose $k$ winners out of $n$.

For Claim 2 write $P[T_{-n} \le k] = \sum_{b = -\infty}^\infty P[T_{-n} \le k, S_k = b]$ and notice that for $b > -n$ we have $$P[T_{-n} \le k, S_k = b] = P[S_k = -2n - b]$$ The last assertion is obtained by reflecting each path visiting $-n$ and ending at $b$ at the first time it hits $-n$. So that for the second part of its path (after first hitting $-n$), all the values of the $X_i = \pm 1$ get swapped to $X_i = \mp 1$. This gives a one-to-one correspondence between paths to $b>-n$, which visit $-n$ and paths to $-2n-b$. (I'm sure this explanation for Claim 2 is hardly understandable, but I can't come up with a better explanation... This is sometimes called the reflection principle). So

\begin{align} P[T_{-n} \le k] &= \sum_{b = -\infty}^\infty P[T_{-n} \le k, S_k = b] \\ &= \sum_{b\le -n} P[S_k = b] + \sum_{b> -n} P[S_k = -2n -b] \\ &= P[S_k = -n] + 2P[S_k < -n] \\[6pt] &= P[S_k \notin (-n,n]\, ] \end{align}

Therefore (even values of $k$ suffice by monotonicity)

\begin{align} P[T_{-n} > 2k] &\ge P[S_{2k} = 0\, ] = \binom{2k}{k} 2^{-2k} \sim \frac{1}{\sqrt{\pi k}} \\ P[T_{-n} > 2k] &\le 2n \cdot P[S_{2k} = 0\, ] = 2n\cdot \binom{2k}{k} 2^{-2k} \sim \frac{2n}{\sqrt{\pi k}} \end{align} q.e.d.

  • 15,078
  • 1
  • 53
  • 96

Assuming this is a symmetric simple random walk (with probability of gaining $1$ unit fortune equal to $0.5$), where you start with a fortune of $n$, it is a simple matter of calculating the expected time of hitting the origin, $0$. That is the expected time to reach $0$.

Since I have assumed the random walk to be symmetric, the probability the random walk, say $S$, hits the point $0$ for the first time at the kth step, starting from n is same as the probability that the random walk hits $n$ for the first time, starting from $0$, in the kth step, under the assumption that the gambler is allowed to go in debt as well. (the assumption is made, NOT to the original problem, but the equivalent one I devised where the gambler starts from fortune = $0$ and has to reach a fortune $n$)

This probability is:

$$f_{n}(k) = \frac{|k|}{n}\mathbb{P}(S_{k} = n)$$

where $$\mathbb{P}(S_{k} = n) = {k\choose \frac{1}{2}(k + n)}0.5^{k} $$

With the probability mass function, $f_{n}(k)$ available, you can readily compute the expectation.

Source: Probability and Random Processes, by Geoffrey Grimmett and David Stirzaker (3ed)

Please correct me, if I made any errors in my reasoning.

EDIT: As mentioned by others, the expectation computed by the aforementioned method doesn't converge, i.e. goes to $\infty$. Still, I would like to see if my reasoning is correct!

Good question!