This question was inspired by another question posted today: Monty Hall Problem Extended.

So I thought that the comments an answers brought up a great point about increasing the doors to 100 or something much larger, and using that as a way to help visualize why switching is always the best choice when trying to explain the problem to others.

And then I was thinking about the game show, Deal or No Deal. For those unfamiliar with Deal or No Deal: there are 26 cases, each containing amounts of money ranging from \$0.01 to one million dollars. You choose one case, and it's "yours" and out-of-play (this is analogous to choosing the first door in the Monty Hall problem). Throughout the game you open 24 of the remaining cases, and you see how much money was in each case.

In the end, you are left with 2 cases: "your" case, that you chose in the beginning, and the only other case you didn't open. This is where it becomes Monty Hall: you can either choose to keep your case, or switch cases and get the other one.

So what I'm wondering is, does the Monty Hall logic of "always switch doors/cases" apply here? The differences:

1) It's not a case of there being simply 1 car and a bunch of goats. All the money values are different in each case. You aren't always going to end up with a choice between a million dollars or something small... The two remaining cases might end up being \$10,000 and \$250,000. Or it might be \$10 and a million dollars. Or \$10 and $100.

2) I think part of what makes Monty Hall work is that the car always remains in play. Your first choice is a 1/26 probability of selecting the car/million dollar case. But in Deal or No Deal, the car/million dollar case can be eliminated partway through the game. So I'm thinking that probably changes things.

My first vague thoughts are... If you make it to the end and the million dollar case still is in play, Monty Hall applies and you should switch cases. Because it's the same idea; I had a 1/26 shot at the million. 24 have been eliminated. It's much more likely that the other case has the million.

But if the million is eliminated while you're playing, what then? Can Monty Hall not help us, because you can't compare the probability of selecting the million dollar case because now it's zero? I'm trying to think of a way to figure out whether or not you should switch, in an attempt to get the case with the most money in it. We know that \$1,000,000 is no longer available. But is there anything we can do to decide which case is likely to be more valuable? Or is this outside Monty Hall's bounds?

  • 8,330
  • 2
  • 23
  • 42
  • 181
  • 1
  • 1
  • 7
  • 1
    Say you opened 24 of the 25 cases and didn't find the million. That's either because you were holding the million all along (1 chance in 26), or because you managed to open all the cases except the million (25 chances in 26 that you didn't originally choose the million, times 1 chance in 25 that you left the million until the end). These are equally probable: it's still 50-50 that you're holding the million. The same applies whichever two cases are left at the end. – mjqxxxx Dec 17 '13 at 04:19
  • 1
    Don't forget about the deal aspect of the game; as far as I can recall, there's a guy who tries to buy your case off you for a given amount of dollars based on the probability of your case having the million dollars. Not sure if that'd affect the theory here though. – Lost Dec 17 '13 at 04:22
  • Agree with @mjqxxxx. An intuitive way of argument: in the Monty Hall problem, Monty helps you because he will never open the door with the car. This causes the chance of the remaining doors to contain the car to be higher. In Deal or No Deal, when we pick a suitcase to open, we could pick any of the amounts (including the million.) Hence we aren't changing up the probabilities, unlike the Monty Hall problem. – Kelvin Soh Dec 17 '13 at 04:29
  • 1
    So why not make any of these comments an answer? – SQB Dec 17 '13 at 11:00
  • @KelvinSoh This is the part where I get a little confused. I don't see the difference between these two scenarios: I pick a case, open 24, and the million is still left (either in my case or the remaining one). I pick a case, the host opens 24 he knows are not the million, and the million is still left (either in my case or the remaining one). The end situation is the same, regardless of how I got there. So why does the probability change just because I opened the cases instead of the host? – WendiKidd Dec 17 '13 at 13:50

6 Answers6


The key is: Monty knows where the car is (and will never open that door). We don't know where the million dollar is so we MIGHT open that door. For an illustration, we look at how the tree diagram differs for the two cases.

Suppose we have 3 doors, A, B and C and our car/million is in door A. We further assume we will always switch. (Once we understand this, we can extend it to $n$ doors and see that the situation will be similar.)

Case 1: Monty Hall Problem Monty Hall Tree Diagram

If we switch, $P($Win$) = \frac{2}{3}$.

Case 2: Deal or No Deal scenario Deal or no deal tree diagram

Notice our assumption in the question is we only look at the situation if the million has not been opened. So we are in essence calculating a conditional probability. If we switch,

$P($Win $|$ Million not opened$) = \displaystyle \frac{P(\textrm{Win}\cap \textrm{Million not opened})}{P(\textrm{Million not opened})} = \frac{\frac{1}{6}+\frac{1}{6}}{\frac{1}{6}+\frac{1}{6}+\frac{1}{6}+\frac{1}{6}}=\frac{1}{2}$.

Kelvin Soh
  • 1,765
  • 10
  • 14

"Odds of picking $1 million immediately: 1/26

Odds million is not picked right away: 25/26

If you get through picking 24 briefcases, and 1 million dollars still remains when given the option to switch briefcases with 2 left, the odds that the other briefcase (not the one you picked) of having $1 million is 25/26. SWITCH!"

Literally, all you need to do is replace "1 million" with "1" in this scenario and the egregious fallacies in this logic are overtly obvious.

Simply put, in a scenario with 26 cases, there is a 1/26 chance that the 1 million case is picked. Subsequently, there is a 1/25 chance that the 1 remains after RANDOM selection, given that RANDOMLY eliminating 24 of 25 cases equates to RANDOMLY selecting one case. So, what are the odds that the case picked is 1 million and the the case remaining is 1? (1/26)×(1/25)=1/650

Now, the odds of selecting the 1 case in the beginning are 1/26, and again we have a 1/25 probability that the 1 million dollar case will remain at the end. So what are the odds of selecting the 1 case and having the 1 million case remain at the end? (1/26)×(1/25)=1/650

The odds of selecting EITHER the 1 case OR the 1 million case first are 2/26. The odds that the other of these two cases will remain until the very end is again 1/25. Therefore, the probability of either of these scenarios occurring is: (2/26)×(1/25)=2/650

In conclusion, the odds that the 1 million and 1 prizes remain regardless of which was picked and which remains, is 2/650. The odds that the 1 million was the picked case is 1/650. The odds that the 1 million is the remaining case is 1/650. This means that there are only TWO scenarios where the 1 million and 1 cases are the last two cases standing, and it is EQUALLY probable that the 1 million (or 1) case is the selected case, or remaining case.

I know it sounds similar to the Monty Hall Problem, but since ALL selections are random (NOT the case in the MHP,) it really only requires application of VERY elementary statistics to determine the probability of these scenarios. There are 325 different combinations of final two cases (assuming 26 different values,) and for each occurrence of two remaining cases, regardless of value (call them x and y,) there is a 50/50 probability that x or y is the selected case or the remaining case.

  • 21
  • 1
  • This is true assuming the game is not allowed to *change* which probably contributes to the psychological effect. *Do I know they really have not switched the goats?* – mathreadler Mar 21 '16 at 16:48

The probability of correctly guessing the million dollar case in the beginning is $\frac1{26}$. So if it is left at the end then there is a $\frac1{26}$ chance that staying is the winning choice (numbers will be fixed later).

Now consider switching. The probability of incorrectly guessing the million dollar case is $\frac{25}{26}$. Afterwards, guessing all cases but the million dollar case is $\frac{24!}{25!}$. Multiplied by $\frac{25}{26}$, this leaves us with a $\frac1{26}$ chance of switching giving us the million dollar case. Since both sides have an equal chance, we have a $\frac{50}{50}$ chance of getting the case we want in the end. The $\frac1{26}$ chance on both sides of the problem also show that this works for all cases, chosen or not, which means that the chance of getting any case is an equal $\frac1{26}$ to every other case, and deal or no deal is pure luck.

  • 2,657
  • 7
  • 20
  • 36

Odds of picking $1 million immediately: 1/26

Odds million is not picked right away: 25/26

If you get through picking 24 briefcases, and 1 million dollars still remains when given the option to switch briefcases with 2 left, the odds that the other briefcase (not the one you picked) of having $1 million is 25/26. SWITCH!

  • 9
  • Your solution seems to be wrong. Lets simplify the problem. Suppose there were only three cases A, B and C containing 1,2 and 3 (in some order). Suppose further the game plays out as follows :you choose A, B is elliminated, and so you must choose between A and C. There are six possible ways that 1, 2 and 3 can be distributed. For two of them you lose (i.e. 3 is in B) that leaves 4 possibilities were 3 is still in one of the two cases (for two of them swapping helps - while for two of them it does not). – Nex Dec 06 '15 at 02:58
  • @JackFrost I think this is an argument by analogy - blow the numbers up a bit, in which case it becomes clearer that the initial chance of being wrong has been collapsed into switching. – pjs36 Dec 06 '15 at 02:58
  • I think I see where you went wrong. MHP works because there's a 0/2 chance of Monty revealing the car, the rules say he won't do it. In Deal or No Deal there is nothing preventing you from choosing the million, thus you don't learn anything about what hasn't already been opened. It's identical to having 26 numbered balls, randomly putting 25 of them in a bag, drawing 24 of them, and then considering the probability of the number still in the bag. – Kaithar May 27 '18 at 03:47

If the contestant knew beforehand, that he/she would end up in a situation with two briefcases left, where one of them was the million dollar briefcase, then he/she should switch.

In Monthy Hall, you know that the host will open a door with a goat behind it. If the host didn't know which door had the car behind it, then it would be 50/50. There is a rule applied to the door with the car, it cannot be opened.

To turn the question around. Why isn't the chance of opening the $0.1 briefcase larger than 50%? Why specifically the million dollar briefcase? There has to be a rule applied, which is that the million dollar briefcase cannot be opened. Since there is a fairly large chance it will be opened during the course of the game, you end up with a 50/50 chance.

Not exactly answering with math, but I hope this helps someone understand the difference.


Firsly, let me preface this with "I'm happy to be wrong".. I'm not a probability or stats guru.. but surely the previous arguments are incorrect due to the fact they are not considering the pick of the initial box?

At the start of each Deal_Or_No_Deal game, someone at the studio picks a random box from 26 boxes to be the contestant's box. That person has a 1/26 of selecting the million for the player.

After this all the player really does, in a very long-winded way, is pick a second box before deciding whether to swap (if we are looking at conditional based probability, where we are assuming he/she makes it that far with the million intact)..

Then that person picks from a selection of 25 boxes. now, is it not fair to say that it would be better to pick from the wider set of boxes and stick with the box the other person chose first, than yours?

worded another way: if i put 100 bits of paper on the floor, and one had a coin underneath... and i ask you if you want to try and guess which one had the coin under it first or let a second person go first.. while you might think that your odds would be better if the first person went first and got it wrong (they would obviously).. is it not better to be the first person in this scenario statistically?

might be wrong. interesting thought anyway.

  • 1