On the TV show "Deal or No Deal", a contestant is faced with a number of briefcases (26 in the US version) in which various amounts of money have been placed. The contestant is asked to select one to keep, unopened. Then the player plays a number of rounds.
In each round, the contestant is asked to select some of the other briefcases to open (usually several in the early rounds, decreasing in number to the later rounds). After seeing what is in the open briefcases, the contestant has a better idea what possible amounts of money could be in his briefcase.
A banker, who has also seen what has been opened, then offers the contestant a sum of money usually this offer is smaller than the average of the remaining prizes (significantly so at first, but closer to the average in later rounds).
The contestant can either take it, or reject the offer and continue to play another round. The excitement of the game centers around the decision that the contestant faces: "Deal or No Deal?"
What should the contestant do?
The Math Behind the Fact:
The answer depends on what the contestant's view of risk is. In the language of game theory, whether the contestant accepts depends on how riskaverse the contestant is.
We can understand this concept mathematically in the following way: if I gave you the choice between a sure $50 or a 5050 chance of getting $100 or nothing, which would you take? You are riskneutral if you are indifferent between the options, because they have the same expected value. You are riskaverse if you would take the sure $50, and riskloving if you prefer to take the gamble.
Thus, on "Deal or No Deal", the banker usually assumes that players are riskaverse, so he lowers the offer so that players will be more likely to take the gamble and continue to play.
Modern utility theory, developed by VonNeumann and Morganstern, sheds more light on what riskaversion is.
We could ask a different question: given an amount $x, at what probability level p would you be indifferent between a sure $x or a lottery in which you had probability p of getting $100 and probability (1p) of getting nothing?
Your VonNeumann Morganstern utility for $x is based on the answer to this question. If $x=$0, then certainly your choice for p will be 0. If $x=$100, then your choice for p will be 1. If you are riskneutral and $x=$50, then your choice for p will be .5 but if you are riskaverse, then you will probably want p to be higher than .5 before the lottery becomes just as desirable as the sure $50. So then, if you are riskaverse, the graph of
p(x) versus x will be concave down. (Similarly, if you are riskneutral this graph will be linear, and if you are riskloving, then p(x) will be concave up.)
Understanding human behavior in terms of utilities and preferences is the subject of game theory, which forms much of the modern language of mathematical economics.
See also St Petersburg Paradox.
How to Cite this Page:
Su, Francis E., et al. "Deal or No Deal."
Math Fun Facts.
<http://www.math.hmc.edu/funfacts>.
