## [Computational Complexity] Deal or No Deal

Expand Messages
• A new US game show started last week, Deal or No Deal hosted by Howie Mandel (the same Howie from this post ). A New York Times article describes the game as a
Message 1 of 1 , Dec 26, 2005
• 0 Attachment
A new US game show started last week, Deal or No Deal hosted by Howie Mandel (the same Howie from this post). A New York Times article describes the game as a exercise in probability.
Twenty-six briefcases are distributed to 26 models before the show begins. Each case contains a paper listing a different dollar amount, from one penny to \$1 million. At the start of the game, a contestant chooses one case, which becomes his; he is then allowed to see the sums in six of the remaining cases. After these have been disclosed, a mysterious figure known as the Banker calls the set, offering to buy the contestant's case for a sum derived, somehow, from the cash amounts that are still unrevealed.

The contestant can take the offer and cash out, or move on to the next round, during which he's allowed to open five more briefcases before the Banker's next offer. The second offer might exceed or fall short of the first offer, but it clearly reflects the newly adjusted odds about what the contestant is holding. If the contestant refuses it, he requests to see the contents of four, three, two, and then one more case, with offers from the Banker coming at the end of each round. Each time the contestant can accept and end the game, or proceed to the next round. If he doesn't accept any of the offers, he is left with the sum in his own case.

Is it wise to take a bank offer when it's below the mathematical expectation, as it always seems to be? As the game goes on, the offers asymptotically approach mathematical expectation; maybe contestants should wait.

If the contestant just wanted to maximize the expected value of their winnings they should always turn down the Banker, but many do accept the Banker's offer. Are they acting rationally?

When the amount of money involved becomes a significant fraction of the contestant's net worth, a contestant becomes risk adverse and is often willing to accept a sure amount rather than an expected higher amount.

Economists model this phenomenon using utility theory. A person has a utility function of their net worth, usually with first derivative positive and second derivative negative (think square root or logarithm). They aim not to maximize expected net worth, but expected utility which leads to risk adversion. For example, if you had a square root utility then you would be indifferent to having a guaranteed net worth of 360,000 and playing a game that would give you a net worth 1,000,000 or 40,000 with equal chance.

Economists can't afford to run these experiments at their universities; they can't offer enough money for serious risk adverse effects to kick in. But television game shows like this do give us a chance to see risk adversion in action.

--
Posted by Lance to Computational Complexity at 12/26/2005 07:10:00 AM

Your message has been successfully submitted and would be delivered to recipients shortly.