🔍
The Infinite Money Paradox - YouTube
Channel: unknown
[0]
Vsauce! Kevin here, and I
have a simple coin-flipping
[3]
game, that requires no skill, has no catch
[7]
or trick, and can lead to infinite wealth.
The thing is… nobody really wants to play
[15]
it.
[16]
Why?
[17]
How is it possible that an incredibly easy
game with infinite upside causes virtually
[24]
everyone to react with a massive yawn?
[31]
To play this game, we’ll turn to the most
rational, calculating man in history: long-time
[36]
friend of Vsauce2, Dwight Schrute. You walk
up to the table to flip a coin. Your prize
[46]
starts at $2. If the coin flip results in
FALSE, the game is over and you win $2. If
[54]
it lands on FACT, you play another round and
your prize doubles. Every time you get a FACT,
[60]
you keep playing and the prize keeps doubling
-- from $2 to $4 to $8 $16 $32 $64. $128,
[69]
so on and so on… forever.
[71]
But as soon as you get a FALSE, you are done
and you collect your winnings. So if you hit
[77]
a FALSE in the third round, then your prize
is $8. If your first FALSE comes in Round
[83]
14, you’d walk away with $16,384. No matter
how unlucky you are, you’ll never win less
[92]
than $2. If things go really well… then
things could go really well.
[98]
Now that you know the potential payoffs, how
much would you be willing to pay to play this
[105]
game? $3? What about $20, $100? The winnings
could be infinite so the question is: how
[116]
much is a chance at infinite wealth worth
to you?
[121]
We can determine the precise answer, but first
we need to know the game’s expected value,
[126]
which is the sum of all its possible outcomes
relative to their probability. That determines
[132]
the point at which we choose to play a game
-- or, in the real world, the point at which
[139]
we decide to take out insurance on our house
or a life insurance policy. If our risk is
[145]
less than our likely reward, we should play.
If we’re paying too much relative to what
[151]
we’re likely to get out of playing, then
we should not play.
[155]
Here’s the expected value of Schrute.
[158]
You’ve got a 50/50 chance of losing on your
first flip and heading back to the beet farm
[163]
with $2. With a probability of ½ and a payoff
of $2, your expected value in the first round
[172]
is $1. The probability of winning two rounds
is ½ * ½, or ¼, and your prize there would
[181]
be $4. That’s another $1 in expected value.
For three successful flips, it’s ½ ^3 -- or
[190]
⅛ -- times $8. Another dollar. 1/16 * 16…
1/32 * 32… 1/64 * 64…
[202]
For n rounds, the expected
value is the probability
[208]
(½)^n * the payoff of 2^n -- so no matter
[213]
the value of n, the result will be 1. The
expected value of the game is 1 + 1 + 1 +
[220]
1 + 1… forever. Because each round adds
$1 of value no matter how rare the occurrence
[233]
might be. The expected value is infinite.
[238]
And there’s our paradox. Because, you’d
think a rational person would pay all the
[244]
money they have to play this game. Mathematically,
it makes sense to pay any amount of money
[251]
less than infinity to play. No matter what
amount of money you risk, you’re theoretically
[257]
getting the deal of a lifetime: the reward
justifies the risk. But nobody wants to do
[264]
that. Who would empty their bank account to
play a game where they know there’s a 75%
[270]
chance they walk away with $4 or less?
[275]
It’s confusing because expected value is,
mathematically, how you determine whether
[281]
you’ll play a game. Look, if I offered you
a coin-flipping game where you won $5 on heads
[289]
and lost $1 on tails, your expected value
of each round would be the sum of those possible
[296]
outcomes: (50% chance * +$5) + (50% chance
* -$1). Half the time you’ll win $5, half
[306]
the time you’ll lose $1. In the long run,
you’ll average +$2 for every round you play.
[313]
So paying anything under $2 to play that game
would be a great deal. When the price to play
[321]
is less than your expected value, it’s a
no-brainer.
[325]
And since the expected value of the Schrute
game is infinite, paying anything less than
[331]
infinite money to play it should also be a
no-brainer. But it’s not. Why?
[339]
The thing that’s so interesting about this
game is how the math conflicts with.... actual
[345]
humans. Enter: Prospect Theory. An element
of cognitive psychology in which people make
[351]
choices based on the value of wins and losses
instead of just theoretical outcomes.
[358]
The reason people don’t want to empty their
pockets to play this game despite its infinite
[364]
gains is that the expected marginal utility
-- its actual value to them -- goes down as
[370]
those mathematical gains increase forever.
This solution was discovered a few years ago.
[379]
A few hundred years ago.
[380]
In 1738, Daniel Bernoulli
published his "Exposition
[384]
of a New Theory on the Measurement of Risk"
[387]
in the Commentaries of the Imperial Academy
of Science of Saint Petersburg -- and what
[391]
we now call the St. Petersburg Paradox was
born. Bernoulli didn’t dispute the expected
[399]
value of the St. Petersburg game; those are
cold, hard numbers. He just realized there
[405]
was a lot more to it. Bernoulli introduced
the concept of the expected utility of a game
[412]
-- what was, until the 20th century, called
moral expectation to differentiate it from
[418]
mathematical expectation.
[420]
The main point of Bernoulli’s resolution
was that utility, or how much a thing matters
[426]
to you, is relative to an individual’s wealth
and that each unit tends to be worth a little
[433]
less to you as you accumulate it.
[436]
So, as an example, not only would winning
$1,000 mean a lot more to someone who’s
[443]
broke than it would to, say, Tony Stark, but
even winning $1 million wouldn’t affect
[449]
the research and development at Stark Industries.
[452]
And there’s also a limit on a player’s
comfort with risk, with John Maynard Keynes
[457]
arguing that a high relative risk is enough
to keep a player from engaging in a game even
[463]
with infinite expected value. Iron Man can
afford to lose a few billion. You probably
[470]
can’t.
[471]
And value itself is subjective. If I won 1,000
peanut butter and jelly sandwiches, I would
[478]
be THRILLED. If someone allergic to peanuts
won them, they’d be… less thrilled.
[487]
So. Okay, okay. Given all this, how much can
YOU afford to lose in the St. Petersburg game?
[495]
How badly do you want to play? Bernoulli used
the logarithmic function to come up with price
[501]
points that factored in not only the expected
value of the game, but also the wealth of
[507]
the player and its expected utility. A millionaire
should be comfortable paying as much as $20.88
[515]
to flip Schrutes, while someone with only
$1,000 would top out at $10.95. Someone with
[523]
a total of $2 of wealth should, according
to the logarithmic function, borrow $1.35
[528]
from a friend to pay $3.35.
[532]
Ultimately, everyone has their own price that
factors in their wealth, their desires, their
[539]
comfort with risk, their preferences, how
they want to spend their time, what else they
[544]
could be doing with their money, their own
happiness...
[548]
And the thing is… this game can’t even
exist. Economist Paul Samuelson points out
[557]
that a game of potentially infinite gain requires
the other party to be comfortable with potentially
[564]
infinite loss. And no one is cool with that.
[569]
So if the important elements are variable
and the game can’t exist, what’s the point?
[578]
The St. Petersburg Paradox reminds us that
we’re all more than math. The raw numbers
[586]
might convince a robot that it’s a good
idea to wager its robohouse on a series of
[591]
coin flips, but you know deep down that’s
a really bad idea.
[598]
Because you aren't an expected value calculation.
You aren't a logarithmic function. The numbers
[608]
are a part of you and help you live your life.
But in the end, you are… you.
[621]
Fact.
[622]
And as always, thanks for watching.
Most Recent Videos:
You can go back to the homepage right here: Homepage





