馃攳
Heavy-Tailed Distributions: What Lurks Beyond Our Intuitions? - Anders Sandberg - YouTube
Channel: Science, Technology & the Future
[8]
So typically when people talk about
[10]
probability they think about nice probability distribution like the bell curve or the Gaussian curve
[16]
So this means that it's most likely that you get something close to zero and then less and less likely
[21]
that you get very positive or very negative things and this is a
[27]
It's a rather nice looking curve.
[29]
However, many things in the world turn out to have much nastier
[32]
probability distributions. A lot of disasters for example have a power law distribution.
[38]
So if this is the size of a disaster and
[41]
this is the probably, they fall off like this. This doesn't look very dangerous from the start.
[49]
Most disasters are fairly small, there's a high probability of something close to zero and a low probability of something large.
[57]
But it turns out that the probability getting a really large one can become quite big.
[62]
So suppose this one has alpha equal to one [that what the corres...] that means that there is the chance of getting
[70]
a disaster of size 10 is proportional to 1 in 10 and that disaster is 10 times as large
[77]
that's just a tenth of that probability and that it's also ten times as large as that big disaster again a tenth of that.
[85]
That means that we've quite a lot of probability of getting very very largely disasters so in this case getting something that is
[92]
very far out here is exceedingly unlikely, but in the case of power laws you can actually expect to see some very very large outbreaks.
[100]
So if you think about the time that various disasters happen.
[105]
They happen irregularly and occasionally one is through the roof, and then another one, and you
[111]
can't of course tell when they don't have that's random
[114]
And you can't really tell how big they are going to be except that you're going to be distributed in this way.
[119]
The real problem is that when something is bigger than any threshold that you imagine well
[125]
It's not just going to be a little bit taller,
[128]
it's going to be a whole lot.
[130]
So if we're going to see a war for example large even
[132]
Second World War, we shouldn't expect it to kill a million people more. We could expect it to kill tens of most likely
[139]
hundreds or in there and even a billion people more
[142]
which is a rather scary prospect.
[144]
So the problem here is that disasters seem to be having these heavy tails. So a heavy a tail
[150]
in probability slang that means that the probability mass over here, the chance that something very large is happening,
[157]
there again it falls off very slowly
[158]
And this is of course a big problem because we tend to think in terms of normal distributions
[164]
Normal distribution are nice we say they're normal because a lot of the things in our everyday life get distributed like this
[170]
The tallness of people for example [in the well
[173]
they're] very rarely we meet somebody who's a kilometer tall, however,
[177]
when we meet the people and think about how much they're making or much money they have
[183]
Well Bill Gates. He is far far richer than just ten times you and me and then he's actually got
[190]
He's from afar out here.
[192]
So when we get to the land where we have these fa- heavy tails when err both the the richest
[198]
if we are talking about rich people and the dangers if we talk about this also
[202]
tend to be much bigger than we can normally think about.
[206]
(Off camera) Hmm yes definately unintuitive.
[209]
mmm and the problem is of course our intuitions are all shaped by what's going on here in the normal realm.
[215]
We have this experience about what has happened so far in our lives and
[220]
Once we venture out here and talk about very big events or intuition suddenly become very bad. We make mistakes
[227]
We don't really understand the consequences, cognitive biases take over and this can of course completely mess up our planning
[234]
So we invest far too little in handling the really big disasters
[237]
and we're far too uninterested in going for the big wins in technology and science.
Most Recent Videos:
You can go back to the homepage right here: Homepage





