THE BLACK SWAN SUMMARY (BY NASSIM TALEB) - YouTube

Channel: The Swedish Investor

[0]
Takeaway number 1: The Black Swan problem
[3]
For an event to qualify as a Black Swan, it must fulfill the following:
[8]
1. It's an outlier
[11]
Nothing that has happened before can convincingly point to even the possibility of the event.
[17]
2. It carries an extreme impact
[20]
3 It becomes explainable only after the fact
[25]
Human nature fools us into believing that we should have been able to know it would happen all along.
[30]
Other examples of Black Swans are: the outbreak of WW1,
[36]
9/11, Black Monday and the Indian Ocean earthquake and tsunami.
[41]
The Black Swan problem is also called "the problem of induction".
[45]
The basic principle is that there are great uncertainties when trying to forecast the future,
[51]
given the knowledge of the past.
[53]
How can we expect to be able to figure out the properties of something that is infinite and
[58]
unknown, based on something that is finite and known?
[63]
Let's take another example, a textbook one.
[66]
Imagine that you're a turkey, living on a farm.
[69]
From the day that you were born, some friendly creatures with two legs and two arms have been feeding you.
[75]
Every day, they keep coming back with food!
[77]
Also, they've built a nice fence which protected you from that hairy thing on four legs,
[82]
which looked like it wanted to rip your guts out.
[85]
They even bring female turkeys to you occasionally. What lovely creatures they seem to be!
[90]
For every passing day, as you are fed, protected and stimulated,
[96]
you become more and more certain that these must be the friendliest creatures on earth!
[100]
Until that day called Thanksgiving, when they're no longer so friendly anymore.
[106]
Thanksgiving definitely came as a Black Swan for this turkey. From the point of view of the turkey,
[112]
it was totally unexpected, had an extreme impact, and if he could have reasoned around the event after its occurrence,
[119]
he might even have found explanations for it.
[122]
From this, we can conclude that a Black Swan is a sucker's problem.
[126]
It's only a Black Swan if you're not informed. This goes for randomness in general. It's nothing else than lack of knowledge.
[134]
For the turkey, him making it to the Thanksgiving table of some hungry human family, was totally unexpected.
[141]
But the family had probably known the exact date of the event for months.
[146]
There's a cousin of the Black Swan which is called the Grey Swan. This concerns the "known unknowns" -
[152]
things that we know that we don't know.
[155]
We don't know how fast a man can run 100 meters, but we know that we don't know that!
[161]
A Black Swan, on the other hand, is an "unknown unknown".
[165]
We don't even know that we don't know that.
[168]
Imagine a book that you have never read, and that you haven't watched a summary of on my channel, of course.
[175]
That contains a lot of unknown unknowns. If you decide to read the book one day, at some point, you'll exclaim:
[181]
"Oh, I didn't know that!" but you can't foresee today what content you'll be exclaiming that about.
[192]
Takeaway number 2: The implications of Black Swan blindness
[196]
Nassim Taleb talks about five related issues to the Black Swan that emerge from our blindness to it.
[202]
I've discussed some of these previously, in my summary of Thinking Fast and Slow.
[208]
1. The error of confirmation. We, as humans, are prone to draw conclusions from what we've seen
[216]
to the unseen. For instance, I've heard a strange argument regarding if graduating from college is a good idea or not.
[223]
It goes something like this:
[226]
"Well, Bill Gates, Thomas Edison, Richard Branson and ... and many other billionaires are school dropouts!"
[233]
Let's exaggerate a bit. Even if we pretend that ALL billionaires are school dropouts, we can never, I repeat never,
[241]
conclude the opposite from that statement - that all school dropouts are billionaires.
[247]
That billionaires are school dropouts doesn't confirm that it's a good idea to drop out from school.
[252]
Yet, using it as an argument for leaving school is not uncommon, but the argument is flawed, at best.
[260]
2. The narrative fallacy. Last year, a friend of mine and I went to Australia for a few weeks of vacation.
[267]
We went to this kinda remote place called Mission Beach, and the only reason for going there is basically skydiving and rafting.
[274]
We had our eyes set on skydiving, but not for long ...
[279]
A friendly fella at our hostel told us about an accident which had happened the week before. In a tandem skydive,
[285]
the parachute of the tandem jumpers and that of their cameraman had apparently twisted around each other during the jump, which caused all
[294]
three of them to fall to their deaths.
[297]
Me and my friend went rafting instead ... It didn't matter that when I asked my friend Siri,
[302]
he told me that I was very unlikely to die in a skydive.
[318]
Stories stick.
[320]
Statistics, do not.
[322]
This is the narrative fallacy. Sorry for making you less likely to go skydiving.
[328]
3. We are not programmed for Black Swans
[332]
Humans are prone to believe in linear progression.
[335]
We think that a certain input will gradually result in a desired output.
[341]
Not so when Black Swans exist.
[344]
Imagine the author who's been spending many years writing books.
[348]
After 10 years of intensely hard work, she finally has her first book published, and it becomes a blockbuster.
[355]
That's a typical Black Swan event. Her effort to progression curve looked something like this. Not even remotely
[361]
resembling the linear progression which would have looked something like this.
[366]
Imagine how demoralizing the first nine years must have been for this author as all her friends
[371]
expected that her progress would look something like this, while it actually looked like this.
[378]
4. The distortion of silent evidence
[382]
History has a tendency of hiding Black Swans for us, by filtering the reality that we are presented with.
[388]
Consider the sailors of the 15th century that came back after their voyages to tell that they survived through many storms through praying together.
[397]
Does this imply that praying together makes it less likely for your ship to sink?
[402]
Well, maybe, but we must also consider the rest of the sailors - those that didn't survive the voyages.
[410]
Did they pray? Or well, did they not?
[413]
We may never know, because they can't really tell us from the bottom of the ocean!
[419]
5. Tunneling
[421]
We tend to focus too much on what we know and shy away from what we don't know.
[427]
For instance, school teaches us many models on how to interpret reality.
[432]
Many of the inventors of these models were great thinkers, yet, they can't think outside the box of their own models.
[440]
And sometimes, being too narrow and relying too much on these models can be devastating for the
[445]
interpretation of the real world, as we shall see in takeaway number four.
[452]
Takeaway number 3: Mediocristan vs Extremistan
[457]
Let's explore two new countries, shall we? Starting with Mediocristan.
[463]
From Gothenburg to
[467]
Mediocristan ...
[470]
Search!
[473]
Mediocristan is the land of the average. In Mediocristan, the first
[479]
100 observations of a variable will give you a good expectation of what you might see for the, say, next 1,000 observations.
[487]
The supreme law of this country is as follows:
[490]
"When your sample size is large, no single instance will significantly change the aggregate or the total."
[498]
For instance - weight, height,
[501]
mortality rates, car accidents and the salary of a college graduate (in Sweden) are all matters that belong to Mediocristan.
[511]
Consider the total height of the people in a sample of 100 Swedish males.
[515]
Let's pretend that their total length is 182 meters, using
[520]
182 centimeters per person, which is the height of the average Swedish male.
[525]
Now, let's add the world's tallest man in history to the sample - Robert Wadlow, who also staggering 272 centimeters tall.
[534]
The total length is now 184.7 meters, which is an increase of about 1.5 percent.
[541]
Not too radical, right?
[544]
Now, let's buy a last-minute ticket and travel to Extremistan instead.
[552]
Extremistan.
[553]
In Extremistan, we introduce Black Swans.
[557]
This makes it so that the first 100 observations might not give so much information about the next 1000 once at all.
[565]
Remember the life of the turkey?
[567]
His first 100 days of being fed and protected by humans couldn't really tell him that he was about to become Thanksgiving dinner on the 101st day.
[577]
Most human-made or social matters belong to Extremistan - such as wealth, income,
[583]
book sales per author,
[585]
number of subs per youtuber, deaths in war,
[589]
sizes of planets, and, of course, financial markets.
[594]
Consider the total wealth of the people in a sample of 100 Swedes.
[598]
Let's pretend that their total wealth is $19 million, using $190,000 per person,
[605]
which is the wealth of the average Swede.
[608]
Now let's add Warren Buffett to the sample, who currently has a net worth of $84.2 billion.
[615]
The total wealth is now $84.22 billion,
[619]
which is an increase of approximately 443,100%.
[625]
That's quite the difference.
[628]
The implications are as follows: In Mediocristan,
[632]
we can safely do some predictions. In Extremistan, it's much more difficult, maybe even impossible.
[639]
Problems arise because we seem to think that we are living in Mediocristan,
[643]
while almost all of the matters that we're trying to forecast are matters of Extremistan.
[648]
Sometimes we base important functions in our society around these predictions, such as the banking system, and that leads to economic
[655]
crisis, such as the financial crisis of 2007 to 2008.
[663]
Takeaway number 4: Gaussian Schmaussian!
[668]
The bell curve, or the normal distribution as we refer to it in school, or the
[673]
Gaussian curve, as
[674]
we refer to it when we think that we're honoring its original inventor, Karl Friedrich Gauss, is a very common tool used for risk
[682]
management among regulators and central bankers, among others.
[686]
Let's talk about what this curve is first and foremost.
[689]
Its fundamental property is that data from a given random variable, say height, will hover around the average.
[697]
For instance, take the example of the Swedish males presented previously. The average height is
[703]
182 centimeters. If you look at the likelihood that someone is taller than, say
[708]
189 centimeters, you may see that only 1/6.3 males are.
[715]
Taller than 196 centimeters?
[718]
1/44.
[720]
Taller than 203 centimeters?
[722]
1/740.
[725]
What's important to notice here is not the actual numbers,
[729]
but that the same incremental increase in our random variable
[732]
(ie height), leads to an even faster decline in its number of observations.
[738]
You can say that the variable is experiencing an ever-increasing headwind whenever it tries to deviate from the mean.
[746]
Now, the normal distribution is awesome when it's applied under the right conditions, which is in Mediocristan.
[753]
Then, our previous observations can tell a whole lot about potential future ones.
[759]
It does not work with variables from Extremistan though.
[763]
For example, if you would have modeled daily stock market returns as normally distributed, and you were faced with the Black Monday of
[770]
1987, where the market crashed by 22.6% in a single day, you would have to call this event an outlier.
[779]
Because, according to the normal distribution, it should only happen once in several billion lifetimes.
[786]
Many investors went bankrupt because they didn't even consider the possibility of something like this happening.
[792]
It was a Black Swan to them.
[795]
So, we see that a normal distribution can be limited, nay,
[799]
dangerous to use for decision-making under circumstances when we deal with matters from Extremistan.
[804]
And remember, this is pretty much all human-made social matters!
[808]
Then the question becomes, what can we do instead?
[813]
We can use something Nassim Taleb calls Mandelbrotian randomness instead of pretending that everything is normally distributed.
[821]
Mandelbrotian randomness does not assume that deviations from the mean becomes increasingly difficult.
[827]
Instead, it suggests that, for instance, if we talk about losses in a single day in the stock market,
[832]
it's just as rare to see a day resulting in a return of -5% instead of -2.5% as
[839]
it is to see -10% rather than -5%.
[843]
Going from
[845]
-1.25 to -2.5 to -5 to -10 to
[850]
-20 are all the same. In terms of probability,
[854]
I mean. If we would have assumed that on the Friday before the Black Monday,
[858]
we would at least have considered a -22.6% decline a
[863]
possibility, and we would have been able to protect ourselves from such an event.
[868]
By assuming Mandelbrotian randomness instead of the normal distribution, we can turn some Black Swans into Grey Swans.
[876]
Grey Swans are known unknowns as we discussed in the first takeaway.
[881]
They are better than Black Swans because at least we can adapt our decision-making to something that we know that we don't know about.
[892]
Takeaway number 5: How to act as an investor in an environment of Black Swans
[898]
And now to the most interesting part.
[900]
In a world dominated by Black Swans, where we fool ourselves with the error of
[905]
confirmation, the narrative fallacy and tunnelling and where we must be very careful when using the Platonic normal distribution for anything useful,
[913]
what should we do?
[917]
Nassim Taleb suggests two different approaches:
[921]
1. The hyper-conservative and hyper-aggressive approach
[926]
Don't put your money in some medium risk investments, because let's face it - how do we know that it's medium risk anyways?
[932]
Did some "expert" compute that using a normal distribution perhaps?
[937]
Instead, put a majority of your money in something
[940]
extremely safe, like Treasury bills. These aren't hedged against Black Swans either,
[945]
but if you lose your money in Treasury bills, you'll have bigger problems than just losing your investment capital ...
[958]
The rest of the money should be put in something extremely speculative, like options or or angel investments.
[965]
With this type of portfolio, you are limited in your risk because of your hyper conservative investments,
[971]
but you are also exposed to the possibility of hitting a positive Black Swan with your hyper-aggressive ones.
[978]
Nassim Taleb refers to this as a convex combination.
[983]
2. The speculative, insured portfolio
[987]
The second option is to have a very speculative portfolio, but to insure it against losses that are greater than, for example, 15%.
[995]
This might not always be possible though, depending on what your portfolio consists of.
[1000]
Instead of using an actual insurance company,
[1002]
you can create this effect yourself by putting up stop losses at minus 15% and
[1008]
taking multiple bets with small parts of your equity.
[1011]
This strategy is also convex. Your risk is limited, but your upside is exposed to positive Black Swans.
[1021]
This is by far the longest video I've made to date, so I think a summary is in place.
[1027]
Humans love to predict, but Black Swans make the future a little less certain than we'd like to think.
[1035]
From Black Swan blindness, many other themes arise, like the error of conformation, the narrative fallacy, silent evidence and
[1043]
tunnelling.
[1044]
We tend to think that we live in Mediocristan,
[1047]
but most of the matters in our society are belongings to Extremistan. We must treat them accordingly.
[1054]
To use Mandelbrotian randomness in favor of the normal distribution can help us in turning some Black Swans into grey ones.
[1063]
In investing, Nassim Taleb suggests that we should expose ourselves to the possibility of
[1068]
positive Black Swans and limit our risk by decreasing our exposure to negative ones.
[1075]
Well, that's it for this time, I hope to see you again soon!