I’ve just finished the major part (without the postscript essays) of the famous and oft-discussed book, once a best seller - the Black Swan. The author was knowledgable, and the book was insightful and well-crafted, with his unique style of discussing serious topics with occasional anecdotes and vivid storytelling. It was a fantastic ride.
Human Thinking Fallacy
Humans tend to think and live in Mediocristan, where probability tends to be in normal distribution - and that’s what most things are. Like human height, weight.
Black Swan incidents are ones that people can barely predict, sometimes grossly overlook. Examples include the 9.11 incident, 2008 stock market crash, etc.
But many other distributions are best described as power distribution, and that’s referred to as Extremistan, where cases tend to be extreme. Like human wealth.
It’s human nature to draw conclusions, find correlations, assume everything is close to what we observed, and extreme cases are extremely unlikely. And that’s the basic recipe for Black Swan incidents.
Think of a turkey well-fed by its owner. It quickly concludes that the owner is a friend, until the day before Thanksgiving. The author advises in the book: don’t be a turkey.
The author discussed a few cognitive biases we’re vulnerable to:
- Confirmation Bias: People seek validation and reenforce their bias.
- Narrative Fallacy: People tend to find causes, as stories are much easier to digest given causes and reasons. People love to conclude as part of our natural tendencies.
- The Antechamber of Hope: The success of specific careers require an extraordinary amount of input and lonely hours waiting for hope. But many people don’t realize that, even the pursuers of these careers themselves.
- Survival Bias: People tend to look at survivors or successful stories while overlook deaths, thus overlook the total probability.
How Are We Bad At Prediction
Human beings are particularly bad at making predictions. One phenomenon is the more information we have, the more confident we are, but not more accurate. It’s called “toxicity of information,” where noise is mistaken for signal.
The author argues that human technological advances are particularly unpredictable: “if you expect to expect something tomorrow, you should expect it today.” It’s especially true with new technologies. If we understand the details of new technology right now to predict it, we should already know how it’ll work and have it today.
In the book, the author slammed the so-called economists, social scientists, and the like, who build complicated mathematical models and beautiful charts to “forecast” the economic trends, stock market, etc., without taking into account chance plays in the outcomes. It makes them utterly vulnerable to Black Swans.
The author points out, however, that we should not try to predict Black Swans. Instead, build robustness against negative Black Swans, and shoot for positive Black Swans.
Gray Swans of Extremistan
The final part of the book author argues that the foundation of Black Swans is power distribution. It happens everywhere in the world: economy, company, nation powers, where winners take all. It has several implications:
- Nobody is safe in Extremistan, but nobody is threatened with total extinction either.
- More concentrated power means more devastating collapse, too.
- There are always ways to soften Extremistan, e.g., tax to redistribute wealth, religion to bind people. But Extremistan is here to stay.
- Black Swans are always going to happen. We can make it grayer by treating them with the correct attitude.
Many book reviews have already gone through what they dislike about the author’s arrogant tone in this book, dismissing all social science as pseudo-science. Also, the author loved to paint himself as the lone wise oracle shunned by ordinary people, but that’s not the truth: many people have similar or close ideas of impending dangers and what we should do about them.
Nevertheless, the ideas in the book are still worth a read and close attention, especially in a fast-changing world as it is today.
One of the best examples might be the coronavirus that’s sweeping across the world right now, as I’m sitting in my own house, not being able to visit the restaurants and coffee shops I love. In retrospect, when the news first broke out, I never expected it could have such a drastic impact. Many people, myself included, like most popular news anchors, technologists, president of the US, and so so many more on social media, regarded the virus as “something just like the flu,” and “it’s just going to go away when the season passes.” Media today love to bring out the old comments, (especially with different political agenda), and use them to mock how ignorant and short-sighted they are - even though they are not so innocent themselves. I see this more like a common flaw in human predictions, just as the book described: as humans, we’re particularly bad at predictions.
There are also voices pointing out that it didn’t need to be a Black Swan. Nassim Taleb, the author of this book, stated in the recent interview: coronavirus shouldn’t be a Black Swan, to governments, medical professionals, and epidemiologists who dealt with situations like this before. He was not alone. Bill Gates once warned us about the dangers of a pending pandemic. We didn’t take the advice seriously, and the pandemic still broke out as a Black Swan to all the rest of us.
Now instead of engaging in bitter political bickering, it’s wiser to learn from this lesson on all humanity and work together to make the next Black Swan grayer.