The Black Swan

I’ve been offline for a while as I needed a bit of a mental break in the midst of a cross-country move and settling in at home. But now that I’ve picked up more work to sustain me, I’ve had a bit more chance to read and have decided to record my notes in blog form.

black swanRecently, a friend referred me to Overdrive, a website that helps you find e-books available from different libraries. I managed to find a book that’s been on my list for about six years now – The Black Swan by Nassim Nicholas Taleb. No, I’m not considering becoming a dancer; I have enough mental health issues to deal with. It’s all about the impact of the highly improbable on life.

After a few chapters, I realized I wouldn’t wish reading it upon even my worst enemies. The author’s style is somewhat endearing at first, but as the chapters bear on starts to feel rather arrogant and pedantic. I’ve begun skipping around because it’s just too tiring to actually read every meandering word.

Some main ideas:

  1. The Black Swan is supposed to be the prototypical example of Hume’s problem with induction. Basically, everyone in the world thought that all swans were white until they discovered a black swan in Australia. In practical matters (history, financial markets), you often don’t realize that x is really possible (dot com bust, 9/11) until it happens. History is often shaped by these seemingly random events that couldn’t be fully predicted (think World War I).
  2. Don’t trust people who think they can predict history or finance 25 years into the future; they can barely predict the next 6 months. (Personal note: keep this in mind for things like Obamacare / the national debt.)
  3. We’re always constructing narratives, because that’s simply how we remember things. We come up with explanations ex post facto, looking at history and pretending that we can find reasons for what occurred even if it was actually random or hard to understand. We try to fit things into our self-constructed patterns, fitting it to a platonic ideal – our map of the world – instead of recognizing the messy reality. (Example: the French thought Hitler was going to be out of power quickly, which is why they didn’t react as quickly. It’s only afterwards we call the French stupid and easily conquerable for not realizing it.)
  4. Remember the difference between “there is no evidence of cancer” and “there is evidence of no cancer.” Basically, just because there is not evidence of the existence of a black swan doesn’t mean there is evidence of no black swans; the absence of evidence for something is not evidence denying its existence.
  5. Interesting idea: maybe capitalism works because it allows people to be wildly unpredictable and try totally unusual things.
  6. Sometimes the more you know, the worse you are. As you try to develop theories, you construct more bad ideas, and then fit future information into those bad theories. (They did a psych test where they focused an initially blurry image. Those given fewer images were more quickly able to identify what the image was, over an even amount of time.)
  7. We generally are bad at predicting risk, because the riskiest things are the least predictable. Don’t trust the “suits” – financial advisors, political consultants, etc.
  8. The Ludic Fallacy: studying probabilities via games is bad, because theirs is sterilized, domesticated uncertainty, whereas in the real-world, you need to discover both the odds and your areas of uncertainty. Example: a casino spends most of its time focused on security in the casino and its clientele, but those numbers are highly predictable. Its four biggest possible financial crises came from completely unexpected places: a tiger maiming his beloved trainer (they’d only planned if the tiger attacked an audience member), an injured construction worker trying to blow the casino up, an employee failing to properly file taxes, the owner’s daughter being kidnapped. The worst risks are not computable, because they are hard to foresee and therefore impossible to model.
  9. A big problem: most risk assessments don’t contain a reasonable possible error rate to their estimation ratios. The error rate is often higher than the projection! Uncertainty is not found in bell curves; it is found irregularities. We should expect deviations from the norm. It is like the turkey, who after being regularly fed day in and day out for a year, is stunned by his death on Thanksgiving. The regular doesn’t always give a clue to the end story.
  10. Hayek’s nobel prize winning acceptance speech was the “the pretense of knowledge” – is this all just a variation on Socrate’s wise realization, “I know that I know nothing” ?

I’ll add more notes as I keep reading skimming.