Kevin Stevens
Kevin Stevens
The Black Swan

The Black Swan

The Black Swan: one unexpected impactful event that invalidates a generally accepted statement

We will make forecast errors, but we cannot be unaware of the fact that we will make them.

We can set ourselves up to take advantages of Black Swans by maximizing our exposure to them

The most important thing for investors is to create processes that allow you to recognize black swans when they present themselves.

PN: We tend to learn the precise, not the general.

E: Humans struggle with learning rules, we learn facts. Metarules (such as the rule that we don't learn rules) we don't seem to get.

Almost everything in social life is produced by rare, consequential jumps - but all of the rules focus on the "normal" - but the bell curve ignores large deviations while making us confident we understand uncertainty.

The Map is Not the Terrain - or Platonicity - we mistake the map for the territory, focus on the well-defined.

We don't know where the map was wrong until after the fact

The mistakes often lead to severe consequences

When the gap between what you think you know (the map) and what you don't know (reality) becomes wide, you are more prone to Black Swans

Platonic fold - where our representation of reality (our map) ceases to apply, but we are unaware of it

E: Ideas come and go, stories stay.

Our minds are great at remembering stories, and thus we have the tendency to create narratives after the fact to explain things to ourselves.

The more intelligent we are, the better the explanation and the story tends to be.

As we progress and our knowledge grows, we become more susceptible to the unknown and our futures are less predictable because the shocks are buried beyond our ever-increasing knowledge.

The Triplet of Opacity

  • the illusion of understanding
  • the retrospective distortion - making sense of things after the fact
  • the overvaluation of factual information and the handicap of authoritative and learned people

Our minds are great at remembering stories, and thus we have the tendency to create narratives after the fact to explain things to ourselves.

The more intelligent we are, the better the explanation and the story tends to be.

Mediocristan: particular events don't contribute much individually, only collectively - no single instance will significantly change the aggregate or total

Type 1 randomness, the tyranny of the collective

Extremistan: inequalities are such that one single observation can disproportionately impact the aggregate or the total - venture capital and wealth

Type 2 randomness, the tyranny of the singular

Everything works until it doesn’t; and what we learn in the past from what does work can be misleading for the future

Don’t mistake observations of the past for predictions of the future

Black Swans can occur instantaneously (9/11 or earthquakes) or over time (technology’s revolution)

Themes that arise from lack of awareness to the Black Swan:

We focus on the seen and generalize it to the unseen - the error of confirmation

We fool ourselves with narratives

We behave as though Black Swans don’t exist, it’s hard for humans to grasp the unknown

The distortion of silent evidence

We focus on well-defined Black Swans at the expense of ones that don’t easily come to mind

Our reactions and intuition depend on the context in which evidence is presented

We naturally look for evidence to confirm our story and out vision, they are easy to find.

This is why it’s important to not come to a conclusion too early in #venture #investing

A series of related facts is not necessarily evidence - the confirmation of white swans doesn’t confirm the non-existence of black swans.

We get closer to the truth by negative instances, not by verification.

Sometimes a lot of data is meaningless, at other times a single data point can be very meaningful. A thousand days cannot prove you right, but one day can prove you wrong.

E: True self-confidence is the ability to look at the world without the need to find signs that you are right. To avoid looking for confirmation bias.

E: The narrative fallacy is our vulnerability to over interpretation, and our desire for narratives over raw, complicated truths.

We have a limited ability to look at facts without trying to weave them into an explanation. Explanations bind facts together, and make them easier to understand and remember. The problem is when the understanding is wrong.

It’s impossible for our brain to see anything in its raw form without trying to interpret it.

PN: The central problems of probability and information theory are: information is costly of obtain, store, and retain. The more orderly, less random, and narrativized the easier it is to solve these problems.

The more random the information, the greater the dimensionality, and thus the more difficult to summarize. The more you summarize, the more order you put in, the less random the data in your mind and the more prone you are to be surprised by what you do not see.

People who work in randomness-laden professions (#VC is one), are more likely to have burnout effects from constant second-guessing of your past actions. Thus, keeping a diary is the most important thing you can do in these circumstances.

E: People would rather be wrong with infinite precision than approximately right.

After a black swan like 9/11 - people expect it to reoccur with a higher frequency, when in fact the odds are actually lowered due to our awareness of the possibility.

Business bets in which one wins big, but infrequently, yet loses small but frequently are worth making as long as you have the personal and intellectual stamina

E: To understand success, we need understand the traits present in failures.

E: Bias has a vicious trait, it hides best when its impact is largest.

Confirmation fallacy - people are great at telling you what they did, not what they didn’t do

Evolution is a series of flukes, some good, many bad. You only see the good, but in the short term it’s not obvious which ones are good.

E: The reference point argument: do not compute odds from the vantage point of the winning gambler, but from all of those who started in the cohort.

We think it is more intelligent to say because than to accept randomness. But, we should ensure our because is derived from experiments not backward-looking history.

The attributes of uncertainty that we face in real life have little connection to the sterilized ones we encounter in exams and games.

In real life you do not know the odds, you have to discover them and the sources of uncertainty are not defined.

We have black swans and do not learn from them because the ones that do not happen are abstract.

The gains in our ability to model the future are dwarfed by the increases in its complexity - implying a greater role for the unpredictable.

Our increase in knowledge creates an increase in confusion and ignorance - we think we know more than we do.

We overestimate what we know, and underestimate uncertainty, by compressing the range of possible uncertain states.

Once we produce a theory, we are not likely to change our minds - so those who delay their theories are better off. When you develop opinions based on weak evidence, it becomes harder to accept and interpret new information that contradicts your theory.

Things that move (finance, politics, etc..), and therefore require knowledge, do not have experts. While things that don’t move (psychics, math, accounting) have some experts.

Things that move are prone to black swans.

E: The same processes that make you satisfied with your knowledge also make your more prone to knowing less.

E: It is easy to predict the ordinary, but not the irregular.

When we become experts, we attribute our success to our knowledge and our failures to randomness.

Forecasting without an error rate uncovers three issues:

Variability matters - the policies we need to make decisions matters on the range of possible outcomes, not the expected final number

Degradation - are predictions are worse the farther out they are

Variables are random

We forget about unpredictability when it is our turn to predict.

It has traditionally been more profitable to bind together in the wrong direction than to be alone in the right one.

When we think of tomorrow, we project it as another yesterday.

E: Randomness is unknowledge. The world is opaque and fools us.

E: Rank your assumptions by the harm they may cause, not according to their predictability.

Many rare events are not easy to compute probabilistically, but we can get a general idea about the possibility of their occurrence. (COVID-19 is definitely this.)

The normalized bell curve makes probabilities drop at a faster rate the further you move away from the mean - however, it doesn’t measure the impact of those outliers in the way that the power law can.