Investors lucky enough to secure a spot in one of Bernie Madoff's funds had it pretty good. Year after year Madoff produced remarkably consistent positive returns. By all accounts, their volatility was quite low. Madoff's investors must have slept quite soundly at night, knowing that the best models the industry had showed that there was a very small chance they would ever lose any of their wealth. Then one morning they woke up, and realized they had just lost it all. This was a black swan - a sudden event that was totally outside of their experience, that seemed to violate the mental model they used to understand the world.
The Origin of The "Black Swan"
Let's go back to the example given in the article on volatility, and imagine that you are standing in your driveway and recording the number of cars that go by during each fifteen minute interval of the working day. Suppose you have done this over many days during the same four hour stretch of the day, and each day you get a data series that is tightly distributed around 8 cars every fifteen minutes. Some days it might look like "8, 7, 9, 10, 6, 10, 10, 6, 6, 8, 7, 9", other days like "7, 9, 6, 10, 8, 9, 10, 6, 7", but over time you become increasingly confident in your model of the world that says that the number of cars passing by your house will average 8 with about an 80% chance of being within 6 and 10, and very occasionally much less or more.
Then one day you head out to the driveway and see no cars within the first fifteen minutes. This is a bit strange, but even in the model of your world that you have accepted, this occurs about .5% of the time, so you do not think much of it. Another fifteen minutes passes, still no cars. After the first two hours, still no cars, and this is now about a fifteen standard deviation. Your model says that this kind of event should occur once every 10 million years or so.
What happened? Your model did not incorporate the fact that the world might change. Your road closed. The cars took a detour instead. You had a black swan event. Your model of how the world should operate under "normal" conditions suddenly became useless.
As sophisticated of a model that we can possibly build still leaves open the gaping hole that the future might be nothing like the past, or at least nothing like the model that we are currently using to explain the past. This kind of paradigm-shattering event has been called a "black-swan" event by famous author, philosopher, and occasional trader Nassim Taleb who wrote a book about them (all of his books are gems, read them...).
The "black swan" name comes with an accompanying story. For many years, Europeans defined swans as white creatures, because that is all that they knew. If a European had to estimate the probability that the next swan he would see would be white, he would have said 100%, because in the mental model that he used to understand the world around him, 100% of all swans were white.
Then Europeans went to the "new world" and discovered that not all swans were white. The sighting of the first black swan was a black swan event.
How do you rationally determine a "worst-case scenario"?
The dangers of an over-reliance on the past can be explained by a very simple example. Imagine that you are thinking about investing in a portfolio of US stocks and you want to know what your worst case scenario is. In other words, if things really get ugly, what is the maximum loss you could be looking at?
One approach might be to look at the very worst historical performance that the US stock market has ever seen. This happens to be during the Great Depression.
But the problem with this logic can be seen pretty easily. Just think about how people in 1928 would have figured out what their worst-case scenario was. They would have looked at the pre-Great Depression history, and concluded that a worst-case scenario for stocks might be equal to the worst performance prior to this. And they would have been badly wrong. If using past experience to inform a purchase decision was irrational in 1928, then why is it rational now?
Risk and Uncertainty
Of course, there is no good precise answer to the risk management problem. The probabilistic models that underlie the concept of risk as volatility were born to solve problems like casino slot machines and blackjack tables, where the distributions of outcomes really are "fixed" by the house. But estimating the bounds of future stock returns is an entirely different class of problem, because the future is inherently uncertain and unknowable. If a nuclear bomb goes off in the center of Manhattan tomorrow, all bets are off, and the past will provide little if any guide to the future.
This highlights a difference cited by famous economist John Maynard Keynes (and promptly forgotten by many of the architects of Modern Portfolio Theory) between "risk" and "uncertainty." Risk we can measure, estimate and ultimately control. Uncertainty, we have to live with.
We will see later that there are ways to partially manage black swan risks, even if we cannot measure them with much accuracy.
Get our next article delivered to your inbox.
Sign up below and be the first to know about our freshest data-driven thinking on the markets, and investing. We will send you no more than one email a week. This is free.
Ready to start putting this into action?
Take a free two-week trial to IvyVest premium -- our premium subscription service. You'll get access to our rules-based dynamic asset allocation model, tools that will show you exactly what you need to buy in your own discount brokerage account (and when to re-balance) to implement it for yourself, and an insightful monthly newsletter that will keep you on abreast of the most important things going on in the markets. There is no credit card required. Get Started Now!