I have just finished reading “The Black Swan” by Nassim Nicholas Taleb. Black swan theory is used to explain rare, high impact events that were difficult to predict and leave some notable or significant legacy.
The term “black swan” comes from the European conviction that all swans were white, and all historical data available backed this theory and therefore black swans did not exist. During the 18th century black feathered swans were discovered in Western Australia, since then the term “black swan” was widely used to denote something that is thought or perceived to be impossible, but none the less happens.
Taleb gives World War 1, the internet, personal computers and the 9/11 attacks as all examples of Black Swan events that have three things in common. Later articles by Taleb (HBR 10/09) include the mortgage meltdown of 08/09 in this list.
First, they were outliers. Each of these events were so far beyond expectations that their very occurrence came as a surprise as there was nothing that pointed to what was coming.
Secondly, the event has an extensive and far had reaching impact.
Third, even though we did not see it coming, we can go back after the event and create a plausible trail of evidence that points to what was going to happen.
This third part is for me particularly interesting. Upon reflection we could see a trail that could lead one to understand that the Black Swan event was coming and do something about it before the full seriousness was realized.
These very low probability, but very high impact events are by definition almost impossible to predict and very difficult to see coming. Traditional risk management largely depends on past experience to predict the future, but this model breaks down totally when there is nothing to compare with.
When a project team starts working on a product introduction into the factory we are tasked with identifying risks and mitigating those risks to reduce both likelihood and impact. We use the experience of each team member to identify, quantify and understand the risk and create a plan to reduce its effects.
Occasionally we get something out of nowhere (a Black Swan event) that no one has seen before that causes the extensive and far reaching impact Taleb talks of.
During and aircraft interior fit at a customer it was discovered that the conversion of some Computer Aided Design (CAD) models from one software system to another had led to the “envelope” of the models to shrink by 10%. This caused the physical and digital mock ups to be incorrect and when the vendor supplied parts that were built to the incorrectly converted models were installed in production nothing fitted and there were substantial gaps between the side and the furnishings.
There was nothing in the teams’ experience that had even suggested this could happen. This was a Black Swan event that caused a six month delay of the aircraft back into service and a knock on that caused the entire retrofit program to be delayed, which in turn had a knock on effect on as the new build became concurrent with the retrofit program instead of following.
Taleb states that with the ever increasing tangles webs of relationships and interdependent actions (in this case vendors, customers, regulatory authorities and three or four internal organizations) that Black Swan events are becoming more and more likely, as an outlier event in just one of these places means a huge impact to the rest.
For me, probably the most important lesson from the book is that it’s better to reduce vulnerability to low-probability/high impact events, rather than trying to anticipate then events themselves. While experience and past lessons learned can help many risks, the true Black Swan events mean that it’s not about controlling what may happen, but to prepare for what can.