This is an interview with Matt Bilderbeck of Navigatus. Matt has recently returned from a trip to North America where he attended training.
Q. Tell us about the courses you took.
A. The first course was at the Real World Risk Institute (RWRI) in New York. It was run by Nassim Taleb, author of Black Swan. The second course was at the New England Complex Systems Institute (NECSI).
Q. That’s a long way to go to attend training, what was it that attracted you to these courses?
A. I’ve been a fan of Nassim Taleb for many years since discovering his books. And I’ve also been a complex systems enthusiast having read a lot of books on the subject. When Nassim announced he would be giving a course I was keen to get along as soon as possible. and given the generous support of Navigatus, the opportunity was too good to pass up.
Q. So you mentioned Nassim Taleb can you tell me a bit more about him?
A. He’s a former full-time trader, a philosopher, mathematician and author. His most famous book is the Black Swan. It describes the rare, unpredictable events that dominate the world. After the book came out many people were saying so there is a problem, now what can we do about it? His latest book, Antifragile, is the answer to that question. It’s effectively a blueprint for living in a world buffeted by Black Swans.
Q. Can you describe the scope of the two courses?
A. The RWRI course covered the applications and limits of statistics with a focus on decision-making under uncertainty. The NECSI course covered data analytics, complex system models and engineering network robustness.
The principles of these courses apply across many domains. We looked at applications in transport, technology, medicine, war, pandemics and finance. However, throughout each of these the focus was very much on extreme events and the system characteristics that can generate these sorts of events. The reason for focusing on the extremes is that they are often overlooked yet can have a greater impact than all the smaller events combined.
Q: People now quite frequently mention the term Black Swan, for example ‘The Kaikoura earthquake was a Black Swan’. In your opinion does this suggest the concept of Black Swan is generally now well understood?
A Black Swan is something that our past-experience can’t point to. It’s a regime shift which means our past statistics are no longer valid. Many of the events that are called Black Swans today are simply rare events that are consistent with what our past statistics indicate are possible. Living in New Zealand we should expect to be exposed to a range of natural hazards. That said, the Christchurch earthquakes certainly taught us we should not be overconfident about our understanding of how these events may play out.
Q. What were the key learnings for you?
A. I now look at risk differently, using categories like fragile, robust, and antifragile. I focus more on factors like unseen consequences, second-order effects, and local vs. systemic impacts. There were many key learnings for me, some more technical than others. To pick just a few:
“Framing is important” – how probabilities are presented affects how people interpret them. Calling something a one-in-fifty year event elicits a different response to saying there is a one-in-fifty chance of that event this year.
“X is not f(X)” – that is to say people tend to spend a lot of time studying some variable, say where the next earthquake will be, when we would be better served directing that effort to reducing our vulnerability to that variable.
“Overcentralisation is fragile” – there is a trend towards centralisation across many domains (e.g. financial, political and engineered systems) with efficiency often cited as the key benefit. But this efficiency comes at a cost of hidden fragility. Errors and events in one area can more easily propagate through the whole system causing widespread damage. I think it’s important to build or maintain barriers, or “circuit breakers”, to prevent this.
Ideally systems would have localised harm but non-localised benefits. Consider aviation, if an aircraft crashes it doesn’t have negative flow on effects to all the other aircraft, yet the whole aviation system can learn from the accident investigation and improve. This has led to the very high safety levels of today. I think this is a good model to aspire to.
Q. How do you perceive the value of such knowledge can be applied in New Zealand context?
A. Many ways. For example how we structure our complex engineered systems so that they can evolve and incorporate new technologies safely and efficiently. As society and its systems become more complex, traditional engineering tools run into limitations. Top down design has given us so much, but increasingly we are seeing its limitations manifested through major cost blowouts and delays on projects where the complexity is too great for this approach.
Systems thinking can help us avoid risks with non-localised harm. Releasing genetically engineered organisms into the environment is one example of this sort of risk. It allows the creation of novel gene sequences which nature itself couldn’t come up with in a billion years. We can’t absolutely know how such gene sequences will behave, interact and evolve in a complex ecosystem. And because we live in a highly connected world, with disrupted barriers, harm could be irreversible and non-localised.
Finally, as a society I think we need more focus on aligning incentives. We are beginning to see this in the safety realm with changes to director accountability following Pike River. In Antifragile, Nassim Taleb calls it ‘Skin in the Game’ (also the name of his next book). It means that those making decisions or taking risks must be exposed in some way to the consequences of their decisions. I think the lack of ‘Skin in the Game’ amongst top decision makers is the best explanatory factor for the political events we have been seeing recently around the world. Ordinary people are rebelling against those who have been making decisions for society, taking the benefits but not facing the consequences.
Q. And what are you now studying?
A. I’m working on improving my mathematical capability with the intention of being able to employ some more advanced techniques for modelling and analysing complex systems.
Close: Thank you Matt we look forward to learning more about that in due course.
No comments yet.