Alumni Voice


“Systemic Risk and Resilience”

Thoriq Gibran Ibrahim

University student (Finance)

Like past generations that went through World Wars and Great Depressions, I expect our society to develop more persistence, resilience, and conviction. Being thrown into an unprecedented crisis, with all the pain and trauma, comes with an opportunity for great learning. We are forced to retreat to our own closed circles, forced to be alone with our thoughts — to reflect, refine, and re-assess the decisions we have made and the framework behind them. From this, I truly hope we can emerge as stronger, more anchored, more self-aware individuals that can better re-align our thoughts, actions, and aspirations with the needs and realities of tomorrow’s world. We will step out of this crisis into a different world, hopefully as better people. But a lot of work must be done.

An issue I have wrestled with since the beginning of the pandemic is that of uncertainty – more specifically, our understanding of, and responses to, Risk and Probabilities. I argue that our minds and our predictive capacity are lagged and ill-equipped to the demands of an increasingly digitalized world saturated with information. It took 8,000 years for the first homo sapiens to develop the earliest forms of language and script-writing, and then another 3,000 years to develop currency, and then another 2,000 years to develop Industries. The Industrial Revolution came, and we never looked back – 100 years from Steam Engines to space travel, 25 years from space travel to the Internet, 15 years from a functioning internet to a World Wide Web. 20 years since then and we can talk to anyone, anywhere in the world, at any time, for however long, for free. Sophisticated AIs have beaten humans in strategy games like Chess, Go, even StarCraft. It is well-known that the development and sophistication of our tools and technologies have grown exponentially – but it is often forgotten that our minds, our prefrontal cortex and frontal lobes, our memory capacity, has barely changed at all. We have the same biological make-up as our ancestors 13,000 years ago, but facing completely different realities. I implore that we realize this mismatch, and develop decision-making frameworks more robust and better equipped for today’s risks.           

With the development of technology, the nature and interaction of information and risk have also accelerated. Humans have spent 98% of our existence operating under a Gaussian reality (read: normally-distributed, bell-curved, predictable, relatively stable and non-random). Hunters and gatherers worried most about natural predators and wild animals – they counteract this by climbing trees, hiding in caves, making spears. They can do this because they have observed that no tigers are taller than 5 meters, and that no Mammoths can fit into caves, and that caves typically protect you from thunderstorms/typhoons. We can call this Pre-Modern World – where information comes from physical observation, and risks are visible and mostly predictable (absent a climate disaster, or a meteor). We know there won’t be 10-meter-tall tigers, or ultra-flexible Mammoths, or spear-proof Hyenas. Our experiences were defined and dominated by the Ordinary, the predictable, the visible – how much food we hunted, how far we could migrate in a day, how far the nearest source of water is.

The Modern World (post-Industrial Revolution) is markedly Random, and very much NOT Gaussian. The Ordinary and the predictable no longer shape our experiences – meals, water, and shelter are (mostly) a given equation of our everyday. Our beloved Internet, and the World Wide Web, and Computers were all moonshots, all unpredictable, all beyond-the-model variables. Instead, it is the unpredictable and the invisible that can hurt us the most; financial crises, nuclear warfare, cellular cancer – none of which are Gaussian, none of which are normally-distributed, none of which we are used to respond to. And these extra-ordinary, seemingly low-probability tragedies happen more and more frequently, hurting us more and more each time — yet we have done nothing about our predictive frameworks that have clearly failed to accommodate for the fundamental change in today’s Probabilities. How many “once in a century” financial crises have happened? Seemingly once every ten years. Our very existence today is owed to Vasili Arkhipov deliberately refusing orders to launch a nuclear torpedo against US forces during the Cuban Missile Crisis — a variable no model could have predicted at the time. We grossly underestimate the probability, and impact, of non-ordinary things happening. We act upon the same Gaussian probability we have evolved to learn, but the world today is far more random than it has ever been. We have observed the same fallacy with the Coronavirus – world leaders everywhere, in developed as well as developing countries, have underestimated the magnitude of this virus, resulting in fatally delayed responses, weak half-hearted measures, even lifting restrictions far too early. We underestimated this virus and we have paid the price, even when there were several virus-related warning signals not too long ago (MERS, SARS, Ebola) – we got lucky last time, we are not so lucky this time.

It is a fatal mistake to ignore these non-ordinary events because they are low-probability – we must change how we fundamentally perceive the realities of today’s world, and be honest about crises and disasters are happening far too frequently to still be considered “low-probability” events. It is an even bigger mistake to not admit the biases in our own decision-making framework. If we want to be more robust against these kinds of risks, we must de-stigmatize saying “I don’t yet know enough, so I won’t make predictions and I will reduce risk-taking.” Delaying inference and conclusions are far more beneficial than making the wrong ones. The UK’s early Coronavirus policy, for example, was almost entirely driven by a rushed, poorly constructed, un-reviewed research paper by some academics in Imperial College London made in a week – the consequences were the highest death toll per 1000 in Europe. Contrast that to Vietnam – they locked down early, imposed restrictions, and put in place an incredibly comprehensive track-and-trace mechanism that proved to be the one of the most robust models in the world, even when their macroeconomic fundamentals were (arguably) far weaker than the UK’s. Experts and Industry Leaders should be encouraged to say “I don’t know and I don’t pretend to know”, instead of being urged to say things they can’t know, or forced to make misguided predictions. We must learn to prioritize robustness rather than precision – striving for robustness reduces the impact of bad things happening, while striving for precision would amplify it. In other words, it is more important to be less wrong ten times out of ten, than be right one time and completely miss the target on the other nine – especially given that our current world is far more random than we would like to admit. In short, I want society to be more robust against risk. This requires a systemic change in how we view reality, and the probabilities we assign to non-ordinary things happening. We should expect more crazy events, more crises, more moonshots, more beyond-the-model variables, so that we prepare for them better. As a future leader, I urge policy-makers to build systemic resilience by emphasizing on things that help people live better in a broader array of circumstances. It starts with higher investment and safety nets for “necessity variables” such as healthcare, education, and employment. Beyond that, the use of the right kinds of technology to tailor scalable solutions to help people live better – cheaper nourishment, more affordable housing stable employment, sustainable production methods, and so on. Under financial crises, world wars, and global pandemics, people who can eat well, sleep well, and work well will fare far better, and recover much quicker than people who are homeless and jobless. Policies should thus focus on making MORE people eat and sleep well, rather than making already well-off people sleep in king-sized beds rather than double beds. And if a bigger chunk of the population is more robust, the aggregate is more robust. Systemic robustness comes from more people living better. Today’s world entails high levels of correlation between one another, it is time we equitably share the rewards as well as the risks.



テクノロジーの発展に伴い、情報とリスクの性質や相互作用も加速しています。人間は日々の動作のうち98%をガウスの分布(読む: 正規分布、ベルカーブ、予測可能、比較的安定、非ランダム)の下に過ごしてきました。狩猟採集者は天敵や野生動物を最も恐れていましたが、木に登ったり、洞窟に隠れたり、槍を作ったりすることで対処しました。このような対処ができたのは、トラは5メートル以下であること、マンモスは洞窟に入れないこと、洞窟は通常雷雨や台風から身を守ってくれることを経験から知っていたからです。私たちはこれを前近代世界と呼ぶことができます。そこでは、情報は物理的な観察から得られ、リスクは目に見えており、ほとんどの場合予測可能です(気候災害や隕石がなければ)。私たちは、10メートルのトラや、超柔軟性のあるマンモスや、槍に強いハイエナがいないことを知っています。私たちの経験は、当たり前のもの、予測可能なもの、目に見えるものによって形作られ、特色づけられてきました。どれだけの食料を狩ることができるか、一日にどれだけの距離を移動することができるか、最も近い水源への距離はどれくらいか、など日々の経験が蓄積され、物事の判断材料になるのです。

(産業革命後の)現代世界は明らかにランダムであり、ガウス型ではありません。あたりまえのものと予測可能なものは、もはや私たちの経験を形作るものではなくなりました。 食事、水、避難所は(ほとんどの場合)私たちの日常の与えられた方程式です。皆が愛してやまないインターネット、ウェブ、そしてコンピュータは、特大級のホームランであり、予測不可能であり、モデルを超えた変数でした。代わりに、予測不能ですが、目に見えるもので私たちを最も傷つけるものがあります。それは、金融危機、核戦争、細胞癌などです。どれもガウス分布でも、正規分布でも、私たちが容易に対応できるようなものではありません。通常を逸脱しており、一見起こり得ないような悲劇は、ますます頻繁に起こり、そのたびに私たちをますます傷つけています。それにも関わらず、私たちはこの予測可能な事態に対して何も対処をしてきていませんでした。ゆえに今日の「確率」の範疇での、根本的変化への対応が明らかに失敗しているのです。「100年に一度の」金融危機は何回起こりましたか?どうやら10年に一度起こっているようです。今日の私たちの存在は、キューバ・ミサイル危機の際に、米軍に対する核魚雷発射命令をわざと拒否したヴァシリ・アーキポフのおかげなのです。当時は誰もそのことを予測できませんでした。私たちは、非日常的なことが起こる確率や影響を過小評価しています。私たちは、進化していく過程の中でもガウスの確率に基づいて行動してきました。そして、それは今日も同じです。ただし、これまでとは違うのは、今日の世界は、以前よりもはるかにランダムだということです。コロナウィルスに対しても同じ過ちを犯してしまったのです。先進国、発展途上国を問わず、世界の指導者たちは、このウイルスを過小評価しました。その結果、致命的な対応の遅れ、弱く中途半端な対策、さらには早すぎる規制解除を招いてしまったのです。私たちはこのウイルスを過小評価しており、代償を払ってきました。そんなに遠くない過去にウィルス(MERS、SARS、エボラなど)に関連した危機的状況を経験していたにも関わらずです。前回は幸運にも何とか切り抜けられましたが、今回はどうやら前回ほどの運は持ち合わせていないようです。

結果は、ヨーロッパで1000人当たりの死者数が最も多かった、というものになりました。それとは対照的に、ベトナムでは早期にロックダウンし、制限を課し、信じられないほど包括的な追跡システムを導入しました。その追跡システムは、世界規模でみてもしっかりとしたものでした。ヴェトナムのマクロ経済基盤が(ほぼ間違いなく)イギリスのものよりもはるかに弱いとされているにも関わらずです。専門家や業界のリーダーは、知らないことを言うように促されたり、見当違いの予測を強要されたりするのではなく、「知らないし、知ったふりもしない」と言うべきだと思います。私たちは、精度よりもたくましさを優先することを学ばなければなりません 。たくましさを追求することは、悪いことから受ける影響を減らし、精度を追求することは影響を増幅させるでしょう。言い換えれば、10回のうち10回は間違っていないことの方が、1回だけ正しくて、残りの9回は完全に的外れになるよりも重要なのです 。 特に、現在の世界が私たちが思うよりもはるかにランダムであることを考えると。


Choose your Reaction!