AI to help predict financial crashes in order to prevent them
Some statisticians believe artificial intelligence, or AI, can predict crashes in the economy before they happen and financial forecasting using advanced analytics combined with machine learning could soon identify early red flags in the economy before they lead to disaster.
This would offer unprecedented grace periods where interventions could safeguard societies worldwide and systems that increasingly rely upon each other in an international market.
By training AI on historical records, it has already been possible to anticipate natural disasters for many years. Whether this be the seismic activity required to spot impending earthquakes or air currents telegraphing a hurricane, AI, despite its infancy, already demonstrates a proven track record in its ability to be trained on large data sets to spot potential impending disasters in the natural world.
Spotting a man-made crisis, however, relies upon identifying much more complex and unpredictable factors. Previous attempts to learn from historical patterns, such as the 2008 crisis or the Great Depression, relied on linear events and trends to identify risks. However, today’s complex and interwoven global financial system is even more volatile than its predecessors, meaning that hidden connections and factors can cause ugly surprises for analysts reliant on human brain power and computing alone.
Financial crashes come in many forms. A bank may suddenly go under, a nation’s currency can have its value evaporate, war or debt can rear its ugly head to disrupt a previously stable system. These events all have data indicating problems that gradually get worse until an event triggers rapid deterioration. The trigger is relatively straightforward to spot, however it is the undercurrent of societal issues bubbling in the background beforehand making the system vulnerable to a single trigger that is both of interest and harder to spot.
This is made all the more difficult when considering all economies tolerate ongoing societal issues as a reality of life.
Distinguishing the benign from the insidious is where it is hoped AI can make a difference.
AI algorithms can rapidly process such data and spot correlations deeply hidden in vast quantities of numbers. This can enable policies to be preemptively emplaced before a situation deteriorates.
Producing AI data is one thing however; understanding it is another.
Understanding why an AI algorithm has made its prediction is crucial, and difficulties in doing so will, no doubt, hamper trust in the system as well as lead to difficulties in applying appropriate remedies.
It is also feared that inherent biases in data fed into the technology could also lead to inaccurate forecasts, along with errors inputted into the data sets themselves.
The widespread implementation of AI in the finance sector means that any attempts to use the technology in a predictive manner must be tightly regulated for these reasons.
Governments must ensure that AI systems are implemented transparently and with measures to guarantee accountability in order to protect the general public. The interconnected nature of international financial markets means that any mistakes could have severe knock-on effects for countless people around the world.
Humans must be trained in the art of deciphering AI forecasts in a meaningful manner applicable to rapidly changing situations. It takes skill and experience to understand the nuance of exactly which solutions should be applied to safeguard the economy once AI has flagged an alert. Whether this be increasing cash flow to certain financial sectors or reducing interest or borrowing rates, having an independent body made up of experts from cross-discipline fields increases the chances of success.
AI holds much promise in protecting us from economic harm. It can aid financial institutions and guide policymakers with greater situational awareness in what is becoming an increasingly volatile financial environment. Its development must, however, be closely scrutinized and applied with care.