In financial markets, automated trading has been around for many years. Software such as the ORC Liquidator allows automated trading engines for derivatives. Trading firms and banks rely on ORC Liquidator and similar software to automate a wide variety of sophisticated trading strategies on trading venues globally. Algorithmic trading is designed to react extremely quickly to market changes, as the algorithms seek out and exploit small windows of trading opportunity, often measured in minute fractions of a second.
Michael Lewis book Flash Boys: A Wall Street Revolt, published in 2014, is a non-fiction book by about the rise of high-frequency trading in dark pools on the US equity market. In the book, Lewis states that “The market is rigged” by high frequency traders who front run orders placed by investors, by using extreme automated software and hardware.
In 2015, the wizards of financial algorithmic trading technology talked about artificial intelligence as the next big thing for hedge funds seeking an edge. Last year more than 40% of new hedge funds were “systematic”, meaning they used computer models for the majority of their trades.
I have noticed the increasing use of systematic algorithms when I participate to speak about innovation on financial industry conferences, and I came to think of how extreme the evolution is becoming when I read an article by Arjan van Os today. Os is head of the Innovation Centre at ABN Amro Bank and the article was published on the FinExtra website this morning. In the article, Os argues that the perfect storm in the crossroads of big data, standardised APIs (standardised programming interfaces which allows systems to interconnect) and computing power now allows financial models and patterns in real time that human minds could not create and cannot even comprehend.
Os writes that in FinTech the number of start-ups focusing on artificial intelligence is rapidly increasing. Engineers and big-data scientists create new companies to offer advanced machine learning solutions for investment banking, advanced family offices and cyber security solutions. Also in the field of Risk Management, the core safety mechanism function of every bank, predictive and prescriptive systems are introduced to produce more efficient algorithmic risk models than the current more descriptive models.
I think there are dangers with complex algorithmic models, just like the problems of VAR (Value-At-Risk) which was introduced in the 1990s to minimize losses in market volatility but which gave false confidence to traders and risk managers and opened up for systematic exploitation by arbitrage traders, and ultimately led to excessive risk-taking and leverage at financial institutions. When I was involved in development of financial trading and risk management software in the 1990s, I think we had a healthy scepticism to taking automation too far. Such reluctance seems to be rare today.
At increasing speed, financial firms are turning to machines to do the job humans have done for decades. The practice of investment bankers and traders shouting and using hand signals to buy and sell is now by far outdated. The work real people once did has been replaced by a much quieter competitor, the computer. But this means that even small faults can be costly. For example in 2012 the US market maker Knight Capital lost over $400m in 30 minutes because of a computer glitch. And last summer as the reader may recall, trading was halted at the New York Stock Exchange following a software problem.
Such costly events have raised doubts about the stability of computerised trading systems. Critics argue the systems makes markets more volatile, as I wrote about in The Consequences of Frequent Trading article here on this blog some years ago.