As is often said, “Moore’s Law is Dead, Long Live Moore’s Law”. Skyrocketing worldwide data volumes have made Moore’s Law of computer processing power an outmoded maxim for financial market participants analyzing that data for trading and investing opportunities and risks. But is the solution to this conundrum hidden in the Solace of Quantum, or The X-Files?
Moore’s Law, which states that computing power doubles every two years, was the industry’s guideline for how to keep pace with ever-increasing data rates. The increase in data however, is not the first or only reason why the applicability of Moore’s Law started to erode.
Financial technologists started talking about these changes on conference panels about 15 years ago, during the explosive growth of electronic trading (video). The number of trading venues and participants, real-time market data feeds, and amount trading data being generated, consumed, multiplied and regenerated, all increased. Around 2007-2008, it became evident that all of these applications and data would surpass the growth of computing power, leaving a “technology gap” that would hamper the use of existing computing models and fall short of real-time data processing demands to feed new trading algorithms.
Following that electronic trading boom, “Big Data” arrived in the financial markets and further widened this gap, more appropriately named the “data deluge gap”. The increase in alternative data sources and data generated by the Internet of Things (IoT) compounded the explosion of big data. This growth has been described (video) as having created “more data … in 2017 than the previous 5,000 years of human history” or, according to a 2017 IBM study, the previous two years generated 90 percent of all data in human history until that time.
This exponential increase in data has benefited quantitative hedge funds because artificial intelligence and machine learning systems now have enough data to produce meaningful analysis and insights about the market. But doing so requires massive amounts of computing power. However, cloud computing has emerged as a bright spot that can reduce the data deluge gap, because it increases the scalability of technology, helping to keep up with market demands.
Quants have also begun to turn to quantum computing as a resource that could erase the gap entirely. IBM and Google are researching and developing quantum computing capability, which uses quantum mechanics principles to fundamentally change how computations are done. IBM presented its latest quantum computing research at the Battle of the Quants Big Data London conference in December 2019; and Newport, R.I.-based Entanglement Institute, a quantum computing research facility, also discussed its work at the Battle of the Quants Big Data New York event in May 2019. Having computing that can emulate complex phenomena found in the behavior of sub-atomic particles can produce more powerful processing – enough to push compute power ahead of the data growth curve.
So, what does the future hold? Speakers at the Battle of the Quants Big Data London event in December 2019 expressed that Moore’s law has now been broken, meaning that we are now in the “data deluge gap”. Unless quantum computing saves us, we can’t play the speed game any longer. So, we have to change the game.
What we need in order to keep up is humans and machines, and smart insights on “Dark Data”. This means using human intelligence and artificial intelligence in collaboration to generate fresh insights from data that we may already have – but may have overlooked. As they used to say on The X-Files, “the truth is out there”. You just have to know where – and how – to look.