What Was The Event?

The Battle of the Quants Big Data (a client project) took place at the De Vere Grand Connaught Rooms on 10th December, 2019.  The event was for quantitative hedge fund managers, investors in and allocators to quantitative funds, quantitative hedge fund data buyers; and big data providers and alternative data providers to quant funds.

What Was It About?

In its fourteenth year of hosting the leading quantitative event worldwide, the Battle of the Quants – Big Data has become the definitive event in the quantitative space for investors, managers and data buyers looking for key industry influencers, decision makers and investment opportunities.  The Battle of the Quants curates carefully selected systematic investors and managers leading the way into the new world of Artificial Intelligence, Alternative Data Sets, Machine Learning, Digital Assets and Quantum Computing.

The event attendees included carefully selected HNWI’s, Family Offices, Fund of Funds, Institutional Investors, Quantitatve Hedge Fund Managers, Data Scientists, Data Providers and Data Buyers.

The packed agenda included cutting edge discussions in quantitative finance, extensive networking, capital introduction and data introduction breakfasts, and pre-scheduled one-on-one meetings helping attendees to form valuable professional business relationships. Careful consideration was given to include issues confronting investors and traders participating in the increasingly quantitative global financial system.

Who Was There?

A sampling of who was in attendance:

  • Quantitative hedge funds – heavily systematic, and advanced users of AI / ML (Artificial Intelligence / Machine Learning)
  • Fundamental hedge funds
  • Institutional investors & allocators – pensions, endowments, asset managers, fund of funds, family offices, and more
  • High Net Worth Individual investors
  • Exchanges
  • Big Data and Alternative Data providers
  • Technology & analytics providers
  • Quantum computing pioneers
  • Academics
  • Authors
  • Industry analysts, consultants and innovators

The speaker list is here.  It was a full house at the Grand Connaught Rooms.

Quick Hits

Photos from the Battle of the Quants Big Data event are here.  These items stood out:

Capital Introduction / Data Introduction Roundtable Breakfasts

Quant fund managers introduced their strategies to investors, and data providers introduced their data sets to quant fund data buyers, at exclusive roundtable breakfasts.  These sessions took place before the opening of the main conference content.

[Want to know how to prepare for future sessions?  See these posts for fund managers and investors, by Castle Hill Capital Partners]

 

Keynote Presentation: Quant Supremacy, How Renaissance Launched the Quant Revolution

Gregory Zuckerman, Staff Writer at The Wall Street Journal, spoke about his experience writing his new book, The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution.  This book reveals the origins of the quantitative trading and investing methods used by many of the quant fund managers in attendance and on panels later in the day, some of whom were also mentioned in the book.  The author’s own takeaway from the writing experience was the paradox that at the time, the man who was about to solve the market shouldn’t have been the one to do so.  He was a mathematician, not an investor, trader, or someone who even really cared about the markets.

Greg signed copies of the book for attendees during one of the coffee breaks.

Read the book to learn everything you always wanted to know about how the Quant Revolution began but were afraid to ask.

Land of the Giants

The London Battle of the Quants Big Data December 2019 event marked the entry of technology and big data behemoths to the conference series, bringing in perspectives from outside the industry.  IBM recently challenged Google’s “quantum supremacy”– indicating that the battle for dominance in one of the most powerful technologies to enter the industry has begun.  Both weighed in with views for the quant community:

  • IBM’s Quantum Computing Ambassador presented Quants, Quantum and the Need for Speed: How Machine Learning and Big Data will Benefit from Post Digital Computing
  • Google joined the CME Group for a fireside chat about their cloud computing collaboration to enable market data delivery, analytics, and machine learning across the financial world

Democratizing Data for The Next Quant Revolution

Proposed data distribution innovations could democratize access to financial data, or simply become the new normal – the new baseline access for all market participants. These possibilities were apparent in the remarks and conversations heard at the London Battle of the Quants Big Data event held in December 2019, which, like the Hong Kong event in the previous month, also featured big data insights.

A “Spotify it” comment succinctly captured the notion of market data distribution strategies modeled on the Spotify streaming music service, in terms of the way licensing of data is paid for and accomplished, as well as the idea of charging an hourly or time interval-based subscription rate for access to data. This comment sparked a discussion across panels about how the licensing of data needs to be reconstructed.  Competitive forces generally result in making it easy for anyone to consume any data from anywhere at lower cost. But what happens to the total spend on data as lowered unit costs draw in more data sources?

Pay-as-you-go market data plans could be especially important given the continued rise of alternative data sources, to the point where subscribing to such sources becomes a prerequisite for doing business. The addition of alternative data to quantitative investing strategies means the addition of non-traditional information such as social media, communications metadata and satellite imagery. With this data, firms may achieve a more informed estimate of the value of securities and a better targeted trading strategy in the market. However, the investor panel described an emerging expectation becoming placed upon quant funds managing their assets: as alternative data becomes just data, its information is priced into the market.  Therefore, alternative data becomes table stakes, i.e. if you don’t have it, you fall behind. The same was true not long ago of direct market data feeds as they displaced consolidated market data feeds.  This is the low latency arms race, déjà vu.

Panelists also discussed the advances in the tools required to extract value from data, for instance artificial intelligence (AI), specifically machine learning (ML), and natural language processing (NLP). Using these tools effectively requires a robust infrastructure for data content, processes to clean and transform multiple data sources, data model development and production, talent, and strong leadership.  Lowering the barriers to access data is not enough, where strong barriers to using the data exist.

And getting over that utility barrier raises new questions such as these that emerged at the event: If machines using NLP can read company reports thousands of times faster than humans, are fundamental analysts out of a job? Or do quants and fundamental investors start to look more alike, as quants pick up new data sets and fundamental analysts learn to code?  The investing world may be getting more “quantamental” by the day.

“Cloud explainability” also entered the discussions on data management tooling, to refer to explainable AI embedded in cloud computing services, where that is possible. “Explainable AI” means that the results of using AI to evaluate data, and the rationale AI used to do so can be described in a way that people can follow and understand. The results obtained from AI may not always be explainable in this manner. Having explainable AI producing more insight using certain data sets could make those data sets more valuable.  Or it could mean that we now have more sources of potentially valuable data, which could be another way of democratizing data.

Around 15 years ago, financial technologists on conference panels were already warning that the increases in data rates would soon outpace Moore’s Law. So perhaps the most ominous observation at this Battle of the Quants Big Data event was that Moore’s Law had now indeed been broken.  And, that what we need in order to keep up is humans collaborating with machines to produce smart insights on “Dark Data” (to unpack this dense insight, see the blog post “Moore’s Law is Dead, Again”).  This notion echoed a similar theme from the New York Battle of the Quants Big Data in May 2019.  The idea there was that constantly fresh and unique data isn’t necessarily the key to producing unique alpha.  Rather, fresh insight on data that you already have is the key, which requires combining technology and human market expertise.

Data is now easier to obtain than ever, thanks to the democratization of data.  However, finding the information in the data that leads to unique, uncorrelated alpha remains as challenging as ever.  So, who will solve the market in the next quant revolution, in 2020 and beyond?  It may be an AI, a human expert on the markets, or a combination of humans and machines from outside of the market with no particular view on the market at all.  Because if we have learned one thing, it’s that history may not repeat, but it does rhyme.

Book: The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution, by Gregory Zuckerman