Article by Campbell Harvey, Partner and Head of Research at Research Affiliates
We are all quants
In my “Man vs. Machine” paper, I undertake an intriguing exercise. [1] The analysis required a lengthy sample of hedge funds. Half of the sample declared whether they were systematic or discretionary. The other half made no declaration but did provide detailed descriptions of what the fund did. We set out to do the following natural language processing exercise: we would look for words and phrases that distinguished systematic from discretionary in our training sample (where we knew the truth) and then apply this to the thousands of unclassified funds.
Certain words made a lot of sense, such as algorithmic. We were also keen on the word quant or quantitative. To our surprise, the word quant did not separate systematic from discretionary. Indeed, it was more likely that “quant” was associated with discretionary fund descriptions!
What does this mean? It is simple: quantitative analysis is a crucial part of the investment process for both discretionary and systematic funds. While in the past, the discretionary portfolio manager might have provided a spreadsheet with valuation models for her favorite names, today’s information environment demands the use of quantitative tools. Thousands of databases are now available to investment professionals and it is implausible that any single manager can manually process all of the data demanded by these quantitative tools.
In today’s environment, we are all quants, but we do not all run systematic portfolios. In systematic investment, trades are generated by rules or algorithms, which are of course designed by humans. These algorithms operate independently when used in live trading. In discretionary portfolios, managers make the final trading decision even though they may use plenty of quantitative tools to assist their decision process. In the end, however, a person generates the trade idea - not an algorithm.
Origins
Thirty-five years ago, systematic investing was a niche investment style, mainly focused on trend-following systems. The initial algorithms operationalized a century-old investing approach called technical analysis. Although technical analysis has many flavors, the identification and extrapolation of trends is its cornerstone. One drawback is the inevitable turning point. At some moment in time, the trend will reverse. Algorithms evolved so that after an extended trend (or a very strong trend signal), risk was reduced. This capability effectively allowed for reversals and reduced the losses suffered at turning points.
The next wave was quantitative stock-selection models. These models used an algorithmic approach to identify stocks the strategy should buy or sell. For long-only portfolios, these models determined over- and underweighting of securities. These models typically went beyond price data and included fundamental information like valuation, growth, profitability and quality metrics.
The next significant innovation in systematic investment was the emergence of so-called smart beta strategies. These low cost products might focus on a particular factor or strategy, such as value. The name – typically applied to a wide array of formulaic or algorithmic strategies, often with impressive backtest results - plants the impression that the strategies are smart. However, there are plenty of strategies that are not smart offered under this rubric. The smart beta strategies create an index using an algorithmic approach. Investors can access the strategy in many forms, such as exchange-traded funds or mutual funds. Smart beta strategies also have multifactor versions.
Simultaneously and more capital entered the market, many managers realized the easiest way to increase alpha was to reduce costs. One way to reduce costs was through improved execution. Hence, the third wave was the emergence of systematic high-frequency trading. Such trading can produce stand-alone profitability to funds such as Renaissance Technologies - or it can be part of the execution strategies of both systematic and discretionary funds.
Currently, we are at the beginning of the era of using artificial intelligence (AI) tools in both systematic and discretionary strategies. For example, large language models hold the possibility of helping researchers analyze a vast amount of financial information and isolate risk factors.
Machine learning
In recent years, machine-learning tools have emerged to drive systematic investment strategies. These tools have been around for quite a while. Indeed, I tried to implement some deep-learning tools on equity returns almost 25 years ago. The model failed because it was too simple. It was simple because of computational constraints.
Three specific factors have led to the surge in machine-learning applications. First, computing speed greatly increased. In 1990, a Cray 2 supercomputer cost $32 million (today’s dollars), weighed 5,500 pounds, and needed a cooling unit. It was able to do 1.9 billion floating-point operations per second. Today, your mobile phone is 500 times faster than the Cray 2.
The second factor is data. In the time of the Cray 2, a cost of a gigabyte of storage was $10,000. Today, the cost of a gigabyte is less than one cent. This allows for the cheap collection and storage of vast amounts of data. In addition to cheap storage, the scope of data expanded beyond financial and price information to include unstructured data from a multitude of sources (text, voice, web, geosat, pictures, etc.).
The third factor is open-source software. In the past, software development was siloed. Today, we have a completely different situation. Development is much more efficient, because engineers do not have to reinvent the wheel: they go to GitHub and find many others have dealt with the same problem they are facing, and the solutions are freely available to them.