
Northern Trust Asset Management has recently made a significant expansion to its quantitative investing team, prompting a shift in its approach. “AI now allows us to explore ideas that were previously impractical,” says ‘newcomer’ Gijsbert de Lange. But he is quick to add that there is nothing random about the process: “Our research always starts with an economic assumption.”
For five years, De Lange led the quant equity team at APG Asset Management, until its largest client, Dutch pension fund ABP, decided to shift the bulk of its equity portfolio into passive strategies. That left no room for a 25-strong team of researchers, portfolio managers, and IT specialists working around the clock to refine the active management of equity portfolios using advanced techniques. APG had little choice but to wind down the team.
Northern Trust Asset Management (NTAM) seized the opportunity, hiring about half of the group earlier this year. The US asset manager has been pioneering quantitative strategies since 1994. Including the new hires, the team now numbers 63 professionals across Chicago, London, Melbourne, and Amsterdam, managing more than 40 billion dollars in assets.
A few months into the job, De Lange is happy to reflect on the transition and how his arrival has influenced Northern Trust’s quant investing.
Northern Trust is a different kind of employer than APG. Has your work changed significantly?
“In terms of research, the differences aren’t that big, but organizationally you feel in every way that this is a truly global company. The commercial opportunities are greater. We talk to European clients, but also to clients in the Far East, in Australia, in the US, you name it. I now spend much more time thinking about the different ways we can apply the results of our research.”
What is the main focus of that research?
“Our work aims to improve equity and fixed income strategies that use factors. The key factors are value, quality, and momentum, although here at Northern Trust we call momentum ‘sentiment.’ Implementation of factor strategies varies by asset manager, what we call proprietary implementation. On top of NTAM’s specific approach, our quant teams use innovative criteria to target alpha, or outperformance, in a highly focused way.”
So quant investing is the icing on the cake?
“If the cake is factor investing, yes. We further refine the implementation of factor strategies through stock selection, and at times through timing as well.”
On what criteria do you base those refinements?
“We add two specific characteristics we believe are valuable. First, we look for innovative companies. Second, we look for companies that consider all their stakeholders. The thinking behind innovation is obvious; the stakeholder dimension reflects our view that companies maintaining strong relationships with all parties around them tend to avoid more incidents over the long term, and that adds value compared to companies that do less well on that front.”
How do you identify such dimensions? Data mining?
“Absolutely not. In our approach, the starting point is always an economic assumption, a theoretical notion we believe makes sense. We then test it against the data to verify whether it holds. We never work the other way around; randomly looking for patterns is a dead end. If a correlation or pattern can’t be explained, it can vanish overnight. You can see it in the flood of research literature on ever-new factors and sub-factors, the so-called ‘factor zoo.’ A few years ago there was even an excellent paper on this called Taming the Factor Zoo.”
What does artificial intelligence add to this?
“For our type of quantitative research, in addition to traditional datasets and numerical series, we work extensively with alternative data such as text. Two technologies are critical here: natural language processing (NLP) and machine learning. AI enables us to deploy these technologies more effectively and at lower cost. The hardware component is significant, as the costs of computing power and data storage have dropped dramatically. That makes it easier for us to test our ideas, for example for new selection models. AI allows us to explore ideas that were previously impractical because they required too much data or computing power.”
So is this a revolution?
“For quant investors, I’d call AI more of an evolution. We started with spreadsheets on PCs, then moved to in-house servers, and now we do everything in the cloud with access to massive data centers. But the principles of our work haven’t changed, we look for patterns and relationships in new data that enable more refined portfolio selection.
“That said, this is expanding rapidly, and for certain parts of our work, today’s explosion in data and computing power is indeed transformative. Take building networks, for example. We map out a company’s relationships with other companies, competitors, customers, suppliers, other partners. We can also map networks of companies using the same software, holding similar patents, or sharing a business model. They might operate in completely different sectors, but face similar risk factors. Comparing their valuations can lead to interesting insights. The challenge is that building these networks requires processing enormous volumes of data, and AI is now increasingly central to that.”
Does AI make quant investing easier?
“I’d argue not. For this type of research, much of the added value still comes from how you execute. Are you doing it right? Are you precise? Are you leaving nothing out? A lot of what we do is actually quite boring, you have to give meticulous attention to every detail. For example, you want to remove every uncompensated risk from the model, and that’s painstaking work. Or you might need to run a backtest. But how do you design a robust backtest? That’s a massive job if you want to get it right. It’s not exciting, but it’s essential craftsmanship.”