Sunday, December 22, 2024

Quant prop-trading firm Hudson River Trading hopes to supercharge its trading power by partnering with Google Cloud

Must read

Hudson River Trading, the market maker that handles some 10% of all trades in the US stock market, is bringing its quantitative research and data science efforts to the public cloud.

New York-based Hudson River Trading is best known for its market-making capabilities. It uses machine-learning algorithms to quote buy or sell prices to individuals and firms wanting to make a trade. Hudson River Trading winning any trade over its competitors, like Citadel Securities, Virtu, and Jane Street, comes down to its pricing engine.

“It is a very competitive thing to provide the best price to buy or sell at, because the person who provides the best price to buy or sell at is the person who does the trade,” Iain Dunning, lead algorithm developer who runs the AI team at HRT, said during Nvidia’s annual GTC conference in March.

To help it get there, Hudson River Trading turned to Google Cloud, where its quants have been conducting their research for several months without the constraints of an on-premise environment. With the public cloud, HRT can access a technology infrastructure that could be scaled up and down depending on its needs. Plus, moving to Google Cloud will give HRT researchers access to coveted Nvidia AI chips, which are increasingly important in quantitative finance. The public cloud could also offer HRT some benefits outside access: testing new strategies in a shorter period of time, and running more simulations cost-effectively.

“Our researchers rely on extensive compute resources to derive new insights about the market, train cutting edge machine learning models, and simulate trading,” Kevin Lee, Hudson River Trading’s head of research and development, told BI in a written statement. “Google Cloud allows us to do that without limits to computing power.”

Why Hudson River Trading needs AI and the cloud to stay competitive

HRT operates in some 200 markets around the world, giving it a global view of various financial instruments and transactions that happen every day. HRT doesn’t just monitor the trades that were executed, but also at the prices quoted or the changes in how much traders and willing to buy or sell a certain financial instrument, he said. This universe of data, which Dunning described as a “firehose” of events coming at you every day, is used to fuel its models.

“We need to keep up with the firehose of data,” Dunning said at the spring conference.

This information, as well as non-market data like posts on the social-media platform X, are used to make predictions about the price of any financial instrument, like stocks, bonds, or ETFs, to provide liquidity at a risk-adjusted price, Dunning said. Based on historical market conditions or made-up market events, these forecasts are then tested to see how they’d perform.

“In other words, what I’m saying is that we need to predict the future of the stock market,” Dunning said.

Once trained, these models are used to execute trades in the market automatically. Training these models, however, requires an immense amount of computing power, he said.

That’s part of the reason why the public cloud is becoming table stakes in the quant world. Two Sigma, the quantitative hedge fund, turned to the cloud when it was in a bind, providing researchers access to highly sought-after Nvidia AI chips. Citadel Securities moved its entire research platform to Google Cloud, Ken Griffin’s market maker announced in April.

The reason? Speed and efficiency, according to Italo Brito, Google Cloud’s head of hedge funds and exchanges, told BI in an email.

“Cloud technology can reduce the amount of time it takes to turn a hypothesis into a production trading signal,” Brito said. He said it speeds up the onboarding and analysis of data and makes compute resources immediately available, which means faster testing and simulations. That’s a stark difference from on-premise data centers, which take a lot of maintenance.

On top of the constant in-take of data, market makers’ demand for compute resources also fluctuates greatly on a daily basis based on market volume and volatility, Brito said. The move to the cloud could present some cost efficiencies, he added, because you only pay for what you use, compared with an on-premise environment where you’re paying for those resources regardless.

“The future of capital markets will require a next-generation, globally distributed infrastructure to open access to more participants, geographies, and asset classes,” Brito said.

Latest article