HRT's Python fork: Leveraging PEP 690 for faster imports

45 davidteather 37 8/8/2025, 4:12:03 PM hudsonrivertrading.com ↗

Comments (37)

roadside_picnic · 1h ago
Interviewed with HRT awhile back. While I didn't get past the final round, their Python internals interview (which I did pass) was an absolute blast to prepare for, and required a really deep dive into implementation specific details of CPython around things like exactly how collisions are handled in dict, details about memory management, etc. Pretty much had to spend a few weeks in the CPython source to prep, and was, for me, worth the interview just to really learn what's going on.

For most teams I would be pretty skeptical of a internal Python fork, but the Python devs at HRT really know their stuff.

ActorNightly · 9m ago
>Python devs at HRT really know their stuff.

Its a finance firm - i.e scam firm. "We have a fancy trading algorithm that statistically is never going to outperform just buying VOO and holding it, but the thing is if you get lucky, it could".

Scammers are not tech people. And its pretty from their post.

> In Python, imports occur at runtime. For each imported name, the interpreter must find, load, and evaluate the contents of a corresponding module. This process gets dramatically slower for large modules, modules on distributed file systems, modules with slow side-effects (code that runs during evaluation), modules with many transitive imports, and C/C++ extension modules with many library dependencies.

As they should.

The idea that when you type something in the code and then the interpreter just doesn't execute it is how you end up with Java like services, where you have dependency injection chains that are so massive that when the first time everything has to get lazily injected the code takes a massive amount of time to run. Then you have to go figure out where is the initialization code that slows everything down, and start figuring out how to modify your code to make that load first, which leads to a mess.

If your python module takes a long time to load, this is a module problem. There is a reason why you can import submodules of modules directly, and overall the __init__.py in the module shouldn't import all the submodules by default. Structure your modules so they don't do massive initialization routines and problem solved.

Furthermore, because of pythons dynamic nature, you can do run time imports, including imports in functions. In use, whether you import something up at the top and it gets lazily loaded or you import something right when you have to use it has absolutely no difference other than code syntax, and the latter is actually better because you can see what is going on rather than the lazy loading being hidden away in the interpreter.

Or if you really care, you can implement lazy work process inside the modules, so when you import them and use them the first time it works exactly like lazy imports.

To basically spend time building a new interpreter with lazy loading just to be able to have all your import statements up at the top just screams that those devs prefer ideology over practicality.

Danjoe4 · 2m ago
Runtime imports are a maintenance nightmare and can quickly fragment a codebase. Static analysis of imports is so desirable that it is almost always worth the initialization performance hit. Tradeoffs.
mhh__ · 6m ago
> "We have a fancy trading algorithm that statistically is never going to outperform just buying VOO and holding it, but the thing is if you get lucky, it could".

You wish lol. How do you think they pay for all the developers?

Firms like HRT don't even take outsider money, they don't really need to.

And besides, we don't get paid for beating stocks, a lot of funds will do worse than equities in a good year for the latter, the whole point is that you're benchmarked to the risk free rate because your skill is in making money while being overall market neutral. So you rarely take a drawdown anywhere near as badly as equities.

As a service this is often a portfolio diversification tool for large allocators rather than something they put all the money into.

It is true however that some firms are basically just rubbish beta vehicles that probably should in an ideal world shut down.

ladberg · 4m ago
> Its a finance firm - i.e scam firm. "We have a fancy trading algorithm that statistically is never going to outperform just buying VOO and holding it, but the thing is if you get lucky, it could".

HRT trades their own money so if it didn't beat VOO then they'd just buy VOO. There are no external investors to scam.

cjj_swe · 3m ago
What a deeply ignorant statement
nly · 1h ago
I interviewed with them as well. Something like 6-8 interviews only to be told they then, after that, were circulating my CV amongst teams and didn't have a fit for me...

But yes, like you I had a great experience

htrp · 1h ago
when milliseconds mean millions
ActorNightly · 6m ago
Those days are all over btw.

Most trading firms are past the whole "beat the other guys to buy". Established large investment firms already have all that on lockdown in terms of infrastructure and influence to the extent where they basically just run the stock market at this point (i.e Tesla posts horrible quarter numbers, but stock goes up).

Most of the smaller firms basically try to figure out the patterns of the larger firms and capitalize on that. The timescales have shifted quite a bit.

PufPufPuf · 52m ago
If that was the case, why use Python in the first place?
mhh__ · 12m ago
It's not uncommon to have a fast core and then an API that alpha / research teams feed signals into

if you are someone like HRT I presume the bulk of their money comes at very short holding periods so you have e.g. fast signals that work short term and then mid frequency alpha signals that spit out a forecast over a few timeframes i.e. it might not be that they buy (aggressively) really quickly but rather than someone sells to them and then they hold onto the position for longer than they would if they have no opinions.

Similarly this shapes where you post your orders e.g. if you really want it then you want to be top of the book

dmoy · 45m ago
Well there's gonna be people writing code who can't do it in say a high performance C/C++ setup. Not professional programmers, but professional <some finance discipline>.

Sometimes it will be worth the tradeoff to put that person and a programmer together to code up a solution in another language. Sometimes it will be worth it to have the non-programmer write it in Python and then do Herculean things in the background to make it fast enough.

procaryote · 15m ago
The gulf between high performance C/C++ and Python is vast and includes most other programming languages, many of which are friendly to write or can be made friendly to write for a limited domain, with significantly less rocket science needed than making python faster.
instig007 · 41m ago
> Sometimes it will be worth it to have the non-programmer write it in Python and then do Herculean things in the background to make it fast enough.

Nim exists, Crystal exists

dmoy · 38m ago
It would not surprise me if some shops end up using less common languages to fill a niche (or hell, invent their own DSL).

But it also wouldn't surprise me if a lot of shops land on python because that's what their hiring pool knows.

shepardrtc · 37m ago
Honestly if you're a millisecond too slow you might as well not trade at all. From my own experience with trying to get Python to go fast for crypto trading, you can get it pretty fast using Cython - single digit microseconds on an average AWS instance for a simple linear regression was my proudest moment. They're probably pushing it even faster because nanoseconds are where the money's at. Many HFT firms are down in the double digit nanoseconds, I believe. Maybe lower.
charcircuit · 5m ago
For crypto you pay the miners to put your transaction first. You don't need millisecond precission reaction time.
tracnar · 18m ago
While I see the usefulness of lazy imports, it always seemed a bit backward to me for the importer to ask for lazy import, especially if you make it an import keyword rather than a Python flag. Instead I'd expect the modules to declare (and maybe enforce) that they don't have side effects, that way you know they can be lazily imported, and it opens the door for more optimizations, like declaring the module immutable. That links to the performance barrier of Python due to its dynamic nature as discussed in https://news.ycombinator.com/item?id=44809387

Of course that doesn't solve the overhead of finding the modules, but that could be optimized without lazy import, for example by having a way to pre-compute the module locations at install time.

nasretdinov · 1h ago
I wonder how much can be saved by using a local file system for imports though. In my testing just a mere presense of a home directory on NFS already dramatically slows down imports (by ~10x) due to Python searching for modules in home directory too by default.
its-summertime · 56m ago
> There’s also no way to make imports of the form from module import * lazy

I'd say if you see

    from typing import Final
    [...]
    __all__: Final = ("a", "b", "c")
Its probably 99% safe to pull that from a quick run over of the AST (and caching that for the later import if you want to be fancy)

Of course, should one be doing a star import in a proper codebase?

fabioz · 2h ago
It'd have been really nice to have that PEP in as it'd have helped me not have to write local imports everywhere.

As it is, top-level imports IMHO are only meant to be used for modules required to be used in the startup, everything else should be a local import -- getting everyone convinced of that is the main issue though as it really goes against the regular coding of most Python modules (but the time saved to start up apps I work on does definitely make it worth it).

theLiminator · 53m ago
Yeah, imo that's the way that python should've worked in the first place.

Import-time side effects are definitely nasty though and I wonder what the implications on all downstream code would be. Perhaps a lazy import keyword is a better way forward.

davidteather · 3h ago
The author interviewed me and talked about this project, so it was cool seeing a blog post posted about it
rasjani · 3h ago
I know few modules that can take seconds to import but would have been nice to hear how much they actually gained?

Also maybe, if this approach could yield stats on if some import was needed or not ?

patrick91 · 1h ago
I really really want lazy imports in Python, it's would be a godsend for CLIs
nomel · 23m ago
Libraries for this have always existed, triggering import on first access. The problem was, they would break linters. But that's not an issue anymore with typing.TYPE_CHECKING.

A PEP is very much welcome, but using lazy import libraries is a fairly common, very old, method of speeding things up. My pre PEP 690 code looks like this:

    import typing
    from lazy import LazyImport

    member = LazyImport('my_module.subpackage', 'member')
    member1, member2, = LazyImport('my_module', 'member1', 'member2')
    
    if typing.TYPE_CHECKING:
        # normal import, for linter/IDE/navigation. 
        from my_module.subpackage import member
        from my_module import member1, member2
formerly_proven · 16m ago
Well if you use argparse or one of the many argparse wrappers for a moderately complex CLI you end up lazyfing the CLI parser itself because just fully populating the argparse data structures can easily take half a second or more, so with other startup costs you easily end up with "program --help" taking >1s and any CLI parsing error also taking >1s.
ecshafer · 2h ago
I thought HRT was a Cpp shop? Is Python used in their main business applications, or more for quants / data scientists?
almostgotcaught · 2h ago
every quant shop has QR and QT people that can barely write passable python let alone cpp - then the QD people have to integrate that stuff with prod cpp pipelines.
instig007 · 52m ago
If only compiled languages with dead code elimination existed...
zzzeek · 2h ago
Gonna call this an antipattern. Do you need all those modules imported in every script ? Well then you save nothing on loadup time, the time will be spent regardless. Does every script not need those imports ? Well they shouldn't be importing those things and this small set of top level imports should be curated into a better, more fine grained list (and if you want to write tools, you can certainly identify these patterns using tooling similar to that which you wrote for LazyImports).
its-summertime · 1h ago

    import argparse
    parser = argparse.ArgumentParser()
    parser.parse_args()
    import requests
Is an annoying bodge that a programmer should not have to think about, as a random example
spicybright · 2h ago
For personal one file utility scripts, I'll sometimes only import a module on a code path that needs it. And make it global if the scope gets in the way.

It's dirty, but speeds things up vs putting all imports at the top.

sunshowers · 2h ago
There are often large programs where not every invocation imports every module.

The lazy import approach was pioneered in Mercurial I believe, where it cut down startup times by 3x.

Spivak · 1h ago
> This process gets dramatically slower for … modules on distributed file systems, modules with slow side-effects

Oh no. Look I'm not saying you're holding it wrong, it's perfectly valid to host your modules on what is presumably NFS as well as having modules with side effects but what if you didn't.

I've been down this road with NFS (and SMB if it matters) and pain is the only thing that awaits you. It seems like they're feeling it. Storing what is spiritually executable code on shared storage was a never ending source of bugs and mysterious performance issues.

globular-toast · 1h ago
Imagine if these guys put their intelligence towards improving the world.
rirze · 3h ago
Oh, it's a trading firm. That's why they can fund an internal fork of Python... That sounds nice...