If you find this interesting, definitely consider checking out contraction hierarchies. One of the early algorithms used by mapping software to enable calculating fastest routes between any pair of places. It's exact, and it's orders of magnitude faster than graph search algorithms like Dijkstra.
(I had the joy of implementing this in Python with OSM data a few years ago. Planning a three hour car trip with software I wrote and having it come back with the same path recommended by Google Maps in a matter of milliseconds was a very rewarding feeling.)
Graph transformers, using the same technology as in LLMs, have been a topic of research for many years, also for routing problems. So yes, it is non-LLM, but they could still have a lot in common.
svcphr · 13h ago
Nice. Very light-weight compared to proper local routers like Graphhopper, OSRM, etc., which can be overkill for simple tasks. Although the 'routing' here is nx.shortest_path, which is just Dijkstra, so pretty slow compared to other easy to implement routing algorithms (even just bi-directional Dijkstra or A*... although contraction hierarchies would be huge gain here since edge weights are fixed). Also not sure why readme describes it as an approximation? Dijkstra is guaranteed to return lowest cost path. Maybe approximation because assuming free-flow, or if the NAR dataset is incomplete?
shoo · 2h ago
> Although the 'routing' here is nx.shortest_path, which is just Dijkstra, so pretty slow compared to other easy to implement routing algorithms
networkx has advantages of being popular, well-documented, pure python (less hassle to maintain) with code that is easy to read and modify. but, one big downside of being pure python means that it also has fundamentally poor performance: it can't use a cpu efficiently, the way the graphs are represented also means it can't use memory, memory bandwidth or cache efficiently either.
orthogonally from switching the search algorithm, one quick way to potentially get a large speedup is try swapping out networkx for rustworkx (or any other graph library with python bindings that has native implementations of data structures and graph algorithms)
another thing to check would be to avoid storing auxiliary node/edge attributes in the graph that aren't necessary during search, so that cache and memory bandwidth can be focused on node indices and edge weights.
I went down a rabbit hole playing around with this some years ago (using Cython not rust). Relatively simple things like "store the graph in an array-oriented way (CSC/CSR sparse matrix format or similar)" and "eliminate all memory allocation and pure python code from the Dijkstra search, replace it with simple C code using indices into preallocated arrays" gets you pretty far. It is possible to get further performance increases by reviewing and tweaking the search code to avoid unnecessary branches, investigating variants of the priority queue used to maintain partial paths by path distance (i found switching the heap queue from a binary tree to a 4-ary tree gave a 30% reduction in running time), seeing if the nodes of the graph can be reindexed so that nodes with similar indices are spatially similar and more likely to be in cache (another 30% or so reduction in running time from Hilbert curve ordering). Some of this will be quite problem and data dependent and not necessarily a good tradeoff for other graphs. All up I got around a 30x speedup vs baseline networkx for dijkstra searches to compute path distances to all nodes from a few source nodes on a street network graph with 3.6m nodes & 3.8m edges (big enough not to fit in L3 cache for the CPU i was running experiments with).
karussell · 45m ago
What does light-weight mean in this case? Less data? Ease of installation?
ivanbelenky · 12h ago
Thx for the heads up on optimizations available. The “Approximations” comment does not apply to the shortest path calculation, but rather to the distances and upper bound times estimations. This is the consequence of enabling routing for points that dont exist as nodes (closest node approximation).
simonw · 14h ago
From poking around in the source code I found this 282M SQLite database:
I can't figure out how to read it though. I get this error:
Connection to NTAD_North_American_Roads_3862439624850511818.geodatabase failed check: no such module: VSRS
As far as I can tell VSRS is a proprietary Esri thing.
ivanbelenky · 14h ago
"This NTAD dataset is a work of the United States government as defined in 17 U.S.C. § 101 and as such are not protected by any U.S. copyrights. This work is available for unrestricted public use."
I based my work on this, maybe the link is out, thx for testing. The dataset has already been consumed and collapsed into a smaller graph representation.
Centigonal · 14h ago
if you follow the link in the github readme, there are a few other download options for the dataset, like CSV or shapefile
schemathings · 2h ago
The feature I always wish for with nav software is 'go back the same way'.
django-john · 5h ago
Nice. Clean and lightweight compared to full routing stacks like OSRM or Graphhopper, which can be a bit much for smaller/local tasks. Curious why the README calls it an "approximation" though — Dijkstra gives exact shortest paths unless you're simplifying inputs. Maybe that's referring to free-flow assumptions or limitations in the underlying network data? Still, cool to see a focused US-only tool.
culopatin · 10h ago
I’d love to see if I could assist in adding road type filters such as avoid multi lane highways for example
ivanbelenky · 10h ago
Contributions are welcomee!!
protocolture · 11h ago
Came here to complain about US Telcos being willing to do anything other than enabling dynamic routing.
Glad to see this is for roads.
CamperBob2 · 11h ago
Edit: thanks very much for the suggestions, especially adding the Python version to the uv command line. I totally missed that, and that totally fixed it. Apologies for the OT tech support derailment.
--------------
Question for those familiar with uv. US Routing apparently requires a very specific Python version (3.11 and nothing else), but my system has Python 3.10.9 installed at the moment and I'd rather not upgrade the global version just now. My understanding from reading a lot of uv evangelism on HN and elsewhere is that uv fixes this type of dilemma. But, having just tried to use it to install this package, it's just giving me the same old Python version errors:
C:\devel\us-routing-master\us_routing>uv venv
Using CPython 3.10.9 interpreter at: c:\WinPython-31090
\python-3.10.9.amd64\python.exe
Creating virtual environment at: .venv
Activate with: .venv\Scripts\activate
C:\devel\us-routing-master\us_routing>.venv\Scripts\activate
(us_routing) C:\devel\us-routing-master\us_routing>uv pip
install us-routing
x No solution found when resolving dependencies:
`-> Because the current Python version (3.10.9) does not
satisfy Python>=3.11,<3.12 and us-routing==0.1.0
depends on Python>=3.11,<3.12, we can conclude that us-
routing==0.1.0 cannot be used.
And because only us-routing==0.1.0 is available and you
require us-routing, we can conclude that your
requirements are unsatisfiable.
Am I misunderstanding the whole uv thing, or just doing something wrong? Or is us-routing somehow incompatible with it?
nighthawk454 · 11h ago
By the looks of things, uv is telling you it’s creating a venv with Python 3.10.9 still. Since you didn’t specify you wanted another version, it probably defaulted to the first available system version.
What you want is `uv venv —python 3.11` to create a virtual environment with Python 3.11 without messing up your global system env. This should also install a portable version of Python 3.11 if needed (which it will be since you don’t have it).
This library can use any version of Python 3.11, which you can install alongside your existing 3.10.9 without changing your global python version. I don't typically work in Windows so the codeblock below is AI generated, but follows the path I would normally take - manage installed python versions using pyenv, changing the python version only for this directory via .python_version, and then creating uv environment using that.
:: Step 2: Add pyenv to PATH (run or add to your profile)
$env:PYENV = "$env:USERPROFILE\.pyenv\pyenv-win"
$env:PATH += ";$env:PYENV\bin;$env:PYENV\shims"
:: Step 3: Restart your terminal or reload environment if needed
:: (you can paste the above $env:... lines again after restart)
pyenv is a great way to have many versions of Python installed, whether or not your global is mapped to the latest. You don't even need to set the local .python_version.. you could just do `uv venv .venv --python=python3.11`
ivanbelenky · 11h ago
This is a project I did not maintain much for the past few months, but recently I migrated many other codebases to be uv managed, and I find this optimal for many reasons. Happy to receive contributions and fix the quite strict requirements I set. That would probably be the faster way.
thelastbender12 · 11h ago
You need to request a specific python version compatible with this project. Give `uv venv --python 3.11` a try.
This webpage has a very intuitive graphical explanation of how it works: https://www.mjt.me.uk/posts/contraction-hierarchies/
(I had the joy of implementing this in Python with OSM data a few years ago. Planning a three hour car trip with software I wrote and having it come back with the same path recommended by Google Maps in a matter of milliseconds was a very rewarding feeling.)
I assume you mean non-LLM.
networkx has advantages of being popular, well-documented, pure python (less hassle to maintain) with code that is easy to read and modify. but, one big downside of being pure python means that it also has fundamentally poor performance: it can't use a cpu efficiently, the way the graphs are represented also means it can't use memory, memory bandwidth or cache efficiently either.
orthogonally from switching the search algorithm, one quick way to potentially get a large speedup is try swapping out networkx for rustworkx (or any other graph library with python bindings that has native implementations of data structures and graph algorithms)
another thing to check would be to avoid storing auxiliary node/edge attributes in the graph that aren't necessary during search, so that cache and memory bandwidth can be focused on node indices and edge weights.
I went down a rabbit hole playing around with this some years ago (using Cython not rust). Relatively simple things like "store the graph in an array-oriented way (CSC/CSR sparse matrix format or similar)" and "eliminate all memory allocation and pure python code from the Dijkstra search, replace it with simple C code using indices into preallocated arrays" gets you pretty far. It is possible to get further performance increases by reviewing and tweaking the search code to avoid unnecessary branches, investigating variants of the priority queue used to maintain partial paths by path distance (i found switching the heap queue from a binary tree to a 4-ary tree gave a 30% reduction in running time), seeing if the nodes of the graph can be reindexed so that nodes with similar indices are spatially similar and more likely to be in cache (another 30% or so reduction in running time from Hilbert curve ordering). Some of this will be quite problem and data dependent and not necessarily a good tradeoff for other graphs. All up I got around a 30x speedup vs baseline networkx for dijkstra searches to compute path distances to all nodes from a few source nodes on a street network graph with 3.6m nodes & 3.8m edges (big enough not to fit in L3 cache for the CPU i was running experiments with).
I based my work on this, maybe the link is out, thx for testing. The dataset has already been consumed and collapsed into a smaller graph representation.
Glad to see this is for roads.
--------------
Question for those familiar with uv. US Routing apparently requires a very specific Python version (3.11 and nothing else), but my system has Python 3.10.9 installed at the moment and I'd rather not upgrade the global version just now. My understanding from reading a lot of uv evangelism on HN and elsewhere is that uv fixes this type of dilemma. But, having just tried to use it to install this package, it's just giving me the same old Python version errors:
Am I misunderstanding the whole uv thing, or just doing something wrong? Or is us-routing somehow incompatible with it?What you want is `uv venv —python 3.11` to create a virtual environment with Python 3.11 without messing up your global system env. This should also install a portable version of Python 3.11 if needed (which it will be since you don’t have it).
https://docs.astral.sh/uv/pip/environments/#creating-a-virtu...
``` :: Step 1: Install pyenv-win (make sure Git is installed) git clone https://github.com/pyenv-win/pyenv-win.git "$env:USERPROFILE\.pyenv"
:: Step 2: Add pyenv to PATH (run or add to your profile) $env:PYENV = "$env:USERPROFILE\.pyenv\pyenv-win" $env:PATH += ";$env:PYENV\bin;$env:PYENV\shims"
:: Step 3: Restart your terminal or reload environment if needed :: (you can paste the above $env:... lines again after restart)
:: Step 4: Install Python 3.11 pyenv install 3.11.9
:: Step 5: Set the local Python version for your project folder cd C:\devel\us-routing-master\us_routing pyenv local 3.11.9
:: Step 6: Verify correct Python is selected pyenv which python # should point to 3.11.x
:: Step 7: Create uv environment using Python 3.11 uv venv .venv --python "$(pyenv which python)"
:: Step 8: Activate the environment .venv\Scripts\activate
:: Step 9: Install your package uv pip install us-routing ```
pyenv is a great way to have many versions of Python installed, whether or not your global is mapped to the latest. You don't even need to set the local .python_version.. you could just do `uv venv .venv --python=python3.11`
https://docs.astral.sh/uv/pip/environments/
I imagine you'd want:
uv venv --python 3.11
FREEWAY = 1 # Freeway (Multi-lane, controlled access)
PP_TH = 2 # Primary Provincial/Territorial highway
SP_TH_MA = 3 # Secondary Provincial/territorial highway/ municipal arterial
MC_USP_TH = 4 # Municipal collector/Unpaved secondary provincial/territorial highway
LS_WR = 5 # Local street/ winter road
3 was the sweetspot. The dataset can be explored here in case you want to get an intuition on detail level. https://geodata.bts.gov/datasets/usdot::north-american-roads...