Just a few months back I said I would never use uv. I was already used to venv and pip. No need for another tool I thought.
I now use uv for everything Python. The reason for the switch was a shared server where I did not have root and there were all sorts of broken packages/drivers and I needed pytorch. Nothing was working and pip was taking ages. Each user had 10GB of storage allocated and pip's cache was taking up a ton of space & not letting me change the location properly. Switched to uv and everything just worked
If you're still holding out, really just spend 5 minutes trying it out, you won't regret it.
tetha · 2m ago
For me, the big key was: uv is so much easier to explain and especially use - especially for people who sometimes script something in python and don't do this daily.
pip + config file + venv requires you to remember ~2 steps to get the right venv - create one and install stuff into it, and for each test run, script execution and such, you need to remember a weird shebang-format, or to activate the venv. And the error messages don't help. I don't think they could help, as this setup is not standardized or blessed. You just have to beat a connection of "Import Errors" to venvs into your brain.
It's workable, but teaching this to people unfamiliar with it has reminded me how.. squirrely the whole tooling can be, for a better word.
Now, team members need to remember "uv run", "uv add" and "uv sync". It makes the whole thing so much easier and less intimidating to them.
yjftsjthsd-h · 40m ago
> Each user had 10GB of storage allocated and pip's cache was taking up a ton of space & not letting me change the location properly. Switched to uv and everything just worked
Is it better about storage use? (And if so, how? Is it just good at sharing what can be shared?)
fsh · 35m ago
uv hardlinks identical packages, so adding virtual envs takes up very little space.
snerbles · 15m ago
Unless you cross mount points, which uv will helpfully warn about.
psychoslave · 47m ago
I wonder how it compares with something more green generalist like "mise", to which I migrated after using "ASDF" for some time.
wrboyce · 40m ago
I use both! uv installed globally with mise, and uv tools can then be managed via “miss use -g pipx:foo”.
espdev · 1h ago
> Just a few months back I said I would never use uv. I was already used to venv and pip. No need for another tool I thought
Really? :)
requirements.txt is just hell and torture. If you've ever used modern project/dependency management tools like uv, Poetry, PDM, you'll never go back to pip+requirements.txt. It's crazy and a mess.
uv is super fast and a great tool, but still has roughnesses and bugs.
aequitas · 59m ago
Pip-tools+requirements.txt helped me survive the past few years. I also never thought I needed uv, but after all the talk about it I gave it a spin and never want back. It’s just so blazing fast en convenient.
_Algernon_ · 49m ago
pip also works with pyproject.toml. Sticking with requirements.txt is a self-imposed constraint.
d00mB0t · 1m ago
Rust...now with more Rust! Rust is turning into systemd, soon we'll only need to ship a single systemd binary written in Rust :-/
polivier · 2h ago
The first time I used `uv`, I was sure that I had made a mistake or typed something wrong because the process finished so much more quickly than anything I had ever experienced with `pip`.
tux3 · 1h ago
I've sometimes had uv take up to 200ms to install packages, so you could feel a slight delay between pressing enter and the next shell prompt
You don't have that problem with Poetry. You go make a cup of coffee for a couple minutes, and it's usually done when you come back.
Numerlor · 38m ago
It's funny when the exact same thing was probably said about pipenv and poetry
baby · 2h ago
Same here lol! The experience is so smooth it doesn't feel like python
johnfn · 1h ago
That makes sense, because it's Rust. :)
augustflanagan · 2h ago
I just had this same experience last week, and was certain it wasn’t working correctly as well. I’m a convert.
nialse · 2h ago
Likewise. I was skeptical, then I tried it and won’t go back.
larkost · 28m ago
Just a warning in case others run into it: on very anemic systems (e.g.: AWS T2.micro running Windows, yes... I know...) uv will try to open too many simultaneous downloads, overloading things, resulting in timeouts.
You can use ent ENV variable UV_CONCURRENT_DOWNLOADS to limit this. In my case it needed to be 1 or 2. Anything else would cause timeouts.
An extreme case, I know, but I think that uv is too aggressive here (a download thread for every module). And should use aggregate speeds from each source server as a way of auto-tuning per-server threading.
ehsankia · 9m ago
Not extreme at all, A lot of people use the cheapest smallest VPS for their hobby work. I know I do (albeit not AWS). Thanks for sharing, hope they improve the automatic detection there.
theLiminator · 3h ago
uv and ruff are a great counterexample to all those people who say "never reinvent the wheel". Don't ever do it just for the sake of doing it, but if you have focused goals you can sometimes produce a product that's an order of magnitude better.
eviks · 2h ago
They didn't reinvent the wheel, "just" replaced all the wood with more durable materials to make it handle rotation at 10 times the speed
socalgal2 · 1h ago
I'd be curious to know exactly what changed. Python -> Rust won't make network downloads faster nor file I/O faster. My naive guess is that all the speed comes from choosing better algorithms and/or parallelizing things. Not from Python vs Rust (though if it's hard to parallelize in Python and easy in rust that would certainly make a difference)
ekidd · 1h ago
I've translated code from Ruby to Python, and other code from Rust to Python.
Rust's speed advantages typically come from one of a few places:
1. Fast start-up times, thanks to pre-compiled native binaries.
2. Large amounts of CPU-level concurrency with many fewer bugs. I'm willing to do ridiculous threading tricks in Rust I wouldn't dare try in C++.
3. Much lower levels of malloc/free in Rust compared to some high-level languages, especially if you're willing to work a little for it. Calling malloc in a multithreaded system is basically like watching the Millennium Falcon's hyperdrive fail. Also, Rust encourages abusing the stack to a ridiculous degree, which further reduces allocation. It's hard to "invisibly" call malloc in Rust, even compared to a language like C++.
4. For better or worse, Rust exposes a lot of the machinery behind memory layout and passing references. This means there's a permanent "Rust tax" where you ask yourself "Do I pass this by value or reference? Who owns this, and who just borrows is?" But the payoff for that work is good memory locality.
So if you put in a modest amount of effort, it's fairly easy to make Rust run surprisingly fast. It's not an absolute guarantee, and there are couple of traps for the unwary (like accidentally forgetting to buffer I/O, or benchmarking debug binaries).
the8472 · 1h ago
NVMe hungers, keeping it fed is hard work. Doing some serial read, decompress, checksum, write loop will leave if starved (QD<1) whenever you're doing anything but the last step.
Disk IO isn't async unless you use io_uring (well ok, writeback caches can be). So threads are almost a must to keep NVMe busy.
Conversely, waiting for blocking IO (e.g. directory enumeration) will keep your CPU starved. Here too the answer is more threads.
jerpint · 1h ago
From just my observations they basically parallelized the install sequence instead of having it be sequential (among many other optimizations most likely)
physicsguy · 58m ago
The package resolution is a big part of it, it's effectively a constraint solver. I.e. if package A requires package B constrained between version 1.0 < X <= 2.X and Package B requires package C between... and so on and so on.
Conda rewrote their package resolver for similar reasons
tl;dw Rust, a fast SAT solver, micro-optimisation of key components, caching, and hardlinks/CoW.
0cf8612b2e1e · 2h ago
The history of Python package management is clear that everyone thinks they can do a better job than the status quo.
psunavy03 · 2h ago
In this case, they were right.
nickelpro · 1h ago
uv is purely a performance improvement, it changes nothing about the mechanics of Python environment management or packaging.
The improvements came from lots of work from the entire python build system ecosystem and consensus building.
0cf8612b2e1e · 1h ago
Disagree in that uv makes switching out the underlying interpreter so straightforward. Becomes trivial to swap from say 3.11 to 3.12. The pybi idea.
Sure, other tools could handle the situation, but being baked into the tooling makes it much easier to bootstrap different configurations.
nickelpro · 1h ago
Yes, it's faster and better than pyenv, but the mechanism it's using (virtual environments) is not a uv invention.
uv does the Python ecosystem better than any other tool, but it's still the standard Python ecosystem as defined in the relevant PEPs.
pityJuke · 21m ago
Are the lock files standardised, or a uv-specific thing?
globular-toast · 27m ago
Actually not true. One of the main differences with uv is you don't have to think about venvs any more. There's a talk about it from one of the authors at a recent PyCon here: https://www.youtube.com/watch?v=CV8KRvWKYDw (not the same talk I linked elsewhere in the thread).
henry700 · 1h ago
Of course they do, this tends to happen when the history is it being hot flaming garbage.
mort96 · 2h ago
Honestly "don't reinvent the wheel" makes absolutely no sense as a saying. We're not still all using wooden discs as wheels, we have invented much better wheels since the neolithic. Why shouldn't we do the same with software?
simonw · 1h ago
When asked why he had invented JSON when XML already existed, Douglas Crockford said:
The good thing about reinventing the wheel is that you can get a round one.
You can get a round one. Or you can make yet another wonky shaped one to add to the collection, as ended up being the case with JSON.
haiku2077 · 1h ago
Right, wheels are reinvented every few years. Compare tires of today to the ones 20 years ago and the technology and capability is very different, even though they look identical to a casual eye.
My primary vehicle has off-road capable tires that offer as much grip as a road-only tire would have 20-25 years ago, thanks to technology allowing Michelin to reinvent what a dual-purpose tire can be!
nightpool · 31m ago
> Compare tires of today to the ones 20 years ago and the technology and capability is very different, even though they look identical to a casual eye
Can you share more about this? What has changed between tires of 2005 and 2025?
haiku2077 · 3m ago
In short: Better materials and better computational models.
> In the last decade, the spiciest street-legal tires have nearly surpassed the performance of a decade-old racing tire, and computer modeling is a big part of the reason
(written about 8 years ago)
aalimov_ · 1h ago
I always took this saying as meaning that we don’t re-invent the concept of the wheel. For example the Boring company and Tesla hoping to reinvent the concept of the bus/train.. (iirc your car goes underground on some tracks and you get to bypass traffic and not worry about steering)
A metal wheel is still just a wheel. A faster package manager is still just a package manager.
haiku2077 · 1h ago
That's not how I've ever seen it used in practice. People use it to mean "don't build a replacement for anything functional."
jjtheblunt · 2h ago
> an order of magnitude better
off topic, but i wonder why that phrase gets used rather than 10x which is much shorter.
BeetleB · 1h ago
Short answer: Because the base may not be 10.
Long answer: Because if you put a number, people expect it to be accurate. If it was 6x faster, and you said 10x, people may call you out on it.
screye · 1h ago
It's meant to signify a step change.
Order of magnitude change = no amount of incremental changes would make up for it.
In common conversation, the multiplier can vary from 2x - 10x. In context of some algorithms, order of magnitudes can be over the delta rather than absolutes. eg: an algorithms sees 1.1x improvement over the previous 10 years. A change that shows a 1.1x improvement by itself, overshadows an an order-of-magnitude more effort.
For salaries, I've used order-of-magnitude to mean 2x. Good way to show a step change in a person's perceived value in the market.
bxparks · 2h ago
I think of "an order of magnitude" as a log scale. It means somewhere between 3.16X and 31.6X.
jjtheblunt · 1h ago
yeah that's what i meant with 10x, like it's +1 on the exponent, if base is 10.
but i'm guessing what others are thinking, hence the question.
bmacho · 53m ago
Because it's not 10x?
fkyoureadthedoc · 2h ago
- sounds cooler
- 10x is a meme
- what if it's 12x better
Scene_Cast2 · 2h ago
10x is too precise.
chuckadams · 1h ago
Because "magnitude" has cool gravitas, something in how it's pronounced. And it's not meant to be precise, it just means "a whole lot more".
refulgentis · 2h ago
"10x" has been cheapened / heard enough / de facto, is a more general statement than a literal interpretation would indicate. (i.e. 10x engineer. Don't hear that much around these parts these days)
Order of magnitude faces less of that baggage, until it does :)
neutronicus · 1h ago
5x faster is an order of magnitude bc of rounding
mh- · 3h ago
Started using this recently for personal stuff on my laptop. When you're used to pip, it's just confusingly fast. More than once I thought maybe it didn't work because it returned too quickly..
leonheld · 3h ago
I adore the
uv add <mydependencies> --script mycoolscript.py
And then shoving
#!/usr/bin/env -S uv run
on top so I can run Python scripts easily. It's great!
simonw · 2h ago
I built a Claude Project with special instructions just teaching it how to do this, which means it can output full scripts for me with inline dependencies based on a single prompt: https://simonwillison.net/2024/Dec/19/one-shot-python-tools/
Claude 4's training cutoff date is March 2025 though, I just checked and it turns out Claude Sonnet 4 can do this without needing any extra instructions:
Python script using uv and inline script dependecies
where I can give it a URL and it scrapes it with httpx
and beautifulsoup and returns a CSV of all links on
the page - their URLs and their link text
Using your system instructions for uv for every LLM now since first seeing your post last year, thanks! It's insanely helpful just asking e.g. Claude to give me a python script for XYZ and just using "uv run". I also added:
If you need to run these scripts, use "uv run script-name.py". It will automatically install the dependencies. Stdlibs don't need to be specified in the dependencies array.
since e.g. Cursor often gets confued because the dependencies are not installed and it doesn't know how to start the script. The last sentence is for when LLMs get confused and want to add "json" for example to the dependency array.
varunneal · 2h ago
claude sonnet typically forgets about uv script syntax in my experience. I usually find myself having to paste in the docs every time. By default it wants to use uv project syntax.
intellectronica · 2h ago
It's so cool. I now habitually vibe-code little scripts that I can immediately run. So much nicer than having to manage environments and dependencies:
PEP723 support is exactly what the poster is using?
kristjansson · 2h ago
Ach, missed the --script, thanks.
jsilence · 2h ago
Using this trick with Marimo.io notebooks in app-mode.
Instant reactive reproducible app that can be sent to others with minimal prerequisites (only uv needs to be installed).
Such a hot combo.
bunderbunder · 13m ago
uv is indeed fast. But I'm also finding that the maintainers' efforts to make it work like Cargo mean it can be more difficult to use in more complex project structures. As sensible as Rust's project management ethos is, you're never going to escape Python's underlying design in a Python project, and friction between the two philosophies may not be avoidable.
One possible alternative is Pants. It's also written in Rust for performance, but has more flexibility baked into the design.
nrvn · 1h ago
this is my new fav for running small executable scripts:
I really wish that hashbang line was something way WAY easier to remember like `#!/usr/bin/env uvx`. I have to look this up every single time I do it.
PufPufPuf · 2m ago
Sadly hashbangs are technically limited to:
1) Support only absolute paths, making it necessary to use /usr/bin/env which is in standardized location to look up the uv binary
2) Support only a single argument (everything after the space is passed as a single arg, it's not parsed into multiple args like a shell would), making it necessary to use -S to "S"plit the arguments. It's a feature of env itself, for this very use case.
So there isn't really much to do to make it simpler.
marifjeren · 4m ago
Seems a lot of people like this and are happy about it, but I for one am tired of the proliferation of python package management tools.
Many languages have many package management tools but most languages there are one or two really popular ones.
For python you just have to memorize this basically:
- Does the project have a setup.py? if so, first run several other commands before you can run it. python -m venv .venv && source .venv/bin/activate && pip install -e .
- else does it have a requirements.txt? if so python -m venv .venv && source .venv/bin/activate && pip install -r requirements.txt
- else does it have a pyproject.toml? if so poetry install and then prefix all commands with poetry run
- else does it have a pipfile? pipenv install and then prefix all commands with pipenv run
- else does it have an environment.yml? if so conda env create -f environment.yml and then look inside the file and conda activate <environment_name>
- else I have not had to learn the rules for uv yet
Thank goodness these days I just open up a cursor tab and say "get this project running"
eats_indigo · 3h ago
Love UV!
Also love Ruff from the Astral team. We just cut our linting + formatting across from pylint + Black to Ruff.
Saw lint times drop from 90 seconds to < 1.5 seconds. crazy stuff.
pu_pe · 3h ago
Tried uv a while ago and I was shocked by how fast and easy it is to use. There's basically no reason to use pip anymore, and if you're using only Python there's basically no reason to use conda either.
oceansky · 2h ago
It seems to make pyenv and poetry droppable too.
findalex · 1h ago
and pipx.
6ak74rfy · 2h ago
UV is fast, like FAST. Plus, it removes the need for pyenv (for managing different Python versions) and pip for me. Plus, no need to activate env or anything, `uv run ...` automatically runs your code through the env.
It's a nice software.
nomel · 1h ago
> Plus, it removes the need for pyenv
I don't see a way to change current and global versions of python/venvs to run scripts, so that when I type "python" it uses that, without making an alias.
adamckay · 1h ago
If they're your scripts (i.e. your writing/editing them) then you can declare dependencies following the PEP723 format and uv will respect that.
For running scripts on my personal computer I really don't care for all the dependency management stuff. I just want a single, globally installed latest version of a library, like what pip does. I've never had the problem of an old script breaking, I guess because I just don't run that much software. These things for writing out explicit versions of everything and reinstalling libraries for every single project just add a lot of overhead, extra typing and an extra layer of complexity that I don't care for. I like just typing "pip install requests" and then being able to do "import requests" in any REPL or script for the next few years, occasionally running a Bash alias to upgrade all dependencies (which is a feature that pip incredibly still doesn't have, 14 years later).
I can see how if you've had issues with dependencies you would rave about systems that let you control down to the commit what an import statement actually means, but I like the system that requires the least amount of typing/thinking and I imagine I'm part of a silent majority.
chuckadams · 1h ago
I've been out of the python world for a while now, but I would think a global install should just be a special case of a local install, one that's shared among scripts, which is basically how npm and yarn work. I'm kind of surprised uv doesn't support something like this already. Maybe it should be up to the distro to base the global python package management on uv then?
shpx · 1h ago
I think it does support it, like this
uv pip install --system requests
but it's more typing. If I type 5 characters per second, making me also type "uv --system" is the same as adding 2 seconds of runtime to the actual command, except even worse because the chance of a typo goes up and typing takes energy and concentration and is annoying.
chuckadams · 1h ago
If only there were a way to alias commands, eh?
globular-toast · 17m ago
Word of warning, if you use Linux you can easily break your system like this. Many distros stop you being able to modify the system packages now but that's quite a recent development. You should look into doing user-level installs instead. Don't know about Mac. If you use Windows then you do you. It might break eventually but probably not irrecoverably.
psunavy03 · 3h ago
I'm sold. Never going back to pip/twine/requirements.txt again if I don't have to. I have several projects that all share a common wheel hosted on an internal GitLab instance, and I was able to replace like 10 lines of YAML with just "uv build" and "uv publish." Importing is quick and easy, and you can see what your core dependencies are as opposed to everything just being in one mess of a requirements.txt.
Its sad that pyrhon only tooling is aparently so imcapable that you have to write it in a compiled language.
After that many years of optimization pure python seems still to be wishfull thinking. It's ai/mk success is also only as a shim language around library calls.
globular-toast · 22m ago
So? It's the best language there is for shimming around library calls. Use the right tool for the job. There's no single language that can do it all.
xnyan · 2h ago
Worth using just for 'uv pip' as a replacement for pip on the grounds of being much much faster. I've completely switched to uv and I can't imagine ever going back to pip for this and lots of other outstanding features.
tandav · 2h ago
Still no option to store virtual envs outside projects after a year
This is the most important missing feature to me as well.
I've written a lightweight replacement script to manage named central virtual envs using the same command syntax as virtualenvwrapper. Supports tab completion for zsh and bash: https://github.com/sitic/uv-virtualenvwrapper
A problem I have now, though, is when I I jump to def in my editor it no longer knows which venv to load because it's outside of the project. This somehow used to work with virtualenwrapper but I'm not sure how.
nchmy · 51m ago
I've been happily using it for a long time, and rye before that.
Just today I set it up on 20 PCs in a computer lab that doesn't have internet, along with vs code and some main packages. Just downloaded the files, made a powershell script and it's all working great with Jupyter etc... Now to get kids to be interested in it...
incognito124 · 2h ago
uv is almost perfect. my only pet peeve is updating dependencies. sometimes I just want to go "uv, bump all my dependencies to the as latest version as possible while respecting their constraints". I still haven't found an elegant way to do this, but I have written a script that parses pyproject.toml, removes the deps, and invokes `uv add --upgrade` with them.
other than that, it's invaluable to me, with the best features being uvx and PEP 723
jmtulloss · 2h ago
Does `uv lock —upgrade` not do what you want?
incognito124 · 2h ago
Unfortunately, no. Only `uv.lock` gets updated, but the dependencies in `pyproject.toml` are frozen at their original constraints.
What I want is, if my project depends on `package1==0.4.0` and there are new versions of package1, for uv to try install the newer version. and to do that for a) all the deps, simultaneously, b) without me explicitly stating the dependencies in the command line since they're already written in the pyproject.toml. an `uv refresh` of sorts
Eridrus · 2h ago
Why not depend on package1>=0.4.0 rather than specifying an explicit version? Then uv will upgrade it to the latest version.
pyproject.toml is meant to encode the actual constraints for when your app will function correctly, not hardcode exact versions, which is what the lockfile is for.
hxtk · 1h ago
If you specify your constraints in pyproject.toml like this: `package1==0.4.0`; then that is the latest (and only) version satisfying your constraints. Not upgrading is expected behavior, because upgrading would violate constraints.
pyproject.toml’s dependency list specifies compatibility: we expect the program to run with versions that satisfy constraints.
If you want to specify an exact version as a validated configuration for a reproducible build with guaranteed functionality, well, that’s what the lock file is for.
In serious projects, I usually write that dependency section by hand so that I can specify the constraints that match my needs (e.g., what is the earliest version receiving security patches or the earliest version with the functionality I need?). In unserious projects, I’ll leave the constraints off entirely until a breakage is discovered in practice.
If `uv` is adding things with `==` constraints, that’s why upgrades are not occurring, but the solution is to relax the constraints to indicate where you are okay with upgrades happening.
incognito124 · 1h ago
> ... the solution is to relax the constraints to indicate where you are okay with upgrades happening.
Yeah, that's pretty much what I've been doing with my workaround script. And btw most of my projects are deeply unserious, and I do understand why one should not do that in any other scenario.
Still, I dream of `uv refresh` :D
wtallis · 2h ago
> What I want is, if my project depends on `package1==0.4.0` and there are new version of package1, for uv to try install the newer version.
I think you're just specifying your dependency constraints wrong. What you're asking for is not what the `==` operator is for; you probably want `~=`.
petters · 1h ago
You are writing your project file incorrectly. It's not a lock file
incognito124 · 1h ago
I never, ever, write my project file[1]. uv {add,remove} is all I ever use.
[1]: I do sometimes write the title or the description. But never the deps themselves
wtallis · 56m ago
Even using `uv add`, you don't have to limit yourself to declaring exact versions when your intention is to allow newer versions.
gschizas · 1h ago
I think what you want is `uv sync --upgrade`
ketozhang · 2h ago
You could either delete the .venv and recreate it or run `uv pip install --upgrade .`
Much prefer not thinking about venvs.
incognito124 · 2h ago
Actually, it won't work. I tried it and running `uv run script.py` just reinstalls the deps back... which is, I admit, the behaviour I expect and want as a user.
neves · 2h ago
It is a venture capital startup. If I start use uv, what's our protection against the company going rogue?
zffr · 2h ago
Why wouldn't you be able to switch back to using pip ?
kylecordes · 1h ago
If uv disappeared tomorrow, five projects would spring up to make compatible implementations of its functionality.
nullhole · 1h ago
It seems like that'd work as long as you restrict yourself entirely to the pip interface. Stray outside of that, and you start accumulating a real dependency on uv itself.
xyst · 2h ago
Community will fork it and move on. See the following examples:
* Redis -> redict, valkey
* elastic search -> opensearch
* terraform -> opentofu
(Probably a few more but those are the ones that come to mind when they "go rogue")
johncole · 1h ago
I love uv. I am a convert, I use it for everything. One area I find it incredible for: deployment. If I have to launch a docker container and install a lot of python packages it saves so much time and compute.
I also appreciate that it handles most package conflicts and it constantly maintains the list of packages as you move. I have gotten myself into a hole or two now with packages and dependencies, I can usually solve it by deleting venv an just using uv to reinstall.
jimjag · 1h ago
This has the same issue as so many package managers for Python, namely, it doesn't provide a way for --no-binary to remain sticky.
There are times when you do NOT want the wheel version to be installed (which is what --no-binary implements in pip), but so many package managers including uv don't provide that core, basic functionality. At least for those that do use pip behind the scenes, like pipenv, one can still use the PIP_NO_BINARY environment variable to ensure this.
So I'll not be migrating any time soon.
csl · 1h ago
Maybe not exactly what you need (sticky) but you can set UV_NO_BINARY=1
It helps, that's for sure. But this sort of knowledge should not exist in the environment in any case. It should be part of the canonical package list, and not hidden away elsewhere. The whole idea of a dependency manager should be a centralized and consistent way to install everything you need and not be dependent on what values may or may not exist as a env-var.
wtallis · 1h ago
Can you elaborate on the reasons why a package would need to declare that its dependencies must be installed from source rather than from pre-built binaries? I'm having trouble imagining a scenario where that capability would be used as anything other than a workaround for a deeper problem with how your dependencies are packaged.
csl · 1h ago
Yes, I can see how it would make sense to be able to set this in pyproject.toml (typically for private package)
uv is still quite new though. Perhaps you can open an issue and ask for that?
mixmastamyk · 1h ago
> There are times when you do NOT want the wheel version to be installed
When, why? Should I be doing this?
jimjag · 39m ago
There are some wheels, for example, 'lxml' that bundle in their binary possibly incompatible external libraries, or older libraries than what you would like. This can cause library conflicts.
octo888 · 1h ago
uv has converted people in the same way Google search, Chrome, Git and SSDs did.
Fast is a massive factor.
I haven't used it much, but being so fast, I didn't even stop to think "is it perfect at dependency management?" "does it lack any features?".
lucideng · 2h ago
UV solved any issue I had getting python to run on a machine.
Just `git clone someproject`, `uv run somescript.py`, then mic drop and walk away.
oezi · 2h ago
`uvx` directly from the repo also works nicely
FL33TW00D · 33m ago
uv has completely changed the experience of Python for me.
AJRF · 1h ago
Is there something inheret about rust that means its faster at dep resolution than Python? Like where is the speed up coming from?
Or would it be possible to go this fast in python if you cared enough about speed?
Is it a specific thing that rust has an amazing library for? Like Network or SerDe or something?
simonw · 1h ago
The biggest speed-up in uv comes from the way it uses caching and hard links. When you install a package into a virtual environment uv uses a hard link to a previously cached version rather than copying files.
Using Rust is responsible for a lot of speed gains too, but I believe it's the hard linking trick (which could be implemented in any language) that's the biggest win.
tcdent · 1h ago
They came up with a faster resolution algorithm, in addition to implementing it in a faster language.
pip could be made faster based on this, but maybe not quite as fast.
_bent · 1h ago
The resolution algorithm is the pubgrub algorithm from Darts package manager pub, implemented in Rust for Cargo https://youtu.be/LGXx5Bqcgq8
spennant · 1h ago
I moved from pip to poetry a while back, but was introduced to uv recently while working on MCP stuff. I now use uv for everything and haven’t looked back.
decko · 51m ago
What made you switch from poetry?
oezi · 2h ago
I continue to be puzzled why sometime running uvx (uv tool run) will redownload all dependencies even though it just downloaded them for another tool. Downloading torch 15 times per day gets old even on 500 mbits
I haven't been able to find any kind of rhyme or rhythm to it, so I don't know how to explain when it happens or how to better debug it for a bug report.
veganjay · 2h ago
Initially, I used `uv tool run <name>`, but later discovered `uv tool install <name>`. The `install` command downloads dependencies once and caches them for future use. It is similar to how `pipx install <name>` works.
e.g.
$ uv tool install asciinema
$ asciinema play example.cast
carlosdp · 2h ago
I love uv, not just for local development, but it also makes it WAY easier to manage python environments you setup for running python workers / user code in the cloud.
forrestthewoods · 2h ago
Here’s my #1 complaint about uv: I’m new to the python ecosystem. I don’t know anything about pip or the existing tools. I would love for uv to at least have documentation and a user guide that doesn’t assume knowledge of the old bad tools that uv replaces.
Perhaps uv will continue its ascendancy and get there naturally. But I’d like to see uv be a little more aggressive with “uv native” workflows. If that makes sense.
wrs · 2h ago
That exists! [0] But IMHO the guides should be linked in big text as the first thing on the homepage. Right now you have to read through a bunch of meaningless comparative bullet points and reassurances then click a few times to get to the guides. If it weren’t for everyone telling me I need to switch, I might not have had the patience to find them.
uv has become essential for me. conda and virtualenv never worked smoothly for me, but uv was easy and "just worked" from day 1.
gamegod · 1h ago
Bragging that your program is faster than anything written in Python is a low bar, lol.
Also, it seems like a sign that even Python tooling needs to not be written in Python now to get reasonable performance.
renewiltord · 1h ago
The Astral projects are all great. I hope the company finds a revenue stream in the future with some hosted something or the other because these tools are so useful I don't want them to become pay-for etc.
mikevm · 1h ago
I think http://pixi.sh is much cooler because it supports conda environments so you can install non-python packages as well (e.g, gcc).
I don't want to charge people money to use our tools, and I don't want to create an incentive structure whereby our open source offerings are competing with any commercial offerings (which is what you see with a lost of hosted-open-source-SaaS business models).
What I want to do is build software that vertically integrates with our open source tools, and sell that software to companies that are already using Ruff, uv, etc. Alternatives to things that companies already pay for today.
An example of what this might look like (we may not do this, but it's helpful to have a concrete example of the strategy) would be something like an enterprise-focused private package registry. A lot of big companies use uv. We spend time talking to them. They all spend money on private package registries, and have issues with them. We could build a private registry that integrates well with uv, and sell it to those companies. [...]
But the core of what I want to do is this: build great tools, hopefully people like them, hopefully they grow, hopefully companies adopt them; then sell software to those companies that represents the natural next thing they need when building with Python. Hopefully we can build something better than the alternatives by playing well with our OSS, and hopefully we are the natural choice if they're already using our OSS.
leobuskin · 18m ago
They are hijacking the entire python's ecosystem in a very smart way, that's all. At some point we, probably, will find us vendor locked-in, just because the initial offer was so appealing. Take a closer look at it: package manager, formatter/linter, types, lsp. What's left before it will poke cpython one way or another? Maybe cloud-based IDE, some interesting WASM relationship (but RustPython is not there yet, they just don't have enough money). Otherwise, Astral is on a pretty straightforward way to `touchdown` in a few years. It's both, the blessing, and the curse.
Let's be honest, all tries to bring a cpython alternative failed (niche boosters like PyPy is a separate story, but it's not up-to-date, and not entirely exact). For some reason, people think that 1:1 compatibility is not critical and too costly to pursue (hello, all LLVM-based compilers). I think, it's doable and there's a solid way to solve it. What if Astral thinks so too?
serjester · 2h ago
Anaconda makes on the order of 100M a year “solving” data science package management. I would argue it has a significantly worse product, attacking a much smaller part of the ecosystem.
It seems easy to imagine Astral following a similar path and making a significant amount of money in the process.
wrs · 1h ago
In theory, Anaconda solves the next higher level of the Python package management nightmare, namely knowing what versions are compatible with each other. But that could presumably be done on top of uv.
colechristensen · 2h ago
Anaconda isn't free. I don't want to pay per-seat fees for slightly improved versions of open source tools which is why I'm very skeptical of Astral and uv.
One day they're going to tell me I have to pay $10/month per user and add a bunch of features I really don't need just because nobody wants to prioritize the speed of pip.
And most of that fee isn't going to go towards engineers maintaining "pip but faster", it's going to fund a bunch of engineers building new things I probably don't want to use, but once you have a company and paying subscribers, you have to have developers actively doing things to justify the cost.
serjester · 2h ago
Enterprises don't care about faster, but they do care an enormous amount about security. Astral is very well positioned here.
SSchick · 2h ago
Apparently VC(1) so far, I'd assume there will be LTS support contracts and tailored enterprise features down the line; for the moment I'd assume it's just a bunch of talented devs fixing problems they've been tired off / see as long term existencial threats to the python ecosystems.
Moved to this and have no need for anything else, especially since uv pip install whatever works with uv and is faster than pip (though I usually use uv add).
putna · 3h ago
unfairly fast
mixmastamyk · 1h ago
People always mention the speed, but that's rarely a factor for me.
Rather, pip was broken intentionally two years ago and they are still not interested in fixing it:
I now use uv for everything Python. The reason for the switch was a shared server where I did not have root and there were all sorts of broken packages/drivers and I needed pytorch. Nothing was working and pip was taking ages. Each user had 10GB of storage allocated and pip's cache was taking up a ton of space & not letting me change the location properly. Switched to uv and everything just worked
If you're still holding out, really just spend 5 minutes trying it out, you won't regret it.
pip + config file + venv requires you to remember ~2 steps to get the right venv - create one and install stuff into it, and for each test run, script execution and such, you need to remember a weird shebang-format, or to activate the venv. And the error messages don't help. I don't think they could help, as this setup is not standardized or blessed. You just have to beat a connection of "Import Errors" to venvs into your brain.
It's workable, but teaching this to people unfamiliar with it has reminded me how.. squirrely the whole tooling can be, for a better word.
Now, team members need to remember "uv run", "uv add" and "uv sync". It makes the whole thing so much easier and less intimidating to them.
Is it better about storage use? (And if so, how? Is it just good at sharing what can be shared?)
Really? :)
requirements.txt is just hell and torture. If you've ever used modern project/dependency management tools like uv, Poetry, PDM, you'll never go back to pip+requirements.txt. It's crazy and a mess.
uv is super fast and a great tool, but still has roughnesses and bugs.
You don't have that problem with Poetry. You go make a cup of coffee for a couple minutes, and it's usually done when you come back.
You can use ent ENV variable UV_CONCURRENT_DOWNLOADS to limit this. In my case it needed to be 1 or 2. Anything else would cause timeouts.
An extreme case, I know, but I think that uv is too aggressive here (a download thread for every module). And should use aggregate speeds from each source server as a way of auto-tuning per-server threading.
Rust's speed advantages typically come from one of a few places:
1. Fast start-up times, thanks to pre-compiled native binaries.
2. Large amounts of CPU-level concurrency with many fewer bugs. I'm willing to do ridiculous threading tricks in Rust I wouldn't dare try in C++.
3. Much lower levels of malloc/free in Rust compared to some high-level languages, especially if you're willing to work a little for it. Calling malloc in a multithreaded system is basically like watching the Millennium Falcon's hyperdrive fail. Also, Rust encourages abusing the stack to a ridiculous degree, which further reduces allocation. It's hard to "invisibly" call malloc in Rust, even compared to a language like C++.
4. For better or worse, Rust exposes a lot of the machinery behind memory layout and passing references. This means there's a permanent "Rust tax" where you ask yourself "Do I pass this by value or reference? Who owns this, and who just borrows is?" But the payoff for that work is good memory locality.
So if you put in a modest amount of effort, it's fairly easy to make Rust run surprisingly fast. It's not an absolute guarantee, and there are couple of traps for the unwary (like accidentally forgetting to buffer I/O, or benchmarking debug binaries).
Conda rewrote their package resolver for similar reasons
tl;dw Rust, a fast SAT solver, micro-optimisation of key components, caching, and hardlinks/CoW.
The improvements came from lots of work from the entire python build system ecosystem and consensus building.
Sure, other tools could handle the situation, but being baked into the tooling makes it much easier to bootstrap different configurations.
uv does the Python ecosystem better than any other tool, but it's still the standard Python ecosystem as defined in the relevant PEPs.
The good thing about reinventing the wheel is that you can get a round one.
https://scripting.wordpress.com/2006/12/20/scripting-news-fo...
My primary vehicle has off-road capable tires that offer as much grip as a road-only tire would have 20-25 years ago, thanks to technology allowing Michelin to reinvent what a dual-purpose tire can be!
Can you share more about this? What has changed between tires of 2005 and 2025?
https://www.caranddriver.com/features/a15078050/we-drive-the...
> In the last decade, the spiciest street-legal tires have nearly surpassed the performance of a decade-old racing tire, and computer modeling is a big part of the reason
(written about 8 years ago)
A metal wheel is still just a wheel. A faster package manager is still just a package manager.
off topic, but i wonder why that phrase gets used rather than 10x which is much shorter.
Long answer: Because if you put a number, people expect it to be accurate. If it was 6x faster, and you said 10x, people may call you out on it.
In common conversation, the multiplier can vary from 2x - 10x. In context of some algorithms, order of magnitudes can be over the delta rather than absolutes. eg: an algorithms sees 1.1x improvement over the previous 10 years. A change that shows a 1.1x improvement by itself, overshadows an an order-of-magnitude more effort.
For salaries, I've used order-of-magnitude to mean 2x. Good way to show a step change in a person's perceived value in the market.
- 10x is a meme
- what if it's 12x better
Order of magnitude faces less of that baggage, until it does :)
Claude 4's training cutoff date is March 2025 though, I just checked and it turns out Claude Sonnet 4 can do this without needing any extra instructions:
Here's the output, it did the right thing with regards to those dependencies: https://claude.ai/share/57d5c886-d5d3-4a9b-901f-27a3667a8581- https://everything.intellectronica.net/p/the-little-scripter
- https://www.youtube.com/watch?v=8LB7e2tKWoI
- https://github.com/intellectronica/ez-mcp
~~That mutates the project/env in your cwd. They have a lot in their docs, but I think you’d like run --with or uv’s PEP723 support a lot more~~
https://docs.astral.sh/uv/guides/scripts/
Instant reactive reproducible app that can be sent to others with minimal prerequisites (only uv needs to be installed).
Such a hot combo.
One possible alternative is Pants. It's also written in Rust for performance, but has more flexibility baked into the design.
So there isn't really much to do to make it simpler.
Many languages have many package management tools but most languages there are one or two really popular ones.
For python you just have to memorize this basically:
- Does the project have a setup.py? if so, first run several other commands before you can run it. python -m venv .venv && source .venv/bin/activate && pip install -e .
- else does it have a requirements.txt? if so python -m venv .venv && source .venv/bin/activate && pip install -r requirements.txt
- else does it have a pyproject.toml? if so poetry install and then prefix all commands with poetry run
- else does it have a pipfile? pipenv install and then prefix all commands with pipenv run
- else does it have an environment.yml? if so conda env create -f environment.yml and then look inside the file and conda activate <environment_name>
- else I have not had to learn the rules for uv yet
Thank goodness these days I just open up a cursor tab and say "get this project running"
Also love Ruff from the Astral team. We just cut our linting + formatting across from pylint + Black to Ruff.
Saw lint times drop from 90 seconds to < 1.5 seconds. crazy stuff.
It's a nice software.
I don't see a way to change current and global versions of python/venvs to run scripts, so that when I type "python" it uses that, without making an alias.
https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...
I can see how if you've had issues with dependencies you would rave about systems that let you control down to the commit what an import statement actually means, but I like the system that requires the least amount of typing/thinking and I imagine I'm part of a silent majority.
After that many years of optimization pure python seems still to be wishfull thinking. It's ai/mk success is also only as a shim language around library calls.
https://github.com/astral-sh/uv/issues/1495
I've written a lightweight replacement script to manage named central virtual envs using the same command syntax as virtualenvwrapper. Supports tab completion for zsh and bash: https://github.com/sitic/uv-virtualenvwrapper
A problem I have now, though, is when I I jump to def in my editor it no longer knows which venv to load because it's outside of the project. This somehow used to work with virtualenwrapper but I'm not sure how.
Just today I set it up on 20 PCs in a computer lab that doesn't have internet, along with vs code and some main packages. Just downloaded the files, made a powershell script and it's all working great with Jupyter etc... Now to get kids to be interested in it...
other than that, it's invaluable to me, with the best features being uvx and PEP 723
What I want is, if my project depends on `package1==0.4.0` and there are new versions of package1, for uv to try install the newer version. and to do that for a) all the deps, simultaneously, b) without me explicitly stating the dependencies in the command line since they're already written in the pyproject.toml. an `uv refresh` of sorts
pyproject.toml is meant to encode the actual constraints for when your app will function correctly, not hardcode exact versions, which is what the lockfile is for.
pyproject.toml’s dependency list specifies compatibility: we expect the program to run with versions that satisfy constraints.
If you want to specify an exact version as a validated configuration for a reproducible build with guaranteed functionality, well, that’s what the lock file is for.
In serious projects, I usually write that dependency section by hand so that I can specify the constraints that match my needs (e.g., what is the earliest version receiving security patches or the earliest version with the functionality I need?). In unserious projects, I’ll leave the constraints off entirely until a breakage is discovered in practice.
If `uv` is adding things with `==` constraints, that’s why upgrades are not occurring, but the solution is to relax the constraints to indicate where you are okay with upgrades happening.
Yeah, that's pretty much what I've been doing with my workaround script. And btw most of my projects are deeply unserious, and I do understand why one should not do that in any other scenario.
Still, I dream of `uv refresh` :D
I think you're just specifying your dependency constraints wrong. What you're asking for is not what the `==` operator is for; you probably want `~=`.
[1]: I do sometimes write the title or the description. But never the deps themselves
Much prefer not thinking about venvs.
* Redis -> redict, valkey
* elastic search -> opensearch
* terraform -> opentofu
(Probably a few more but those are the ones that come to mind when they "go rogue")
I also appreciate that it handles most package conflicts and it constantly maintains the list of packages as you move. I have gotten myself into a hole or two now with packages and dependencies, I can usually solve it by deleting venv an just using uv to reinstall.
There are times when you do NOT want the wheel version to be installed (which is what --no-binary implements in pip), but so many package managers including uv don't provide that core, basic functionality. At least for those that do use pip behind the scenes, like pipenv, one can still use the PIP_NO_BINARY environment variable to ensure this.
So I'll not be migrating any time soon.
See https://docs.astral.sh/uv/reference/environment/#uv_no_binar...
uv is still quite new though. Perhaps you can open an issue and ask for that?
When, why? Should I be doing this?
Fast is a massive factor.
I haven't used it much, but being so fast, I didn't even stop to think "is it perfect at dependency management?" "does it lack any features?".
Just `git clone someproject`, `uv run somescript.py`, then mic drop and walk away.
Or would it be possible to go this fast in python if you cared enough about speed?
Is it a specific thing that rust has an amazing library for? Like Network or SerDe or something?
Using Rust is responsible for a lot of speed gains too, but I believe it's the hard linking trick (which could be implemented in any language) that's the biggest win.
pip could be made faster based on this, but maybe not quite as fast.
Maybe that functionality isnt implemented the same way for uvx.
You could try this equivalent command that is under "uv run" to see if it behaves differently: https://docs.astral.sh/uv/concepts/tools/#relationship-to-uv...
e.g.
$ uv tool install asciinema
$ asciinema play example.cast
Perhaps uv will continue its ascendancy and get there naturally. But I’d like to see uv be a little more aggressive with “uv native” workflows. If that makes sense.
[0] https://docs.astral.sh/uv/guides/
Also, it seems like a sign that even Python tooling needs to not be written in Python now to get reasonable performance.
I don't want to charge people money to use our tools, and I don't want to create an incentive structure whereby our open source offerings are competing with any commercial offerings (which is what you see with a lost of hosted-open-source-SaaS business models).
What I want to do is build software that vertically integrates with our open source tools, and sell that software to companies that are already using Ruff, uv, etc. Alternatives to things that companies already pay for today.
An example of what this might look like (we may not do this, but it's helpful to have a concrete example of the strategy) would be something like an enterprise-focused private package registry. A lot of big companies use uv. We spend time talking to them. They all spend money on private package registries, and have issues with them. We could build a private registry that integrates well with uv, and sell it to those companies. [...]
But the core of what I want to do is this: build great tools, hopefully people like them, hopefully they grow, hopefully companies adopt them; then sell software to those companies that represents the natural next thing they need when building with Python. Hopefully we can build something better than the alternatives by playing well with our OSS, and hopefully we are the natural choice if they're already using our OSS.
Let's be honest, all tries to bring a cpython alternative failed (niche boosters like PyPy is a separate story, but it's not up-to-date, and not entirely exact). For some reason, people think that 1:1 compatibility is not critical and too costly to pursue (hello, all LLVM-based compilers). I think, it's doable and there's a solid way to solve it. What if Astral thinks so too?
It seems easy to imagine Astral following a similar path and making a significant amount of money in the process.
One day they're going to tell me I have to pay $10/month per user and add a bunch of features I really don't need just because nobody wants to prioritize the speed of pip.
And most of that fee isn't going to go towards engineers maintaining "pip but faster", it's going to fund a bunch of engineers building new things I probably don't want to use, but once you have a company and paying subscribers, you have to have developers actively doing things to justify the cost.
1: https://old.reddit.com/r/Python/comments/12rk41t/astral_next...
Rather, pip was broken intentionally two years ago and they are still not interested in fixing it:
https://github.com/pypa/packaging/issues/774
I tried uv and it just worked.