I've recently updated a Python script that I originally wrote about 10 years ago. I'm not a programmer - I just have to get stuff done - think sysops.
For me there used to be a clear delineation between scripting languages and compiled languages. Python has always seemed to want to be both and I'm not too sure it can really. I can live with being mildly wrong about a concept.
When Python first came out, our processors were 80486 at best and RAM was measured in MB at roughly £30/MB in the UK.
"For the longest time, ..." - all distros have had scripts that find the relevant Python or Java or whatevs so that's simply daft. They all have shebang incantations too.
So we now have uv written in Rust for Python. Obviously you should install it via a shell script directly from curl!
I love all of the components involved here but please for the love of a nod to security at least suggest that the script is downloaded first, looked over and then run.
I recently came across a Github hosted repo with scripts that changed Debian repos to point somewhere else and install ... software. I'm sure that's all fine too.
curl | bash is cute and easy and very, very insecure.
puika · 2h ago
Like the author, I find myself going more for cross-platform Python one-offs and personal scripts for both work and home and ditching Go. I just wish Python typechecking weren't the shitshow it is. Looking forward to ty, pyrefly, etc. to improve the situation a bit
ACAVJW4H · 4h ago
finally feels like Python scripts can Just Work™ without a virtualenv scavenger hunt.
Now if only someone could do the same for shell scripts. Packaging, dependency management, and reproducibility in shell land are still stuck in the Stone Ages. Right now it’s still curl | bash and hope for the best, or a README with 12 manual steps and three missing dependencies.
Sure, there’s Nix... if you’ve already transcended time, space, and the Nix manual. Docker? Great, if downloading a Linux distro to run sed sounds reasonable.
There’s got to be a middle ground simple, declarative, and built for humans.
wpm · 21m ago
I simply do not write shell scripts that use or reference binaries/libraries that are no pre-installed on the target OS (which is the correct target, writing shell scripts for portability is silly).
There is no package manager that is going to make a shell script I write for macOS work on Linux if that script uses commands that only exist on macOS.
traverseda · 3h ago
I don't think nix is that hard for this particular use case. Installing nix on other distros is pretty easy, and once it's installed you just do something like this
> Packaging, dependency management, and reproducibility in shell land are still stuck in the Stone Ages.
IMO it should stay that way, because any script that needs those things is way past the point where shell is a reasonable choice. Shell scripts should be small, 20 lines or so. The language just plain sucks too much to make it worth using for anything bigger.
We use it at $work to manage dev envs and its much easier than Docker and Nix.
It also installs things in parallel, which is a huge bonus over plain Dockerfiles
andenacitelli · 52m ago
+1 for Mise, it has just totally solved the 1..N problem for us and made it hilariously easy to be more consistent across local dev and workflows
password4321 · 2h ago
I'm unable to resist responding that clearly the solution is to run Nix in Docker as your shell since packaging, dependency management, and reproducibility will be at theoretical maximum.
fouronnes3 · 4h ago
Consider porting your shell scripts to Python? The language is vastly superior and subprocess.check_call is not so bad.
bjackman · 4h ago
For the specific case of solving shell script dependencies, Nix is actually very straightforward. Packaging a script is a writeShellApplication call and calling it is a `nix run`.
I guess the issue is just that nobody has documented how to do that one specific thing so you can only learn this technique by trying to learn Nix as a whole.
So perhaps the thing you're envisaging could just be a wrapper for this Nix logic.
pxc · 3h ago
I use Nix for this with resholve and I like it a lot.
I hope this stays on the front page for a while to help publicize it.
jkingsman · 5h ago
uv has been fantastic to use for little side projects. Combining uv run with `uv tool run` AKA `uvx` means one can fetch, install within a VM, and execute Python scripts from Github super easily. No git clone, no venv creation + entry + pip install.
And uv is fast — I mean REALLY fast. Fast to the point of suspecting something went wrong and silently errored, when it fact it did just what I wanted but 10x faster than pip.
It (and especially its docs) are a little rough around the edges, but it's bold enough and good enough I'm willing to use it nonetheless.
lxgr · 5h ago
Truly. uv somehow resolves and installs dependencies more quickly than pyenv manages to print its own --help output.
mikepurvis · 4h ago
I know there are real reasons for slow Python startup time, with every new import having to examine swaths of filesystem paths to resolve itself, but it really is a noticeable breath of fresh air working with tools implemented in Go or Rust that have sub-ms startup.
lxgr · 4h ago
The Python startup latency thing makes sense, but I really don't understand why it would take `pyenv` a long time to print each line of its "usage" output (the one that appears when invoking it with `--help`) once it's already clearly in the code branch that does only that.
It feels like like it's doing heavy work between each line printed! I don't know any other cli tool doing that either.
heavyset_go · 2h ago
There's a launcher wrapper shell script + Python startup time that contributes to pyenv's slow launch times.
theshrike79 · 4h ago
The "slowness" and the utter insanity of trying to make a "works on my computer" Python program work on another computer pushed me to just rewrite all my Python stuff in Go.
About 95% of my Python utilities are now Go binaries cross-compiled to whatever env they're running in. The few remaining ones use (API) libraries that aren't available for Go or aren't mature enough for me to trust them yet.
Spivak · 3h ago
Not to derail the Python speed hate train but pyenv is written in bash.
It's a tool for installing different versions of Python, it would be weird for it to assume it already had one available.
lxgr · 2h ago
Oh, that might actually explain the slow line printing speed. Thank you, solves a long standing low stakes mystery for me :)
heavyset_go · 2h ago
Last time I looked, pyenv contributors were considering implementing a compiled launcher for that reason.
But that ship has sailed for me and I'm a uv convert.
satvikpendem · 3h ago
Very nice, I believe Rust is doing something similar too which is where I initially learned of this idea of single-file shell-type scripts in other languages (with dependency management included, which is how it differs from existing ways of writing single-file scripts in e.g. scripting languages) [0].
Hopefully more languages follow suit on this pattern as it can be extremely useful for many cases, such as passing gists around, writing small programs which might otherwise be written in shell scripts, etc.
So far I've only run into one minor ergonomic issue when using `uv run --script` with embedded metadata which is that sometimes I want to test changes to the script via the Python REPL, but that's a bit harder to do since you have to run something like:
$ uv run --python=3.13 --with-requirements <(uv export --script script.py) -- python
>>> from script import X
I'd love if there were something more ergonomic like:
$ uv run --with-script script.py python
Edit: this is better:
$ "$(uv python find --script script.py)"
>>> from script import X
That fires up the correct python and venv for the script. You probably have to run the script once to create it.
You can make `--interactive` or whatever you want a CLI flag from the script. I often make these small Typer CLIs with something like that (or in this case, in another dev script like this, I have `--sql` for entering a DuckDB SQL repl)
mayli · 2h ago
you are welcome
cat ~/.local/bin/uve
#!/bin/bash
temp=$(mktemp)
uv export --script $1 --no-hashes > $temp
uv run --with-requirements $temp vim $1
unlink $temp
nomel · 2h ago
This is rather silly.
kristianp · 1h ago
If you want to manually manage envs and you're using conda, you can activate the env in a shell wrapper for your python script, like so (this is with conda)
Admittedly this isn't self contained like the PEP 723 solution.
sambaumann · 4h ago
Between yesterday's thread and this thread I decided to finally give uv a shot today - I'm impressed, both by the speed and how easy it is to manage dependencies for a project.
I think their docs could use a little bit of work, especially there should be a defined path to switch from a requirements.txt based workflow to uv. Also I felt like it's a little confusing how to define a python version for a specific project (it's defined in both .python-version and pyproject.toml)
tdhopper · 3h ago
I write an ebook on Python Developer tooling. I've attempted to address some of the weaknesses in the official documentation.
Let me know if there are other topics I can hit that would be helpful!
wrboyce · 3h ago
This would’ve been really handy for me a few weeks ago when I ended up working this out for myself (not a huge job, but more effort than reading your documentation would’ve been). While I can’t think of anything missing off the top of my head, I do think a PR to uv to update the official docs would help a lot of folk!
Actually, I’ve thought of something! Migrating from poetry! It’s something I’ve been meaning to look at automating for a while now (I really don’t like poetry).
> it's defined in both .python-version and pyproject.toml
The `requires-version` field in `pyproject.toml` defines a range of compatible versions, while `.python-version` defines the specific version you want to use for development. If you create a new project with uv init, they'll look similar (>=3.13 and 3.13 today), but over time `requires-version` usually lags behind `.python-version` and defines the minimum supported Python version for the project. `requires-version` also winds up in your package metadata and can affect your callers' dependency resolution, for example if your published v1 supports Python 3.[old] but your v2 does not.
furyofantares · 41m ago
Same, although I think it doesn't support my idiosyncratic workflow. I have the same files sync'd (via dropbox at the moment) on all my computers, macos and windows and wsl alike, and I just treat every computer likes it's the same computer. I thought this might be a recipe for disaster when I started doing it years ago but I have never had problems.
Some stuff like npm or dotnet do need an npm update / dotnet restore when I switch platforms. At first attempt uv seems like it just doesn't really like this and takes a fair bit of work to clean it up when switching platforms, while using venvs was fine.
zahlman · 2h ago
> how to define a python version for a specific project (it's defined in both .python-version and pyproject.toml)
pyproject.toml is about allowing other developers, and end users, to use your code. When you share your code by packaging it for PyPI, a build backend (uv is not one, but they seem to be working on providing one - see https://github.com/astral-sh/uv/issues/3957 ) creates a distributable package, and pyproject.toml specifies what environment the user needs to have set up (dependencies and python version). It has nothing to do with uv in itself, and is an interoperable Python ecosystem standard. A range of versions is specified here, because other people should be able to use your code on multiple Python versions.
The .python-version file is used to tell uv specifically (i.e. nobody else) specifically (i.e., exact version) what to do when setting up your development environment.
(It's perfectly possible, of course, to just use an already-set-up environment.)
0cf8612b2e1e · 4h ago
I have never researched this, but I thought the .python-version file only exists to benefit other tools which may not have a full TOML parser.
zahlman · 2h ago
Read-only TOML support is in the standard library since Python 3.11, though. And it's based on an easily obtained third-party package (https://pypi.org/project/tomli/).
(If you want to write TOML, or do other advanced things such as preserving comments and exact structure from the original file, you'll want tomlkit instead. Note that it's much less performant.)
No comments yet
gschizas · 4h ago
> there should be a defined path to switch from a requirements.txt based workflow to uv
Some years ago I thought it would be interesting to develop a tool to make a python script automatically install its own dependencies (like uvx in the article), but without requiring any other external tool, except python itself, to be installed.
The downside is that there are a bunch of seemingly weird lines you have to paste at the begging of the script :D
Grace Hopper technology: A well formed Python program shall define an ENVIRONMENT division that specifies the environment in which the program will be compiled and executed. It outlines the hardware and software dependencies. This division is crucial for making COBOL^H^H^H^H^HPython programs portable across different systems.
tpoacher · 3h ago
What's going on? This whole thread reads like paid amazon reviews
indosauros · 3h ago
What's going on is "we have 14 standards so we need to create a 15th" actually worked this time
kibwen · 2h ago
It works far more of the time than people give it credit for. There are a lot of good XKCDs, but that one is by far the worst one ever made, as far as being a damaging meme goes.
mturmon · 57m ago
"xkcd 927 Considered Harmful" ?
oblio · 2h ago
Occasionally the reviews match reality.
divbzero · 1h ago
If momentum for uv in the community continues, I’d love to see it distributed more broadly. uv can already be installed easily on macOS via Homebrew (like pyenv). uv can also be installed on Windows via WinGet (unlike pyenv). It would be nice to see it packaged for Linux as well.
quibono · 2h ago
Last time I looked at switching from poetry to uv I had an issue with pinning certain dependencies to always install from a private PyPI repository. Is there a way to do that now?
(also: possible there's always been a way and I'm an idiot)
> uv is an extremely fast Python package and project manager, written in Rust.
Is there a version of uv written in Python? It's weird (to me) to have an entire ecosystem for a language and a highly recommended tool to make your system work is written in another language.
sgeisenh · 3h ago
Similar to ruff, uv mostly gathers ideas from other tools (with strong opinions and a handful of thoughtful additions and adjustments) and implements them in Rust for speed improvements.
Interestingly, the speed is the main differentiator from existing package and project management tools. Even if you are using it as a drop-in replacement for pip, it is just so much faster.
zahlman · 2h ago
They are not making a Python version.
There are many competing tools in the space, depending on how you define the project requirements.
Contrary to the implication of other replies, the lion's share of uv's speed advantage over Pip does not come from being written in Rust, from any of the evidence available to me. It comes from:
* bootstrapping Pip into the new environment, if you make a new environment and don't know that you don't actually have to bootstrap Pip into that environment (see https://zahlman.github.io/posts/2025/01/07/python-packaging-... for some hints; my upcoming post will be more direct about it - unfortunately I've been putting it off...)
* being designed up front to install cross-environment (if you want to do this with Pip, you'll eventually and with much frustration get a subtly broken installation using the old techniques; since 22.3 you can just use the `--python` flag, but this limits you to environments where the current Pip can run, and re-launches a new Pip process taking perhaps an additional 200ms - but this is still much better than bootstrapping another copy of Pip!)
* using heuristics when solving for dependencies (Pip's backtracking resolver is exhaustive, and proceeds quite stubbornly in order)
* having a smarter caching strategy (it stores uncompressed wheels in its cache and does most of the "installation" by hard-linking these into the new environment; Pip goes through a proxy that uses some opaque cache files to simulate re-doing the download, then unpacks the wheel again)
* not speculatively pre-loading a bunch of its own code that's unlikely to execute (Pip has large complex dependencies, like https://pypi.org/project/rich/, which it vendors without tree-shaking and ultimately imports almost all of, despite using only a tiny portion)
* having faster default behaviours; e.g. uv defaults to not pre-compiling installed packages to .pyc files (since Python will do this on the first import anyway) while Pip defaults to doing so
* not (necessarily) being weighed down by support for legacy behaviours (packaging worked radically differently when Pip first became publicly available)
* just generally being better architected
None of these changes require a change in programming language. (For example, if you use Python to make a hard link, you just use the standard library, which will then use code written in C to make a system call that was most likely also written in C.) Which is why I'm making https://github.com/zahlman/paper .
jaapz · 2h ago
But also, because it's written in rust. There are tools written in python that do these smart caching and resolving tricks as well, and they are still orders of magnitude slower
zahlman · 56m ago
Such as?
Poetry doesn't do this caching trick. It creates its own cache with the same sort of structure as Pip's, and as far as I can tell it uses its own reimplementation of Pip's core installation logic from there (including `installer`, which is a factored-out package for the part of Pip that actually unpacks the wheel and copies files).
ebb_earl_co · 3h ago
Well, I use Debian and Bash: pretty much everything to make my system work, including and especially Python development, is written in C, another language!
dralley · 3h ago
pip?
A tool written in Python is never going to be as fast as one written in Rust. There are plenty of Python alternatives and you're free to use them.
There has been a flurry of `uv` posts on HN recently. I don't have any experience with it, is it really the future, or is it a fad?
As Ive gotten older I've grown weary of third party tools, and almost always try to stick with the first party built in methods for a given task.
Does uv provide enough benefit to make me reconsider?
Disposal8433 · 4h ago
I'm not a Python master but I've struggled with all the previous package managers, and uv is the first tool that does everything easily (whether it's installing or generating packages or formatting or checking your code).
I don't know why there is such a flurry of posts since it's a tool that is more than a year old, but it's the one and only CLI tool that I recommend when Python is needed for local builds or on a CI.
Hatch was a good contender at the time but they didn't move fast enough, and the uv/ruff team ate everybody's lunch. uv is really good and IMHO it's here to stay.
Anyway try it for yourself but it's not a high-level tool that is hiding everything, it's fast and powerful and yet you stay in control. It feels like a first-party tool that could be included in the Python installer.
eipipuz · 4h ago
The learning curve is so low that yes.
Try it for <20mins and if you don't like it, leave it behind. These 20mins include installation, setup, everything.
codethief · 2h ago
Yes, IMO it does. I wrote my first lines of Python 16 years ago and have worked with raw pip & venv, PDM and Poetry. None of those solutions come close to how easy it is to use (and migrate to) uv. Just give it a try for half an hour, you likely won't want to use anything else after that.
collinmcnulty · 4h ago
I also went through a similar enlightenment of just sticking to pip, but uv convinced me to switch and I’m so glad I did. You can dip your toe in by just using the ‘uv pip’ submodule as a drop in replacement for pip but way faster.
padjo · 2h ago
I’m a moron when it comes to python tooling but switching a project to uv was a pleasant experience. It seems well thought out and the speed is genuinely a feature compared to other python tooling I’ve used.
zahlman · 2h ago
A lot of people like all-in-one tools, and uv offers an opinionated approach that works. It's essentially the last serious attempt at this since Poetry, except that uv is also supporting a variety of new Python packaging standards up front (most notably https://peps.python.org/pep-0621/ , which Poetry lagged on for years - see https://github.com/python-poetry/roadmap/issues/3 ) and seems committed to keeping on top of new ones.
How much you can benefit depends on your use case. uv is a developer tool that also manages installations of Python itself (and maintains separate environments for which you can choose a Python version). If you're just trying to install someone else's application from PyPI - say https://pypi.org/project/pycowsay/ as an example - you'll likely have just as smooth of an experience via pipx (although installation will be even slower than with pip, since it's using pip behind the scenes and adding its own steps). On the other hand, to my understanding, to use uv as a developer you'll still need to choose and install a build backend such as Flit or Hatchling, or else rely on the default Setuptools.
One major reason developers are switching to uv is lockfile support. It's worth noting here that an interoperable standard for lockfiles was recently approved (https://peps.python.org/pep-0751/), uv will be moving towards it, and other tools like pip are moving towards supporting it (the current pip can write such lockfiles, and installing from them is on the roadmap: https://github.com/pypa/pip/issues/13334).
If you, like me, prefer to follow the UNIX philosophy, a complete developer toolchain in 2025 looks like:
* Ability to create virtual environments (the standard library takes care of this; some niche uses are helped out by https://virtualenv.pypa.io/)
* Package installer (Pip can handle this) and manager (if you really want something to "manage" packages by installing into an environment and simultaneously updating your pyproject.toml, or things like that; but just fixing the existing environment is completely viable, and installers already resolve dependencies for whatever it is they're currently installing)
* Build backend (many options here - by design! but installers will assume Setuptools by default, since the standard requires them to, for backwards compatibility reasons)
* Some version of Python (the one provided with a typical Linux distribution will generally work just fine; Windows users should usually just install the current version, with the official installer, unless they know something they want to install isn't compatible)
* Ability to create virtual environments and also install packages into them (https://pipx.pypa.io/stable/ takes care of both of these, as long as the package is an "application" with a defined entry point; I'm making https://github.com/zahlman/paper which will lift that restriction, for people who want to `import` code but not necessarily publish their own project)
* Ability to actually run the installed code (pipx handles this by symlinking from a standard application path to a wrapper script inside the virtual environment; the wrappers specify the absolute path to the virtual environment's Python, which is generally all that's needed to "use" that virtual environment for the program. It also provides a wrapper to run Pip within a specific environment that it created. PAPER will offer something a bit more sophisticated here, for both aspects.)
giantrobot · 3h ago
It is difficult to use Python for utility scripts on the average Linux machine. Deploying Python projects almost require using a container. Popular distros try managing Python packages through the standard package manager rather than pip but not all packages are readily available. Sometimes you're limited by Python version and it can be non-trivial to have multiple versions installed at once. Python packaging has become a shit show.
If you use anything outside the standard library the only reliable way to run a script is installing it in a virtual environment. Doing that manually is a hassle and pyenv can be stupidly slow and wastes disk space.
With uv it's fast and easy to set up throw away venvs or run utility scripts with their dependencies easily. With the PEP-723 scheme in the linked article running a utility script is even easier since its dependencies are self-declared and a virtual environment is automatically managed. It makes using Python for system scripting/utilities practical and helps deploy larger projects.
zahlman · 1h ago
> Deploying Python projects almost require using a container.
Really? `apt install pipx; pipx install sphinx` (for example) worked flawlessly for me. Pipx is really just an opinionated wrapper that invokes a vendored copy of Pip and the standard library `venv`.
The rest of your post seems to acknowledge that virtual environments generally work just fine. (Uv works by creating them.)
> Sometimes you're limited by Python version and it can be non-trivial to have multiple versions installed at once.
I built them from source and make virtual environments off of them, and pass the `--python` argument to Pipx.
> If you use anything outside the standard library the only reliable way to run a script is installing it in a virtual environment. Doing that manually is a hassle and pyenv can be stupidly slow and wastes disk space.
If you're letting it install separate copies of Python, sure. (The main use case for pyenv is getting one separate copy of each Python version you need, if you don't want to build from source, and then managing virtual environments based off of that.) If you're letting it bootstrap Pip into the virtual environment, sure. But you don't need to do either of those things. Pip can install cross-environment since 22.3 (Pipx relies on this).
Uv does save disk space, especially if you have multiple virtual environments that use the same packages, by hard-linking them.
> With uv it's fast and easy to set up throw away venvs or run utility scripts with their dependencies easily. With the PEP-723 scheme in the linked article running a utility script is even easier since its dependencies are self-declared and a virtual environment is automatically managed.
Pipx implements PEP 723, which was written to be an ecosystem-wide standard.
kristianp · 3h ago
Does this create a separate environment for each script? If so, won't that create lots of bloat?
zahlman · 1h ago
It does create separate environments.
Each environment itself only takes a few dozen kilobytes to make some folders and symlinks (at least on Linux). People think of Python virtual environments as bloated (and slow to create) because Pip gets bootstrapped into them by default. But there is no requirement to do so.
The packages take up however much space they take up; the cost there is unavoidable. Uv hard-links packages into the separate environments from its cache, so you only pay a disk-space cost for shared packages once (plus a few more kilobytes for more folders).
(Note: none of this depends on being written in Rust, but Pip doesn't implement this caching strategy. Pip can, however, install cross-environment since 22.3, so you don't actually need the bootstrap. Pipx depends on this, managing its own vendored copy of Pip to install into multiple environments. But it's still using a copy of Pip that interacts with a Pip-styled cache, so it still can't do the hard-link trick.)
JimDabell · 3h ago
Yes, it creates a separate environment for each script. No, it doesn’t create a lot of bloat. There’s a separate cache and the packages are hard-linked into the environments, so it’s extremely fast and efficient.
k__ · 5h ago
Pretty nice!
Some Python devs told me, it's an awesome language, but they envy the Node.js ecosystem for their package management.
Seems like uv finally removed that roadblock.
Y_Y · 4h ago
I think they must have been joking!
wavemode · 4h ago
Probably not. NPM has its problems but Python packaging has always been significantly messier (partly because, Python is much older than Node and, indeed, much older than the very concept of resolving dependencies over the internet).
int_19h · 2h ago
The upside in Python is that dependencies tend to be more coarse grained and things break less when you update. With JS you have to be on the treadmill constantly to avoid bitrot, and because packages tend to be so small and dependency trees so large, there's a lot of potential points of failure when updating anything.
oblio · 2h ago
The bigger problem in Python has been its slowness and reliance on C dependencies.
Maven solved Java packaging circa 2005, for example. Yes, XML is verbose, but it's an implementation detail. Python still lags on many fronts, 20 years later.
An example: even now it makes 0 sense to me why virtual envs are not designed and supposed to be portable between machines with the same architecture (!). Or why venvs need to be activated with shell-variety specific code.
zahlman · 1h ago
> An example:
None of this example has anything to do with performance or reliance on C dependencies, but ok.
> even now it makes 0 sense to me why virtual envs are not designed and supposed to be portable between machines with the same architecture (!).
They aren't designed to be relocatable at all - and that's the only actual stumbling block to it. (They may even contain activation scripts for other platforms!)
That's because a bunch of stuff in there specifies absolute paths. In particular, installers (Pip, at least) will generate wrapper scripts that specify absolute paths. This is so that you can copy them out of the environment and have them work. Yes, people really do use that workflow (especially on Windows, where symlinking isn't straightforward).
It absolutely could be made to work - probably fairly easily, and there have been calls to sacrifice that workflow to make it work. It's also entirely possible to do a bit of surgery on a relocated venv and make it work again. I've done it a few times.
The third-party `virtualenv` also offers some support for this. Their documentation says there are some issues with this. I'm pretty sure they're mainly talking about that wrapper-script-copying use case.
> Or why venvs need to be activated with shell-variety specific code.
The activation sets environment variables for the current shell. That isn't possible (at least in a cross-platform way) from Python since the Python process would be a child of that shell. (This is also why you have to e.g. use `source` explicitly to run the Linux versions.)
But venvs generally don't need to be activated at all. The only things the activation script effectively does:
* Set the path environment variable so that the virtual environment's Python (or symlink thereto) will be found first.
* Put some fancy stuff in the prompt so that you can feel like you're "in" the virtual environment (a luxury, not at all required).
* Set `VIRTUAL_ENV`, which some Python code might care about (but they could equally well check things like `sys.executable`)
* Unset (and remember) `PYTHONHOME` (which is a hack that hardly anyone has a good use case for anyway)
* (on some systems that don't have a separate explicit deactivate script) set up the means to undo all those changes
The actually important thing is the path variable change, and even then you don't need that unless the code is going to e.g. start a Python subprocess and ask the system to find Python. (Or, much more commonly, because you have a `#!/usr/bin/env python` shebang somewhere.) You can just run the virtual environment's Python directly.
In particular, you don't have to activate the virtual environment in order to use its wrapper scripts, as long as you can find them. And, in fact, Pipx depends on this.
Thanks! I'm guessing it's blocked on your work/uni network too? Stupid over-eager firewall.
kzrdude · 3h ago
I like uv run and uvx like the swiss army knifes of python that they are, but PEP 723 stuff I think is mostly just a gimmick. I'm not convinced it's more than a cool trick.
zahlman · 1h ago
It's useful for people who don't want to create a "project" or otherwise think about the "ecosystem". People who, if they share their code at all, will email it to coworkers or something. It lets you get by without a pyproject.toml file etc.
gigatexal · 3h ago
Ok I didn’t know about this pep. But I love uv. I use it all day long. Going to use this to change up a lot of my shell scripts into easily runnable Python!
bjourne · 1h ago
> For the longest time, I have been frustrated with Python because I couldn’t use it for one-off scripts.
Bruh, one-off scripts is the whole point of Python. The cheat code is to add "break-system-packages = true" to ~/.config/pip/pip.conf. Just blow up ~/.local/lib/pythonX.Y/site-packages/ if you run into a package conflict (exceedingly rare) and reinstall. All these venv, uv, metadata peps, and whatnot are pointless complications you just don't need.
Noumenon72 · 5h ago
If PEP 723 is only an enhancement proposal does it work only because `uv` happens to support it?
Can you not use `uvx` with your script because it only works on packages that are installed already or on PyPi?
I don't think running with uv vs uvx imposes any extra limitations on how you specify dependencies. You should either way be able to reference dependencies not just from PyPi but also by git repo or local file path in a [tool.uv.sources] table, the same as you would in a pyproject.toml file.
rahimnathwani · 3h ago
PEP 723 is final and most relevant tools will support it:
I honestly don't like that this is expressed as a comment but I guess it makes the implementation easy and backwards compatible...
babuloseo · 3h ago
been doing this with Pipenv before, but uv is like Pipenv on steroids.
No comments yet
korijn · 3h ago
There's no lockfile or anything with this approach right? So in a year or two all of these scripts will be broken because people didn't pin their dependencies?
I like it though. It's very convenient.
js2 · 3h ago
> There's no lockfile or anything with this approach right?
There are options to both lock the dependencies and limit by date:
PEP 723 allows you to specify version numbers for direct dependencies, but of course indirect dependencies aren't guaranteed to be the same.
zahlman · 2h ago
> So in a year or two all of these scripts will be broken because people didn't pin their dependencies?
People act like this happens all the time but in practice I haven't seen evidence that it's a serious problem. The Python ecosystem is not the JavaScript ecosystem.
nomel · 2h ago
I think it's because you don't maintain much python code, or use many third party libraries.
An easy way to prove that this is the norm is to take some existing code you have now, and update to the latest versions your dependencies are using, and watch everything break. You don't see a problem because those dependencies are using pinned/very restricted versions, to hide the frequency of the problem from you. You'll also see that, in their issue trackers, they've closed all sorts of version related bugs.
zahlman · 1h ago
> An easy way to prove that this is the norm is to take some existing code you have now, and update to the latest versions your dependencies are using
I have done this many times and watched everything fail to break.
nomel · 7m ago
Are you sure you’re reading what I wrote fully? Getting pip, or any of them, to ignore all version requirements, including those listed by the dependencies themselves, required modifying source, last I tried.
I’ve had to modify code this week due to changes in some popular libraries. Some recent examples are Numpy 2.0 broke most code that used numpy. They changed the c side (full interpreter crashes with trimesh) and removed/moved common functions, like array.ptp(). Scipy moved a bunch of stuff lately, and fully removed some image related things.
If you think python libraries are somehow stable in time, you just don’t use many.
tpoacher · 3h ago
> If you are not a Pythonista (or one possibly living under a rock)
That's bait! / Ads are getting smarter!
I would also have accepted "unless you're geh", "unless you're a traitor to the republic", "unless you're not leet enough" etc.
SpaceNugget · 3h ago
I'm not a python dev, but if you read HN even semi-regularly you have surely come across it several times in at least the past few months if not a year by now. It is all the rage these days in python world it seems.
And so, if you are the kind of person who has not heard of it, you probably don't read blogs about python, therefor you probably aren't reading _this_ blog. No harm no foul.
For me there used to be a clear delineation between scripting languages and compiled languages. Python has always seemed to want to be both and I'm not too sure it can really. I can live with being mildly wrong about a concept.
When Python first came out, our processors were 80486 at best and RAM was measured in MB at roughly £30/MB in the UK.
"For the longest time, ..." - all distros have had scripts that find the relevant Python or Java or whatevs so that's simply daft. They all have shebang incantations too.
So we now have uv written in Rust for Python. Obviously you should install it via a shell script directly from curl!
I love all of the components involved here but please for the love of a nod to security at least suggest that the script is downloaded first, looked over and then run.
I recently came across a Github hosted repo with scripts that changed Debian repos to point somewhere else and install ... software. I'm sure that's all fine too.
curl | bash is cute and easy and very, very insecure.
Now if only someone could do the same for shell scripts. Packaging, dependency management, and reproducibility in shell land are still stuck in the Stone Ages. Right now it’s still curl | bash and hope for the best, or a README with 12 manual steps and three missing dependencies.
Sure, there’s Nix... if you’ve already transcended time, space, and the Nix manual. Docker? Great, if downloading a Linux distro to run sed sounds reasonable.
There’s got to be a middle ground simple, declarative, and built for humans.
There is no package manager that is going to make a shell script I write for macOS work on Linux if that script uses commands that only exist on macOS.
If you're allowed to install any deps go with uv, it'll do the rest.
I'm also kinda in love with https://babashka.org/ check it out if you like Clojure.
IMO it should stay that way, because any script that needs those things is way past the point where shell is a reasonable choice. Shell scripts should be small, 20 lines or so. The language just plain sucks too much to make it worth using for anything bigger.
We use it at $work to manage dev envs and its much easier than Docker and Nix.
It also installs things in parallel, which is a huge bonus over plain Dockerfiles
I guess the issue is just that nobody has documented how to do that one specific thing so you can only learn this technique by trying to learn Nix as a whole.
So perhaps the thing you're envisaging could just be a wrapper for this Nix logic.
Note that PEP 723 is also supported by pipx run:
https://pipx.pypa.io/latest/examples/#pipx-run-examples
https://simonwillison.net/2024/Dec/19/one-shot-python-tools/
And there was a March discussion of a different blog post:
https://news.ycombinator.com/item?id=43500124
I hope this stays on the front page for a while to help publicize it.
And uv is fast — I mean REALLY fast. Fast to the point of suspecting something went wrong and silently errored, when it fact it did just what I wanted but 10x faster than pip.
It (and especially its docs) are a little rough around the edges, but it's bold enough and good enough I'm willing to use it nonetheless.
It feels like like it's doing heavy work between each line printed! I don't know any other cli tool doing that either.
About 95% of my Python utilities are now Go binaries cross-compiled to whatever env they're running in. The few remaining ones use (API) libraries that aren't available for Go or aren't mature enough for me to trust them yet.
It's a tool for installing different versions of Python, it would be weird for it to assume it already had one available.
But that ship has sailed for me and I'm a uv convert.
Hopefully more languages follow suit on this pattern as it can be extremely useful for many cases, such as passing gists around, writing small programs which might otherwise be written in shell scripts, etc.
[0] https://rust-lang.github.io/rfcs/3424-cargo-script.html
You can make `--interactive` or whatever you want a CLI flag from the script. I often make these small Typer CLIs with something like that (or in this case, in another dev script like this, I have `--sql` for entering a DuckDB SQL repl)
I think their docs could use a little bit of work, especially there should be a defined path to switch from a requirements.txt based workflow to uv. Also I felt like it's a little confusing how to define a python version for a specific project (it's defined in both .python-version and pyproject.toml)
How to migrate from requirements.txt: https://pydevtools.com/handbook/how-to/migrate-requirements.... How to change the Python version of a uv project: https://pydevtools.com/handbook/how-to/how-to-change-the-pyt...
Let me know if there are other topics I can hit that would be helpful!
Actually, I’ve thought of something! Migrating from poetry! It’s something I’ve been meaning to look at automating for a while now (I really don’t like poetry).
The `requires-version` field in `pyproject.toml` defines a range of compatible versions, while `.python-version` defines the specific version you want to use for development. If you create a new project with uv init, they'll look similar (>=3.13 and 3.13 today), but over time `requires-version` usually lags behind `.python-version` and defines the minimum supported Python version for the project. `requires-version` also winds up in your package metadata and can affect your callers' dependency resolution, for example if your published v1 supports Python 3.[old] but your v2 does not.
Some stuff like npm or dotnet do need an npm update / dotnet restore when I switch platforms. At first attempt uv seems like it just doesn't really like this and takes a fair bit of work to clean it up when switching platforms, while using venvs was fine.
pyproject.toml is about allowing other developers, and end users, to use your code. When you share your code by packaging it for PyPI, a build backend (uv is not one, but they seem to be working on providing one - see https://github.com/astral-sh/uv/issues/3957 ) creates a distributable package, and pyproject.toml specifies what environment the user needs to have set up (dependencies and python version). It has nothing to do with uv in itself, and is an interoperable Python ecosystem standard. A range of versions is specified here, because other people should be able to use your code on multiple Python versions.
The .python-version file is used to tell uv specifically (i.e. nobody else) specifically (i.e., exact version) what to do when setting up your development environment.
(It's perfectly possible, of course, to just use an already-set-up environment.)
(If you want to write TOML, or do other advanced things such as preserving comments and exact structure from the original file, you'll want tomlkit instead. Note that it's much less performant.)
No comments yet
Try `uvx migrate-to-uv` (see https://pypi.org/project/migrate-to-uv/)
The downside is that there are a bunch of seemingly weird lines you have to paste at the begging of the script :D
If anyone is curios it's on pypi (pysolate).
Not quite the same but interesting!
(also: possible there's always been a way and I'm an idiot)
Is there a version of uv written in Python? It's weird (to me) to have an entire ecosystem for a language and a highly recommended tool to make your system work is written in another language.
Interestingly, the speed is the main differentiator from existing package and project management tools. Even if you are using it as a drop-in replacement for pip, it is just so much faster.
There are many competing tools in the space, depending on how you define the project requirements.
Contrary to the implication of other replies, the lion's share of uv's speed advantage over Pip does not come from being written in Rust, from any of the evidence available to me. It comes from:
* bootstrapping Pip into the new environment, if you make a new environment and don't know that you don't actually have to bootstrap Pip into that environment (see https://zahlman.github.io/posts/2025/01/07/python-packaging-... for some hints; my upcoming post will be more direct about it - unfortunately I've been putting it off...)
* being designed up front to install cross-environment (if you want to do this with Pip, you'll eventually and with much frustration get a subtly broken installation using the old techniques; since 22.3 you can just use the `--python` flag, but this limits you to environments where the current Pip can run, and re-launches a new Pip process taking perhaps an additional 200ms - but this is still much better than bootstrapping another copy of Pip!)
* using heuristics when solving for dependencies (Pip's backtracking resolver is exhaustive, and proceeds quite stubbornly in order)
* having a smarter caching strategy (it stores uncompressed wheels in its cache and does most of the "installation" by hard-linking these into the new environment; Pip goes through a proxy that uses some opaque cache files to simulate re-doing the download, then unpacks the wheel again)
* not speculatively pre-loading a bunch of its own code that's unlikely to execute (Pip has large complex dependencies, like https://pypi.org/project/rich/, which it vendors without tree-shaking and ultimately imports almost all of, despite using only a tiny portion)
* having faster default behaviours; e.g. uv defaults to not pre-compiling installed packages to .pyc files (since Python will do this on the first import anyway) while Pip defaults to doing so
* not (necessarily) being weighed down by support for legacy behaviours (packaging worked radically differently when Pip first became publicly available)
* just generally being better architected
None of these changes require a change in programming language. (For example, if you use Python to make a hard link, you just use the standard library, which will then use code written in C to make a system call that was most likely also written in C.) Which is why I'm making https://github.com/zahlman/paper .
Poetry doesn't do this caching trick. It creates its own cache with the same sort of structure as Pip's, and as far as I can tell it uses its own reimplementation of Pip's core installation logic from there (including `installer`, which is a factored-out package for the part of Pip that actually unpacks the wheel and copies files).
A tool written in Python is never going to be as fast as one written in Rust. There are plenty of Python alternatives and you're free to use them.
As Ive gotten older I've grown weary of third party tools, and almost always try to stick with the first party built in methods for a given task.
Does uv provide enough benefit to make me reconsider?
I don't know why there is such a flurry of posts since it's a tool that is more than a year old, but it's the one and only CLI tool that I recommend when Python is needed for local builds or on a CI.
Hatch was a good contender at the time but they didn't move fast enough, and the uv/ruff team ate everybody's lunch. uv is really good and IMHO it's here to stay.
Anyway try it for yourself but it's not a high-level tool that is hiding everything, it's fast and powerful and yet you stay in control. It feels like a first-party tool that could be included in the Python installer.
Try it for <20mins and if you don't like it, leave it behind. These 20mins include installation, setup, everything.
How much you can benefit depends on your use case. uv is a developer tool that also manages installations of Python itself (and maintains separate environments for which you can choose a Python version). If you're just trying to install someone else's application from PyPI - say https://pypi.org/project/pycowsay/ as an example - you'll likely have just as smooth of an experience via pipx (although installation will be even slower than with pip, since it's using pip behind the scenes and adding its own steps). On the other hand, to my understanding, to use uv as a developer you'll still need to choose and install a build backend such as Flit or Hatchling, or else rely on the default Setuptools.
One major reason developers are switching to uv is lockfile support. It's worth noting here that an interoperable standard for lockfiles was recently approved (https://peps.python.org/pep-0751/), uv will be moving towards it, and other tools like pip are moving towards supporting it (the current pip can write such lockfiles, and installing from them is on the roadmap: https://github.com/pypa/pip/issues/13334).
If you, like me, prefer to follow the UNIX philosophy, a complete developer toolchain in 2025 looks like:
* Python itself (if you want standalone binaries like the ones uv uses, you can get them directly; you can also build from source like I do; if you want to manage Python installations then https://github.com/pyenv/pyenv is solid, or you can use the multi-language https://asdf-vm.com/guide/introduction.html with https://github.com/asdf-community/asdf-python I guess)
* Ability to create virtual environments (the standard library takes care of this; some niche uses are helped out by https://virtualenv.pypa.io/)
* Package installer (Pip can handle this) and manager (if you really want something to "manage" packages by installing into an environment and simultaneously updating your pyproject.toml, or things like that; but just fixing the existing environment is completely viable, and installers already resolve dependencies for whatever it is they're currently installing)
* Build frontend (the standard is https://build.pypa.io/en/stable/; for programmatic use, you can work with https://pyproject-hooks.readthedocs.io/en/latest/ directly)
* Build backend (many options here - by design! but installers will assume Setuptools by default, since the standard requires them to, for backwards compatibility reasons)
* Support for uploading packages to PyPI (the standard is https://twine.readthedocs.io/en/stable/)
* Optional: typecheckers, linters, an IDE etc.
A user on the other hand only needs
* Some version of Python (the one provided with a typical Linux distribution will generally work just fine; Windows users should usually just install the current version, with the official installer, unless they know something they want to install isn't compatible)
* Ability to create virtual environments and also install packages into them (https://pipx.pypa.io/stable/ takes care of both of these, as long as the package is an "application" with a defined entry point; I'm making https://github.com/zahlman/paper which will lift that restriction, for people who want to `import` code but not necessarily publish their own project)
* Ability to actually run the installed code (pipx handles this by symlinking from a standard application path to a wrapper script inside the virtual environment; the wrappers specify the absolute path to the virtual environment's Python, which is generally all that's needed to "use" that virtual environment for the program. It also provides a wrapper to run Pip within a specific environment that it created. PAPER will offer something a bit more sophisticated here, for both aspects.)
If you use anything outside the standard library the only reliable way to run a script is installing it in a virtual environment. Doing that manually is a hassle and pyenv can be stupidly slow and wastes disk space.
With uv it's fast and easy to set up throw away venvs or run utility scripts with their dependencies easily. With the PEP-723 scheme in the linked article running a utility script is even easier since its dependencies are self-declared and a virtual environment is automatically managed. It makes using Python for system scripting/utilities practical and helps deploy larger projects.
Really? `apt install pipx; pipx install sphinx` (for example) worked flawlessly for me. Pipx is really just an opinionated wrapper that invokes a vendored copy of Pip and the standard library `venv`.
The rest of your post seems to acknowledge that virtual environments generally work just fine. (Uv works by creating them.)
> Sometimes you're limited by Python version and it can be non-trivial to have multiple versions installed at once.
I built them from source and make virtual environments off of them, and pass the `--python` argument to Pipx.
> If you use anything outside the standard library the only reliable way to run a script is installing it in a virtual environment. Doing that manually is a hassle and pyenv can be stupidly slow and wastes disk space.
If you're letting it install separate copies of Python, sure. (The main use case for pyenv is getting one separate copy of each Python version you need, if you don't want to build from source, and then managing virtual environments based off of that.) If you're letting it bootstrap Pip into the virtual environment, sure. But you don't need to do either of those things. Pip can install cross-environment since 22.3 (Pipx relies on this).
Uv does save disk space, especially if you have multiple virtual environments that use the same packages, by hard-linking them.
> With uv it's fast and easy to set up throw away venvs or run utility scripts with their dependencies easily. With the PEP-723 scheme in the linked article running a utility script is even easier since its dependencies are self-declared and a virtual environment is automatically managed.
Pipx implements PEP 723, which was written to be an ecosystem-wide standard.
Each environment itself only takes a few dozen kilobytes to make some folders and symlinks (at least on Linux). People think of Python virtual environments as bloated (and slow to create) because Pip gets bootstrapped into them by default. But there is no requirement to do so.
The packages take up however much space they take up; the cost there is unavoidable. Uv hard-links packages into the separate environments from its cache, so you only pay a disk-space cost for shared packages once (plus a few more kilobytes for more folders).
(Note: none of this depends on being written in Rust, but Pip doesn't implement this caching strategy. Pip can, however, install cross-environment since 22.3, so you don't actually need the bootstrap. Pipx depends on this, managing its own vendored copy of Pip to install into multiple environments. But it's still using a copy of Pip that interacts with a Pip-styled cache, so it still can't do the hard-link trick.)
Some Python devs told me, it's an awesome language, but they envy the Node.js ecosystem for their package management.
Seems like uv finally removed that roadblock.
Maven solved Java packaging circa 2005, for example. Yes, XML is verbose, but it's an implementation detail. Python still lags on many fronts, 20 years later.
An example: even now it makes 0 sense to me why virtual envs are not designed and supposed to be portable between machines with the same architecture (!). Or why venvs need to be activated with shell-variety specific code.
None of this example has anything to do with performance or reliance on C dependencies, but ok.
> even now it makes 0 sense to me why virtual envs are not designed and supposed to be portable between machines with the same architecture (!).
They aren't designed to be relocatable at all - and that's the only actual stumbling block to it. (They may even contain activation scripts for other platforms!)
That's because a bunch of stuff in there specifies absolute paths. In particular, installers (Pip, at least) will generate wrapper scripts that specify absolute paths. This is so that you can copy them out of the environment and have them work. Yes, people really do use that workflow (especially on Windows, where symlinking isn't straightforward).
It absolutely could be made to work - probably fairly easily, and there have been calls to sacrifice that workflow to make it work. It's also entirely possible to do a bit of surgery on a relocated venv and make it work again. I've done it a few times.
The third-party `virtualenv` also offers some support for this. Their documentation says there are some issues with this. I'm pretty sure they're mainly talking about that wrapper-script-copying use case.
> Or why venvs need to be activated with shell-variety specific code.
The activation sets environment variables for the current shell. That isn't possible (at least in a cross-platform way) from Python since the Python process would be a child of that shell. (This is also why you have to e.g. use `source` explicitly to run the Linux versions.)
But venvs generally don't need to be activated at all. The only things the activation script effectively does:
* Set the path environment variable so that the virtual environment's Python (or symlink thereto) will be found first.
* Put some fancy stuff in the prompt so that you can feel like you're "in" the virtual environment (a luxury, not at all required).
* Set `VIRTUAL_ENV`, which some Python code might care about (but they could equally well check things like `sys.executable`)
* Unset (and remember) `PYTHONHOME` (which is a hack that hardly anyone has a good use case for anyway)
* (on some systems that don't have a separate explicit deactivate script) set up the means to undo all those changes
The actually important thing is the path variable change, and even then you don't need that unless the code is going to e.g. start a Python subprocess and ask the system to find Python. (Or, much more commonly, because you have a `#!/usr/bin/env python` shebang somewhere.) You can just run the virtual environment's Python directly.
In particular, you don't have to activate the virtual environment in order to use its wrapper scripts, as long as you can find them. And, in fact, Pipx depends on this.
Bruh, one-off scripts is the whole point of Python. The cheat code is to add "break-system-packages = true" to ~/.config/pip/pip.conf. Just blow up ~/.local/lib/pythonX.Y/site-packages/ if you run into a package conflict (exceedingly rare) and reinstall. All these venv, uv, metadata peps, and whatnot are pointless complications you just don't need.
Can you not use `uvx` with your script because it only works on packages that are installed already or on PyPi?
I don't think running with uv vs uvx imposes any extra limitations on how you specify dependencies. You should either way be able to reference dependencies not just from PyPi but also by git repo or local file path in a [tool.uv.sources] table, the same as you would in a pyproject.toml file.
https://discuss.python.org/t/40418/82
You can use uvx run scripts with a combination of the --with flag to specify the dependencies and invoking python directly. For e.g
uvx --with youtube-transcript-api python transcript.py
But you wont get the benefit of PEP 723 metadata.
No comments yet
I like it though. It's very convenient.
There are options to both lock the dependencies and limit by date:
https://docs.astral.sh/uv/guides/scripts/#locking-dependenci...
https://docs.astral.sh/uv/guides/scripts/#improving-reproduc...
People act like this happens all the time but in practice I haven't seen evidence that it's a serious problem. The Python ecosystem is not the JavaScript ecosystem.
An easy way to prove that this is the norm is to take some existing code you have now, and update to the latest versions your dependencies are using, and watch everything break. You don't see a problem because those dependencies are using pinned/very restricted versions, to hide the frequency of the problem from you. You'll also see that, in their issue trackers, they've closed all sorts of version related bugs.
I have done this many times and watched everything fail to break.
I’ve had to modify code this week due to changes in some popular libraries. Some recent examples are Numpy 2.0 broke most code that used numpy. They changed the c side (full interpreter crashes with trimesh) and removed/moved common functions, like array.ptp(). Scipy moved a bunch of stuff lately, and fully removed some image related things.
If you think python libraries are somehow stable in time, you just don’t use many.
That's bait! / Ads are getting smarter!
I would also have accepted "unless you're geh", "unless you're a traitor to the republic", "unless you're not leet enough" etc.
And so, if you are the kind of person who has not heard of it, you probably don't read blogs about python, therefor you probably aren't reading _this_ blog. No harm no foul.