Brembo Brakes Are Going on Mountain Bikes Now (thedrive.com)
1 points by PaulHoule 2m ago 0 comments
How big could an "AI Manhattan Project" get? (epoch.ai)
2 points by almost-exactly 6m ago 0 comments
European video conferences: secure and sovereign (opentalk.eu)
1 points by doener 16m ago 0 comments
The uv build back end is now stable
116 NeutralForest 61 7/3/2025, 11:51:33 AM docs.astral.sh ↗
E.g.
Then on another machine: The above is from memory typed on a phone so maybe some minor syntax issues but the point i tried to make was we can kinda emulate the convenience of statically compiled binaries a-la Go these daysBefore (analogous to go mod init):
On another machine (still the before scenario, this time analogous to maybe go run): In the after scenario: That's it. Comparable toFirst, you can move that script to a different machine and do `uv run {script}`, no need to recreate a venv or provide install instructions (I believe uv will now even grab an appropriate version of Python if you don't have it?). This comes from PEP 723, and multiple tools support doing this, such as hatch.
Second, when you "add" a requirement instead of "install" a requirement it manages that with the knowledge of all requirements that were added before. For example, if I `pip install foo` and then `pip install bar` pip does not consider foo or it's dependencies as required when installing bar, so it's possible that you can break `foo` by installing completely incompatible dependencies. But when you "add foo" and then "add bar" from uv (and other tools that are declarative, like Poetry) your environment gets updated to take everything into account.
If managing Python dependencies is second nature to you then these might seem like extra concepts to keep in your head, but lots of people do find these useful because they find they can think less about Python dependencies.
I am interested in how they're going to make money eventually, but right now it's working for me.
Does anyone have an idea about how they're going to monetize?
By rule, you should never meddle with the globally installed python because so many packages will try to look for the system installed Python and use it, better let your package manager handle it.
Will it support the wide range of options setuptools does? Or maybe a build.rs equivalent - build.py, but in a sane way unlike setup.py.
As such they do not currently support C extensions, nor running arbitrary code during the build process. I imagine they will add features slowly over time, but with the continued philosophy of the simple and common cases should be zero configuration.
For Python experts who don't have special needs from a build backend I would recommend flit_core, simplest and most stable build backend, or hatching, very stable and with lots of features. While uv_build is great, it does mean that users building (but not installing) your project need to be able to run native code, rather than pure Python. But this is a pretty small edge case that for most people it won't be an issue.
I wish python can provide an "official" solution to each problem (like in rust, there's cargo, end of story), or at lease, an official document describing the current best practice to do things.
For the last year or so, I've been trying to provide an alternative guide that stays abreast the best options and provides simple guides: https://pydevtools.com/.
Makes me wonder, did the Python core team fail to see the opportunity in python tooling, have no desire to build it, or they didn't have the skills?
I never learned python the way I wanted to because for years I would first look at the excruciating transition from v2 to v3 and just not see a point of entry for a newb like me.
Now the same thing is happening with tooling for v3. pip? pepenv? python pip? python3 pip? I don't freakin' know. Now there's uv, and I'm kinda excited to try again.
They spend a lot of time on improving Python itself and then you have pip which is a way to install packages and that's it; it's not a package manager nor a python version manager.
That said, the people left in the CPython team generally have a low regard for bloat-free, correct and fast solutions, so external solutions are most welcome.
I don't blame the core python team for not super optimizing tools like Astral.
[1] https://news.ycombinator.com/item?id=44358482
I dont think there is enough money in package registries to pay for all of the VC investment in astral.
That said, I've checked Anaconda's site, and while it used to be "Anaconda [Python] Commercial Distribution", "On-Prem repositories", "Cloud notebooks and training"... during the last year they've changed their product name to "Anaconda AI Platform", and all it's about "The operating system for AI", "Tools for the Complete AI Lifecycle". Eeeeh, no thanks.
With that said — it’s uv or die for me
Among many things it’s improved, scripting with python finally just works without the pain of some odd env issue.
From what I can tell uv doesn’t (unlike poetry) assist with venvs what so ever.
What is a trivial «poetry run» becomes the same venv-horrors of Python fame when I use uv and «uv run».
Based on that, your comment strikes me as the polar opposite of my experience (which is why I still resort to poetry).
Care to outline how you use v to solve venv-issues, since from what I can tell, uv explicitly doesn’t?
I’m very curious.
Here’s a couple links to discussions about it on HN:
https://news.ycombinator.com/item?id=42198256
https://news.ycombinator.com/item?id=42855258
And how does that work on Windows, which to my knowledge doesn’t even support shebangs?
its main benefit is that it is well maintained and does everything you used to need a string of tools for before.
---
time uv
real 0m0.005s
user 0m0.000s
sys 0m0.004s
---
time npm
real 0m0.082s
user 0m0.068s
sys 0m0.020s
---
time pip
real 0m0.320s
user 0m0.179s
sys 0m0.031s
My good uv experience. I tried installing tensor/cuda Python code recently. Plain pip just failed. uv pip actually returned WHY it failed.
It definitely felt like magic.
They do if you instruct them to.
Before uv I was doing everything in a devcontainer on my Mac since that was easiest, but uv is super fast that I skip that unless I have some native libraries that I need for Linux.