Show HN: Wetlands – a lightweight Python library for managing Conda environments

27 arthursw 38 5/28/2025, 2:54:04 PM arthursw.github.io ↗
When building a plugin system for an application, avoiding dependency conflicts is critical. To address this, I created Wetlands – a lightweight Conda environment manager.

Wetlands not only simplifies the creation of isolated Conda environments with specific dependencies, but also allows you to run arbitrary Python code within those environments and retrieve the results. It uses the multiprocessing.connection and pickle modules for inter-process communication. Additionally, one can easily use shared memory between the environments, making data exchange more efficient.

Docs: https://arthursw.github.io/wetlands/latest/ Source: https://github.com/arthursw/wetlands/

I’d really appreciate any feedback. Thanks!

Comments (38)

mushufasa · 12h ago
> Wetlands not only simplifies the creation of isolated Conda environments with specific dependencies, but also allows you to run

I've been using Conda for 10 years as my default package manager on my devices (not pipenv or poetry etc). I started because it was "the way" for data science but I kept with it because the syntax is really intuitive to me (conda create, conda activate).

I'm not sure what problem you are solving here -- the issues with conda IMO are that it is overkill for the rest of the python community, so conda-forge has gradually declined and I typically create a conda environment then use pip for the latest libraries. Managing the conda environments though is not my issue -- that part works so well that I keep with it.

If you could explain why you created this and what problems you are solving with an example, that would be helpful. All package managers are aimed at "avoiding dependency conflicts" so that doesn't really communicate to me what this is and what real problem it solves.

N1H1L · 11h ago
I used to use conda, but have switched entirely to uv now
oezi · 10h ago
As someone new to Python: what was ever the appeal of conda that uv doesn't satisfy?
aeroevan · 10h ago
conda doesn't just package python libraries, but also the C/Fortran/other bits that the scipy stack often depended on. With the rise of binary wheels that is less needed though
jaimergp · 8h ago
I think it's more about tool X vs Y, but about ecosystems and packaging approaches; in other words Python packaging (which has tools like pip, uv or poetry) vs conda packaging (which has tools like conda itself, mamba or pixi). https://pypackaging-native.github.io/ is an excellent starting point to learn about the limitations on Python packaging for native dependencies and compiled extensions.
aldanor · 7h ago
Distributing non Python packages via the same channel that Python packages may depend on. E.g, h5py depending on libhdf5.
kjkjadksj · 9h ago
Fundamentally it is a fresh usr/bin per environment with all that can go into that. Not just python tooling. R packages. Binaries. All of that. Env can be exported as a yaml file and trivially shared without appending some header to all scripts you write.
jpecar · 11h ago
Why not just call it swamp? It would better describe the python ecosystem mess ;)

Jokes aside, this feels very meta: package manager for a package manager for a package manager. Reminds me of the old RFC1925: "you can always add another layer of abstraction". That RFC also says "perfection has been reached not when there is nothing left to add, but when there is nothing left to take away".

And as a hpc admin, I'm not offering my users any help with conda and let them suffer on their own. Instead I'm showing them the greener pastures with spack, easybuild and eessi whenever I can. And they're slowly migrating over.

vindex10 · 8h ago
it was also my first thought on abstractions of abstractions, thanks for sharing :)

could you elaborate a bit more on why HPC world is special when it comes to configuring the environment?

I always feel it is a typical problem in software development, to separate operating system env from the application env.

do you use spack / easybuild on your personal computer, for example if you need to install a package that is not part of the distribution?

tgamblin · 53m ago
I do for macOS and Linux :). Windows support is also coming along.

There isn’t anything particularly special about the HPC world other than the need for many different configurations of the same software for many different CPU and GPU architectures. You might want to have several versions of the same application installed at once, with different options or optimizations enabled. Spack enables that. Environments are one way to keep the different software stacks separate (though, like nix, spack uses a store model to keep installs in separate directories, as well).

barapa · 11h ago
Why do people use Conda instead of uv?
phronimos · 10h ago
Conda manages binaries and their native dependencies together, including shared libraries[0]. This offers significant advantages over uv and pip when distributing packages with C extensions, such as dependency resolution that accounts for shared library requirements, and better package isolation.

[0]: https://docs.conda.io/projects/conda-build/en/latest/resourc...

reedf1 · 10h ago
Use PDM with the UV backend - this accomplishes this in a much more lightweight and performant way.
agoose77 · 7h ago
The PyPI ecosystem can not, for the foreseeable future, replicate the scope of the conda ecosystem. From microarch builds to library deduplication, conda is a more general purpose solution. That doesn't mean that one "wins out" (and, for reference I predominantly use Python's PyPI), but they're not the same tools.
aeroevan · 10h ago
Does PDM manage C/Fortran library dependencies? IIRC conda was the only solution for managing both native and python dependencies but I haven't really looked elsewhere.

With wheels and the manylinux specifications there's less of a usecase for that, but still could be useful

reedf1 · 10h ago
Not sure about Fortran - but C for sure, yes.
jaimergp · 8h ago
Where does it fetch the C packages from? I always thought PDM was a _Python_ package manager, so the only source is PyPI or another index.
oblvious-earth · 6h ago
PDM has plugins, such as being able to invoke conda commands: https://github.com/pdm-project/awesome-pdm

Otherwise I don't know what they're talking about, it is indeed a Python package manager.

ElectricalUnion · 11h ago
The conda ecosystem was a early adopter of standardized binary packages.

Now it's mostly behind us, but there used to be a time where pypi didn't have wheels (a 2012 thing), or manylinux wheels (a 2016 thing) for most libraries. pip install was a world of pain if you didn't have the "correct source packages" in your system.

And now several of those projects built back then, they're no longer projects but deployed systems, might as well stick to what is working.

krapht · 11h ago
Conda is still the recommended way to work with Intel's Python distribution, so there's a reason for it to live on my work computer.

The fastest graph library I know, graph-tool, needs to be installed with Conda too.

zzzeek · 7h ago
that appears to be because the maintainers have chosen not to release any files

https://pypi.org/project/graph-tool/

that's not a limitation of pip, that's a limitation of the maintainers of graph-tool

krapht · 2h ago
I agree, and yet here we are. If it wasn't built with automake I would even consider getting wheels built on the projects behalf, but I can't make heads or tails of M4.

EDIT: But let's not pretend getting cibuildwheel going for every supported variant of PythonMajVer/OS/(32/64) bit is an easy lift for small open source projects. Conda is still more ergonomic than this, although it's slowly dying because big projects have the manpower to stand up and maintain the CI pipeline for PyPI, so Conda is less and less necessary.

zzzeek · 8h ago
there was still yet a time when "open source maintainer has to build binaries for x86, ARM, OSX, Windows in order to not get complaints but owns no windows licenses, ARM machines, OSX servers", and github now gives you all of that for free / automatic / declaratively with actions + cibuildwheel. so the value add from "conda" is way down from where it started.
blactuary · 9h ago
I use it because I do not need to create packages, and I often do a lot of interactive coding within a conda environment, not just running full Python scripts. At any given time I have a primary conda env I'm using with my set of daily use packages, eventually creating a new one for testing when there are major version upgrades to Python or a package I use frequently.

When I read the uv docs and see other people's examples, I have a hard time understanding how it works for my workflow. It seems I could continue using conda for environment management and only use uv for package installation and it would be much faster, but that also feels a little shaky and potential for error combining the two tools, and since mamba became the default solver conda is pretty fast, even when building a new env from scratch.

It feels like conda and it's ability to have multiple Python versions, with env management built in, gives me more than uv, just without the package installation speed. But I am certainly open to someone explaining uv to me in a way to disprove that

jessekv · 11h ago
These days I see Conda (and micromamba) used as a reliable cross-platform winget/apt/yum/brew. For example, to install GDAL.

uv replaces pip, conda and pip have been complementary for a long time. But I would be surprised if uv does not take on conda at some point, e.g. with a micromamba subcommand.

martinky24 · 11h ago
Because conda has been around for years and uv is more or less brand new. It's pretty much as simple as that.
superkuh · 11h ago
A python dependency manager manager manager. Truly we live in an age of unprecedented code abstraction and complexity. And I love that you install and manage wetlands itself with pip. An ouroboros matrioska of code.
chillpenguin · 11h ago
The fact that this exists shows that there is a serious problem in the python ecosystem. I'm sure it solves a real problem, so I'm not knocking the author. It's more of a "state of our industry" problem.
whalesalad · 10h ago
I maintain there is no issue. It's really not hard. conda is a smell for me though.

pyenv is all you need. it manages python versions and python virtual environments. you can create and destroy them just as easily as git branches.

pyenv + good ol' requirements.txt is really all you need.

if your env dictates containers, it's even easier to work with. FROM python:version and done.

tuckerman · 10h ago
The issue is you think pyenv has solved everything, someone else thinks poetry solves everything, I think uv solves everything, and someone else is apt installing things. And then there is installing torch and cuda...

I think having a very widely accepted and supported default would let the community rally around it and improve the experience and documentation (though I am not a fan of the centralized hosting that npm and cargo push, I much prefer the go approach, but we are already there with PyPI)

whalesalad · 10h ago
uv + poetry are a higher level in the stack than something like pyenv.

pip, uv, poetry are all analagous. they ensure the correct packages are installed. we have some internal apps that devs decided to start with poetry and it has some nice ergonomics ... but on the other hand I find the simplicity of a requirements file to be so ... simple. People get caught up on the file, too, but really its just a convention. you can call it whatever you want, deps.txt, deps.foo, packages.bar ... its just a plaintext file with newline delimited packages. since its "just a file" and "just unix" this has the added perk of being able to cat many files together and use that with pip. it's all just lines of text, pip does not care.

pyenv + pip works. pyenv + poetry works. pyenv + uv works. those become inconsequential decisions you can make on a case by case basis and try or forget as needed.

tuckerman · 7h ago
You're right not all of those need to be conflated but they aren't entirely orthogonal either, uv can install/manage python installations for you and poetry can manage environments as well if you want. On top of that, you have tools like mise that are more cross cutting.

I agree that there is a really nice simplicity to requirements.txt but I've myself enjoying uv enough to just fully embrace it (and not farm out to uv pip) and as a result I now find myself in pyproject.toml land.

mrweasel · 9h ago
I just install the libraries I need using the operating systems package manager. Works perfectly fine. In development I do use virtualenvs because I need to keep track of which dependencies are required, but in production I just apt-get install.
whalesalad · 9h ago
that will 100% come to bite you in the ass one day
mrweasel · 9h ago
Probably, but it keeps my dependencies low and packages are automatically updated with backported security patches. So far it has never been an issue, but there has been frustrations when the features aren't available, because the Debian packages aren't the newest versions.

I wouldn't recommend it for something with hundreds of dependencies, but I also wouldn't recommend having hundreds of dependencies.

Poetry has messed up packages more often apt ever did, in my use cases. So far using apt as by package manager has failed me exactly zero times.

whalesalad · 9h ago
If it works, it works. Just be mindful and prepared for the day where it stops working.
mrweasel · 9h ago
> Just be mindful and prepared

Absolutely, but that's true for pretty much all the package managers, for all languages. They all break at some point.

One thing I would note is: Don't install your dependencies with apt in a development environment, you need to have a clean environment to avoid dragging unneeded dependencies into your production environments. That does mean that you need to find the exact version of your dependency in Debian, but it's a good exercise and ensures that you're mindful of your dependencies.

Be mindful, be prepared is good advise, in all aspects of life really.

mrweasel · 12h ago
> If the user doesn't have pixi or micromamba installed, Wetlands will download and set it up automatically.

Please don't. Never have a tool that automatically reaches out onto the internet to get a binary and then run it. Just let the user know that they need to install either pixi or micromamba. It's inherently unsafe and you don't know what will be put into those binaries in the future.

Maybe it's because I don't have a use case for this, but I don't really get what this is for. It's interesting, but I'm not really sure where I'd use it.

No comments yet