Show HN: Wetlands – a lightweight Python library for managing Conda environments
27 arthursw 38 5/28/2025, 2:54:04 PM arthursw.github.io ↗
When building a plugin system for an application, avoiding dependency conflicts is critical. To address this, I created Wetlands – a lightweight Conda environment manager.
Wetlands not only simplifies the creation of isolated Conda environments with specific dependencies, but also allows you to run arbitrary Python code within those environments and retrieve the results. It uses the multiprocessing.connection and pickle modules for inter-process communication. Additionally, one can easily use shared memory between the environments, making data exchange more efficient.
Docs: https://arthursw.github.io/wetlands/latest/ Source: https://github.com/arthursw/wetlands/
I’d really appreciate any feedback. Thanks!
I've been using Conda for 10 years as my default package manager on my devices (not pipenv or poetry etc). I started because it was "the way" for data science but I kept with it because the syntax is really intuitive to me (conda create, conda activate).
I'm not sure what problem you are solving here -- the issues with conda IMO are that it is overkill for the rest of the python community, so conda-forge has gradually declined and I typically create a conda environment then use pip for the latest libraries. Managing the conda environments though is not my issue -- that part works so well that I keep with it.
If you could explain why you created this and what problems you are solving with an example, that would be helpful. All package managers are aimed at "avoiding dependency conflicts" so that doesn't really communicate to me what this is and what real problem it solves.
Jokes aside, this feels very meta: package manager for a package manager for a package manager. Reminds me of the old RFC1925: "you can always add another layer of abstraction". That RFC also says "perfection has been reached not when there is nothing left to add, but when there is nothing left to take away".
And as a hpc admin, I'm not offering my users any help with conda and let them suffer on their own. Instead I'm showing them the greener pastures with spack, easybuild and eessi whenever I can. And they're slowly migrating over.
could you elaborate a bit more on why HPC world is special when it comes to configuring the environment?
I always feel it is a typical problem in software development, to separate operating system env from the application env.
do you use spack / easybuild on your personal computer, for example if you need to install a package that is not part of the distribution?
There isn’t anything particularly special about the HPC world other than the need for many different configurations of the same software for many different CPU and GPU architectures. You might want to have several versions of the same application installed at once, with different options or optimizations enabled. Spack enables that. Environments are one way to keep the different software stacks separate (though, like nix, spack uses a store model to keep installs in separate directories, as well).
[0]: https://docs.conda.io/projects/conda-build/en/latest/resourc...
With wheels and the manylinux specifications there's less of a usecase for that, but still could be useful
Otherwise I don't know what they're talking about, it is indeed a Python package manager.
Now it's mostly behind us, but there used to be a time where pypi didn't have wheels (a 2012 thing), or manylinux wheels (a 2016 thing) for most libraries. pip install was a world of pain if you didn't have the "correct source packages" in your system.
And now several of those projects built back then, they're no longer projects but deployed systems, might as well stick to what is working.
The fastest graph library I know, graph-tool, needs to be installed with Conda too.
https://pypi.org/project/graph-tool/
that's not a limitation of pip, that's a limitation of the maintainers of graph-tool
EDIT: But let's not pretend getting cibuildwheel going for every supported variant of PythonMajVer/OS/(32/64) bit is an easy lift for small open source projects. Conda is still more ergonomic than this, although it's slowly dying because big projects have the manpower to stand up and maintain the CI pipeline for PyPI, so Conda is less and less necessary.
When I read the uv docs and see other people's examples, I have a hard time understanding how it works for my workflow. It seems I could continue using conda for environment management and only use uv for package installation and it would be much faster, but that also feels a little shaky and potential for error combining the two tools, and since mamba became the default solver conda is pretty fast, even when building a new env from scratch.
It feels like conda and it's ability to have multiple Python versions, with env management built in, gives me more than uv, just without the package installation speed. But I am certainly open to someone explaining uv to me in a way to disprove that
uv replaces pip, conda and pip have been complementary for a long time. But I would be surprised if uv does not take on conda at some point, e.g. with a micromamba subcommand.
pyenv is all you need. it manages python versions and python virtual environments. you can create and destroy them just as easily as git branches.
pyenv + good ol' requirements.txt is really all you need.
if your env dictates containers, it's even easier to work with. FROM python:version and done.
I think having a very widely accepted and supported default would let the community rally around it and improve the experience and documentation (though I am not a fan of the centralized hosting that npm and cargo push, I much prefer the go approach, but we are already there with PyPI)
pip, uv, poetry are all analagous. they ensure the correct packages are installed. we have some internal apps that devs decided to start with poetry and it has some nice ergonomics ... but on the other hand I find the simplicity of a requirements file to be so ... simple. People get caught up on the file, too, but really its just a convention. you can call it whatever you want, deps.txt, deps.foo, packages.bar ... its just a plaintext file with newline delimited packages. since its "just a file" and "just unix" this has the added perk of being able to cat many files together and use that with pip. it's all just lines of text, pip does not care.
pyenv + pip works. pyenv + poetry works. pyenv + uv works. those become inconsequential decisions you can make on a case by case basis and try or forget as needed.
I agree that there is a really nice simplicity to requirements.txt but I've myself enjoying uv enough to just fully embrace it (and not farm out to uv pip) and as a result I now find myself in pyproject.toml land.
I wouldn't recommend it for something with hundreds of dependencies, but I also wouldn't recommend having hundreds of dependencies.
Poetry has messed up packages more often apt ever did, in my use cases. So far using apt as by package manager has failed me exactly zero times.
Absolutely, but that's true for pretty much all the package managers, for all languages. They all break at some point.
One thing I would note is: Don't install your dependencies with apt in a development environment, you need to have a clean environment to avoid dragging unneeded dependencies into your production environments. That does mean that you need to find the exact version of your dependency in Debian, but it's a good exercise and ensures that you're mindful of your dependencies.
Be mindful, be prepared is good advise, in all aspects of life really.
Please don't. Never have a tool that automatically reaches out onto the internet to get a binary and then run it. Just let the user know that they need to install either pixi or micromamba. It's inherently unsafe and you don't know what will be put into those binaries in the future.
Maybe it's because I don't have a use case for this, but I don't really get what this is for. It's interesting, but I'm not really sure where I'd use it.
No comments yet