I'm going to admit that what I really want to see is a strong push to standardize and fully incorporate package management and distribution into python's core. Despite the work done on it, it's still a mess as far as I can see, and there is no single source of truth (that I know of) on how to do it.
For that matter, pip can't even search for packages any more, and instead directs you to browse the pypi website in a browser. Whatever the technical reasons for that, its a user interface fail. Conda can do it!!!!! (as well as just about any package management system I've ever used)
> I'm going to admit that what I really want to see is a strong push to standardize and fully incorporate package management and distribution into python's core. Despite the work done on it, it's still a mess as far as I can see, and there is no single source of truth (that I know of) on how to do it.
Package management is standardized in a series of PEPs[1]. Some of those PEPs are living documents that have versions maintained under the PyPA packaging specifications[2].
The Python Packaging User Guide[3] is, for most things, the canonical reference for how to do package distribution in Python. It's also maintained by the PyPA.
(I happen to agree, even with all of this, that Python packaging is a bit of a mess. But it's a much better defined mess than it was even 5 years ago, and initiatives to bring packaging into the core need to address ~20 years of packaging debt.)
[1]: https://peps.python.org/topic/packaging/
[2]: https://packaging.python.org/en/latest/specifications/index....
Is there anything in there about managing dependencies within a python project? What is the canonical way to do that in python today?
It depends (unfortunately) on what you mean by a Python project:
* If you mean a thing that's ultimately meant to be `pip` installable, then you should use `pyproject.toml` with PEP 518 standard metadata. That includes the dependencies for your project; the PyPUG linked above should have an example of that.
* If you mean a thing that's meant to be deployed with a bunch of Python dependencies, then `requirements.txt` is probably still your best bet.
I meant the second. requirements.txt is a really bad solution for that, and that is the frustration many of us have that have used languages with much better solutions.
Requirements feels like a dirty hack but it does work fine. It has ==, ~=, and >= for version numbers, as well as allowing you to flag dependencies for different target os, etc. And then you can add setup.py if you need custom steps. But yes, it feels dirty to maintain requirements.txt, requirements-dev.txt, etc.
Poetry is the most common solution that I've seen in the wild. You spec everything using pyproject.toml and then "poetry install" and it will manage a venv of its own. But you still need to tell people to "pip install poetry" as step 0 which is annoying.
If you don't care about deploying python files, and rather just the final product, I'd recommend either nuitka or pyinstaller. These are for bundling your project into an executable without a python runtime needed (--onefile type of options for single file output). Neither supports cross compilation though.
What flow do you use with requirements.txt that gives you reproducible builds across a team and environments? Using ==, ~=, and >= will not give you reproducible builds.
Hash-pinning with requirements.txt will get you the closest to this, but it's not possible in the general case to have a cross-environment reproducible build with Python. The closest you can hope for is a build that reproduces in the same environment.
This problem is shared by the majority of language packaging ecosystems; the only one I'm aware of that might avoid it is Java.
This is not raw requirements.txt, but isn’t too far off: Pants/PEX can consume one to produce a hash-pinned lock file.