The number of extra tools used in this article boggles my mind.

Are you writing a simple library? Create a setup.py. Copy paste an existing setup.py and modify it to suit your purposes. Now you have a working, pip installable python package.

Want to publish to PyPI? Use twine. It's standard and it's simple.

You don't need complicated tooling for simple projects.

How do you handle version pinning? hash checking? CI? testing on multiple platforms? multiple python versions? deployment? credential management? package data? version bumps?

Sure, experts know how to do all these things because they spent many days learning them, but I'd rather outsource to a tool.

Iteratively. You don't need to solve all those problems at once.

Version pinning can be done in setup.py using the same syntax you would see in a requirements.txt file. You should be very conservative when pinning versions in a library, though.

You can lean on your ci tool (eg. Github actions) to handle testing, hash checking, credential management, etc. But I recommend all of this start as a bunch of locally runnable scripts.

I typically bump version directly in a version file and move on with my life.

This stuff usually builds up iteratively and at least for me has never been the starting point. Starting point should be a library worth sharing. It is not the end of the world of you release the first few versions manually.

TBH as someone trying to use Python professionally it is extremely frustrating that basic things with regards to package management are something you have to iterate towards, as opposed to just being obvious and default.

I sympathize. It is unfortunate that the python community never settled around a tool like leiningen for clojure or cargo for rust or npm for node.

What we saw with npm was the entire community iterating towards a feature set and everyone reaping the benefits automatically with npm updates. package-lock.json is a good example of this.

Worth noting is that cargo and npm weren't "settled around"; they were developed and presented, from the beginning, alongside the relevant compiler and runtime. There was never a question; the batteries were included.

Leiningen is the weird one where people did actually settle fairly well around an unofficial solution in the absence of an official one. I think the norm with languages that forego official tooling is closer to what we've seen in Python.

The Python community has considered an "official" packaging tool in the past, but in those conversations found that the community had too many preferences to find a good compromise. That's the trouble with having a highly diverse set of uses and integrations, and lots of legacy.

If you're curious, the email threads about Conda and defining wheels are interesting.

I feel like the entire point of a BDFL is that they can just ignore this sort of thing and make the hard call, but it never happened.

Maybe it could still happen? It seems like a super high value challenge that the BDFL could take on: build out the official set of tools (setup.py, twine, virtualenv, pip) to support features that make people seek out alternatives (pyproject.toml, poetry, flit, conda, pyenv, pipenv).

I realize this is controversial but from reading the docs I really thought Pipenv was the official solution. Took me a while to realize this wasn't the case.

I went through the same progression, thinking pipenv was the official solution before deciding it wasn’t. Then, just now, I realized that pipenv [1] is currently owned by the Python Packaging Authority (PyPA) who also owns pip [2] and virtualenv [3]. I don’t know the right answer but this illustrates the confusion of not coalescing around an official solution.

[1]: https://github.com/pypa/pipenv

[2]: https://github.com/pypa/pip

[3]: https://github.com/pypa/virtualenv