So instead of BDFL for Python, he's going "make using Python better".

Congrats to him for finding something fun to do in retirement - dictators usually end up with a different outcome. ;)

I'm looking forward to seeing the future of Python - I think this move will be great for the whole community, and lets him push boundaries without being bogged down on the management side.

An official package manager with great dependency resolution would be fantastic. Or over take pipenv or poetry and sponsor it through Microsoft $$$.

The biggest hurdle to python right now is the stupid package managers. We need cargo for Python.

I only know a bit about Python - in what sense is pip not a package manager?

> in what sense is pip not a package manager?

It is a package manager but it lacks features that many other package managers have in Ruby, Node, Elixir, and other languages.

For example there's no concept of a separate lock file with pip.

Sure you can pip freeze your dependencies out to a file but this includes dependencies of dependencies, not just your app's top level dependencies.

The frozen file is good to replicate versions across builds but it's really bad for human readability.

Ideally we should have a file made for humans to define their top level dependencies (with version pinning support) and a lock file that has every dependency with exact pinned versions.

FWIW I had a lot of success using https://github.com/jazzband/pip-tools to have dependencies automatically managed in a virtualenv.

* Basically I would have a single bash script that every `.py` entrypoint links to.

* Beside that symlink is a `requirements.in` file that just lists the top-level dependencies I know about.

* There's a `requirements.txt` file generated via pip-tools that lists all the dependencies with explicit version numbers.

* The bash script then makes sure there's a virtual environment in that folder & the installed package list matches exactly the `requirements.txt` file (i.e. any extra packages are uninstalled, any missing/mismatched version packages are installed correctly).

This was great because during development if you want to add a new dependency or change the installed version (i.e. pip-compile -U to update the dependency set), it didn't matter what the build server had & could test any diff independently & inexpensively. When developers pulled a new revision, they didn't have to muck about with the virtualenv - they could just launch the script without thinking about python dependencies. Finally, unrelated pieces of code would have their own dependency chains so there wasn't even a global project-wide set of dependencies (e.g. if 1 tool depends on component A, the other tools don't need to).

I viewed the lack of `setup.py` as a good thing - deploying new versions of tools was a git push away rather than relying on chef or having users install new versions manually.

This was the smoothest setup I've ever used for running python from source without adopting something like Bazel/BUCK (which add a lot of complexity for ingesting new dependencies as you can't leverage pip & they don't support running the python scripts in-place).