How Python dependency management can be improved at this point? Should we hold our breath for Python 4 to solve it?

The current paradigm feels OK to me. Unfortunately there are a lot of old crappy "intro to Python" blog posts and tutorials still telling people to use 'sudo pip install' to install packages on their system Python. pip/setuptools do not support that workflow. I think it's slowly getting more widely recognized.

I'm one of those people using 'sudo pip install'. What should I be using instead?

Python -m venv in your project directory. Then source ./bin/activate. Pip install from the project directory as your regular user (no sudo). You can also set up venv (virtual environments) in home or somewhere else instead of for each project. This is called python virtual environments and very easy to adopt.

> Python -m venv in your project directory. Then source ./bin/activate.

And how do I do that from inside an already running Emacs session? (Or Vscode?)

Needing to "prepare" an environment before you can work with it, prevents you from simply "jumping" into a task from an existing workflow.

Working with .NET, Rust, Node, go, Haskell, etc... I have no such issues.

This "venv"-requirement only applies to Python, and it's really the opposite of ergonomic. I really think it's about time the Python-community starts recognizing this.

VSCode will detect virtual environments within the directory structure.

Node definitely has this issue. There's a reason Node NVM[1] exists.

[1] https://github.com/nvm-sh/nvm