Today I spent at least an hour fighting with Python packaging. The more I think about it, the more I feel that self-contained static binaries are the way to go. Trying to load source files from all over the filesystem at runtime is hell. Or at least it's hell to debug when it goes wrong.

I would love to see a move towards "static" binaries that package everything together into a single, self-contained unit.

Deno (https://deno.land) solves this problem very well for JavaScript. I’d like so see more languages adopting “dependencies are URLs” approach. No more package hell.

How does this solve the problem? As I can tell it replaces a centralized dependencies (requirements.txt) containing references to a known and trusted registry (pypi.org) with url's in all files across your project (and 3rd party library's files). Making maitenance only harder and security updates even more so. And making it harder to debug things as you now keep in mind that multiple versions of the same packages can be used in your source. Also, requirements.txt already support's urls if you want your sources from other places.

For me, most of the pain with the Python packaging went away after I started using Pip-tools[0]. It's just a simple utility to add lockfile capabilities to Pip. Nothing new to learn, no new filosophies or paradigm's. No PEP waiting to be adopted by everyone. Just good old requirements.txt + Pip.

[0] https://github.com/jazzband/pip-tools