It's trickier than just pinning dependencies because some libraries also need to build C code, etc. Once you bring in external build tools, you have that many more potential points of failure. It's great. Also, what happens if your dependencies don't pin their dependencies? Possibly, uploading a package to pipy should require freezing dependencies or do it automatically.

Modern python tooling like pipenv pins the dependencies of your dependencies as well. This is no longer an issue

I used a requirements.in file to list out all the top-level direct dependencies & then used pip-compile from piptools to convert that into a frozen list of versioned dependencies. pip-compile is also nice because it doesn't upgrade unless explicitly asked to which makes collaboration really nice. I then used the requirements.txt & various supporting tooling to auto-create & keep updated a virtualenv (so that my peers didn't need to care about python details & just running the tool was reliable on any machine). It was super nice but there's no existing tooling out there to do anything like that & it took about a year or two to get the tooling into a nice place. It's surprisingly hard to create Python scripts that work reliably out-of-the-box on everyone's environments without the user having to do something (which always means in my experience that something doesn't work right). C modules were more problematic (needing Xcode installation on OSX, potentially precompiled external libraries not available via pip but also not installed by default), but I created additional scripts to help bring a new developer's machine to a "good state" to take manual config out of the equation. That works in a managed environment where "clean" machines all share the same known starting state + configs - I don't know how you'd tackle this problem in the wild.

I do think there's a lot of low-hanging fruit where Python could bake something in to auto-setup a virtualenv for a script entrypoint & have the developer just list the top-level dependencies & have the frozen dependency list also version controlled (+ if the virtualenv & frozen version-controlled dependency list disgaree rebuild virtualenv).

I don't know if it'd work the same way, but I've had a lot of success with Twitter's Pex files. They package an entire Python project into an archive with autorun functionality. You distribute a Pex file and users run it just like a Python file and it'll build/install dependencies, etc. before running the main script in the package.

I used it to distribute dependencies to Yarn workers for PySpark applications and it worked flawlessly, even with crazy dependencies like tensorflow. I'm a really big fan of the project, it's well done.

https://github.com/pantsbuild/pex