This helps highlights the main issue I have with python today, and that's running python apps.

Having to ship a compiler to a host|container to build python with pyenv, needing all kinds of development headers like libffi for poetry. The hoopla around poetry init (or pipenv's equivalent) to get a deterministic result of the environment and all packages. Or you use requirements files, and don't get deterministic results.

Or you use effectively an entire operating system on top of your OS to use a conda derivative.

And we still haven't executed a lick of python.

Then there's the rigmarole around getting one of these environments to play nice with cron, having to manipulate your PATH so you can manipulate your PATH further to make the call.

It's really gotten me started questioning assumptions on what language "quick wins" should be written in.

I decided to drag myself kicking-and-screaming to the 21st century and start writing my handy-dandy utility scripts in python instead of bash. All was well and good until I made them available to the rest of my team, and suddenly I'm in python dependency hell. I search the internet and there are a lot of different solutions but all have their problems and there's no standard answer.

I decided "to heck with it" and went back to bash. There's no built-in JSON parser but I can use 'grep' and 'cut' as well as anyone so the end result is the same. I push it to our repo, I tell coworkers to run it, and I wash my hands of the thing.

When I was at Google I had a similar problem (team wasn't using Blaze). So what I did was to have a wrapper entrypoint around every python entrypoint that would just run that python entrypoint (e.g. foo would execute foo.py). The advantage was that the shell script would first set up a virtual environment for every entrypoint and install all the packages in the requirements.txt that was beside the entrypoint (removing any new ones). Each requirements.txt was compiled from a requirements.in file via pip-sync [1] which meant that devs only had to worry about declaring just the packages they actually directly depended on. Any change to requirements.in would require you to have run pip-sync which wouldn't (by default) upgrade any packages & only lock whatever the current version is (automation unit tests would validate that every requirements.txt matched the requirements.in file).

This didn't solve the multiple versions of python on the host. That was managed by having a bootstrap script written in python2 that would set up the development environment to a consistent state (i.e. install homebrew, install required packages) that anyone wanting to run the tools would run (no "getting started guides") which also versioned itself & was idempotent (generally robust against running multiple times). We also shipped this to our external partners in the factory. Generally worked well as once you ran the necessary scripts once no further internet access was required.

It wasn't easy but eventually it worked super reliably.

[1] https://github.com/jazzband/pip-tools