I 100% agree on this. But it's been pretty hilarious on how obstinate tool makers are about changing to Python's new pyproject.toml [1]

Python packaging is a bit of a mess already, so when I recently started a new small package I wanted to choose tooling that would not clutter, but a fair amount of tool makers were reluctant to allow code into their repo that would use the unified toml rather than a ton of separate file.

.config is probably a better solution than a single top level toml, IMHO, but far more important is doing something unified rather than continuing the pollution of the top-level namespace, which merely obscures the project structure.

[1] https://snarky.ca/what-the-heck-is-pyproject-toml/

Python is unified on a requirements.txt file with pip, to manage dependencies.

requirements.txt has a lot of deficiencies for package management (compare to the various features Yarn or Cargo offer). The biggest weakness is transitive dependency locking: even if you specify the exact version of the dependencies you want, Pip can still resolve to a different version of their dependencies, and even then there’s no verification that the content matches what is expected beyond the version number (other tools use a content checksum). This can cause a long tail of reproducibility and maintenance issues.

It’s also a tiny subset of what the parent describes.

I strongly recommend https://github.com/jazzband/pip-tools to solve this. It provides a simple script to take a requirements file and "compile" a full specification of all transitive dependencies. You check both files into the repo, point pip at the generated file, and manually modify the other one. It means you often don't need to pin requirements manually at all, and the versions will be explicitly updated whenever you choose to recompile your requirements.