I don't know what the solution is but it feels like this is a much bigger issue and we need some rethinking of how OSes work by default. Apple has taken some steps it seems the last 2 MacOS updates where they block access to certain folders for lots of executables until the user specifically gives that permission. Unfortunately for things like python the permission is granted to the Terminal app so once given, all programs running under the terminal inherit the permissions.

Microsoft has started adding short life VMs. No idea if that's good. Both MS and Apple offer their App stores with more locked down experiences though I'm sad they conflate app security and app markets.

Basically anytime I run any software, everytime I run "make" or "npm install" or "pip install" or download a game on Steam etc I'm having to trust 1000s of strangers they aren't downloading my keys, my photos, my docs, etc...

I think you should be in control of your machine but IMO it's time to default to locked down instead of defaulting to open.

When I first realized that any and all code that I execute, has read/write permissions to most of my filesystem, it blew my mind. The OS grants every process its own unique virtual-memory-space, specifically to prevent malicious/accidental interference with other processes. It seems like the file-system really should operate on a similar principle as well. Every application should run in a sandboxed environment by default, with exceptions being granted by the user for specific applications that actually do need access to the entire file system.

This is already possible in Linux with mount namespaces, and used by (for example) systemd to block access to /home by services if so configured by the user.