God damn... this is it, this is the end-game. There's no way to fight this unless you customize and maintain blocking scripts for each individual website.

Yes, websites could always have done this, but the REST (CDN-bypassing) requests' cost and the manual maintenance for the telemetry endpoints and storage was an impediment that Google just gives them a drop-in solution for :(

I think Google is happy to eat some of the cost for the "proxy" server given the abundance of data they'll be gobbling up (not just each request's query string and users' IP address but -being a subdomain- all the 1st party cookies as well). I don't have the time or energy to block JavaScript and/or manually inspect each domain's requests to figure out if they use server-side tracking or not.

I honestly don't know if there's any solution to this at all. Maybe using an archive.is-like service that renders the static page (as an image at the extreme), or a Tor-like service and randomizes one's IP address and browser fingerprint.

Just block Google tag manager itself. Gets two birds stoned at the same time.

How would you do that? Isn't it the server that talks to Google Tag Manager, not the browser?

use uMatrix or uBlock and block individual domains

https://github.com/gorhill/uMatrix