So Google would need to build a discriminator that detects machine-generated content. It will be interesting to see these discriminators fight the generators of other big companies.

I'll be taking a front-seat row watching that show, if it happens. Perhaps in the future, we'll have to deal with discriminators that approximate some originality-index. Will be fun fighting with those algorithms, to interact with the internet as a normal user (to some extent we already do - proving that you're human is becoming more and more tedious.)

In practice I think it's more Google now has another policy reason to banhammer prolific and irritating blogspammers than an arms race Google has a chance in.

Google isn't yet effective at detecting blogspam generated by naive scripts that simply swap words in the source material for other words in a thesaurus. I don't think they're going to start picking up continuity errors, factual errors or "weirdness" in GPT-3 - which often satisfies human readers - any time soon.

Google engineers can't even filter out GH or SO scraped sites like gitmemory, nor offer a way to let users block these sites. I'm not sure we should expect them to handle more advanced techniques like detecting word swaps any time soon.

This filter list for uBO will likely be useful to you

https://github.com/quenhus/uBlock-Origin-dev-filter