Here are a few things I do to combat nasty websites:

- blacklists entire domains using wildcards (using an "unbound" DNS resolver and forcing all traffic to my DNS resolver, preventing my browser to use DoH -- I can still then use DoH if I want, from unbound)

- reject or drop a huge number of known bad actors, regularly updated: they go into gigantic "ip sets" firewall rules

- (I came up with this one): use a little firewall rule that prevents any IDN from resolving. That's a one line UDP rule and it stops cold dead any IDN homograph attack. Basically searching any UDP packet for the "xn--" string.

I do not care about what this breaks. The Web still works totally fine for me, including Google's G Suite (yeah, I know).

EDIT: just to be clear seen the comments for I realize I wasn't very precise... I'm not saying all IDN domains are bad! What I'm saying is that in my day to day Web surfing, 99.99% of the websites I'm using do not use IDN and so, in my case, blocking IDN, up until today, is totally fine as it not only doesn't prevent me from surfing the Web (I haven't seen a single site I need breaking) but it also protects me from IDN homograph attacks. Your mileage may vary and you live in a country where it's normal to go on website with internationalized domain names, then obviously you cannot simply drop all UDP packets attempting to resolve IDNs.

Steven Black runs a hosts file on GitHub with regular updates. https://github.com/StevenBlack/hosts

There are a bunch of file variants to weed out specific bad actors.

It's well currated though I will disclaimer it has broken a few websites in the past for me. Maybe that's a good thing.