I use a slightly customized version of the Energized Protection[1] block list, which acts as a DNS sinkhole but is really just a text file that you paste into /etc/hosts. Before that I was using Pi-Hole but I found it too cumbersome to maintain properly. (Additionally /etc/hosts entries are way easier to scan, modify and verify for non-maliciousness IMO.)

In my browser I use uMatrix since it gives me fine-grained control over what websites can do. I have very strict default policies that break most sites but you can set them to whatever you want.

Additionally I've written my own regex-based request blocker[2] for YouTube midroll- and page ads since I don't trust other, more opaque ad blocking solutions that handle those (like AdBlock Plus). It does break all other Google services I'm aware of however. (Which I could patch but I don't really mind.)

[1]: https://github.com/EnergizedProtection/block [2]: https://addons.mozilla.org/en-US/firefox/addon/ytblocker

Don't you notice a slowdown in your connection on using a 20MB hosts file?

In my experience, the size of the hosts file matters on some devices, and some os.

On older versions of Windows, for example, networking and browsing slows noticeably as the size of the host file increases.

The same can be said for rootable mobile devices, though it’s less noticeable off WiFi because cellular latency is so much higher.

I would guess, marginal consumer and home routers will suffer with larger hosts files, but I don’t have sufficient experience to claim this for certain.

Background: years of discussions and issues at https://github.com/StevenBlack/hosts, which I maintain.