As a non-English speaker, I find this pretty useless. I always want to get English version of any website, and not some poor translation, and honestly the fingerprinting entropy is negligible. This seems to be PR-driven development, just to score some points from the privacy aware community.

Every time I see tech being proposed for privacy instead of legislation I wonder how the topic is kept so vague. There are a handful of companies that can track you across multiple websites, so any real solution has to start by enumerating and addressing those companies.

It feels like were always discussing about curing "diseases" without explicitly saying that malaria, TB, etc are the targets.

> There are a handful of companies that can track you across multiple websites, so any real solution has to start by enumerating and addressing those companies.

I think it's more like hundreds than handfuls. But they're all connecting your behavior across sites using the same few techniques:

* Explicit methods: cookies, link decoration, and other browser-supported ways of adding entropy. Browsers are working on removing these, but if they move too aggressively here then adtech just moves to:

* Fingerprinting: using existing browser entropy. Generally worse than explicit methods because the user doesn't have control (ex: shared fingerprint between successive private browsing sessions). Browsers are also working on reducing this, see the article, but it's very hard because the number of techniques is large and they generally use features users/sites depend on.

* Timing attacks (pretty sure no one is doing this commercially yet)

You might be interested in https://github.com/michaelkleber/privacy-model

(Disclosure: I used to work on ads at Google)

> I think it's more like hundreds than handfuls.

Could you be a bit more explicit here? I'd be curious how these entities coordinate themselves, since I don't have hundreds of cookies set by any website, so there must be few networks to correlate this data (FB, Google, some IAB groups, who else?). Going after those networks would seem like the obvious next step then.

Thanks for the link. It seems like an honorable direction, but it's a bit nebulous on how browsers would be incentivized to implement with good intentions.

> I don't have hundreds of cookies set by any website

Are you sure? These are third-party cookies, and it's not easy to get a full list. One way to do it is to go to a major publisher (NYT, CNN, etc) with devtools open and networking enabled. Filter to third party requests and look for ones sending cookies. Trying this on the NYT front page I saw 3p requests with cookies to amazon-adsystem.com, doubleclick.net, prebid.media.net, rubiconproject.com, adnxs.com, 3lift.com, openx.net, google.com, scorecardresearch.com, casalemedia.com, pubmatic.com, bluekai.com, adsrvr.org, bing.com, twitter.com, everesttech.net, criteo.com, dotomi.com, bidswitch.net, mfadsrvr.com, agkn.com, pswec.com, adtdp.com, demdex.net, bidr.io, adition.com, brand-display.com, intentiq.com, w55c.net, pippio.com, rlcdn.com, and adsymptotic.com before I got bored and stopped counting. Some of these might not be for personalized advertising, but most of them look like it.

> browsers would be incentivized to implement with good intentions.

Browsers compete on privacy, and what they do is open source. So while their incentives aren't perfect, external groups (and competing browsers!) can help keep them honest by paying attention and calling attention to bad decisions.

A great example of this was Mozilla's thorough and careful privacy analysis of FLoC (https://blog.mozilla.org/en/privacy-security/privacy-analysi...), and looking at Topics (https://github.com/patcg-individual-drafts/topics) Chrome seems to have spent a lot of time addressing that feedback.