What does HackerNews think of psl-problems?

Ryan Sleevi has written about this before on Hacker News and here's his list https://github.com/sleevi/psl-problems

It's definitely possible that Ryan would consider using this for HN a reasonable choice, because it's mostly cosmetic, but in general you should just not add more dependencies.

Before you begin to make use of the PSL, consider some of its problems: https://github.com/sleevi/psl-problems

FWIW, the link above successfully convinced me and a coworker not to use the PSL.

(googler here, but this is my opinion)

I think there's a big abstraction gap between what we use domains for and what they were supposed to be used for, in a way that we shouldn't assume any ownership only based on the domain itself.

For instance you can have a number of sites that use separate domains but are owned by the same entity (N domains for 1 party). You could also have the same base domain being used for several unrelated parties, think hosting a store on Shopify (1 domain for N parties). This is so ambiguous that even inside the browser you have two different implementations on the way you handle this attribution, one for cookies and one for Single-Origin Policy.

There's a good write up about this problem at https://github.com/sleevi/psl-problems. Sometimes I wonder how the web got here with the amount of kludge that we have to carry.

Reminder that the Public Suffix List is a non-scalable hack, and platforms should be reducing their reliance on it, not increase it:

https://github.com/sleevi/psl-problems

It sounds like there's two cases:

1. Multi-tenant domains that probably should've always been in the PSL (ex. to provide cookie silos) but are only realizing now that they should be in it due to the arrival of PCM.

2. Sites that want to abuse an eTLD to do something like give all users on their social network a custom subdomain so that they're not polluting the same pool.

--

I think it was actually reasonable for Apple to consider the PSL as it's basically the most comprehensive eTLD list that we have and would allow them to match browser behavior.

The problem now is that case (1) is sending a bunch of requests at once as something will now actually break for these sites. Before now it was really just them being lax with security and not considering that cookies should be siloed. This isn't a unique situation btw, PSL also saw a large increase in inclusion requests when LetsEncrypt added rate limits based on eTLDs.

(2) is obviously bad and there's really no other justification for these sites being in the PSL.

Therefore I think it's reasonable for PSL to deny inclusion requests that are solely for PCM reasons.

This all being said, the PSL is a massive hack [1] and really needs to be replaced by something else. It probably is about time for these companies to invest in a replacement.

[1]: https://github.com/sleevi/psl-problems

Ryan Sleevi suggests (perhaps that's putting it too mildly) that further reliance on the PSL is a bad idea:

https://github.com/sleevi/psl-problems and https://news.ycombinator.com/item?id=24441942

His recommendation is to as much as possible use the Same Origin Policy or if you must, a slightly weakened variant like ignoring port numbers.

I suspect I'll make another comment elsewhere in this discussion that says more about the problems with that.

PSL maintainer here: please don’t use the PSL!

Yes, it’s weird to have a maintainer asking people not to use their project, but the PSL was a very specific (and unfortunate) hack for a very specific (and unfortunate, and browser-created) problem. It is something we live with, not something we like. While the ideal world is “don’t use any list at all, use the protocols as God, the IETF, and IANA intended”, if you are going to use a list, using the IANA list, updated daily, is much better than the PSL.

Do not use the PSL for anything that is not “cookies abusing the Host header”

https://github.com/sleevi/psl-problems