For sites without account creation, sure, you might be losing a ton of good content... but it won't matter. The spam, harassment, trolling, illegal content, and the pinning of responsibility on the platform owner will almost certainly get you shut down. The problem of how to scale moderation of human communication is unsolved. Until it gets solved, accounts--really, identity and authentication--are not really optional... at least not on any website I would choose to run.

The first part of solving this problem may lie in tying identity with micro payments to make and read updates/comments on content. We would do this by implementing the 402 HTTP code stub using the Lightning Network. There already exists a proxy to implement this: https://github.com/lightninglabs/aperture

The second part of implementing this is to increase the cost of producing bad content for a particular identity. Using a proof of stake type system based on the identity and content, not the coin itself, might allow decent scaling of moderation.

> work by selecting validators in proportion to their quantity of holdings

In this particular case, the comments on the content (holdings) become the selecting validators. The more high quality validations through consensus of other identities, the cheaper it becomes to comment as a particular identity. If the identity becomes a bad actor later, subsequent consensus from other identities increases the cost of that identity posting new content related (through identity) to their older, higher quality content. In a way, the content an identity produces becomes their "ego" or "account".

But, by using crypto identities, no "account" is needed. A tag may be added to the content in a particular system to indicate which transaction and identity was responsible for paying to add that content. The system remains secure for privacy of participating individuals, other than what they leak through posting their thoughts to a public forum.