Through the Tor Uplift Project,[1] Tor Browser's Fingerprinting Protection feature is now available in Firefox on desktop and Android. The feature makes Firefox hide some pieces of identifying information from the sites you interact with, such as your timezone (which is set to UTC), some of your fonts, your keyboard layout/language, and parts of your user agent (for example, your browser version is set to the latest ESR version).[2]

To enable Fingerprinting Protection in Firefox, go to about:config and set privacy.resistFingerprinting to true.

Some Firefox forks enable Fingerprinting Protection by default, including LibreWolf[3] (desktop) and Mull[4] (Android). If you are on Android, the release version of Firefox does not include access to about:config, and you'll need to either switch to Firefox Beta/Nightly or use a fork like Mull, Fennec F-Droid, or Iceraven to take advantage of this feature.

[1] https://wiki.mozilla.org/Security/Tor_Uplift

[2] https://support.mozilla.org/en-US/kb/firefox-protection-agai...

[3] https://librewolf.net

[4] https://f-droid.org/en/packages/us.spotco.fennec_dos/

I just altered that setting and now Firefox resets my zoom level for every page I read to 100%. Makes HN unreadable as my default zoom for this site is 170% and having to set it every page I visit becomes old very quickly!

Interesting side effect though.

Yes, disabling site-specific zoom is one of the things that Resist Fingerprinting does.[1] If you want to use site-specific zoom while keeping Resist Fingerprinting enabled, the Zoom Page WE add-on[2] should allow this. I've just tried it and it worked for me.

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=1369357

[2] https://addons.mozilla.org/en-US/firefox/addon/zoom-page-we

Why does a website need to know my zoom level, i.e., why is this information even made available?

This is the problem with the modern web, which has become an app distribution platform. When you treat the browser as an OS, you need to expose a lot of information for stuff to work.

It would be very interesting to develop a modern web based purely on declarative content (modern HTML/CSS). HTMX is an interesting take on this, although it's currently implemented as server-provided JS: i don't see a reason why such patterns couldn't be implemented by the browser itself.

i would very much like for htmx to not have to exist and just have the functionality subsumed into the HTML spec

it wouldn't be much work

Hey thanks for taking the time to reply. Have you maybe got in touch with hacker-friendly browsers such as nyxt? There may be some interested people over there.

Also, is there some good venues to discuss the semantic/declarative web with you htmx folks and hopefully people from other like-minded projects? IRC? XMPP? Matrix?

Sorry for the delayed reponse: nope, never talked w/ the nyxt folks. I tried to post a topic on the working group thingie but they, understandably, weren't very receptive.

We use discord for chat right now:

https://htmx.org/discord

Well the good thing about nyxt is it's super extensible so a PoC doesn't require proper "reception" on their side.

Do you maybe have a gateway/bridge to a libre network such as IRC/XMPP/Matrix? I find HTMX pretty interesting but i wouldn't touch discord with a 10-foot pole, if only because my limited computing resources won't allow for such a resource-hungry app to run in the background.

It seems like matterbridge supports discord backend but i don't have a discord account to try it with. If you're not willing to host matterbridge, i'm already hosting one and i would just need credentials to try and connect it to Discord. If you're willing to give that a try, feel free to mail me at my username @ thunix.net.

https://github.com/42wim/matterbridge