f.lux author here, still slogging through the article. It is hard to understand the light levels used because they use "power" (uW, mW) from a laser, and not "irradiance" (uW/cm^2 or mW/cm^2), so which area they have concentrated that light over is hard to understand. All I can see is it is from a laser, so the irradiance could be extremely high.

The human lens filters most light at the peak of the given spectrum for free retinal (383nm), and so once you get to 450nm like an LED, the hazard data in the visual range is 100x less sensitive, see Fig 1 here:

https://onlinelibrary.wiley.com/doi/pdf/10.1111/j.1751-1097....

It bears repeating that computer screens have <5% the hazard-weighted irradiance of a blue sky. (Can't make a direct comparison with lasers.)

Not sure the conversion to white light is correct, and it is unclear to me right now if this much retinal is available in vivo.

Hijacking this for a moment to say thanks for f.lux :) I've been running it for years and it's one of those things where you don't really notice the effect until you turn it off and then EGADS MY EYES.

Windows 10 has this built-in, so does macOS.

So does Fedora, courtesy of recent versions of GNOME with "Night Light" feature.

Ubuntu got that too with the switch to Gnome. Before that, I used the open source https://github.com/jonls/redshift