What does HackerNews think of WebGL_Compute_shader?
WebGL 2.0 Compute shader Demos
https://registry.khronos.org/webgl/specs/latest/2.0-compute/
https://github.com/9ballsyndrome/WebGL_Compute_shader
And then Google decides not implementating it after all, because of WebGPU.
https://github.com/9ballsyndrome/WebGL_Compute_shader/issues...
What I can say about that and the current thing that's happening to computation is that we will all be doing code for vector processors soon. It's a sea change and the effect of the throughput is an edge for anyone who can use it now and in the near future. That's what this article is about. Google is the only other company making vector processors for compute at scale and they have the BEST tools for people making use of them in their cloud service even though generally they are provisioning Nvidia GPUs.
AWS doesn't ship computers and is lagging behind in their compute services. Nvidia with CUDA changes the landscape in a crazy way. You might not care about it now, but having an understanding of it is, IMO, critical to anyone that plans to be working on computers in the next 2-10 years unless you have a really untouchable position. Even if you think your position is untouchable, you might be in for a shock when someone blows the doors off your business logic with CUDA and you can't catch up.
Regardless of GPU choice, "write once" (or close to it) cross platform compute shaders[0][1][2] are coming in 2020 and there's no way anyone is going to bump CUDA out of being at the front of that.
[0] https://en.wikipedia.org/wiki/WebGPU
And here is a draft spec for WebGL2 Compute: https://www.khronos.org/registry/webgl/specs/latest/2.0-comp...
Only a matter of time until this will be available in chrome and firefox by default. Probably never in Safari though, since it doesn't even support WebGL 2.