This is cool and I am glad it exists as someone who works with nodejs often, but if you’re doing heavy server side computations where serious parallelization is actually important, I’m a little skeptical that investing in having those computations in JS is likely to be a reasonable choice in many cases outside of small hobby projects. I could be wrong, but I think the performance compared to other alternatives also using parallelization via the GPU would be very unfavorable and you also wouldn’t have rich complementary libraries for this as a result of the low performance ceiling for JS in this domain.

Those criticisms haven't really been true of JS for years. Google has invested many dozens of millions to make JS fast. And GPU acceleration is largely unrelated to the particular language that happens to drive the GPU.

If the critique is "static typing good, JS bad" then that's a fair argument. But personally I'm excited to have something akin to a scripting language for a GPU.

> But personally I'm excited to have something akin to a scripting language for a GPU.

There's a few of these already, depending what you mean by scripting language.

You have Numba for Python [0], with official backing from nVidia, and turbo.js [1] for JS.

Alea for F# might classify, too, thanks to its REPL, and how quick you can get off the ground. [2]

[0] https://developer.nvidia.com/how-to-cuda-python

[1] https://turbo.js.org/

[2] https://developer.nvidia.com/alea-gpu

Neanderthal also exists, giving you REPL based GPU programming in Clojure:

https://github.com/uncomplicate/neanderthal