As a non-web-developer, I'm kinda excited about WebGPU. Specifically WebGPU native, that has potential to be a portable modern 3D graphics API without the difficulty of using Vulkan or DX12.

It can only expose a subset of their capabilities.

Don't expect using mesh shaders on WebGPU for example.

I'll say I am definitely biased in favor of WebGPU up front[2].

WebGPU is focused on landing 1.0 currently, so yes, more features are not on the immediate roadmap.

But they've been considering/investigating adding support for this before mesh shaders in DX12 were even finalized[0] so it's not like there is some "we cannot / will not expose that functionality" mandate going on here. Despite its name, WebGPU is actually pretty much a 1:1 mapping of what is commonly available across APIs.

Separately, AMD doesn't support mesh shaders in Vulkan either, in fact Vulkan doesn't at all outside of a Nvidia extension from what I understand - and it sounds like there are some concerns from Khronos about whether they even map to all GPU architectures in a reasonable way at all.[1]

If you want to argue "but by dropping to the level of DX12, Vulkan, Metal, etc. I can use some very specific new features" that's totally true, but you can do that with WebGPU native too: after all, Dawn and other WebGPU implementations are just abstractions over those 3 APIs. One can just as easily hop into their code and add your own extension to make use of Mesh shaders in DX12, I can say that at least Dawn's codebase is set up to allow for such extensions from what I've seen.

[0] https://github.com/gpuweb/gpuweb/issues/445

[1] https://github.com/KhronosGroup/Vulkan-Docs/issues/1423#issu...

[2] https://devlog.hexops.com/2021/mach-engine-the-future-of-gra...

When WebGPU runs on the browser you cannot definitely do it, after all that is the whole point being discussed about working around stores.

Given the distance between WebGL 2.0 and GL ES 3.2, with the 10 years that took to made it available everywhere, good luck waiting for WebGPU to become mature for adoption.

Mesh shaders was the easiest example to refer to, there are plenty of 2021 native features that won't make it into version 1.0.

Then to top that, we get a shading language that seems like C++, Rust and HLSL had a child.

> Given the distance between WebGL 2.0 and GL ES 3.2, with the 10 years that took to made it available everywhere, good luck waiting for WebGPU to become mature for adoption.

WebGL2 took 4 and a half year to complete (with OpenGL ES 3.0: published in August 2012, and implementations of WebGL2 final version shipped in browsers in early 2017). Why did Apple refused to implement it for four more years? I don't know, but at least it isn't happening with WebGPU.

> Then to top that, we get a shading language that seems like C++, Rust and HLSL had a child.

Is that supposed to be a bad thing?

What guarantees do you have it won't happen again?

Even if Apple had done it on time, it was a 2012 hardware API for 2017 hardware, and never exposed a complete ES 3.0 API surface, nor anything beyond it up to ES 3.2.

Intel had two failed attempts to bring compute into the browser.

Yes it is a very bad thing, when Vulkan can keep using GLSL and HLSL, while DX12 happily will use any HLSL from the API history, and Metal can use proper C++14 shaders.

WGSL is web politics as usual.

> What guarantees do you have it won't happen again?

Apple was pretty clear about their intent not to ship WebGL2, and they did the opposite for WebGPU so it's not gonna be the same story. Of course, I can't be 100% sure that Apple won't change their mind (like they eventually did for WebGL2) or anything, but there is no reason to believe they'll do so.

> Even if Apple had done it on time, it was a 2012 hardware API for 2017 hardware

Most games released in 2017 had to support hardware from 2012 anyway. Even AAA games released this year support GPU released in 2012[0]! For non AAA games, targeting a 5-years old API is probably the newest you can afford. We're not talking about bringing the bleeding edge GPU tech to the web (it won't, it's never standardised anyway, like Mesh shaders you talked about). The goal is to provide modern standardized tech to developers, and it does it in a portable way, which makes it even more affordable.

> Intel had two failed attempts to bring compute into the browser.

So what ?

> Yes it is a very bad thing, when Vulkan can keep using GLSL and HLSL, while DX12 happily will use any HLSL from the API history, and Metal can use proper C++14 shaders.

Vulkan uses SPIR-V, not HSL or GLSL. Translation tooling exist, but it also exists for WGSL -> SPIR-V[1]

Each platform has its own shading language, will this one be better than the other, I don't know but I don't think it's gonna be worse either.

> WGSL is web politics as usual.

Not really. See this summary[2] from a Mozilla gfx engineer:

[0]: https://support.activision.com/black-ops-cold-war/articles/m...

[1]: https://github.com/gfx-rs/naga

[2]: https://kvark.github.io/webgpu-debate/SPIR-V.component.html