AFAIK modern browsers all use GPU accelerated rendering, so using HTML/CSS does give you a GPU-backed user interface.
Also desktop PWAs are IMO underrated and a really good way to distribute a desktop app without having to bundle a whole browser engine with your app. We do this for Construct (www.construct.net) - a complete game development IDE - and I think it's worked out great.
Saying HTML/CSS is "GPU-backed" gives the wrong impression.
What's the point of saying a UI is GPU-backed (which usually means a UI hand-crafted using mesh, texture atlas, shaders, minimal draw calls, etc.) if it's also grouped together with one of the most bloated and inefficient means to output pixels on the screen (HTML/CSS).
I love the web, not hating at all, but this is comparing apple and oranges.
The point is that GPU-backed is faster than not, which is why your browser (and your OS) is doing all those things except maybe hand-crafted meshes (and I don’t agree that hand-crafted meshes is the usual definition of a GPU-backed UI). The GPU-backed UI does benefit from simple things like faster scrolling, especially on high DPI displays, in addition to speeding up the rendering of new pixels for HTML/CSS/SVG/image/video elements. The point is the browsers are actively trying to fix the inefficiency you’re referring to, and it might now be further along than you think.
BTW, the GPU-backed browser UI and GPU-backed OS UI, these come with a bunch of things that someone’s hacked UI doesn’t, namely transparent software fallback when a GPU isn’t available, full compatibility with the UI spec, accessibility support, and tolerance for a wide range of GPUs. People making their own UIs directly almost never do these things (I’m guilty as charged), and as such their UIs are often some combination of more brittle, device dependent, and limited to a narrower audience.
Sure hand-crafted mesh might push the envelop too much (I work in game dev) but GPU-Backed UI referred to a specialized form of UI.
Have you read the article? They specifically mention how nowadays app are just a web page instead of being a GPU-Backed UI.
Trying to push the narrative that HTML / CSS is somehow a GPU-Backed UI is almost comical.
The browser is GPU-accelerated, it is not what anyone in the industry would consider a GPU-Backed UI.
> Trying to push the narrative that HTML / CSS is somehow a GPU-Backed UI is almost comical.
To me it rather seems like there are different definitions of what GPU-backed means.
>The browser is GPU-accelerated, it is not what anyone in the industry would consider a GPU-Backed UI.
In which industry?
As a former web dev I would consider any calculation GPU-backed if it is utilizing a GPU.
With that definition and considering html/css is used to build user interfaces, rendering in modern browsers could certainly be called GPU-Backed UI.
If this is a term that already has a different meaning to just the words it is composed of, that's fair to mention. But it doesn't help a discussion to escalate using wording such as "Trying to push the narrative ... is almost comical"
It is stated right there in the article that "GPU-Backed UI" is one of the alternative to the mainstream way of building app nowadays that are basically just web page.
Now people argue that web page are "GPU-Backed UI".
I get the word might be misleading, but the article is talking about a specific way of building UI using the GPU, which does not include web page, so trying to fit the web into that category makes no sense.
In some loose sense every UI is "GPU-Backed", so how do we name that specific way of building UI that the article is mentioning? They choose "GPU-Backed UI" which if anyone would have told me those words I would immediately know what they are talking about and I would've never included HTML/CSS in that category (even if in some broad loose sense of the words it does fit).
Where to draw the line?