Can someone explain to me why virtual DOM is faster than DOM implementations of browsers?

Actually changing the DOM has a lot of implications. Recalculating CSS, layout, possible side effects that generate events, repainting, etc, etc. It's a lot of work to go through for an intermediate state that may not even last long enough to be visible to the user. Buffering and batching those changes can save a lot of that effort. I suppose in theory the browser could optimize this as well but in practice it seems most are still optimized for rendering static pages very quickly rather than for handling rapid changes.

Yes but WHY don't the browser makers optimize for what seems to be this very common use case? Why does a third party library outperform them?

Because when your code changes DOM, browser does not know whether you are going to stop at it for now, or do something else, so it has to re-render. In React you explicitly request application of virtual DOM changes to real DOM once you are done.

In theory they could add some new methods to the DOM api to allow for this, but it would be non-standard and currently none of the browsers have it.

Something like this maybe?

batchDomChanges(); ... dom changes here ... flushDomChanges();

Or use fastdom[1], which batches reads and writes to reduce layout thrashing.

[1]: https://github.com/wilsonpage/fastdom