Why is it always analogue output with these kinds of devices? It seems to me like it should be easier to do some form of digital out, but you never see small micro-controllers implementing any kind of digital video output. Is it just because current video standards are bloated and terrible (HDCP, really complicated spec, etc) or is there some more fundamental reason why analog out is so much easier?

I looked at how USB was bit-banged on an arduino and I'm sure some of the problems involved are common with this.

One constraint is VGA is at minimum around 25mhz and HDMI is at minimum 165mhz and a pico is around 133mhz.

It also takes clock cycles to update the data and run algorithms. Some of the algorithms can often be mitigated to the silicon that controls some of the pins as they can have small bit of programmable logic for this purpose or can be re-purposed, but usually not all the pins or all of the algorithms.

There are tricks and such that might be able to get it working in certain lower resolutions and color modes, perhaps like overclocking or taking advantage of some harmonic, or clever use of some existing feature, but the more tricks you do to get things faster,often introduce issues with stability that you have to mitigate or solve. This barely touches physical constraints like cord length, signal integrity, voltages and current, overheating, etc that may or may not be applicable.

A TMDS symbol is 10 bits long so a pixel clock of 25Mhz needs a DVI bitrate of 250Mhz. You can generate DVI with an overclocked RP2040[1], but even then you have to halve the horizontal resolution because the general TMDS encoding algorithm is expensive to perform in software. You could, however, replace the VGA resistor DAC with a dedicated DVI encoder IC like the TFP401.

[1] https://github.com/Wren6991/PicoDVI