My first video accelerator was the Nvidia NV-1 because a friend of mine was on the design team and he assured me that NURBs were going to be the dominant rendering model since you could do a sphere with just 6 of them, whereas triangles needed like 50 and it still looked like crap. But Nvidia was so tight fisted with development details and all their "secret sauce" none of my programs ever worked on it.

Then I bought a 3DFx Voodoo card and started using Glide and it was night and day. I had something up the first day and every day thereafter it seemed to get more and more capable. That was a lot of fun.

In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL (which they didn't control) to program games if they could. After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.

Sigh.

Part of the success of directx over opengl was that very few graphics card companies seemed capable of producing a fully functional opengl driver, for the longest time Nvidia was the only option.

I recall ATI and Matrox both failing in this regard despite repeated promises.

Fully functional opengl was not exactly the issue, or not the only one

Opengl was stagnating at the time vendors started a feature wars. On opengl you can have vendor specific extensions, because it was meant for tightly integrated hardware and software. Vendors started leaning heavily on extensions to one up each other.

The cronus group took ages to get up and standardize modern features

By that time gl_ext checks became nightmarishly complicated and cross compatibility was further damaged by vendors lying about their actual gl_ext support, where drivers started claiming support for things the hardware could do, but using the ext causes the scene to not look right or outright crash

Developers looked at that and no wonder they didn't want to take part in any of it

This all beautifully exploded a few year later when compiz started taking a foothold which required this or that gl_ext and finally caused enough rage to get cronus working at bringing back under control the mess

By that time ms were already at directx 9, you could use xlna to target different architectures, and it brought networking and io libraries with it making it a very convenient development environment

*this is all a recollection from the late nineties early 2k and it's by now a bit blurred, it's hard to fill in the details on the specific exts Nvidia was the one producing the most but it's not like the blame is on them, Maxtor and ATI wonky support to play catch up was overall more damaging. Ms didn't need to really do much to win hearts with dx.

How did Microsoft solve the "extensions problem"? Did they publish standardized APIs in time so vendors wouldn't come up with any extensions? Even then, how did MS prevent them from having the driver lie about the card's features to make it look better than it is?

MS had a rigorous certification and internal testing process, new D3D versions came out quickly to support new hardware features, and through the Xbox Microsoft had more real world experience for what games actually need than the GPU vendors themselves, which probably helped to rein in some of the more bizarre ideas of the GPU designers.

I don't know how the D3D design process worked in detail, but it is obvious that Microsoft had a 'guiding hand' (or maybe rather 'iron fist') to harmonize new hardware features across GPU vendors.

Over time there have been a handful of 'sanctioned' extensions that had to be activated with magic fourcc codes, but those were soon integrated into the core API (IIRC hardware instancing started like this).

Also, one at the time controversial decision which worked really well in hindsight was that D3D was an entirely new API in each new major version, which allowed to leave historical baggage behind quickly and keep the API clean (while still supporting those 'frozen' old D3D versions in new Windows versions).

> Also, one at the time controversial decision which worked really well in hindsight was that D3D was an entirely new API in each new major version, which allowed to leave historical baggage behind quickly and keep the API clean (while still supporting those 'frozen' old D3D versions in new Windows versions).

This is interesting. I have always wondered if that is a viable approach to API evolution, so it is good to know that it worked for MS. We will probably add a (possibly public) REST API to a service at work in the near future, and versioning / evolution is certainly going to be an issue there. Thanks!

I've never had to migrate between DirectX versions but I don't imagine it's the easiest thing in the world due to this approach. Somewhat related I saw a library to translate DirectX 9 function calls to DirectX 12 because apparently so much of the world is still using DirectX 9.

That's what the drivers for the new Intel GPUs have to do, since the GPUs were only designed for DX12, and need earlier versions to be emulated

Ah yeah, and it looks like this is what they're using: https://github.com/microsoft/D3D9On12 And it looks like DirectX 11 to DirectX 12 translation exists as well: https://github.com/microsoft/D3D11On12