Is Apple not counted because they're not favoring Nvidia at the moment?

Apple has basically bailed out of building modern developer workstations for the time being because of their rejection of Nvidia. It's so frustrating. I miss the perfect tool that my 2013 macbook pro was when I bought it. eGPUs are not an option and neither side is supporting drivers anymore. Eventually something like CUDA might come to the platform and they'll get back in it but IMO Nvidia holds so many of the cards right now it's crazy. The next couple years are going to be an interesting shift after nearly a decade of amazing workstation standardization.

It depends on what you’re developing. I have 0 need for anything except an on chip GPU for my development. That’s only because I developed for the web. If I need horsepower on the go I can VPN into my home workstation or server. If you’re doing ML or even Video editing a multi-GPU desktop will be so much faster than any portable laptop.

However, I agree that eGPUs aren’t a solution. I had a bad experience with Nvidia on my 2011 MBP but my 2017 MPB with an AMD chip is as good as the 2011 in its day.

I lucked out a few years ago and ended up at a startup that was focused on scaling GPU computation for a web service. I've spent 15 years being a web/mobile developer before that. First Perl/PHP for about 5 years and the last 10 mostly Python/C/C++/Obj-C/C#/Go.

What I can say about that and the current thing that's happening to computation is that we will all be doing code for vector processors soon. It's a sea change and the effect of the throughput is an edge for anyone who can use it now and in the near future. That's what this article is about. Google is the only other company making vector processors for compute at scale and they have the BEST tools for people making use of them in their cloud service even though generally they are provisioning Nvidia GPUs.

AWS doesn't ship computers and is lagging behind in their compute services. Nvidia with CUDA changes the landscape in a crazy way. You might not care about it now, but having an understanding of it is, IMO, critical to anyone that plans to be working on computers in the next 2-10 years unless you have a really untouchable position. Even if you think your position is untouchable, you might be in for a shock when someone blows the doors off your business logic with CUDA and you can't catch up.

Regardless of GPU choice, "write once" (or close to it) cross platform compute shaders[0][1][2] are coming in 2020 and there's no way anyone is going to bump CUDA out of being at the front of that.

[0] https://en.wikipedia.org/wiki/WebGPU

[1] https://www.khronos.org/vulkan/

[2] https://github.com/9ballsyndrome/WebGL_Compute_shader