This is easily among the rare highest quality articles/comments I've read in the past weeks, perhaps months (on LLMs/AI since that's what I am particularly interested in). And this was for internal consumption before it was made public. Reinforces my recent impression that so much that's being made for public consumption now is shallow and it is hard to find the good stuff. And sadly, increasing so even on HN. As I write this, I acknowledge I discovered this on HN :) Wish we had ways to incentivize the public sharing of such high-quality content that don't die at the altar of micro rewards.

I've been saying the same things for weeks, right here and in the usual places. Basically - OpenAI will not be able to continue to commercialise chatGPT-3.5, they will have to move to GPT-4 because the open source alternatives will catch up. Their island of exclusivity is shrinking fast. In a few months nobody will want to pay for GPT-4 either when they can have private, cheap equivalents. So GPT-5 it is for OpenAI.

But the bulk of the tasks can probably be solved at 3.5 level, another more difficult chunk with 4, I'm wondering how many of the requests will be so complex as to require GPT-5. Probably less than 1%.

There's a significant distinction between web search and generative AI. You can't download "a Google" but you can download "a LLaMA". This marks the end of the centralisation era and increased user freedom. Engaging in chat and image generation without being tracked is now possible while searching, browsing the web or torrenting are still tracked.

And where are these open source models where I can go to a url and do all the things I can do in ChatGPT or through api keys for OpenAI? I googled a couple of weeks ago to find hosted versions of these open source models to try, and every one was either down or woefully poor.

OpenAI and MS are going to win because they have a package to go and it’s ready and available and working well - they have set the benchmark. I’m not seeing any evidence of this in the OSS community thus far.

Until I can spin up a docker image capable of the same as OpenAI in hetzner for 30 bucks a month - it’s not in the same league.

>Until I can spin up a docker image capable of the same as OpenAI in hetzner for 30 bucks a month

I do exactly this with https://github.com/nsarrazin/serge

Hetzner will install any hardware you send them for $100. So you can send them a $200 P40 24GB to run 33B parameter GPU models at ChatGPT speeds without increasing your monthly cost.