GPT-3 is opensource. GPT-4 isn't.
1.7k forks…
https://github.com/openai/gpt-3
apart from that, several pre trained corpuses had been around for a while
Can you please explain to me (I don't understand it) why do you think GPT-3 is closed? Yes, they won't share the trained model, but they're sharing the research here[0][1] so you can reproduce easily, aren't they? As I understand it now, it's very fair - training the model is a separate thing from doing (and sharing) the research, is very costly, and would not happen if they were forced to open that too - I also don't understand why should they be.
This isn't strictly true. This is comparing to "GPT3 as a few-shot learner"[1] as opposed to the fine tuned models that everyone else use.
Few-shot GPT3 outperforms a BERT-based baseline.
> For GPT-2, their repository is archived
Fun fact: the GPT-3 repo (https://github.com/openai/gpt-3 ) is archived too, but it does use the GitHub archive feature, unlike the GPT-2 repo.
Since the demos on this page use zero-shot learning and the used model has a 2020-05-03 timestamp, that implies this API is using some form of GPT-3: https://news.ycombinator.com/item?id=23345379 (EDIT: the accompanying blog post confirms that: https://openai.com/blog/openai-api/ )
Recently, OpenAI set the GPT-3 GitHub repo to read-only: https://github.com/openai/gpt-3
Taken together, this seems to imply that GPT-3 was more intended for a SaaS such as this, and it's less likely that it will be open-sourced like GPT-2 was.
Is there a fast.ai like library that allows a novice to try GPT-3?
https://github.com/openai/gpt-3 only contains dataset