>> In order to maximize the reproducibility of our results, we provide code in JAX (Bradbury et al., 2018) for our proposed X-UNet neural architecture from Section 2.3
Nice.
OpenAI shitting their pants even more.
Oh, OpenAI does more or less release that much. People don't have issues implementing the models from their papers.
What they don't do is release the actual models and datasets, and it's very expensive to retrain those.
They released CLIP (both model and code[1]), which is very broadly used in Dall-E alternatives. For example Stable Diffusion uses it.
They also release Whisper model and code[2]