What does HackerNews think of jax?
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Language:
Python
>
Python/Numpy: 5360ms (Xeon(R) CPU E5-2698 v4 @ 2.20GHz)
CuPy: 10.6ms (A100)
MatX: 2.54ms (A100)
>
Are they even comparing apples to apples to claim that they see these improvements over NumPy?
> While the code complexity and length are roughly the same, the MatX version shows a 2100x over the Numpy version, and over 4x faster than the CuPy version on the same GPU.
NumPy doesn't use GPU by default unless you use something like Jax [1] to compile NumPy code to run on GPUs. I think more honest comparison will mainly compare MatX running on same CPU like NumPy and for GPU comparison focus to compare vs CuPy.
I've been noticing more and more ML projects such as hugging face adding support for JAX which makes Python have automatic differentiation which is a key feature of Julia.
> I love their JAX
For those out of loop: https://github.com/google/jax
JAX should be mentioned [1]. It's also from Google and is getting popular these days. Not PyTorch-popular, but the progress and investment seem promissing.
You should give JAX a go.