What does HackerNews think of autograd?
Efficiently computes derivatives of numpy code.
Numba compiles functions down to machine code or cuda kernels, that's it. XLA is "higher level" than what Numba produces.
You may be able to get the equivalent of jax via numba+numpy+autograd[1], but I haven't tried it before.
IMHO, jax is best thought of as a numerical computation library that happens to include autograd, vmapping, pmapping and provides a high level interface for XLA.
I have built a numerical optimisation library with it, and although a few things became verbose, it was a rather pleasant experience as the natural vmapping made everything a breeze, I didn't have to write the gradients for my testing functions, except for special cases that involved exponents and logs that needed a bit of delicate care.
I assume you mean autograd?