What does HackerNews think of autograd?

Efficiently computes derivatives of numpy code.

Language: Python

They solve different problems.

Numba compiles functions down to machine code or cuda kernels, that's it. XLA is "higher level" than what Numba produces.

You may be able to get the equivalent of jax via numba+numpy+autograd[1], but I haven't tried it before.

IMHO, jax is best thought of as a numerical computation library that happens to include autograd, vmapping, pmapping and provides a high level interface for XLA.

I have built a numerical optimisation library with it, and although a few things became verbose, it was a rather pleasant experience as the natural vmapping made everything a breeze, I didn't have to write the gradients for my testing functions, except for special cases that involved exponents and logs that needed a bit of delicate care.

[1] https://github.com/HIPS/autograd

I stumbled on this excellent article from 2015 that describes a really cool mechanism for tracing function execution in Python. This technique is used in the [autograd library](https://github.com/HIPS/autograd), which is a precursor to [Jax](https://github.com/google/jax), a competitor to Tensorflow and pytorch that's starting to gain traction.
> fun fact, the Jax folks at Google Brain did have a Python source code transform AD at one point but it was scrapped essentially because of these difficulties

I assume you mean autograd?

https://github.com/HIPS/autograd

Defining a loss function comes to mind. Checkout JAX[1] or Autograd[2]. As this is essentially a different programming paradigm there are a ton of opportunities, but it is fresh enough such that there is little to no history of software engineering differentiable programs. Design patterns, rules of thumb, and standard practices have yet to be defined.

[1] https://github.com/google/jax

[2] https://github.com/HIPS/autograd

I really like JAX as well: https://github.com/google/jax. It's younger than PyTorch and TF, but feels cleaner and more expressive. It has a very nice autodiff implementation (based on https://github.com/HIPS/autograd) and performance is comparable to TF in my experience.