What does HackerNews think of PySR?

High-Performance Symbolic Regression in Python and Julia

Language: Python

#164 in Python
I encourage everyone to read this paper. It's well written and easy to follow along. To the uninitiated, SR is the problem of finding a mathematical (symbolic) expression that most accurately describes a dataset of input-output examples (regression). The most naive implementation of SR is basically a breath first search starting from the simplest program tree: x -> sin(x) -> cos(x) ... sin(cos(tan(x))) until timeout. However, we can prune out equivalent expressions and, in general, the problem is embarrassingly parallel which alludes to some hope that we can solve this pretty fast (check out PySR[1] for a modern implementation). I find SR fascinating because it can be used for model distillation: learn a DNN approximation and "distill" it to a symbolic program.

Note that the paper talks about the decision version of the SR problem. ie: can we discover the global optimum expression. I think this proof is important for the SR community but not particularly surprising (to me). However, I'm excited by the potential future work for this paper! A couple of discussion points:

* First, SR is technically a bottom up program synthesis problem where the DSL (math) has an equivalence operator. Can we use this proof to impose stronger guarantees on the "hyperparameters" for bottom up synthesis. Conversely, does the theoretical foundation of the inductive synthesis literature [2] help us define tighter bounds?

(EDIT: I was thinking a bit more about this and [2] is a bad reference for this... Jha et al give proofs for synthesis w/ CEGIS where the synthesizer queries a SMT solver for counterexamples until there are none... kinda like a GAN. Apologies.)

* Second, while SR itself is NP hard, can we say anything about the approximate algorithms (eg: distilling a deep neural network to find a solution[3])? Specifically, what proof tell us about the PAC learnability of SR?

Anyhow, pretty cool seeing such work get more attention!

[1] https://github.com/MilesCranmer/PySR

[2] https://susmitjha.github.io/papers/togis17.pdf

[3] https://astroautomata.com/paper/symbolic-neural-nets/

I found it curious that one of the implementations of symbolic regression (the "machine scientist" referenced in the article) is a Python wrapper on Julia: https://github.com/MilesCranmer/PySR

I don't think I've seen a Python wrapper on Julia code before.