What does HackerNews think of PySR?
High-Performance Symbolic Regression in Python and Julia
Note that the paper talks about the decision version of the SR problem. ie: can we discover the global optimum expression. I think this proof is important for the SR community but not particularly surprising (to me). However, I'm excited by the potential future work for this paper! A couple of discussion points:
* First, SR is technically a bottom up program synthesis problem where the DSL (math) has an equivalence operator. Can we use this proof to impose stronger guarantees on the "hyperparameters" for bottom up synthesis. Conversely, does the theoretical foundation of the inductive synthesis literature [2] help us define tighter bounds?
(EDIT: I was thinking a bit more about this and [2] is a bad reference for this... Jha et al give proofs for synthesis w/ CEGIS where the synthesizer queries a SMT solver for counterexamples until there are none... kinda like a GAN. Apologies.)
* Second, while SR itself is NP hard, can we say anything about the approximate algorithms (eg: distilling a deep neural network to find a solution[3])? Specifically, what proof tell us about the PAC learnability of SR?
Anyhow, pretty cool seeing such work get more attention!
[1] https://github.com/MilesCranmer/PySR
I don't think I've seen a Python wrapper on Julia code before.