Don't underestimate the impact this has on getting funding or even just tenure/etc recognition for working on numpy. I'm in industry these days, but coming from the academic side, it's _really_ hard to get recognized for building the underlying infrastructure that tons of people use. I've built and maintained libraries that are used in a ton of publications, but was always told my work was "utterly and completely useless". It was also always unpublishable, as methods are never publishable in my field. Numpy has (obviously) vastly more respect and impact than my work, but the general problem remains.

Articles like this are a _huge_ deal for that reason. It's an immense delayed recognition for over a decade of work from a lot of folks.

I recall a story where a friend was unable to publish a paper in which he wrote an alternative to a very commonly used commercial tool (that virtually everybody used) with roughly 10 times better performance. He open sourced it and all, it was extremely useful, but there was no new methodology, it was simply very well implemented.

At a talk of his it lead to a very heated discussion where an older professor accused him of wasting government money on such nonsense.

Been there. A few years back I got a government scholarship for my PhD (which is still in progress, due to my follow up work). I basically built the foundation upon which to establish a new field for my university, and the region where I live. There are some professor who think that scholarship (and the little money it gave me) was wasted on my because I chose to build all of that from the ground up, instead of rushing through my PhD.

By the way, those of that opinion are all professors who wanted me on their labs, but I turned them down...

For every story like this, I believe there are many more in which the student simply writes their own implementation due to not invented here syndrome or engineering as a form of procrastination.

If you talked to me about my PhD for a few minutes you would surely put me into your "had to reinvent the wheel for no reason" category.

As indeed, I wrote an analysis framework for my data (of a gaseous detector used for axion search) [0] instead of using an existing framework used by my predecessor. However, things are always more complicated than they seem. Many of those not talked about students who rewrite stuff probably have reasons!

In my case the existing framework [1] was a monster that was bent to allow it to work with the kind of data we have in the first place. In my case my detector had several additional features, which fit _even less_ into the existing framework. It would have been a hack and still a significant amount of work to make it work well.

To be fair, when I started this I expected it to be less work than it ended up being. But that's the story of software development.

The advantages now are significant of course. I know the whole codebase. It does exactly what I want. I can extend it easily as I see fit.

That doesn't mean I didn't also partly procrastinate writing software. Far from it. Hell, there was no reason to write a freaking plotting library (a sort of port of ggplot2 for Nim) [3]. But again, this means my thesis will have plots created natively using a TikZ backend while at the same time provide links to Vega-Lite plots for each and every plot in my thesis (which of course will include the data for each plot!).

Finally, the most important point: A university / professor who only pays me for 20h a week does not get to tell me how I do my PhD.

[0]: https://github.com/Vindaar/TimepixAnalysis [1]: https://ilcsoft.desy.de/portal/software_packages/marlintpc/ [2]: https://github.com/Vindaar/ggplotnim