One thing I’d add to this conversation, though I’m certain it’s already been stated: As many have mentioned, there is a large subset of the user base that uses Python for applied purposes in unrelated fields that couldn’t care less about more granular aspects of optimization. I work as a research assistant for international finance faculty and I would say that compared to the average Hackernews reader, I’m technologically illiterate, but compared to the average 60-80 y/o econ/finance faculty member, I’m practically a Turing award winner.
Most of these applied fields are using Python and R as no more than data gathering tools and fancy calculators. something for which the benefits of other languages are just not justified.
The absolute beauty of Python for what I do is that I can write code and hand it off to a first year with a semester of coding experience. Even if they couldn’t write it themselves, they can still understand what it does after a bit of study. Additionally, I can hand it off to 75 year old professors who still sends Fax memos to the federal reserve and they’ll achieve a degree of comprehension.
For these reasons, Python, although not perfect, has been so incredibly useful.
when it comes to slightly more non simple use cases involving parallelism and concurrency python and their imperative kin starts falling quite short of basic needs that are easily satisfied by
fp languages like
ocaml
haskell
racket
common lisp
erlang
elixir
or rust/golang
but even if the code is single threaded and not hampered by GIL limitations python tends to be super slow imho; also debugging dynamic python and imperative stateful python after a certain code base size >10k LOC gets extremely painful
A lot of these problem spaces can get away with single threaded performance because maybe they're generating a report or running an analysis once a day or at even slower frequency. I work in a field where numerical correctness and readability is important for prototyping control algorithms (I work on advanced sensors) and python satisfies for those properties for our analysis and prototyping work.
When we really want or need performance we rewrite the slow part in C++ and use pybind to call into it. For all the real implementations that run in a soft real time system, everything is done in C++ or C depending on the ecosystem.
debugging dynamic python and imperative stateful python after a certain code base size >10k LOC gets extremely painful
for any meaningful scale you are better served by basic FP hygiene as evidenced in
haskell
elixir
CL/racket
or rust/golang
I don’t get it. Go is as imperative as a language can be.