Lisp - oh what it could have been. It had such potential, but then it got broken. I find it fascinating that those who are dedicated to the proselytisation of Lisp don't see the brokenness of the language. For them, all of the broken things are the features of the language.

Scheme was one attempt to fix some of those flaws.

In latter times, we see the development of Kernel to fix other flaws.

So many second class citizens, so many exceptions to the rule.

I am going through the source code for Maxima CAS (written in Lisp) and in so many ways, it's a mess. I am not at all disparaging those who have been involved in writing the Maxima CAS system and its source. They have done an incredible job and what they have achieved is remarkable.

However, like any software system of any complexity in any language, it has lots of areas that are difficult to maintain, let alone advance. In that regard, Lisp has not been as an advantageous language as it could have been.

Lisp (as in Common Lisp and its add-ons) is not a simple language and it is not a consistent language (see CLHS - Common Lisp Hyper Spec docs).

When I first came across it in the latter 1970's, I thought "wow". But its flaws quickly came to the fore.

So, there is no way that it would ever be God's own programming language. Especially since, God doesn't need to program, that's just for us very limited mortals.

> then it got broken

Got broken? I think of it more as having failed to obtain/coordinate the resources needed to progress.

What it means to have a healthy language ecosystem has advanced. 1970's Prolog implementations couldn't standardize on a way to read files. 1980's CommonLisp did, but had no community repo. 1990's Perl did, but few languages then had a good test suite, and they were commercial and $$$$. Later languages did, but .

And it's not easy for a language to move on. Prolog was still struggling with creating a standard library decades later. CommonLisp and Python had decade-long struggles to create community repos. A story goes that node.js wasn't planning on a community repo, until someone said "don't be python".

The magnitude of software engineering resources has so massively ramped, that old-time progress looks like sleep or death. Every phone now has a UI layout constraint system. We knew it was the right thing, craved it, for years... while the occasional person had it as an intermittent hobby project. That was just the scale of things. Open source barely existed. "Will open source survive"? was a completely unresolved question. Commercial was a much bigger piece of a much smaller pie, but that wasn't sufficient to drive the ecosystem.

The Haskell implementation of Perl 6 failed because the community couldn't manage to fund the one critical person. It was circa 2005, and the social infrastructure needed to fund someone simply wasn't the practiced thing it is now.

And we're still bad at all this. The javascript community, for all it's massive size, never managed to pick up the prototype-based programming skills of self and smalltalk. The... never mind.

It's the usual civilization bootstrap sad tale. Society, government, markets, and our profession, are miserably poor at allocating resources and coordinating effort. So societally-critical tech ends up on the multi-decade glacial-creep hobby-project-and-graduate-student installment plan. Add in pervasively dysfunctional incentives, and... it becomes amazing that we're making such wonderful progress... even if is so very wretchedly slow and poor compared to what it might be.

So did CL get broken? Or mostly just got left behind? Or is that a kind of broken?

You raise interesting history and it's a good thing to see the perspective as you've given.

I don't know if Common Lisp got left behind or just took a completely different path. From my perspective, it got broken with its macro system decisions, it dynamic/static environment decisions and its namespace decisions. It created too many second class citizens within the language which means that you have to know far more than you should in understanding any part of the programs you are looking at.

Every choice a language designer makes affects what the language will do in terms of programmer productivity, not only for the original developers of programs using that language, but also for all those who come later when maintaining or extending those programs.

I have come to the conclusion that a language can be a help when writing the original program and become a hindrance when you need to change that program for any reason. It is here that the detailed documentation covering all the design criteria and coding decisions, algorithm choices, etc, become more important than the language you may choose.

Both together will enable future generations to build upon what has been done.

All the points that you have highlighted above are important, but the underlying disincentive to provide full and adequately detailed documentation will work against community growth. No less today than in the centuries past is the hiding away of knowledge where individuals are not willing to pass on the critical pieces unless you are a part of the pack or do not think it is important enough to write down because it is obviously obvious.

To understand a piece of Lisp code, one has to know what the special forms and how they interact, what the macros being used are and what code they are generating and what the various symbols are hiding in terms of their SPECIALness might be. These things may help in writing the code, but they work against future programmers in modifying the code. Having had to maintain various code bases that I did not write in quite a variety of different languages, I have found that "trickily" written code can become a nightmare to bring about required changes. I have found that Lisp code writers seem to like writing "trickily" written code.

Now, that is only one person's perspective and someone else may find something completely different. That is not a problem as there are many tens of .... programmers in the world. Each one having a perspective on how to write good code.

> namespace

Nod. I fuzzily recall being told yeas ago of ITA Software struggling to even build their own CL code. Reader-defined-symbol load-order conflict hell, as I recall. And that was just a core engine, embedded in a sea of Java.

> second class citizens

I too wish something like Kernel[1] had been pursued. Kernel languages continue to be explored, so perhaps someday. Someday capped by AI/VR/whatever meaning "it might have been nice to have back then, but old-style languages just aren't how we do 'software' anymore".

> detailed documentation covering all the design criteria and coding decisions

As in manufacturing, inadequate docs can have both short and long-term catastrophic and drag impacts... but our tooling is really bad, high-burden, so we've unhappy tradeoffs to make in practice.

Though, I just saw a pull request go by, adding a nice function to a popular public api. The review requested 'please add a sentence saying what it does.' :)

So, yeah. Capturing design motivation is a thing, and software doesn't seem a leader among industries there.

> enable future generations to build upon what has been done.

Early python had a largely-unused abstraction available, of objects carrying C pointers, so C programs/libraries could be pulled together at runtime. In an alternate timeline, with only slightly different choices, instead of monolithic C libraries, there might have been rich ecology. :/ The failure to widely adopt multiple dispatch seems another one of these "and thus we doomed those who followed us to pain and toil, and society to the loss of all they might have contributed had they not been thus crippled".

> To understand a piece of Lisp code [...struggle]

This one I don't quite buy. Java's "better for industry to shackle developers to keep them hot swappable", yes, regrettably. But an inherent struggle to read? That's always seemed to me more an instance of the IDE/tooling-vs-language-mismatch argument. "You're community uses too many little files (because it's awkward in my favorite editor)." "You're language shouldn't have permitted unicode for identifiers (because I don't know how to type it, and my email program doesn't like it)." CL in vi, yuck. CL in Lisp Machine emacs... was like vscode or eclipse, for in many ways a nicer language, that ran everything down to metal. Though one can perhaps push this argument too far, as with smalltalk image-based "we don't need no source files" culture. Or it becomes a "with a sufficiently smart AI-complete refactoring IDE, even this code base becomes maintainable".

But "trickily" written code, yes. Or more generally, just crufty. Perhaps that's another of those historical shifts. More elbow room now to prioritize maintenance: performance less of a dominating concern; more development not having the flavor of small-team hackathon/death-march/spike-into-production. And despite the "more eyeballs" open-source argument perhaps being over stated, I'd guess the ratio of readers to writers has increased by an order of magnitude or two or more, at least for popular open source. There are just so very many more programmers. The idea that 'programming languages are for communicating among humans as much as with computers' came from the lisp community. But there's also "enough rope to hang yourself; enough power to shoot yourself in the foot; some people just shouldn't be allowed firearms (or pottery); safety interlocks and guards help you keep your fingers attached".

One perspective on T(est)DD I like, is it allows you to shift around ease of change - to shape the 'change requires more overhead' vs 'change requires less thinking to do safely' tradeoff over your code space. Things nailed down by tests, are harder to change (the tests need updating too), but make surrounded things easier to change, by reducing the need to maintain correctness of transformation, and simplifying debugging of the inevitable failure to do so. It's puzzled me that the TDD community hasn't talked more about test lifecycle - the dance of adding, expanding, updating, and pruning tests. Much CL code and culture predated testing culture. TDD (easy refactoring) plus insanely rich and concise languages (plus powerful tooling) seems a largely unexplored but intriguing area of language design space. Sort of haskell/idris T(ype)DD and T(est)DD, with an IDE able to make even dense APL transparent, for some language with richer type, runtime, and syntax systems.

Looking back at CL, and thinking "like , just a bit different", one can miss how much has changed since. Which hides how much change is available and incoming. 1950's programs each had their own languages, because using a "high-level" language was implausibly heavy. No one thinks of using assembly for web dev. Cloud has only started to impact language design. And mostly in a "ok, we'd really have to deal with that, but don't, because everyone has build farms". There's https://github.com/StanfordSNR/gg 'compile the linux kernel cold-cache in a thrice for a nickle'. Golang may be the last major language where single-core cold-cache offline compilation performance was a language design priority. Nix would be silly without having internet, but we do, so we can have fun. What it means to have a language and its ecosystem has looked very different in the past, and can look very different in the future. Even before mixing in ML "please apply this behavior spec to this language-or-dsl substrate, validated with this more-conservatively-handled test suite, and keep it under a buck, and be done by the time I finish sneezing". There's so much potential fun. And potential to impact society. I just hope we don't piss away decades getting there.

[1] https://web.cs.wpi.edu/~jshutt/kernel.html