Quick question: why did we stop producing lisp machines, or any other machine more closely related to Lambda calculus model of computation than a Turing machine? Is it merely cultural, or are there technical reasons as to why we stopped producing them competitively?
In short Moore's Law.
Between 1970 and 2010 if you could design special purpose hardware that ran 10 times faster than the state of the art you would need to get it to market in volume in under 3 years.
If you took any longer the general purpose CPUs from Intel would by that point be within spitting distance of your superior architecture, at a fraction of the cost.
That's what happened to Symbolics, general purpose PC's could run their software faster than the dedicated machines they designed.
Also if you make hardware absolutely optomised for Lisp then all the C, FORTRAN, and Cobol (this was the 1970's!) programmers aren't particularly interested because it does nothing for their code.
So the Lisp machines were targeting a fairly small segment of the market that might have exploded into the mainstream but didn't, so they had less income to reinvest and therefore struggled to maintain their lead over the general purpose/mass market hardware vendors.
The early architectures were micro-programmable. For example the Xerox Lisp Machines were using the exact same hardware as the Smalltalk or the Xerox office systems running Mesa-software. See: https://en.wikipedia.org/wiki/Xerox_Star
Symbolics had C, Pascal and Fortran compilers written in Lisp running on their machines. Others developed an Ada system for these machines. Drawback for all those: expensive and running not that fast. The interesting part: they allowed interactive development in these languages and one could run some C software a bit more safely due to runtime type checking. Examples for non-Lisp programs running on them: TeX and the usual X11 server.
The explosion did not happen during their lifetime, because the whole package was expensive (RAM&Disk was very expensive), it was tied to higher-end hardware with lots of memory, it has never been ported to other architectures on the metal (an Alpha machine - expensive already - would have been fast enough to run a full Lisp OS - but all one had was an emulator for it ... there was not enough manpower to make it run natively) and because the OS was mostly limited to one person, one GUI and one Lisp per machine - it was no real server OS with securely (somehow) separated programs.
Possible additional point: going into the 80s/90s (the last gasps of "alternative" personal computing architecture), there were not many widely-adopted standards across computing. This is why we all bitched about file incompatibility across systems etc. But now we have all of this great stuff: format standards, APIs, hell, even TCP/IP is a universal standard that allows all manner of exotic hardware to speak to the rest of the world.
I think something like Lisp/Smalltalk/SomethingNewOS machines might have another shot here soon, especially with the loss of gains from Moore's law
I'd love to have a true Lisp or Smalltalk OS.