The problem is that both hardware and software are garbage.

Spectre/Meldown & friends are just the tip of an iceberg. We have layers & layers of indirection/abstraction everywhere. We have hardware that lies and tells you that it has certain properties when in reality it doesn't (example: sector sizes in hard drives/NVMs, processors still pretending that they behave like PDP-11), we have hardware that is flat out broken. We try to fix those issues in software.

But in the software, we have another dump of workarounds, dependencies, abstractions with a sprinkle of backward compatibility. We are now creating "minimalist" applications with a fraction of functionality of the software from 30 years ago but using so many layers that total amount of code used to make it work is many orders of magnitude larger than what we had back then.

I know that most of the programmers did not work with systems where it's very, very easy to debug the whole stack and you can learn it in a short period but it's amazing when you have knowledge about EVERY part of the system in your head.

There are some good things going on (like strive to replace C with something which has similar performance characteristics but without its flaws) but it's not enough.

Here are two things worth watching:

https://www.youtube.com/watch?v=pW-SOdj4Kkk - Jonathan Blow - Preventing the Collapse of Civilization

https://www.youtube.com/watch?v=t9MjGziRw-c - Computers Barely Work - Interview with Linux Legend Greg Kroah-Hartman

> The problem is that both hardware and software are garbage.

I think it's incredible that, in my lifetime, computers went from giant mainframes with dedicated hard-line terminals to always-connected supercomputers in everyone's pocket, worldwide. Furthermore, anyone can use the internet to learn how to program.

Maybe that's garbage compared to some mythical ideal but in terms of impact on the world it's incredible.

> I know that most of the programmers did not work with systems where it's very, very easy to debug the whole stack and you can learn it in a short period but it's amazing when you have knowledge about EVERY part of the system in your head.

Well, you can tell from the above that I was around then. I started programming with a single manual and the Beagle Brothers "Peeks, Pokes and Pointers" cheatsheet[1].

People forget that the software itself had to do much less than it does today. Here's just one angle to judge: security. We did not have a worldwide interconnected network with so many people trying to steal data from it. We all used rsh and packets were flying around in cleartext, no problem. But today, all software will have to incorporate TLS.

And far fewer people built that simpler software. EA's first titles were written by one or two people. Now a typical EA title has hundreds of people working on it.

Things will get better than where they are today. In the future, the industry will have invest more money in "10X" productivity and reliability improvements. Eventually, I think that will happen as productivity continues to slow on large codebases.

[1] - https://downloads.reactivemicro.com/Apple%20II%20Items/Docum...

And yet, even doing something as basic as a smart doorbell requires a backend server somewhere (essentially a mainframe) and all kinds of NAT hole-punching, UPnP and proprietary push notifications despite the supercomputer in your pocket technically being able to listen for incoming TCP or UDP packets from the doorbell directly.

The "supercomputer" processing power is also being wasted on all kinds of malicious and defective-by-design endeavors such as ads, analytics, etc (install any mainstream app and look at the network traffic, 90% of it will be for analytics and can be blocked with no ill effects).

Despite the supercomputers being 10x faster than the early ones (back in the iPhone 3G days) we somehow lost the ability to render a 60fps non-stuttering UI despite modern UIs being less rich and consisting mostly of whitespace.

> anyone can use the internet to learn how to program

I think there used to be a "golden age" of this where the resources were all available for free and at the same time the stacks were manageable (think basic PHP deployed on shared hosting or Ruby or Django) where as nowadays it is considered "wrong" if you don't use Kubernetes, 10 microservices and 100+ NPM packages just to serve a "Hello world" page to a browser.

It requires all of those things because users want a doorbell that will easily connect to WiFi and a nice phone app, that includes not only movement alerts but also a miniature social network. And the startup making the doorbell wanted to make a quick proof of concept to get funding and then build organically using a flock of less expensive younger developers.

Alternatively, a company could invest money into writing something that looks beautiful to software developers that you could SSH into. The architecture would be sound because several gray-beared devs would talk about the relative merits of different approaches. It could offer so much more functionality if the user is willing to acquire minimal knowledge of Linux. The only problem is that the only people interested in it would be other software developers.

We're building stuff for people, not other geeks. Businesses invest in software as a means to an end, and if the backend is ugly but accomplishes that end, then it's successful.

...what? The doorbell can't talk to the phone because the supercomputer in our pockets is not really ours - it's functionality is gimped by the various software and operating systems providers that control it from the respective motherships.

This is accurate, but is not the whole story. I’m running a terminal on iOS right now to ssh into my Linux server.[1] There are terminal emulators on Android too.

I wish there were more GUI apps centered around hybrid cloud/shell use cases. I would like to be able to make GUI widgets to do things in a ssh session on my server. I’m not sure how important it would be to run on the device; it could be a webapp I run on the server itself to trigger scripts. It’s a UI/UX that centers around touchscreen input, is reconfigurable, and can perform arbitrary commands or events server-side, which I find lacking. Anyone know of tools that scratch this itch?

[1] https://github.com/ish-app/ish