I've read that doctors now spend as much as 50% of their time documenting their work. Companies such as Epic, which provide the software that hospitals use to build databases of patient data, have been big winners in the new world of hospitals-depending-on-software. But did the doctors become more productive? By almost any measure, they became less productive.

People in tech keep thinking more tech will solve problems and they keep underestimating the flexibility of the old models. For instance, most large companies used to be run by armies of secretaries, and the senior secretaries functioned as what we would now call "project managers" -- they made calendars, oversaw who was working on what, followed up to keep track on whether work was being done, and kept a close eye on what money was being spent. The crucial thing about having humans overseeing such work is that humans can take a flexible approach to the rules: they know when to break them. By contrast, systems that are highly dependent on software tend to be more rigid. Software doesn't know when its rules should be broken.

The flexibility of the old system is constantly underestimated, the rigidness of the new systems is often misunderstood.

In his book "The Design Of Design" Fred Brooks talks about the power of trust, and he contrasts that situations where everything needs to be first negotiated and specified in a contract. High trust systems are flexible and fast, whereas a system where every detail needs to be specified in a contract is slow and rigid. We should stop and ask ourselves, our favorite Agile methodology resembles which of these? Are specifying things with needless detail?

I don’t think productivity was ever the goal of this software. It was to have a record that is standard, digital, transferable, etc. Doctors fought it as long as they could because they knew what it meant for them.

I remember pretty early demos in early/mid 2000s when I was doing some clinical grunt work in college. I had written some software to make my department’s life easier so I was offered up as the hospital’s liaison for the software evaluation. This is when I formed my “never replace a terminal based app, with a GUI based app and expect productivity gains” theory. Everyone working in the hospital knew the terminal app, they type in some random 3 letter code and a screen would pop up. Then they would memorize how many tabs each field was apart from each other. Without a mouse, people could just hum along imputing data a blazing speed once some muscle memory was in place. Everyone had little cheat sheets printed out for the less frequently used commands/codes. When you replace this with a browser/desktop GUI with selectors and drop downs and reactive components of GUI, it tends to 1) require mouse usage for most people and 2) lose the ability to do this quick data entry I described. The pretty interface becomes a steady stream of speed bumps that reduce productivity. Since then I’ve witnessed it in banking and other industries too.

IMHO, this is because the people writting GUI's these days are mostly incompetent, or hamstrung by "web" technologies.

Early GUI's didn't have the problem you describe because they were designed as discovery mechanisms to the underlying function. AKA, the idea was that after clicking File->Save a dozen times you would remember the keyboard accelerators displayed on the right hand side of the menu. Or if nothing else, Remember that the F in File was underlined along with the "S" in Save (or whatever). Which would lead people to just press ctrl-s, or Alt-F, S. Then part of testing was making sure that that the tab key moved appropriately from field to field,etc.

I remember in the 1990's spending a fair amount of time doing keyboard optimization in a "reporting" application I wrote (which also had an early touchscreen) for use by people who's main job wasn't using a computer. Then we would have "training" classes and watch how they learned to use it.

So, much of this has been lost with modern "GUI's", even the OS vendors which should have been keeping their human interface guidelines updated, did stupid things like _HIDE_ the accelerator keys in windows if the user wasn't pressing the Alt key. Which destroys discoverability, because now users don't have the shortcut in their face. Nevermind the recent crazy nonsense where links and buttons are basically the same thing, sometimes triggering crazy behaviors like context menus and the like. Or just designing UI's where its impossible to know if something is actually a button because the link text is the same color as the rest of the text on the screen..

I feel like the push to make software accessible (in a new-user, not disability, context) and intuitive has made complexity the enemy. Instead of having software that grows with the user's capability, features are hidden from the top layer of interactivity or just cut entirely.

I was at the post office here in Australia a few years ago and saw the screen. It was one of those DOS-era full screen red and blue text interfaces. She was flying through hotkeys and getting things done. People can learn, so much software treats them as infants.

And you know there's definitely someone looking at replacing that software with a modern GUI.

For future reference (and anyone following along later), that is an "ncurses" terminal application.

You should see the customized JBHIFI terminal + keyboard.

So should we be doing demos in Bash, with an ncurses or gum workflow before going back to write the proper system in C?

I dont see the problem with this. What is "Gum workflow" in this context ?

Neither do I.

It was supposed to be read:

A [ncurses or gum] workflow.

Gum being an alternative method of making a Bash UI.

https://github.com/charmbracelet/gum