> Recently, I had a chat with some of my friends about Nushell and why they stuck with traditional shells like bash/zsh or the "new" hotness like fish rather than using Nushell.

Umm... because Nushell is pre-1.0, with major language changes every few weeks?

Living on the bleeding edge isn't everyone's cup of tea, and certainly not with something as fundamental to the system as a shell. Rest assured I will switch the moment 1.0 is released and a stability promise is published.

That being said, yeah – Nushell is the real deal. It's the Unix philosophy, except this time it actually works.

> It's the Unix philosophy, except this time it actually works.

How is Nushell compatible with the Unix philosophy? The core Unix philosophy is about enabling workflows that combine a set of independent tools in novel ways. Nushell implements a monolithic ecosystem with commands that support its fancy features, but using any external command not written for Nushell will be cumbersome at best, and incompatible at worst. This goes against the open philosophy of Unix, which is partly what has allowed it to grow and succeed.

> The core Unix philosophy is about enabling workflows that combine a set of independent tools in novel ways.

You mean like

    ls | rm
deletes the files in a directory?

Uh wait, that doesn't actually work.

Because the Unix tools actually don't implement the Unix philosophy at all.

Because as it turns out, text streams are not a universal interface after all.

Nushell succeeds where the coreutils have failed for decades precisely because its commands are designed to work together. They pass structured data around, which means that pipelines can actually work without requiring weird hacks to make one underspecified text format conform to the other. That is the Unix philosophy: Tools that work together.

But don't worry, the old Unix tools can be used from Nushell – and they work together just as poorly in Nushell as in every other shell, not one bit worse.

We've had this discussion before[1]. :)

It's a huge leap to assert that Unix tools don't implement the Unix philosophy because a specific example doesn't work in the exact way you expect it to. You could just as well implement a version of `rm` that parses stdin in a specific way by default, but that would likely make it work for that specific use case, and not in the generic and composable way the tools are meant to be used.

The whole point of Unix is for tools to be independent of each other _and_ of the environment they run in, while giving the user the freedom to compose them in any way they need. What Nushell has built instead is a closed ecosystem of tools that interoperate well with each other, but not with the outside environment. This means that a) the ecosystem is not extensible unless one contributes to Nushell, or directly implements Nushell features, and b) external tools will always be second-class citizens, that might even be incompatible with Nushell.

To give you an example, how would I use GNU ls within Nushell? Suddenly, I can't use any of the `where` fanciness, and my entire pipeline breaks. I would have to resort to another Nushell-specific helper to integrate the command, which needs to be written in a very generic way to support a wide range of use cases, or switch my entire pipeline to use external commands only, which defeats the purpose of using Nushell to begin with.

This is a contrived example, but if you compound this with the amount of CLI tools that exist and have yet to be written, it's a usability and maintenance headache.

So I'm glad that Nushell exists, as it challenges existing notions of what a shell can be, but let's not disparage existing shells or Unix to prove a point. The Unix ecosystem is so widespread and successful today _because_ of these early decisions, and will more than likely continue to exist because of them as well. That doesn't mean that we can't do better, but I'd argue that a monolithic shell with a strict contract between commands is not the way to build a sustainable and future-proof ecosystem.

[1]: https://news.ycombinator.com/item?id=36706617

> That doesn't mean that we can't do better, but I'd argue that a monolithic shell with a strict contract between commands is not the way to build a sustainable and future-proof ecosystem.

I don't think it is far fetched to imagine that most popular command line tools will have support for JSON output within let's say 5 years. With that, I think nushell's value proposition becomes a whole lot stronger. Granted, there will never be a time when all tools integrate well, but I can see a critical mass evolve.

I absolutely think that treating data piped from or to another process as JSON by default, and treating data going back to the terminal as text (by transforming the JSON to tabular data etc.) is the way to go (also because it's backwards-compatible). To the point that I wanted to write some wrappers for the standard commands that automatically did all this via `jq`. I know commands can detect whether they're being piped to another command or to a terminal, since certain commands will automatically skip ANSI coloring when being piped elsewhere...

> I wanted to write some wrappers for the standard commands that automatically did all this via `jq`.

If you're not already aware of it, you may wish to check out `jc`[0] which describes itself as a "CLI tool and python library that converts the output of popular command-line tools, file-types, and common strings to JSON, YAML, or Dictionaries. This allows piping of output to tools like jq..."

The `jc` documentation[1] & parser[2] for `ls` also demonstrates that reliable & cross-platform parsing of even "basic" commands can be non-trivial.

[0] https://github.com/kellyjonbrazil/jc

[1] https://kellyjonbrazil.github.io/jc/docs/parsers/ls

[2] https://github.com/kellyjonbrazil/jc/blob/4cd721be8595db52b6...