the post focusses on the practicality of haskell and addresses purity but leaves the biggest problem out: laziness, which is more or less the reason haskell exists, is the wrong default. In particular lazy I/O which can introduce horrible bugs or straight up mess with execution order of critical code, which in my opinion is an absolute no-go in an industrial language.
> lazy I/O which can introduce horrible bugs or straight up mess with execution order of critical code
I've never seriously programmed Haskell, so honest question: I understand the first point, but how is the second possible? Isn't the point of passing RealWorld around that you enforce the order of execution through data dependencies rather than expression order? It always seemed like a very elegant (and incredibly impractical :) ) solution to me.
fd <- open "/some/path"
s <- readContentsLazy fd
close fd
pure $ processString s
Now, processString is getting a string with the file's contents, right? Nope, you have a cons cell that probably contains the first character of the file, and maybe even a few more up to the first page that got read from disk, but eventually as you're processing that string, you'll hit a point where your pure string processing actually tries to do IO on the file that isn't open anymore, and your perfect sane and pure string processing code will throw an exception. So, that's gross.That's a real issue that will hit beginners. There's been a lot of work done to make ergonomic and performant libraries that handle this without issues; I think that right now pipes[0] and conduit[1] are the big ones, but it's a space that people like to play with.
[0] - https://hackage.haskell.org/package/pipes [1] - https://github.com/snoyberg/conduit