I think John Carmack called it right. It just makes sense if you are dealing with a lot of complicated state in situations where side effects on state can cause some really hairy issues (threading, memory, real time behavior, etc.).

So, a lot of languages have adopted the more popular mechanisms for doing things in a functional style from each other. Even Java has had lambda functions and streams for a while now. And it seems like a popular feature. And of course John Carmack was talking about this in the context of C++. If you are going to do stuff with elements in a list, you might as well use something called map. Is that functional programming or just common sense?

So, the short answer is "because it works and adds value". Of course, people tend to get carried away with being pure and pedantic a bit around this topic. And never mind things like monads, which brings out all the arm chair philosophers. But some of that stuff is pretty neat and not that complicated.

The role of academia is to move the field forward and experiment with different ways of doing things. Not all of those things work well in the real world. E.g. logic programming (prolog) is cool but ultimately never really caught on. And there have been quite a few dead ends with whole families of languages just never really getting a lot of traction only to see later languages embrace bits and pieces of it. The influences of other languages on javascript for example is fairly interesting.

> Not all of those things work well in the real world. E.g. logic programming (prolog) is cool but ultimately never really caught on.

It does have its niches though. For example, there is a trait solver for Rust called Chalk that uses a Prolog-inspired language because trait bounds basically define a logic:

https://github.com/rust-lang/chalk