This is such an ignorant and low-effort post.

The author did not bother to check whether the delay comes from the library loading time vs. the actual string formatting.

He is complaining about a single millisecond difference (that is near the measurement error) and haven't bothered to run the entire thing in a loop and get the difference to something meaningful (where the terminal performance can eat away any noticeable differences).

He picked a completely unrealistic metric (console output is usually meant to be read by humans, so it was never optimized for +- millisecond performance).

He doesn't distinguish between the language (C++) and the I/O API (printf() vs iostream).

The main advantage of C++ is that if you bother to understand how it works, you can let the compiler/debugger do a lot of mind-numbing and error-prone work that C developers proudly do by hand (collections, RAII, inline templates with concepts) at exactly 0.0 runtime overhead. The key is "if you bother to understand how it works". If you randomly pick an arbitrary API and complain how it's worse than a different arbitrary API from C for an arbitrary use case, it only shows your own incompetence.

The author knows C++ (https://github.com/simdjson/simdjson) and writes a lot about his performance experiments. I don't see why he - or anybody else - shouldn't raise such questions and arguments without having people (like you) getting angry about it. Is it offensive?

Anyway, he has a point, `cout` is used extensively as a logging mechanism. If you don't see that "single millisecond" making any difference, you certainly haven't work on a relevant system.