Seems to be better exceptions than what is already there. And fully compliant as far as I can tell. Someone just had to go out there and do it. From doing my own benchmarks on exceptions, I already know you should basically never use them. The bloat and unexplained slowdowns in real programs is just a headache you don't want.

In the end, I still use exceptions, so this gives me hope. I like C++ exceptions as a feature, but I'm not happy about how it's implemented right now.

Out of interest (its decades since I last used C++) - do C++ exceptions cause an overhead generally or only if they are thrown?

With the more common modern compilers (Microsoft, GCC, LLVM, ICC, many others) there is zero runtime overhead to programming with exceptions unless they are thrown. This is possible because the unwind information is stored externally to the function code. There is still an additional overhead in terms of space, since the .eh_frame (or equivalent) sections need to loaded and possibly relocated by the linkloader at program startup, although that's minimal work if addresses are in self-relative tables like in the ARM EABI.

The alternative to having the exception-handling code out of line is, of course, handling all exceptional conditions in line, which not only makes the code harder to read but makes the generated object code bigger and, if it fails to fit in a cache line or causes unwholesomely large numbers of branch evaluations (see spectre and meltdown) results in slower or more insecure code. The real answer to C++ exceptions is the classic C-style programming where all errors are just ignored.

Of course, no amount of reasoning or hard data survives in the face of religious belief, and 80% of programmers out there are part of one cargo cult or another, so carry on with what you were doing. It's probably good enough.

I've had major regressions in performance when a thrown exception was added.

We narrowed it down to the fact that the compiler wasn't free to A) inline the function, B) reorder internal operations in the function which came before vs. after the exception being thrown.

Yes, exceptions affect optimizations. The crux is that anything that might throw is a barrier to any other statement with observable side effects before or after it. Neither can be moved to the other side of the potential thrower. However, explicit error handling adds a branch and explicit return inatead of a potentially throwing statement, so if the compiler is rightfully pessimistic, no performance is lost. In the other hand, you can make the compiler ignore the potential for exceptions by declaring functions and methods nothrow. This will make the compiler ignore all considerations for potential exceptions in callers and lead to better optimizations in some cases. But sticking nothrow onto things is hard because when a nothrow fuction does happen to throw, you're in UB land.

> because when a nothrow fuction does happen to throw, you're in UB land

IIRC that was the originally proposed behavior for noexcept, but it was changed at some point to call std::terminate instead.

Yikes, of course it should be noexcept instead of nothrow in my previous post!

However, I don't find any referenc to noexcept having gotten defined behaviour. Do you have any pointers?

In the current version of the standard on the GitHub repo [0], Section 14.5 Paragraph 5 says:

> Whenever an exception is thrown and the search for a handler (14.4) encounters the outermost block of a function with a non-throwing exception specification, the function std::terminate is called (14.6.1).

I don't know when precisely noexcept was changed to call std::terminate, but I did find N3103 (included in the 2010-08 post-Rapperswil mailing [1]), which argued that the standard should require calling std::terminate to avoid security issues.

[0]: https://github.com/cplusplus/draft [1]: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2010/n310...