Interesting to see BearSSL mentioned today because I have just started to replace OpenSSL with BearSSL in my webserver. The reason for that is that the server and the computer I develop on run different versions of Debian that have different versions of OpenSSL, so the server suddenly stopped working because the program linked dynamically to a newer version of OpenSSL that the server didn't have. I tried to link OpenSSL statically but at that point i thought it was better to move to BearSSL.
How does that help? Unless bearssl has a perfectly stable api aren't you just trading one library for another?
If I'm going to link statically, I prefer a smaller library, both for size and security reasons. I was also getting some issues when linking OpenSSL statically and I think (or hope) that will be easier to do with BearSSL.
In this case, the size might not give better security. OpenSSL is one the most tested products on the planet.
Being "one the most tested products on the planet" (citation needed, btw) did not prevent Heartbleed.
No software can be quaranteed to be bug-free.
And Heartbleed bug was in the extension, not in the core software.
> (citation needed, btw)
Not really. But if you insist, OpenSSL is defacto crypto library [1] , at least on server side. Browsers and OpenSSL takes the most burden for securing every-day use of internet in the world. There are billions testers every day.
> No software can be quaranteed to be bug-free.
Sure, this is done all the time via verification and other formal methods. It's not common in the industry, but not entirely uncommon either.
got any examples of this being used in popular FOSS software?
CompCert [2] is a formally verified optimizing C compiler.
Most formal verification happens in compilers or low-level mission critical systems due to the cost.
If you want to write some formally verified C, you can check out Frama-C [3].