The rustnomicon is a very good resource. It's less of "the unsafe book" and more "the advanced book", covering lower-level details and advanced concepts that aren't touch on by the book. One gripe I have is that it kind of pushes the idea that unsafe is bad and evil, which is not true at all:

> The Dark Arts of Unsafe Rust

> THE KNOWLEDGE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF UNLEASHING INDESCRIBABLE HORRORS THAT SHATTER YOUR PSYCHE AND SET YOUR MIND ADRIFT IN THE UNKNOWABLY INFINITE COSMOS.

I know it's a joke, and I laughed when I first read it. Unsafe is dangerous and requires care, but unsafe == evil is definitely not true. Jon Gjengset said it well in "Demystifying Unsafe Code" [0]:

> One thing I keep observing about the Rust community is how we're all like allergic to unsafe code or we think that unsafe code is is totally fine and not nothing to worry about, and I wanted to take a little bit of time to talk about what unsafe is because I think a lot of where the communities lack of alignment on this particular topic comes from a fact that unsafe is a bit of a mystery to many of us

Learning about unsafe is useful, and at time necessary, it's not inherently evil, nor should it be used too freely.

[0]: https://www.youtube.com/watch?v=QAz-maaH0KM

I'm no stranger to unsafe code but in my professional work, I avoid it almost any cost. In my experience the common denominator among average engineers is an aversion to the kind of processes that derisk unsafe code, where average == "I can reliably find at least two engineers at this skill level within the next few months with competitive but not FAANG-level salaries". This is why Rust is so successful in the first place: it's a low level language disguised as a high level language (or the way around, depending on your perspective) where you don't have to worry about fuzzing or vagrant. That `unsafe` is avoided like the plague is a great feature of the Rust ecosystem and IMO an achievement in its own right.

I've actually found the reverse problem to be even more detrimental: people coming from high level languages like Python are getting tripped up because they try to go too low level to make up for tradeoffs they no longer have to make. Rust makes it painfully obvious when wrapping a shared value in a Rc+RefCell/Arc+Mutex or cloning so it seems like an antipattern to newbies. Even though the majority of them are coming from languages where everything is implicitly reference counted and performance is abysmal, they worry about incrementing an atomic integer or acquiring a lock in a program that will only ever run on x64. I can't imagine what they'd come up with using `unsafe` if the community didn't literally refer to it as the dark arts.

I've found I constantly want to build off of an existing c or c++ library, so I need unsafe. I'm much less experienced than the people's code I'm using, and I'm generally fine with "Rust protects against me, if the lib authors screwed up then bad things happen".

There are tools to automatically generate "safe" C++-Rust bindings (the intent being to restrict unsafety to trusting C++ code to be sound, without the risk of screwing up the bindings): https://github.com/dtolnay/cxx and https://cxx.rs/.

However, creating bindings without typing out `unsafe` was a controversial issue, discussed at https://www.reddit.com/r/rust/comments/ielvxu/the_cxx_debate... .

Also there are tools to automatically parse C++ headers and turn them into cxx bindings: https://github.com/google/autocxx and https://docs.rs/autocxx/