Cargo in terms of ease-of-use but not in terms of scale. You only get the packages that Meson (or the community?) has written build scripts for. By my count that's ~144 packages [1]. That's nowhere near as pervasive as Cargo given that basically any Rust project published will use Cargo vs C/C++ that has a mix of CMake & autoconf (largely standard for OSS) & BUCK/Bazel for Facebook/Google-influenced (potentially multi-language) codebases. That's also not including all the IDE-specific or platform-specific workspaces (Visual Studio, Xcode, gradle NDK builds, etc).

It's a nice thought but you have to be confident every 3rd party project you use is one whose parallel build you'll maintain as it updates. That's the strength of Cargo - it's the way to write Rust code & the maintenance is done by upstream for everyone.

Additionally Cargo's strength isn't just it's packaging & publishing system. Having a single Rust compiler that everyone uses on all platforms is powerful. C++ might get there eventually with clang but the ecosystem of libraries will need to support that as a baseline. Currently there's too many compiler or platform-specific hacks that are done to regularize the environment a bit whereas Rust generally does all of that under the hood for you (i.e. try writing any cross-platform BSD socket code - it's a total shitshow that basically results in Winsock & all other POSIX OSes basically different net stacks). Rust also comes with a very deep standard library for I/O, path manipulation etc whereas the C++ equivalents for all of this are still surprisingly lacking even in C++20 (especially on the network side).

[1] Based on manually counting the packages listed here https://wrapdb.mesonbuild.com/

Having multiple compiler implementations is a strength, not a weakness. As long as Rust is not standardized and documented only by the implementation's code it will never be a serious language.

What is a serious language? Is being used in companies like Amazon to deliver runtimes used by a considerable % of the internet not serious?

No, "used for pet project by mismanaged megacorp" is not a sign of seriousness. (I'm sure there's some team inside FAANG that's using Brainfuck in production right now.)

"The Rust development team isn't interested in supporting your obscure embedded architecture" or "sorry, this platform was phased out due to technical debt" is what I mean by not serious.

Look at Python and how any attempts to make alternative interpreter implementations ultimately failed. (And they at least have PEPs.)

I'd hardly call for example firecracker a "pet project by mismanaged megacorp". It's running production workloads - and at AWS scale, I'm fairly certain it's running more work than most of us here ever touched in their career total. https://github.com/firecracker-microvm/firecracker

> And they at least have PEPs.

Rusts language development has been guided by RFCs since like the beginning. There's no comprehensive standard as in "somebody went and wrote down the current state", but it's also not the case that there's no written spec of how things should behave. You could base a new compiler on that and there's actually developments to have different backends to rustc. Cranelift for example https://github.com/bytecodealliance/wasmtime/blob/main/crane...

> "The Rust development team isn't interested in supporting your obscure embedded architecture" or "sorry, this platform was phased out due to technical debt" is what I mean by not serious.

So any compiler team that stops supporting some sort of architecture because it's either rare or no longer worth the effort is not serious in your opinion? Like GCC, that has a decision making matrix for that? https://gcc.gnu.org/backends.html And dropped various architectures along the way?