sounds like the same experience I have with pretty much every language I'm not used to.

I just want to try this C++, download, unzip, oh it's windows so .project file. Fine, redo on windows , oh it's 3 versions of vstuido old and says it wants to upgrade , okay. Hmm errors. Try to fix. Now it's getting linking error.

repeat the same with xcode and any project in c, c++, objc, swift

okay how about ruby? oh I have old ruby , hmmm , try to install new ruby, seems to run, but it can't find certain gems or something. oh and this other ruby thing I was using is now broken ? why do I have to install this stuff globally? You don't but there are several magic spells you must execute and runes you must set in the rigtt places. Oh ... I think I got that setup but it getting this new error.

....

I'm sypathetic it sucks but it's not unique to js

Meanwhile I've been trying to warn the Rust project team, or more specifically, the Cargo/Crates teams that they're cheerfully skipping down the easy/happy path towards the same mess.

It's inevitable that the conclusion will be the same if the same steps are taken in the same direction. The counter-arguments fail to recognise the special case of "most of us are from Mozilla and we all get along[1]", "no-one has attacked us yet so no-one ever will[2]", "it's tracktable to untangle the dependency hell in your head because there aren't that many crates (yet)", and of course "we all know this history like the back of our hand, because we've all been around since v0.1 alpha."

I'm picturing "10 years later" when some junior developer at MegaCorp Inc desperately tries to figure out why the legacy Rust 2018 version he's forced to use in 2032 can't compile some privately hosted crate that's needed for an enterprise project. It'll probably depend on a specific version of a crate that was yanked because some transitive dependency had to be yanked because that guy was a Russian kid that sold access to this popular crate for $500 to Israeli spies. Of course now the patched crate requires Rust 2030 or whatever, breaking fifty seven of the two thousand transitive crates being pulled it, so good luck to him with disentangling that mess and making this work.

PS: I wish I was making this story up, but I basically just changed Angular to Rust for a real thing that happened to me, like... a month ago. There is no fundamental difference to how Rust does packages, it's just younger so the mess hasn't accumulated to Angular's levels of madness. Give it time...

[1] Except for that spat with the core team.

[2] That we know of. Ignore the guy spamming popular, single-word crate names and shovelling in 99.9% C/C++ code with the thinnest possible Rust wrapper. Surprise! Your code is now more C++ than Rust.

You've obviously thought about this quite a bit... Do you have any ideas as to how projects could avoid this problem?

I have, but to be honest I've forgotten a lot of the specific debates and their finer points.

I don't have a single, definitive, clear solution -- as pointed out by others -- nobody does. It's not a simple problem.

That doesn't mean that steps can't be taken to improve the situation, perhaps dramatically in some cases.

1) Enforced MFA to publish a crate -- credential theft is semi-regularly seen as an attack vector.

2) Strong links between the "source ref" and the specific crate versions. An example of this done super badly is NuGet. All of the hundreds (thousands?) of Microsoft ASP.NET packages point to the same top-level asp.net or .net framework URLs. E.g.:

https://www.nuget.org/packages/Microsoft.Extensions.Configur...

Links to "https://dot.net" as the Project Website, and "https://github.com/dotnet/runtime" as the repository. This couldn't be more useless. Where is the Git hash for the specific change that "7.0.0-preview.4.22229.4" of this library represents? Who knows...

3) Namespaces. They're literally just folders. If you can't code this, don't run a huge public website. This is more important than it sounds, because wildly unrelated codebases might have very similar names, and it's all too easy to accidentally drag in entire "ecosystems" of packages. Think of the Apache Project. It's fine and all if you've "bought in" to the Apache way of doing... everything. But imagine accidentally importing some Google thing, some Netflix thing, some Apache thing, and some Microsoft thing into the same project. Now your 2 KLOC EXE is 574 megabytes and requires 'cc', 'python', and 'pwsh' to build. Awesome.

For example, in ASP.NET projects I avoid anything not officially published by Microsoft and with at least 10M downloads because otherwise it's guaranteed to be a disaster in 5-10 years. Ecosystems diverge, wildly, and no single programmer or even a small group could possibly stitch them back together again. Either it's a dead end of no further upgrades, or rip & replace an entire stack of deeply integrated things.

4) Publisher-verified crate metadata / tags. You just cannot rely on the authors to be honest. It's not even about attacks, it's also about consistency and quality. All crates should be compiled by the hosting provider in isolated docker containers or VMs using a special "instrumented build" flag. Every transitive dependency should be indexed. Platform compatibility should be verified. Toolchain version compatibility should be established for the both the min and max range. Flags like "no-std" or whatever should be automatically checked. CPU and platform compatibility would also be very helpful for a lot of users. The most important one in the Rust world would be the "No unsafe code" tag.

This would stop "soft attacks" such as the guy spamming C++ libraries as Rust crates. Every such crate should have been automatically labelled as: "Requires CC" and "Less than 10% Rust code".

Similarly, if a crate/package changes its public signature in a breaking way, then the publishing system should enforce the right type of semantic versioning bump.

Essentially, what I would like to see is something more akin to a monorepo, but not technically a single repository. That is, a bunch of independent developers doing their own thing, but with a cloud-hosted central set of tooling that helps gain the same benefits as a monorepo.

I'm expecting a lot of arguments along the lines of "that sounds like a lot of work, etc..." Meanwhile Mozilla had a large team for this, millions of dollars of funding, and did not do even 0.1% of what Matt Godbolt did in his spare time...