I am ... confused by Nim a bit. Nim on the surface has a dream set of features: python like syntax, strong types, compilation to C, good garbage collection, good metaprogramming...

OTOH there seems to be a rapid accumulation of features. The language wants implement the "next shiny thing" (e.g. ownership ala rust) or at least that is how it appears to me from the outside.

For those people using Nim -- how do you cope with the ever increasing scope of the language? Isn't there a possibility that things aren't going to be well baked/abandoned?

I like the philosophy of evolving your language slowly and consolidating before adding newer and newer features.

Do you think the pace of change in Nim is OK?

> Nim on the surface has a dream set of features

It's "too good to be true" is for real. The lack of friction when developing in Nim is what makes it so fun to code in (IMO). Metaprogramming in particular is just great in Nim and give you a lot of scope for doing things that would otherwise require new language features.

> how do you cope with the ever increasing scope of the language?

I've been using Nim pretty intensively for several years. From my perspective the language design has been very stable, even since before 1.0. I don't feel like the scope is expanding, and I don't think it really needs to either because the core language is built to be expanded with metaprogramming.

None of my code breaks when updating versions (and I do a lot of heavy metaprogramming). Nothing seems to be abandoned, only refined, and there's a big effort towards backwards compatability so breaking changes are exceedingly rare. The only one I only remember was some time ago when seq (the variable sized list type à la C++ vector) became not nil, so I had to replace `if list != nil` with `if list.len > 0`, so a good change IMO.

Most of the new things have been mechanical under the hood improvements and ownership is one of those. As I understand it, when using gc refs, you'll get free extra speed, better multithreadding, and better compile-time checks. No code needs to be changed, and when it's considered ready it will replace the previous GC, and you can add extra annotations for performance. It's refinement of the GC using ownership rules rather than a whole new set of rules for your program.

The stdlib is very stable (at least from my experience), and again great effort is spent on backwards compatability. Stdlib focus is on being small and "essential", so there's a high bar for stuff getting into it, and probably why it's rare for anything to change API there.

So personally the pace of change has been great - basically my code's just got free speed boosts when it's already fast enough :)

> Metaprogramming in particular is just great in Nim and give you a lot of scope for doing things that would otherwise require new language features.

Metaprogramming is v alluring to me, but I’ve been wondering, people that write code super heavy on the metaprogramming, doesn’t revisiting code and debugging become much more taxing?

I have written a lot of metaprogramming myself, like 5/10 of my repos are macro libraries. And I share your concern. Even the manual advices people to use the least "powerful" tool for the job. These are proc/funcMacros are prefered for creating dsl like https://github.com/treeform/fidget or https://github.com/pragmagic/karax

Otherwise the advice is to use them when they "pull their weight". The reason is they're a lot harder to write and even "design" them right (how they function, what code they write, etc) takes a lot of effort. And in most cases the simpler "tools" will fit your needs.