This has been my experience as well. I have a couple of Clojure(Script) applications that are approximately 3-5K lines each and those have been a pleasure too work on. However, my latest project is now pushing past 20K lines and the mental load has jumped exponentially. There is a much greater need for spec, asserts, type hints, and comments just to keep everything straight.

That's interesting do you have any idea why the mental load jumped? would a static analysis tool working with your type hints help?

Or is there many things to consider at once in the system instead of many individual things?

The biggest issue I keep running into is just the concept of data "shape". I love that clojure gives you so much freedom but it can be quite the footgun in a large system because you see that a function expects a map with keys :foo, :bar, and :baz but what are the values for those keys? Spec helps a little bit here for primitive values but for complex nested structures (e.g. {:a [{:b [1 2]} {:c "bar"}]}), it doesn't do much. So, as data moves through the system, and as the system grows it has become increasingly difficult to track the mutations to the underlying structures.

I do think that a static analysis tool would be of some help. Some sort of tooling to better handle tree structures would be very handy. I often find myself being off-by-one level with get-in calls on tree data (E.g. (get-in m [:a :b :d]) where m is {:a {:b {:c {:d 1}}}}), which is annoying because the NPE gets thrown 3 function calls up the stack.

Totally not saying "you're holding it wrong", but maybe once you're more than a few couple levels deep into a nested map it's time to look at an in memory db like [Datascript](https://github.com/tonsky/datascript)? Actual Datalog queries can replace get-in vectors growing out of hand, and you also get better mutations with transactions.