I can't help but feel this problem is indicative of incidental complexity in how we develop web applications. Not saying the PHP glory days were better, but there's something to be said for removing the layers of abstraction between the data and the presentation. Make the database query using SQL directly, and then inject the results into the HTML template to be delivered to the browser. Obviously, there were many issues here, like how easy it was to leave applications open to SQL injection attacks.

But it has been interesting to see the tide turn back towards server-side rendering, relying on partial DOM replacement for client-side updates. For web apps that don't have massive numbers of UI states (like a document editor), it seems like people are rethinking the wisdom thick client-side JavaScript applications, which seem to be one of the main motivators for REST API layers, and the need to efficiently fulfill N+1 queries.

Although, I do remember dealing with the N+1 problem when doing Django server-side apps more than a decade ago, before the dominance of client-side apps. I guess it was more the rise of MVC architecture and the active record pattern (https://en.wikipedia.org/wiki/Active_record_pattern) that brought the N+1 problem, more so than client-side apps.

There's a Ruby gem called Bullet that identifies and warns developers about N+1 problems. You can also have it fail tests if detected.

I don't know if the approach is possible with every ORM or if it's just leveraging some Ruby perks, but I can't think of a good reason why you wouldn't use the equivalent everywhere.

https://github.com/flyerhzm/bullet