>However, if you instead organize all your data in a format that's sympathetic to line-oriented processing on stdin-stdout, then shell will work with you instead of against.

Not even that is necessary. Just use structured data formats like json. If you are consuming some API that is not json but still structured, use `rq` to convert it to json. Then use `jq` to slice and dice through the data.

dmenu + fzf + jq + curl is my bread and butter in shell scripts.

However, I still haven't managed to find a way to do a bunch of tasks concurrently. No, xargs and parallel don't cut it. Just give me an opinionated way to do this that is easily inspectable, loggable and debuggable. Currenly I hack together functions in a `((job_i++ < max_jobs)) || wait -n` spaghetti.

new to 'rq', it's not in active development, any other alternatives? it seems doing a lot other than convert structured data to json.

Not sure what it is doing more...I'm referring to this rq: https://github.com/dflemstr/rq#format-support-status

It converts to/from the listed formats.

There is also `jc` (written in Python) with the added benefit that it converts output of many common unix utilities to json. So you would not need to parse `ip` for example.

https://github.com/kellyjonbrazil/jc#parsers

Also look at `yq` - https://github.com/mikefarah/yq

This is a wrapper to jq that also supports yaml and other file formats.