I've always had trouble getting `for` loops to work predictably, so my common loop pattern is this:
grep -l -r pattern /path/to/files | while read x; do echo $x; done
or the like.This uses bash read to split the input line into words, then each word can be accessed in the loop with variable `$x`. Pipe friendly and doesn't use a subshell so no unexpected scoping issues. It also doesn't require futzing around with arrays or the like.
One place I do use bash for loops is when iterating over args, e.g. if you create a bash function:
function my_func() {
for arg; do
echo $arg
done
}
This'll take a list of arguments and echo each on a separate line. Useful if you need a function that does some operation against a list of files, for example.Also, bash expansions (https://www.gnu.org/software/bash/manual/html_node/Shell-Par...) can save you a ton of time for various common operations on variables.
for loops were exactly the pain point that lead me to write my own shell > 6 years ago.
I can now iterate through structured data (be it JSON, YAML, CSV, `ps` output, log file entries, or whatever) and each item is pulled intelligently rather than having to conciously consider a tonne of dumb edge cases like "what if my file names have spaces in them"
eg
ยป open https://api.github.com/repos/lmorg/murex/issues -> foreach issue { out "$issue[number]: $issue[title]" }
380: Fail if variable is missing
379: Backslashes and code comments
378: Improve testing facility documentation
377: v2.4 release
361: Deprecate `swivel-table` and `swivel-datatype`
360: `sort` converts everything to a string
340: `append` and `prepend` should `ReadArrayWithType`
Github repo: https://github.com/lmorg/murexDocs on `foreach`: https://murex.rocks/docs/commands/foreach.html