> You will not need to open an editor and write a long script and then to have an interpreter like Node.js to run it, sometime the bash command line is just enough you need!
This is a bad attitude to have for working with data processing, where QA is necessary and the accuracy of the output is important. A 50 LOC scraper with comments and explicitly-defined inputs and output from functions is far preferable to a 8 LOC scraper that those without bash knowledge will be unable to parse.
And the 8 LOC bash script is not much of a time savings as this post demonstrates; you still have to check each function output manually to find which data to parse / handle edge cases.
I've used Makefiles to coordinate a zoo of perl/bash scripts before. It's pretty effective.
I recommend mk (original make replacement for Plan 9, available for different OSes through Plan9Port [0,1]). It has a bit more uniform syntax, and can check that the dependency graph is wellfounded.