I've been burned by so many build tools over the years. I've finally settled (for C/++/asm) on the combination of Make + ccache: I build a _very_ paranoid Makefile that recompiles everything if it feels like anything changes. For instance, every rule that compiles a C/++ file is invoked if _any_ header/inc/template file changes. I let ccache do the precise timestamp/check-sum based analysis. The result is that (for large builds < 10MMLOC) I rarely wait for more than a few hundred milliseconds on incremental, _and_ I have confidence that I never miscompile.
I just wish that I had a high-performance replacement for linking that was cross-platform (deterministic mode for ar), and for non-C/++ flows. Writing a deterministic ar is about 20 lines of C-code, but then I have to bake that into the tool in awkward ways. For generalized flows, I've looked at fabricate.py as a ccache replacement, but the overhead of spinning up the Python VM always nukes performance.
Do you have some kind of way to verify that your makefile dependencies conform to your source dependencies? Is clang/gcc tracking sufficient for your use case? What about upgrading the compiler itself, does your makefile depend on that? If so, how?
Have you considered tup[0]? Or djb-redo[1]? Both seem infinitely better than Make if you are paranoid. tup even claims to work on Windows, although I have no idea how they do that (or what the slowdown is like). Personally, I'm in the old Unix camp of many-small-executables, non of which goes over 1M statically linked (modern "small"), so it's rarely more than 3 secs to rebuild an executable from scratch.
> (deterministic mode for ar)
Why do you care about ar determinism? Shouldn't it be ld determinism you are worried about?