It amazes me how far out of their way many people will go to avoid using parsing tools.
He avoided using a lexer generator because "I'd figured using a lexer generator was more trouble than it was worth". To him it was more convenient to write a lexer manually (using macros) in C++, then later completely rewrite it in a different language (D) as a one-off trie matcher code generator.
I am amazed that this could be thought of as less work. How can writing two lexers from scratch manually be less trouble than writing a bunch of regexes in a Flex or Ragel file? Especially since the result was likely slower than using Flex or Ragel would have been?
To me the interesting question is: how much of this allergy to external tools is:
1. the trouble of learning/maintaining something new/different (inherent tool overhead)
2. design of the tool isn't as user-friendly as it could be (incidental tool overhead)
3. irrational belief that the tool will be more work/trouble than it actually is (non-optimal decision making)
If part of the answer is (2), then improved tools will lead to greater adoption. And hopefully more convenient tools will lead to them being better known and more widely adopted, which should lessen (1) also.
Everyone uses regexes; using a lexer generator should (in my opinion) be as easy as using a one-off regex. I think the key is to make the lexer generator an embeddable library like regex libraries are.
There actually is a really powerful parser generator for D by Philippe Sigaud: https://github.com/PhilippeSigaud/Pegged It's much more pleasant to use than something like Boost Spirit.
Somewhat relatedly, D's standard library has a compile time regex engine that compiles regular expressions down at compile-time resulting in some of the fastest regular expressions in the world.