The reason compilation time is a problem is that compiler architecture is stuck in the 1970s, at least for systems programming languages. Compilers make you wait while they repeat the same work over and over again, thousands of times. Trillions, if you count the work globally. How many times has been parsed over the years?

In a world where compiler architecture had actually advanced, you would download a precompiled binary alongside the source, identical to what you would have produced with a clean build, and that binary would be updated incrementally as you type code. If you made a change affecting a large part of the binary that couldn't be applied immediately, JIT techniques would be used to allow you to run and test the program anyway before it finished compiling.

There is no fundamental reason why anyone should ever have to wait for a compiler. And if you didn't have to wait, then it would free the compiler to spend potentially much more time doing optimizations, actually improving the final binary.

The zapcc project shows a bit of the potential for improvement in build times, though it's just scratching the surface. https://github.com/yrnkrn/zapcc

In addition to improved local compilation, the constraint space for compiler design is also changed by cloud compilation. Distributed deterministic compilation like [1] would permit community-level caching. Distribution reduces the importance of peak computes - if prompt compilation requires a hundred cores, that could be ok. Community caching reduces the importance of worst-case performance - if some rich type analysis of a standard library takes days, that might be tolerable if it only needs to happen once. If optimization decisions can become a discrete things-on-the-side, like current execution traces for jit, then there's a new opportunity for human-compiler collaboration - "lay out the data in memory like so, when the cpu cache has this shape". I'm looking forward to the ferment of hobby languages starting to explore this new space. What would you do differently in designing a language, if the language community was served by a single giant instance of a compiler? How might that change the economics of language development? PLAAS?

[1] https://github.com/StanfordSNR/gg Sort of checksums-and-argument-lists to make gcc deterministic, as for a cache, but farmed out so `make -j100` runs on amazon Lambda.