I think this is a great idea, and applaud the idea of bringing a high availability, high-load site template available to others.

The number of times I have thought in the past few weeks that if they had just used some static pages on S3 behind Cloudfront, or some kind of CDN, that much pain could have been averted.

Of course the first thing I did was to benchmark the test site to see how their edge network performs. For reference I'm based in Melbourne, Australia, and have a 100mbps download, 50mbps upload connections:

  $ ab -n 10000 -c 100 https://emergency-site.dev/
  This is ApacheBench, Version 2.3 <$Revision: 1826891 $>
  Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
  Licensed to The Apache Software Foundation, http://www.apache.org/
  
  Benchmarking emergency-site.dev (be patient)
  Completed 1000 requests
  Completed 2000 requests
  Completed 3000 requests
  Completed 4000 requests
  Completed 5000 requests
  Completed 6000 requests
  Completed 7000 requests
  Completed 8000 requests
  Completed 9000 requests
  Completed 10000 requests
  Finished 10000 requests
  
  
  Server Software:        Netlify
  Server Hostname:        emergency-site.dev
  Server Port:            443
  SSL/TLS Protocol:       TLSv1.2,ECDHE-RSA-AES128-GCM-SHA256,2048,128
  TLS Server Name:        emergency-site.dev
  
  Document Path:          /
  Document Length:        4836 bytes
  
  Concurrency Level:      100
  Time taken for tests:   106.534 seconds
  Complete requests:      10000
  Failed requests:        0
  Total transferred:      53220000 bytes
  HTML transferred:       48360000 bytes
  Requests per second:    93.87 [#/sec] (mean)
  Time per request:       1065.345 [ms] (mean)
  Time per request:       10.653 [ms] (mean, across all concurrent requests)
  Transfer rate:          487.85 [Kbytes/sec] received
  
  Connection Times (ms)
                min  mean[+/-sd] median   max
  Connect:      713  808  30.7    803    1828
  Processing:   230  236   4.8    236     443
  Waiting:      230  236   3.9    236     310
  Total:        956 1044  31.7   1039    2067
  
  Percentage of the requests served within a certain time (ms)
    50%   1039
    66%   1047
    75%   1053
    80%   1057
    90%   1070
    95%   1082
    98%   1107
    99%   1168
   100%   2067 (longest request)
I know there's much better ways of testing load/performance. It's just what I had on hand.
> I know there's much better ways of testing load/performance.

such as

https://github.com/giltene/wrk2