And of course, this talk by Gil Tene is fantastic if you're interested in load testing stats https://www.youtube.com/watch?v=lJ8ydIuPFeU
One example of recieving a 50% cost saving is from Amazon Machine Images (AMIs). We compared the cheapest official NGINX AMI available on the Amazon Marketplace[0] and then also ran an NGINX+Unikraft AMI. We ran the same workload using wrk[1] and then checked the bill at the end of the month, roughly $80 vs $40.
[0]: https://aws.amazon.com/marketplace/pp/prodview-xogyq23b3mfge
At that Nim release page:
https://nim-lang.org/blog/2021/10/19/version-160-released.ht...
Is link to this benchmark:
https://web-frameworks-benchmark.netlify.app/result
Where nim is 2nd with 200k req/s, but it is using httpbeast:
https://github.com/dom96/httpbeast
That says it would be more useful to use jester:
https://github.com/dom96/jester
Jester has 150k req/s.
But, when looking at these:
https://www.techempower.com/benchmarks/
drogon, actix etc has about 600k req/s .
Also redbean has about 600k req/s, when I tested:
I tested like this:
git clone https://github.com/wg/wrk.git
cd wrk
make
./wrk -H 'Accept-Encoding: gzip' -t 12 -c 120 http://127.0.0.1:8080/
When I tested https://caddyserver.com v2, it did show about 800k req/s.
It would be very helpful to know how those benchmarks are actually done, so that I could compare what is actually fastest in real world, and not just use some for benchmark tested winning non-realistic code.
Anyway, what I've seen when comparing the performance of tools, is that Artillery, which is running on NodeJS, is perhaps the wors performer of all the tools I've tested. I don't know if it's because of NodeJS or that Artillery in itself isn't a very performant piece of software (It also consumes a lot of memory, btw).
If you want the highest performance, there is one tool that runs circles around all others (including k6), and that is wrk - https://github.com/wg/wrk - very cool piece of software although it is lacking in terms of functionality so mostly suitable for simpler load testing like hammering single URLs.
(I don't know how fast wrk2 is, haven't benchmarked it)
In our startup we used it to test 100,000 users traffic on our python based backend and optimized it continuously. We also used wrk with test cases written in LUA scripts. That too worked fantastic.
We did not use bokeh visualization, but just with locusts we could improve the response time by logging and improving sqlalchemy query logs.
Interestingly there was a thread on HN previously which tools are used for HTTP perf testing:
>>> - Wrk: https://github.com/wg/wrk - Fastest tool in the universe. About 25x faster than Locust. 3x faster than Jmeter. Scriptable in Lua. Drawbacks are limited output options/reporting and a scripting API that is callback-based, so painful to use for scripting user scenario flows.
json library https://github.com/nlohmann/json
wrk, post with json payload via lua post script https://github.com/wg/wrk
* It allows me to focus on latency versus throughput independently
* It has tools for testing performance on pipelined requests (very important in my space)
* It supports scripting complex authentication processes (like magic headers and oauth2)
* It supports scripting fuzzing (random values) for URLs and POST requests
[1]: https://github.com/wg/wrk
For wide-area load testing, I simply buy some advertising since that gives me cheap access to millions of simultaneous users.