( WP w/ FastCGI Cache ) - new install ( VPS - 4vCPU + 8GB ram ) test hitting --> mywpsite.com/index.php 500 Clients per second Test-500-CPS-A-load-test-by-loader-io by Matt Williams posted May 20, 2022 at 9:45 AM 1000 Clients Per Second Test-1000-CPS-A-load-test-by-loader-io by Matt Williams posted May 20, 2022 at 9:45 AM 1500 Clients per second Test-1500-CPS-A-load-test-by-loader-io by Matt Williams posted May 20, 2022 at 9:45 AM 3000 Clients Per Second Test-3000-CPS-A-load-test-by-loader-io by Matt Williams posted May 20, 2022 at 9:45 AM That's quite a bit of traffic within 1 minute and still no timeouts and great response times. I just started using loader.io and it's a fantastic tool to really stress test with.
Wow 6ms response times! Your VPS must be very close to the loader.io test servers location. Maybe even in the same datacenter!
The server is in Ashburn, VA. I installed Virtualizor on it to create VMs. Not sure where loader.io test from. Do you know? I'll have to test my Dallas, TX server here soon. I was quite impressed with the overall results. I would have kept going to see how much it can really push without any timeouts but I got side tracked again..
Yeah, it would be still 6ms is crazy fast on any network let alone remote! Tested my Wordpress blog running Centmin Mod LEMP stack with Cloudflare load balanced Cloudflare Argo Tunnels + full HTML page caching enabled via custom CF Worker caching at 5,000 loader.io users it tripped Cloudflare's DDOS layer 7 protection several times, so had to temporarily disable DDOS layer 7 protection for a specific DDOS ruleset Check out the Cloudflare web, cache and firewall analytics too ~1.6m requests is the one related to loader.io test I log every request in Cloudflare firewall so I can analyze potential threats and attacks even if they don't trigger Cloudflare protection/firewall rules cache analytics shows the ~2.2+ million requests served by Cloudflare CDN caching loader.io gzip requested run with 5000 users = 2,296,939 successful request with average 60ms response time pushed 49.99GB of data at a peak of 50K requests/s
You mean Centmin Mod LEMP stack install itself Centmin Mod LEMP Stack Install Nginx on CentOS ? or using loader.io Application Load Testing Tools for API Endpoints with loader.io ?
To override centmin.sh variables or enable AUTOHARDTUNE_NGINXBACKLOG='y', set it in existing persistent config file at /etc/centminmod/custom_config.inc. If persistent config file doesn't exist, create it prior to Centmin Mod installation. Example of using persistent config file https://blog.centminmod.com/2019/07/15/117/centmin-mod-advanced-customised-installation-guide/
for existing Centmin Mod installs, just set AUTOHARDTUNE_NGINXBACKLOG='y' in existing persistent config file at /etc/centminmod/custom_config.inc (create file if doesn't exist) and then run and exit centmin.sh once for it to apply to Nginx config