Welcome to Centmin Mod Community
Become a Member

Sysadmin wrk - a HTTP benchmarking tool (forked version)

Discussion in 'System Administration' started by eva2000, Jan 25, 2018.

Thread Status:
Not open for further replies.
  1. eva2000

    eva2000 Administrator Staff Member

    54,606
    12,225
    113
    May 24, 2014
    Brisbane, Australia
    Ratings:
    +18,794
    Local Time:
    10:41 AM
    Nginx 1.27.x
    MariaDB 10.x/11.4+
    Today I am sharing my forked version of wrk HTTP load testing tool on Github you'll find the code and install instructions at https://github.com/centminmod/wrk/tree/centminmod. My forked wrk version adds additional features that are outstanding from original wrk version's Github pull requests such as source IP address binding (code provided by Nginx folks), breakout latency statistics and lua JSON format latency distribution statistics. I also experimented with adding additional lua scripts from various sources such as the scripts/setup.lua used in Nginx zlib vs brotli HTTP compression benchmarks posted here. I am not a lua coder, so just using or including lua scripts that work.

    Pair Webpagetest page load speed testing tool with this benchmarking tools wrk, will allow you to measure before and after optimisation/migration improvements to your web sites :D


    For web server benchmark testing using local installed tools, I mainly use either Siege, h2load, Locust.io or wrk load testers.
    • Siege benchmark is already installed out of box with Centmin Mod LEMP installs and can test HTTP/1.1 based HTTP and HTTPS. Doesn't support HTTP/2 HTTPS load testing - see h2load below for that. You can see paste examples here and here. But Siege has alot of cpu overhead so can affect load testing if you run Siege on same server as the target web server - thus limiting how well you can stress your target server. This is where h2load and wrk maybe better for such testing. Ideally you want to have load testing tool on separate server to your target web server you want to test.
    • h2load can test HTTP/1.1 and HTTP/2 based HTTPS and HTTP and is apart of nghttp2 HTTP/2 C library tools. I usually use my Ubuntu 17 nghttp2 docker image when I need to use h2load and accompanying nghttp2 HTTP/2 tools. Example h2load testing was done with Caddy vs Centmin Mod Nginx HTTP/2 HTTPS benchmarks here.
    • Locust.io is open source Python based load testing tool. You can see example of my benchmarks on my Wordpress7 site.
    • My forked version wrk HTTP benchmarking tool for testing HTTP/1.1 HTTPS and HTTP.
    For example from wrk load testing here. I used the command as follows:
    Code (Text):
    wrk-cmm -t3 -c200 -d5s --breakout -H 'Accept-Encoding: gzip' -s scripts/setup.lua --latency http://domain.com
    

    where
    • -3 = 3 threads
    • -c200 = 200 users
    • -d5s = 5 second duration for test
    • --breakout = list latency breakout stats
    • Accept-Encoding: gzip = request compressed HTTP page
    wrk-cmm forked version help/usage commands
    Code (Text):
    wrk-cmm -v
    wrk 4.1.0-16-g8d0a45d [epoll] Copyright (C) 2012 Will Glozer
    Usage: wrk <options> <url>                        
      Options:                                        
        -c, --connections <N>  Connections to keep open
        -d, --duration    <T>  Duration of test        
        -t, --threads     <N>  Number of threads to use
                                                      
        -b, --bind-ip     <S>  Source IP (or CIDR mask)
                                                      
        -s, --script      <S>  Load Lua script file    
        -H, --header      <H>  Add header to request  
            --latency          Print latency statistics
            --breakout         Print breakout statistics
            --timeout     <T>  Socket/request timeout  
        -v, --version          Print version details  
                                                      
      Numeric arguments may include a SI unit (1k, 1M, 1G)
      Time arguments may include a time unit (2s, 2m, 2h
    

    Load Testing Notes


    • Important to note that load testing has to be done carefully especially if target site/server is in a shared/VPS environment as you are affecting your neighbours too as are your load testing results being affected by your neighbours as well. Some web hosts may consider your load testing as a DDOS attack so don't go overboard on VPS/cloud hosting. Dedicated servers on the other hand are fine as you have full control and use of your resources.

    Other Links



    Enjoy :D
     
Thread Status:
Not open for further replies.