the goal: to do fast virtual host routing, e.g. to have a single process on a machine listening on port 80 and proxying data based on HTTP Host to other non-port-80 web processes on the same machine
many people use nginx for this because nginx is faster than node is currently for data-heavy applications (see below)
they use the JS proxies from https://github.com/substack/bouncy/tree/master/bench
and this node server: https://github.com/substack/bouncy/blob/master/bench/bench.js#L11-L16
each result was generated with ab -n 5000 -c 10
nginx used is v1.4.4 w/ 256 worker connections and 1 worker process
benchmarks done on a macbook air laptop computer, all w/ local http servers
Time taken for tests: 13.915 seconds
Complete requests: 5000
Failed requests: 0
Write errors: 0
Total transferred: 20971895000 bytes
HTML transferred: 20971520000 bytes
Requests per second: 359.34 [#/sec] (mean)
Time per request: 27.829 [ms] (mean)
Time per request: 2.783 [ms] (mean, across all concurrent requests)
Transfer rate: 1471865.46 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 1 0.9 1 26
Processing: 10 27 4.3 26 57
Waiting: 0 1 1.4 1 26
Total: 11 28 4.6 27 59
Percentage of the requests served within a certain time (ms)
50% 27
66% 28
75% 29
80% 30
90% 32
95% 35
98% 42
99% 48
100% 59 (longest request)
Time taken for tests: 37.414 seconds
Complete requests: 5000
Failed requests: 0
Write errors: 0
Total transferred: 20972000000 bytes
HTML transferred: 20971520000 bytes
Requests per second: 133.64 [#/sec] (mean)
Time per request: 74.827 [ms] (mean)
Time per request: 7.483 [ms] (mean, across all concurrent requests)
Transfer rate: 547405.79 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.3 0 5
Processing: 13 74 23.7 70 469
Waiting: 4 23 9.2 22 109
Total: 13 75 23.7 70 470
Percentage of the requests served within a certain time (ms)
50% 70
66% 73
75% 76
80% 79
90% 90
95% 104
98% 128
99% 146
100% 470 (longest request)
Time taken for tests: 55.968 seconds
Complete requests: 5000
Failed requests: 0
Write errors: 0
Total transferred: 20976089379 bytes
HTML transferred: 20975714304 bytes
Requests per second: 89.34 [#/sec] (mean)
Time per request: 111.936 [ms] (mean)
Time per request: 11.194 [ms] (mean, across all concurrent requests)
Transfer rate: 366003.19 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.5 0 22
Processing: 44 111 25.1 107 358
Waiting: 10 31 14.9 28 273
Total: 44 112 25.2 107 358
Percentage of the requests served within a certain time (ms)
50% 107
66% 118
75% 126
80% 132
90% 144
95% 154
98% 166
99% 173
100% 358 (longest request)
Time taken for tests: 74.141 seconds
Complete requests: 5000
Failed requests: 0
Write errors: 0
Total transferred: 20971895000 bytes
HTML transferred: 20971520000 bytes
Requests per second: 67.44 [#/sec] (mean)
Time per request: 148.281 [ms] (mean)
Time per request: 14.828 [ms] (mean, across all concurrent requests)
Transfer rate: 276236.99 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.2 0 2
Processing: 72 148 28.7 139 385
Waiting: 24 59 19.8 54 245
Total: 73 148 28.7 139 385
Percentage of the requests served within a certain time (ms)
50% 139
66% 158
75% 170
80% 175
90% 188
95% 197
98% 209
99% 218
100% 385 (longest request)
Time taken for tests: 46.905 seconds
Complete requests: 5000
Failed requests: 0
Write errors: 0
Total transferred: 20971895000 bytes
HTML transferred: 20971520000 bytes
Requests per second: 106.60 [#/sec] (mean)
Time per request: 93.811 [ms] (mean)
Time per request: 9.381 [ms] (mean, across all concurrent requests)
Transfer rate: 436631.96 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.1 0 3
Processing: 58 94 26.7 86 534
Waiting: 4 28 23.5 24 466
Total: 58 94 26.8 86 534
Percentage of the requests served within a certain time (ms)
50% 86
66% 92
75% 98
80% 104
90% 124
95% 133
98% 142
99% 153
100% 534 (longest request)
heres the
nginx.conf
I used