-
-
Save adamlwatson/1367854 to your computer and use it in GitHub Desktop.
0. gem install goliath (should be v0.9.4 as of writing this) | |
1. gem install mongoid (should be v2.3.3 as of writing this) | |
2. create and save test.rb file (see below) | |
3. % ruby test.rb -sv | |
4. % ab -n500 -c10 http://localhost:9000/status | |
5. note the req/s results (somewhere around 80-90 req/s on my machine) | |
6. edit the test.rb file, remove the 'require mongoid' line | |
7. repeat steps 3 and 4. | |
8. note the req/s results (somewhere aaround 550 req/s on my machine) | |
Conclusion: Mongoid is patching something in its initialization that is causing severe performance bottlenecks in Goliath. |
$:<< '../lib' << 'lib' | |
require 'goliath' | |
require 'mongoid' | |
class HelloWorld < Goliath::API | |
use Goliath::Rack::Heartbeat | |
def response(env) | |
[200, {}, ""] | |
end | |
end |
adamlwatson
commented
Nov 16, 2011
via email
Thanks for answering all my questions. I really appreciate it.
The thing I'm puzzling through is how real that 500-600 number you're citing is. The key is that your test isn't really taking advantage of what async frameworks bring to the table. Since all your data is running on a local box, IO latency is negligible, so I would expect your test to perform as fast as a threaded server. In other words, if your requests take 2ms, you should get 500rps. If they take less, you should get even more.
I've been testing with MongoDB running across a WAN (from my box here at my friends house to my server at home) and there performance takes a hit. I'm still trying to validate my testing (I just sent a message to the goliath list), so take this with a grain of salt. For example, my performance issues last night turned out to be a bug in apache bench on OS X Lion. ;-)
Sujal
(the short version of what I'm puzzling through is how do I know that MongoDB calls are really async... that's where this all started)