Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save Mahaswami/3c79acfa2225f9a73f47e40ae57e704e to your computer and use it in GitHub Desktop.
Save Mahaswami/3c79acfa2225f9a73f47e40ae57e704e to your computer and use it in GitHub Desktop.
[nginx] limit requests from searchengine crawlers/bots to 1r/m (prevent DDOS)
http {
map $http_user_agent $limit_bots {
default '';
~*(bing|yandex|msnbot) $binary_remote_addr;
}
limit_req_zone $limit_bots zone=bots:10m rate=1r/m;
server {
location / {
limit_req zone=bots burst=5 nodelay;
}
}
}
@Mahaswami
Copy link
Author

Mahaswami commented Mar 22, 2018

Full bot list available here

https://github.com/mitchellkrogza/nginx-ultimate-bad-bot-blocker/blob/master/conf.d/globalblacklist.conf

Also an example of limiting non bot requests is available here

http://alex.mamchenkov.net/2017/05/17/nginx-rate-limit-user-agent-control-bots/

Setting status code to 429 instead of nginx default may also serve a useful purpose

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment