-
-
Save Mahaswami/3c79acfa2225f9a73f47e40ae57e704e to your computer and use it in GitHub Desktop.
[nginx] limit requests from searchengine crawlers/bots to 1r/m (prevent DDOS)
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
http { | |
map $http_user_agent $limit_bots { | |
default ''; | |
~*(bing|yandex|msnbot) $binary_remote_addr; | |
} | |
limit_req_zone $limit_bots zone=bots:10m rate=1r/m; | |
server { | |
location / { | |
limit_req zone=bots burst=5 nodelay; | |
} | |
} | |
} |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Full bot list available here
https://github.com/mitchellkrogza/nginx-ultimate-bad-bot-blocker/blob/master/conf.d/globalblacklist.conf
Also an example of limiting non bot requests is available here
http://alex.mamchenkov.net/2017/05/17/nginx-rate-limit-user-agent-control-bots/
Setting status code to 429 instead of nginx default may also serve a useful purpose