Created
July 4, 2017 19:56
-
-
Save cesarmiquel/507a4a508a7685d6f77ad7c9f5bb9627 to your computer and use it in GitHub Desktop.
Config para bloquear robots.txt
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## Here's the way to have Nginx return a robots.txt file that disallows all crawling by bots. | |
## This is useful for development and private sites. | |
location = /robots.txt { | |
add_header Content-Type text/plain; | |
return 200 "User-agent: *\nDisallow: /\n"; | |
} |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment