-
-
Save Stanback/6998085 to your computer and use it in GitHub Desktop.
# Note (November 2016): | |
# This config is rather outdated and left here for historical reasons, please refer to prerender.io for the latest setup information | |
# Serving static html to Googlebot is now considered bad practice as you should be using the escaped fragment crawling protocol | |
server { | |
listen 80; | |
listen [::]:80; | |
server_name yourserver.com; | |
root /path/to/your/htdocs; | |
error_page 404 /404.html | |
index index.html; | |
location ~ /\. { | |
deny all; | |
} | |
location / { | |
try_files $uri @prerender; | |
} | |
location @prerender { | |
#proxy_set_header X-Prerender-Token YOUR_TOKEN; | |
set $prerender 0; | |
if ($http_user_agent ~* "googlebot|yahoo|bingbot|baiduspider|yandex|yeti|yodaobot|gigabot|ia_archiver|facebookexternalhit|twitterbot|developers\.google\.com") { | |
set $prerender 1; | |
} | |
if ($args ~ "_escaped_fragment_|prerender=1") { | |
set $prerender 1; | |
} | |
if ($http_user_agent ~ "Prerender") { | |
set $prerender 0; | |
} | |
if ($prerender = 1) { | |
rewrite .* /$scheme://$host$request_uri? break; | |
#proxy_pass http://localhost:3000; | |
proxy_pass http://service.prerender.io; | |
} | |
if ($prerender = 0) { | |
rewrite .* /index.html break; | |
} | |
} | |
} |
@dfmcphee Did you get that to work? Encountered the same problem right now. I've got one proxy_pass for my node-servers and the real clients requesting the web page, and want to redirect the rest to another proxy_pass. Anyone got a solution for this?
Hi Brian,
I am using angularjs based website with nginx as the server and without node js.
I have used the nginx settings as mentioned here: https://gist.github.com/thoop/8165802
I have configured a two server approach:
Website is running on a server: www.mywebsite.com. Running on SSL port 443
Prerender is running on another server: www.myprerenderservice.com. Running on port 80
The prerender service is running with PM2 as the process manager
In google crawler, I have used ?escaped_fragment= as well
The google crawler is sometimes able to return the results, while sometimes not. If it fails, I just tried killing the Prerender PM2 service and start it again and clear memory on server and then try it again. It starts working again for once. But it fails again. Don't know what is happening. Can you help?
I want to limit the rendering of only partial pages. Do you have better settings?
# Except for the home page and the info page, other pages are not forwarded.
if ($document_uri !~ "/index.html|/info.html") {
set $prerender 0;
}
Is there a way to avoid using prerender.io at all?
Let's say I have generated static files with rendertron, I want to store them in a sub-folder
How can i ask nginx "if is googlebot|otherbot", please load files from this directory?
Why would I need a rendertron server running all the time, or why would I need prerender.io, if the end result is kinda the same as with static files...?
Finally, I tried something like this:
It seems proxy_pass does not like dynamic stuff? So I would need to add a resolver:
And deploy a DNS server on my host machine. Am I right?