You probably don't want Google crawling your development staging app. Here's how to fix that.
$ mv public/robots.txt config/robots.production.txt
$ cp config/robots.production.txt config/robots.development.txt
Now edit config/routes.rb
to add a route for /robots.txt
, and add the controller code.
Your development staging app should be using the production environment, not development environment. I'd recommend setting a separate environment variable like DISABLE_ROBOTS=true and then using the following instead
class HomeController < ApplicationController
caches_page :robots
def robots
robot_type = ENV["DISABLE_ROBOTS"] == "true" ? "staging" : "production"
robots = File.read(Rails.root + "config/robots/robots.#{robot_type}.txt")
render :text => robots, :layout => false, :content_type => "text/plain"
end
end
In rails 4 you'll need this for page caching https://github.com/rails/actionpack-page_caching