robots.txt file controls what pages of your allplications gets crawled by the serach engine bots. As part of robots.txt file we can allow or disallow single pages or directories within the application.
robots.txt file lives in the root folder of your application. A simple, static solution in Rails would be, to put a robots.txt file into you /public folder in your rails app but then you can't dynamically set the content of this file.
If you want a different file for staging and production server or want some dynamic routes in your robots.txt, then you need to generate this file with rails. Make sure you remove/rename robots.txt from /public folder.
# routes.rb
get '/robots.:format', to: 'home#robots'
# app/controllers/home_controller.rb
def robots
respond_to :text
expires_in 6.hours, public: true
end
# app/views/robots.text.erb
<% if Rails.env.production? %>
User-Agent: *
Allow: /
Disallow: /admin
Sitemap: http://www.yourdomain.com/sitemap.xml
<% else %>
User-Agent: *
Disallow: /
<% end %>
Show!!