Skip to content

Instantly share code, notes, and snippets.

@sandheepg
Last active December 30, 2024 15:48
Show Gist options
  • Save sandheepg/0e9d855d9c37d305d3cdb775a53226e1 to your computer and use it in GitHub Desktop.
Save sandheepg/0e9d855d9c37d305d3cdb775a53226e1 to your computer and use it in GitHub Desktop.
Dynamic Robots.txt file in Rails

robots.txt file controls what pages of your allplications gets crawled by the serach engine bots. As part of robots.txt file we can allow or disallow single pages or directories within the application.

robots.txt file lives in the root folder of your application. A simple, static solution in Rails would be, to put a robots.txt file into you /public folder in your rails app but then you can't dynamically set the content of this file.

If you want a different file for staging and production server or want some dynamic routes in your robots.txt, then you need to generate this file with rails. Make sure you remove/rename robots.txt from /public folder.

# routes.rb
get '/robots.:format', to: 'home#robots'
# app/controllers/home_controller.rb
def robots
  respond_to :text
  expires_in 6.hours, public: true
end
# app/views/robots.text.erb
<% if Rails.env.production? %>
  User-Agent: *
  Allow: /
  Disallow: /admin
  Sitemap: http://www.yourdomain.com/sitemap.xml
<% else %>
  User-Agent: *
  Disallow: /
<% end %>
@AlexRoveda
Copy link

Show!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment