Robots.txt Generator

Create a custom robots.txt file for your website.

Configure Robots.txt

What is robots.txt?

A robots.txt file tells search engine crawlers which pages or files they can or can't request from your site.

Common Use Cases:

  • Block admin pages from search results
  • Prevent duplicate content indexing
  • Control crawl rate
  • Specify sitemap location

Important Notes:

  • Must be placed in root directory
  • Case-sensitive
  • Not a security measure
  • Publicly accessible