Loading...
Please wait while we prepare your experience
Loading...
Please wait while we prepare your experience
Generator
Generate a robots.txt file with custom rules.
Validator
Validate and analyze any robots.txt file.
Generate a robots.txt file with custom rules.
Build your robots.txt rules below, then copy the output.
Allow
Disallow
User-agent: * Allow: / Disallow:
Generate a new robots.txt or validate an existing one in three steps.
Choose generator or validator
Use the Generator to create a new robots.txt file, or the Validator to check an existing one.
Add your rules
In the generator, add user-agents and allow/disallow rules. In the validator, paste your robots.txt content or enter your site URL.
Copy or verify
Copy the generated robots.txt and deploy it to your site root, or review the validation results to fix any issues.
Blocking admin pages
Disallow crawlers from accessing /admin, /login, and other private pages that shouldn't appear in search results.
Preventing duplicate content
Block crawlers from indexing URL parameters, print pages, or other duplicate versions of your content.
Protecting staging sites
Add a Disallow: / rule to staging and development sites to prevent them from being indexed.
Controlling crawl budget
For large sites, block low-value pages from being crawled so search engines spend their crawl budget on important content.
Pointing to your sitemap
Add a Sitemap: directive to your robots.txt so search engines can find your sitemap automatically.
Validating existing rules
Test whether a specific URL is blocked or allowed by your current robots.txt rules before making changes.
Found a problem with Robots Txt? Let us know.
Your feedback helps us improve.