Robots.txt Validator

Validate your robots.txt file for syntax errors, warnings, and SEO issues. Paste content or fetch from any domain.

Validation Results

Parsed Rules

User-AgentDirectiveValue

Test a URL

Enter a path and select a user-agent to check if it would be blocked or allowed.

Want AI-generated blog content that ranks? Try Autorank free.

Get Started Free →

Robots.txt Best Practices

The robots.txt file tells search engine crawlers which pages or sections of your site they can or cannot request. It lives at the root of your domain (e.g. https://example.com/robots.txt) and follows a simple text-based protocol.

Valid Directives

Common Mistakes to Avoid

Wildcard Patterns

Google and Bing support * (match any sequence) and $ (end of URL) in path patterns. For example, Disallow: /*.pdf$ blocks all PDF files.

All Free SEO Tools

More free tools to help with your SEO workflow.