Robots.txt Tester

Test any site's /robots.txt. See which bots are blocked — including AI crawlers (GPTBot, ClaudeBot, PerplexityBot, Google-Extended). Test specific URLs against specific bots.

Why this matters

For SEO

If Googlebot is blocked from a URL, Google will never index it. If it's blocked from your CSS/JS, Google can't render the page properly. Most sites accidentally block too much through inheritance from old Disallow rules.

For AEO (the bigger story)

If GPTBot, ClaudeBot, PerplexityBot, or Google-Extended can't crawl your site, you're invisible to ChatGPT, Claude, Perplexity, and Google AI Overviews. Many sites block AI bots without realizing it kills their AI-engine traffic.

How robots.txt rules work

Each User-agent: block applies to specific bots. The most-specific match wins. Bots with their own block ignore the wildcard * block. Within a block, the longest matching rule wins; Allow beats Disallow at equal length.

Common mistakes

  • Disallow: / on the wildcard * block — blocks everything for everyone
  • Using Disallow: with no path — does nothing (people think it's "allow all")
  • Blocking /wp-admin/ while accidentally allowing /wp-admin/admin-ajax.php
  • Forgetting to update robots.txt after migrating