Robots.txt Checker
Paste your robots.txt content below to analyze user-agents, crawl rules, sitemaps, and detect common issues.
Enter a URL to fetch its robots.txt
Tips & Best Practices
- Don't block resources: Avoid blocking CSS, JS, or image files — search engines need them to render pages.
- Sitemap reference: Always include a Sitemap directive pointing to your XML sitemap.
- Be specific: Use specific paths rather than broad Disallow rules to avoid accidental blocking.
- Crawl budget: For large sites, use robots.txt strategically to focus crawl budget on important pages.
- Test changes: Always test robots.txt changes before deploying — mistakes can deindex your entire site.
- Crawl-delay: Most major search engines ignore Crawl-delay. Use server-side rate limiting instead.