Robots.txt Analyzer
Fetch, analyze, and test robots.txt rules for any website
About Robots.txt
📄 What is Robots.txt?
- Text file in website root
- Tells crawlers which pages to access
- Standard web protocol since 1994
- Not a security mechanism
🔍 Common Directives
- User-agent: Target specific bots
- Disallow: Block paths
- Allow: Permit paths (overrides)
- Sitemap: Declare sitemap URL
⚠️ SEO Tips
- Don't block CSS/JS files
- Check for accidental blocks
- Use specific rules over wildcards
- Test changes before deploying