Robots.txt Tester & Analyzer
Test if specific URLs are blocked by robots.txt and analyze your robots.txt configuration
URL Testing
Check if specific pages are accessible to search engine bots
File Analysis
Analyze entire robots.txt for common issues and best practices
Security Check
Identify security vulnerabilities in your robots.txt configuration
Website Configuration
example.com
wordpress.org
github.com
/
Testing Options
Common Test Paths:
Test Results
Ready to Test
Enter a website URL and click "Test URL & Analyze" to check robots.txt configuration
Test common paths like /wp-admin, /login, or /search
Check different user-agents for specific bot behavior
What is Robots.txt?
Robots.txt is a text file that tells search engine crawlers which pages or files they can or cannot request from your site. It's placed in the root directory of your website.
Allow:
User-agent: *Allow: /public/
Disallow:
User-agent: *Disallow: /private/
Common Issues
- Blocking CSS/JS files (affects page rendering)
- Missing sitemap reference
- Incorrect syntax or formatting
- Blocking important content accidentally
- No robots.txt file found
Best Practices
- Place robots.txt in root directory
- Include sitemap location
- Use correct syntax and formatting
- Test with Google Search Console
- Regularly review and update
