Robots.txt check
Check if a robots.txt file exists and view the most important crawl rules for search engines.
Similar calculation tools
What is a robots.txt file?
A robots.txt file instructs search engines on which pages may or may not be crawled. Errors in this file can have serious SEO consequences.
What does this tool do?
This tool:
- checks if robots.txt is accessible
- displays the contents of the file
- marks basic rules such as User-agent, Disallow, and Allow
This is a basic check , intended for quick validation.
When is this useful?
- check if your site has been blocked by accident
- quick SEO scan
- during go-live or migration
- debugging indexation problems
Significant limitation
Browsers sometimes block the retrieval of robots.txt due to CORS restrictions . ???? In production, you can resolve this via:
- server-side fetch (PHP)
- proxy endpoint
The tool itself is technically correct.