Audit robots.txt directives, discover declared sitemaps, and confirm sitemap.xml is reachable.
Fetches and analyzes robots.txt for blocking directives.
Finds sitemaps declared in robots.txt and probes /sitemap.xml.
Scores overall crawl accessibility and identifies issues.
Robots.txt controls which pages search engines can access. Sitemaps help them discover all your content. Misconfigurations can hurt indexing.
Search engines will assume they can crawl everything. It's best to have one for explicit control.
At least one. Large sites may need multiple sitemaps or a sitemap index file.