Build a clean, SEO-friendly robots.txt file with common directives and automatic sitemap linking.
User-agent: *
Verify HTTPS certificates and expiry dates.
Inspect status codes and response headers.
Validate robots and meta indexing signals.
Confirm canonical tags and robots directives.
Audit sitemap coverage and freshness.
Build XML sitemaps from URL lists.