Sitemap Configuration

Help AI crawlers find every important page.

What This Check Does

Your XML sitemap is a roadmap for AI crawlers. It tells them which pages exist, when they were last updated, and how important they are. A well-configured sitemap referenced in robots.txt ensures comprehensive AI indexing.

Why It Matters for AI Search

AI search engines have crawl budgets just like traditional ones. A sitemap helps AI crawlers prioritize your most important content and discover pages that might not be reachable through internal links alone. Without a sitemap, AI models may miss critical pages.

How We Evaluate

  1. 1Check for sitemap.xml at the root and in robots.txt Sitemap directive
  2. 2Parse the sitemap for valid URLs and count total entries
  3. 3Verify lastmod dates are present and recent
  4. 4Check for sitemap index files for large sites

Optimization Tips

  • Generate and maintain an XML sitemap with all important pages
  • Reference your sitemap in robots.txt with a Sitemap: directive
  • Include lastmod dates that reflect actual content changes
  • Remove noindex pages and redirects from your sitemap

Related Checks

Test This Check on Your Site

Run a free GEO scan to see how your site performs on Sitemap Configuration.

Run Free Scan