Check if your sitemap exists and is properly configured for AI crawlers. Get insights on coverage, freshness, and optimization.
Sitemaps help AI crawlers discover and understand all your content systematically. A well-configured sitemap ensures nothing gets missed.
Sitemaps ensure AI crawlers find all your pages, including those not easily discovered through internal links.
Last modified dates help AI prioritize recent content and understand when pages are updated.
Priority and change frequency signals guide crawlers to focus on your most important content.
XML format provides machine-readable structure that AI can process efficiently.
Create a valid XML sitemap with all your important URLs. Place it at yourdomain.com/sitemap.xml.
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://yoursite.com/</loc>
<lastmod>2025-12-20</lastmod>
<changefreq>weekly</changefreq>
<priority>1.0</priority>
</url>
<url>
<loc>https://yoursite.com/about</loc>
<lastmod>2025-12-19</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
</urlset> Reference your sitemap in robots.txt so crawlers can easily discover it.
User-agent: *
Allow: /
Sitemap: https://yoursite.com/sitemap.xml π‘ Tip: This makes your sitemap discoverable without crawlers having to guess the location.
Include lastmod, changefreq, and priority to help AI understand your content better.
If you have more than 50,000 URLs, use a sitemap index file to organize multiple sitemaps.
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>https://yoursite.com/sitemap-posts.xml</loc>
<lastmod>2025-12-20</lastmod>
</sitemap>
<sitemap>
<loc>https://yoursite.com/sitemap-pages.xml</loc>
<lastmod>2025-12-19</lastmod>
</sitemap>
</sitemapindex>