Start typing to search tools…
🛠️ All Tools
💻 Developer Tools
📋 JSON Formatter 🌳 JSON Tree 📄 XML Formatter 🌲 XML Tree 🎨 CSS Generator 🗄️ SQL Builder ⚙️ Programming Tools 📊 ER Diagram 📐 UML Diagram 🔀 Flowchart
🌐 Network & DNS
🔍 DNS Lookup 🌍 DNS Propagation 🔎 WHOIS Lookup 🔒 SSL Checker 📡 Ping Test ⚡ Speed Test ✉️ Email Auth 👤 Username Checker
🔐 Encoding & Security
🔑 Base64 Encode 🖼️ Base64 Image 🔐 MD5 Hash 🔑 Password Gen 🎭 Fake Name Gen
🖼️ Image Tools
📦 Compressor 🔄 Converter ✂️ Cropper 📐 Resizer 🎨 Filters ✨ Effects 💧 Watermark 📸 Social Image ⭐ Favicon Maker 🖼️ Image to Text 📷 EXIF Viewer
🔍 SEO & Web
✅ SEO Checklist 🔍 SERP Preview 🗺️ Sitemap Gen 📱 Social Debugger 🏷️ Hashtag Gen
✍️ Text & Writing
📝 Word Counter ✨ Fancy Text 🎲 Random Generator 🎨 Color Palette 💡 Brainstorm Tool 🚀 SaaS Ideas 🧠 Mind Map
ℹ️ About ✉️ Contact
Home Web & SEO Tools Sitemap Generator
🗺️ SEO ✅ 100% Free ⚡ Live Crawl

Sitemap Generator

Crawl any website and instantly generate a standards-compliant XML sitemap. Set crawl depth, max pages, changefreq, and priority — then download your sitemap.xml in one click.

Enter website URL to crawl
Try: example.com github.com blog.cloudflare.com
Crawling website…
⚠️
Generation Failed
An error occurred.
Pages
Sitemap generated successfully
Depth: Errors:
📄 Crawled Pages
📋 XML Preview
📊 Sitemap Stats
Pages
Errors
Max Depth
XML Size
✅ Sitemap Checklist
🗺️
Submit your sitemap to Google Search Console and Bing Webmaster Tools.
🤖
Add Sitemap: https://yoursite.com/sitemap.xml to your robots.txt.
🔄
Regenerate and re-submit your sitemap whenever you publish new pages.
🚫
Exclude low-value pages like tag archives, login pages, and duplicate content.
📏
A single sitemap can have at most 50,000 URLs and be at most 50 MB uncompressed.
💡 SEO Tips
Priority 1.0 should only be used for your homepage and most critical pages.
📅
lastmod helps search engines prioritise crawling recently updated pages first.
🔗
Canonical URLs only — never include pages with a canonical tag pointing elsewhere.
📦
For large sites, use a sitemap index file to reference multiple sitemaps.

What is an XML Sitemap?

An XML sitemap is a file that lists all the important pages on your website, helping search engines like Google and Bing discover, crawl, and index your content efficiently. Rather than relying on crawlers to find every page by following links, a sitemap provides a direct, structured list — making it especially important for large sites, new sites with few inbound links, or sites with deep page hierarchies that crawlers might not reach on their own.

This tool crawls your website from a given starting URL, following internal links up to your specified depth, and produces a fully standards-compliant sitemap.xml file ready to upload to your server and submit to search consoles.

Why Does Your Website Need a Sitemap?

  • Faster indexing — New pages get discovered and indexed by Google significantly faster when included in a submitted sitemap.
  • Deep page coverage — Pages buried 4–5 clicks from the homepage may never be found by crawlers without a sitemap.
  • lastmod signals — The lastmod field tells search engines which pages have changed recently, prioritising re-crawl of updated content.
  • International sites — Sitemaps can be combined with hreflang annotations to handle multi-language and multi-region sites.
  • Rich results — Image and video sitemaps help Google index media content that might otherwise be missed.

How to Use Your Generated Sitemap

  • Download the sitemap.xml file and upload it to your website's root directory so it's accessible at https://yoursite.com/sitemap.xml.
  • Add the sitemap URL to your robots.txt file: Sitemap: https://yoursite.com/sitemap.xml.
  • Submit the sitemap URL in Google Search Console under Indexing → Sitemaps.
  • Submit the sitemap URL in Bing Webmaster Tools under Sitemaps.
  • Re-generate and re-submit whenever you add, remove, or significantly update pages.

Frequently Asked Questions

The Sitemaps protocol allows up to 50,000 URLs per sitemap file with a maximum uncompressed file size of 50 MB. For sites exceeding this, use a sitemap index file that references multiple individual sitemaps. As a best practice, only include pages you want indexed — exclude utility pages, admin pages, duplicate content, and pages with noindex tags.
Google has officially stated it largely ignores the changefreq and priority fields, preferring its own crawl scheduling algorithms. However, Bing and other search engines may still use them as hints. It's still good practice to set them accurately. The lastmod field is far more valuable and is actively used by Google to prioritise re-crawling recently updated pages.
No — including noindex pages in your sitemap sends conflicting signals to search engines. If a page has a noindex meta tag, it means you don't want it in the index, so don't list it in your sitemap. This tool excludes noindex pages by default. The "Include noindex pages" option is available only if you want to audit which pages have conflicting directives.
Pages may be missing if: they're beyond the crawl depth limit, not linked from any crawled page, blocked by robots.txt or a noindex tag, behind a login/authentication wall, dynamically loaded by JavaScript (this crawler only reads server-rendered HTML), or returned an HTTP error. For comprehensive coverage on large JavaScript-heavy sites, consider a dedicated sitemap plugin for your CMS.
Regenerate and resubmit your sitemap whenever you: publish new pages, delete existing pages, significantly update important content, or change your URL structure. For blogs or news sites with frequent publishing, daily regeneration is ideal. For static brochure sites, monthly is usually sufficient. Many CMS platforms (WordPress with Yoast, etc.) can auto-generate and ping Google when content changes.
Copied!