Done
Add URL
Defaults:
Bulk Add URLs
Apply to all:
Crawl Website NEW
Auto-discover pages on your site
Options:
Skip patterns:
 How it works: Pages are fetched via allorigins.win CORS proxy, then links are extracted from each page. Works best on publicly accessible sites. Cloudflare-protected or bot-blocking sites may not return results.
URL List 0
# URL Changefreq Priority Last Modified Actions
<!-- Add URLs above to generate XML -->
Sitemap Stats
0
URLs
0 B
Est. Size
50K
Limit
0%
Used
Priority Distribution
Changefreq Breakdown
SEO Best Practices
Max 50,000 URLs per sitemap file or 50 MB uncompressed. Use sitemap index for larger sites.
Priority 1.0 = homepage only. Use 0.8 for key pages, 0.6 for blog posts, 0.4 for tags.
Changefreq is a hint, not a command — search engines may ignore it. Use "weekly" for blogs.
Submit via Google Search Console → Sitemaps, or add to robots.txt: Sitemap: https://…/sitemap.xml
Only canonical URLs — exclude noindex, paginated duplicates, and redirect chains.
robots.txt Snippet
# Add sitemap URL above first

How to Create an XML Sitemap in 5 Steps

This generator builds valid XML sitemaps conforming to the sitemaps.org protocol supported by Google, Bing and all major search engines.

Set Your Base URL

Enter your website's root URL (e.g. https://example.com) in the Base URL field. All relative paths you add will be prefixed with this base to produce absolute URLs.

Crawl, Add, or Bulk Import URLs

Use the new Crawl Website feature to auto-discover all pages on your site. Or use the Add URL form to enter pages individually, or Bulk Add to paste multiple URLs at once.

Configure Priority, Changefreq and Lastmod

Set a priority from 0.1 to 1.0 for each URL. Set a changefreq value and enable lastmod with today's or a custom date.

Review, Sort, Edit and Reorder

Use the URL list table to review all entries, filter by keyword, sort A–Z or by priority, drag rows to reorder, and edit any field inline.

Validate, Download and Submit

Switch to the Validate tab to confirm all checks pass. Then click Download sitemap.xml, upload to your site root, and submit in Google Search Console.

XML Sitemap Protocol Quick Reference

ParameterValid ValuesRequired?Notes
XML namespacehttp://www.sitemaps.org/schemas/sitemap/0.9✅ YesMust be on the <urlset> root element
<loc>Absolute URL starting with https://✅ YesOne per <url> entry. Max 2,048 characters. UTF-8 encoded.
<lastmod>W3C Datetime: YYYY-MM-DD or ISO 8601❌ OptionalOnly update if content substantially changed.
<changefreq>always, hourly, daily, weekly, monthly, yearly, never❌ OptionalA hint only — Google and Bing may ignore it.
<priority>0.1 – 1.0 (decimal)❌ OptionalDefault 0.5. Relative to other pages on your site only.
Max URLs per file50,000✅ Hard limitExceed this and use a <sitemapindex> file instead.
Max file size50 MB uncompressed✅ Hard limitCompress with gzip to reduce to ~5–10% of uncompressed size.

About This XML Sitemap Generator

The TechOreo XML Sitemap Generator is a free, browser-based tool that builds valid XML sitemaps conforming to the sitemaps.org protocol version 0.9. All generation happens entirely client-side: no URLs or sitemap data are ever sent to a server.

New: use the Crawl Website feature to automatically discover all pages on your site without manually entering each URL. The crawler follows internal links up to the depth and URL count you specify.

Tool Features

🕷️Website crawler — auto-discover all pages
Valid sitemaps.org XML output
📋Bulk URL import (paste multiple lines)
✏️Inline editing — change any field in-place
↕️Drag-and-drop row reordering
🔁Duplicate URL detection and removal
📑Sitemap index file generator
🎨Syntax-highlighted XML live preview
📊Priority and changefreq distribution charts
🤖robots.txt Sitemap directive snippet
⬆️Import existing sitemap.xml files
💾Auto-saves to browser localStorage

Frequently Asked Questions

Common questions about XML sitemaps, the sitemaps.org protocol, and how to use this generator effectively.

An XML sitemap is a structured file that lists the URLs of your website, helping search engine crawlers discover and index your pages more efficiently. The sitemaps.org protocol — supported by Google, Bing, Yahoo and Ask — defines the XML format your sitemap must follow.
A single XML sitemap file can contain a maximum of 50,000 URLs and must not exceed 50 MB when uncompressed. If your site has more URLs, use a sitemap index file.
Priority is a decimal value between 0.0 and 1.0 (default: 0.5). Recommended: 1.0 homepage; 0.9 key category pages; 0.8 landing pages; 0.6 blog posts; 0.4 tag/archive pages.
The seven valid values are: always, hourly, daily, weekly, monthly, yearly, and never. These are hints — Google may override them based on its own crawl signals.
The crawler uses allorigins.win as a CORS proxy to fetch each page, then extracts all internal links using the browser's built-in HTML parser. It follows links up to the depth limit you set, skipping external URLs, file downloads, and any patterns you specify to exclude. Works best on publicly accessible sites without Cloudflare or bot protection.
Upload sitemap.xml to your site root, then: (1) Go to Google Search Console → Indexing → Sitemaps and submit the URL. (2) Add Sitemap: https://yourdomain.com/sitemap.xml to your robots.txt file. Both methods are recommended.
Exclude: pages with a noindex tag; redirect URLs; pages blocked by robots.txt; paginated pages beyond page 1; duplicate/thin-content pages; admin, login, cart pages; and any URL returning a non-200 HTTP status code.