robots.txt & Sitemap Validator - Validate Robots Rules & Sitemap XML Online

Free online robots.txt and sitemap.xml validator. Parse and validate robots.txt directives, test URL access rules, and verify sitemap XML structure, URLs, and metadata.

No issues found — robots.txt is valid!

Parsed Rules (5)

User-agentTypePath
*disallow/admin/
*disallow/private/
*allow/public/
Googlebotallow/
Googlebotdisallow/no-google/

Declared Sitemaps

  • https://example.com/sitemap.xml

Test URL Access

Blocked — Matched "disallow: /admin/" for User-agent: *

What Is a robots.txt & Sitemap Validator?

A robots.txt & Sitemap Validator is a dual-purpose SEO tool that helps webmasters and developers ensure their robots.txt and sitemap.xml files are correctly formatted and follow search engine standards. The robots.txt validator parses directives like User-agent, Allow, Disallow, Sitemap, and Crawl-delay, checks for syntax errors and common mistakes, and lets you test whether specific URLs would be allowed or blocked for a given crawler. The sitemap XML validator verifies XML structure, checks for required tags like urlset, url, and loc, validates optional metadata such as lastmod, changefreq, and priority, and reports any formatting issues. Both files are critical for controlling how search engines crawl and index your website.

How to Use the robots.txt & Sitemap Validator

  1. Select the 'robots.txt Validator' or 'Sitemap XML Validator' tab depending on which file you want to validate.
  2. For robots.txt: Paste your robots.txt content into the text area. The tool instantly parses directives, displays validation results (errors, warnings), and shows parsed rules in a structured table.
  3. Use the 'Test URL Access' section to enter a URL path and User-agent to check whether a specific URL would be allowed or blocked by your robots.txt rules.
  4. For sitemap XML: Paste your sitemap.xml content into the text area. The tool validates XML structure, checks required and optional tags, validates URL formats and date formats, and displays all parsed entries.
  5. Review any validation issues highlighted with red (errors), yellow (warnings), or blue (info) indicators, and fix them in your actual files.

Common Use Cases

  • Pre-deployment SEO Check — Validate your robots.txt and sitemap.xml before deploying to production to ensure search engines can properly crawl and index your site without encountering errors.
  • Debugging Crawl Issues — When search engines aren't indexing certain pages, use the URL tester to check if your robots.txt rules are accidentally blocking important content.
  • Sitemap Compliance — Verify that your sitemap.xml follows the Sitemaps protocol specification with valid URLs, correct date formats, and proper priority values before submitting to Google Search Console.
  • Migration Validation — After a site migration or redesign, validate that updated robots.txt rules and sitemap entries correctly reflect your new URL structure.

FAQ

What directives are supported in robots.txt validation?
The validator checks User-agent, Allow, Disallow, Sitemap, Crawl-delay, and Host directives. It reports errors for missing colons, empty values, paths not starting with '/', and rules appearing before a User-agent declaration.
How does the URL test feature work?
Enter a URL path (e.g., /admin/settings) and a User-agent (e.g., Googlebot or *). The tool matches the path against your parsed Allow and Disallow rules using longest-prefix matching, just like search engine crawlers do.
What sitemap tags are validated?
The tool checks for the required urlset root element, url entries with loc tags, and validates optional tags including lastmod (W3C Datetime format), changefreq (always, hourly, daily, weekly, monthly, yearly, never), and priority (0.0 to 1.0).
Does this tool fetch files from my website?
No. This tool only validates content you paste into the text areas. It does not make any HTTP requests or access your server. All processing happens locally in your browser for maximum privacy and speed.
Does it support Sitemap Index files?
Yes. If you paste a sitemap index file (using sitemapindex instead of urlset), the validator detects it automatically, parses the referenced sitemap locations, and displays them in the entries table.

Verwandte Tools