The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more intelligently. Sitemaps are a URL inclusion protocol and complement robots.txt, a URL exclusion protocol.
Sitemaps are particularly beneficial on websites
The webmaster can generate a Sitemap containing all accessible URLs on the site and submit it to search engines. Since Google, MSN, Yahoo, and Ask use the same protocol now, having a Sitemap would let the biggest search engines have the updated pages information.
Sitemaps supplement and do not replace the existing crawl-based mechanisms that search engines already use to discover URLs. By submitting Sitemaps to a search engine, a webmaster is only helping that engine's crawlers to do a better job of crawling their site(s). Using this protocol does not guarantee that web pages will be included in search indexes, nor does it influence the way that pages are ranked in search results.
The Sitemaps protocol is based on ideas from "Crawler-friendly Web Servers".
The Sitemap Protocol format consists of XML tags. The file itself must be UTF-8 encoded. (Sitemaps can also be just a plain text list of URLs. They can also be compressed in .gz format.)
http://w3c-at.de 2006-11-18 daily 0.8
If Sitemaps are submitted directly to a search engine (pinged), it will return status information and any processing errors. The details involved with submission will vary with the different search engines. The location of the Sitemap can also be included in the robots.txt file by adding the following line to robots.txt:
The The following table lists the Sitemap submission URLs for several major search engines: As with all XML files, any data values (including URLs) must use entity escape codes for the characters : ampersand(&), single quote ('), double quote ("), less than (<) and greater than (>).
Help page http://www.google.com/webmasters/tools/ping?sitemap=
How do I resubmit my Sitemap once it has changed? Yahoo!
Does Yahoo! support Sitemaps? Ask.com
Q: Does Ask.com support sitemaps? Live Search
Webmaster Tools (beta)
Sitemap files have a limit of 50,000 URLs and 10 megabytes per sitemap. Sitemaps can be compressed using gzip, reducing bandwidth consumption. Multiple sitemap files are supported, with a Sitemap index file serving as an entry point for a total of 1000 sitemaps.
The following table lists the Sitemap submission URLs for several major search engines:
As with all XML files, any data values (including URLs) must use entity escape codes for the characters : ampersand(&), single quote ('), double quote ("), less than (<) and greater than (>).