The Google Sitemaps Resource
The Google Sitemaps Info Center provides you with a full blown roundup of Google's Sitemap program from a site owner's perspective. Your Webmasters get everything they need to make your Google Sitemaps implementation a great success: free tools, tutorials and guides, source code examples, search engine optimization (SEO) tips and search engine (SE) knowledge.
Track Google Sitemaps Updates in your RSS-Reader: Web Feed
About Google Sitemaps
Google Sitemaps, introduced in June 2005 as an experiment in Web crawling and now an integrated part of Google's crawling infrastructure, allows Web sites to alert SE crawlers to fresh content. Aimed collaborative crawling keeps search results fresh, and enhances a well structured Web site's search engine visilibity. Participation in the Google Sitemaps program shortens the time-to-index, but does not guarantee instant rankings. The goal of the program is to discover the "hidden Web", and to reduce the time between releasing updated or new contents and the first fetch by a search engine crawler.
Google Sitemaps can enhance a Web site's search engine coverage, but just adding XML sitemaps will not improve its crawl-ability. This guide explains how a smart Web site architecture can ensure a close to perfect crawlability, and the role of Google Sitemaps in a Webmaster's toolset.
Supporting crawlers in indexing a web site
Detecting search engine spiders, tracking and analyzing their behavior
Preventing search engine crawlers from fetching particular files and directories
Telling search engine spiders how to index and cache a particular page
Hindring search engines to interpret a link as a vote for the link target
How to make cluttered page areas like blocks with ads unsearchable. The class name robots-nocontent can be applied to everything not related to the page's main content.
Leading search engine bots to the content they shall index
If you can't avoid query strings in URLs, keep them short
Educating Googlebot and (hopefully, in the future) other crawlers too
Webmaster's toolset to support and control search engine spiders
Google Sitemaps Tutorial
This guide to Google Sitemaps explains the Google Sitemaps Protocol and everything else one needs to know about the program, for example crawler stats and problem reports. Besides links to free 3rd party tools provided from various software vendors to create, edit, display, validate and submit XML sitemaps, this tutorial comes with source code examples for a database driven Google Sitemaps implementation.
What is a Google SiteMap and how can this service work for you?
How the communication between Google and the web site owner works
How to populate the sitemap.xml file honestly, and nevertheless getting the most value out of its submission to Google
Make your sitemap.xls file dynamic to ensure it delivers your recent changes whenever it is fetched by Google's bot
How to submit your sitemap to Google and how to inform Googlebot about new and changed content
Google's sitemap program provides detailed crawler reports which make it easy to fix issues like broken links, conflicts with robots.txt exclusions etc. etc. in no time. There is even a great robots.txt validator.
Ensure your web site is in a pretty good shape before you submit a sitemap
About site owners influencing Google's rankings via sitemaps and other fictions
Read here what webmasters have to say about their experiences with Google Sitemap, starting with the first announcement
Web sites without an underlying database and many smaller sites can't work with the dynamic approach to fully automate the Google SiteMaps channel outlined here, so here you go ...
If you prefer to buy your Google Sitemaps implementation, instead of bothering with many technical details...
Google Sitemaps Frequently Asked Questions
The Google Sitemaps FAQ provides in-depth knowledge on Google Sitemap related questions. It covers popular topics like Google's time-to-index, or indexing and ranking problems of new Web sites, with comprehensive articles.
What Google Sitemaps is all about: Aimed crawling makes Google's search results fresh, and Webmasters happy.
No, Google Sitemaps are are just (mass) URL submissions. If you feel comfortable using Google's add-url form to submit your URLs, you can submit an XML sitemap too. In any case you should care what you're submitting. Junk submissions from your site may have impact on search engine placements of clean pages.
No, Google Sitemaps is a robots inclusion protocol lacking any syntax for deletions. Remove deleted URLs in the XML file, and ensure your server responds 404 or 410 to Googlebot.
The official answer is no, PageRank adjustments are unrelated to Google Sitemaps. The inofficial answer is depends, in some cases Google Sitemap submissions can have impact on PageRank calculations.
No, Google Sitemaps enhance a site's crawlability, which is a good thing as part of a long-term SEO strategy, but even perfectly crawlable sites can get 'sandboxed'. To escape Google's probation period (a.k.a. 'sandbox') you must tweak other factors.
Depends. Before you look for a tool, work out a sitemap strategy suitable for your Web site. Choosing a tool sets the procedure to create and maintain the sitemaps in stone, thus make a good decision in the first place.
Breaking up the anatomy of the Google Sitemaps submission process leads to a time table, and answers a lot of related questions.
The time to index counted from the Google Sitemaps submission may range from a few hours to never. For Web sites in pretty good shape Google's time to index usually doesn't exceed two days.
If your particular question wasn't answered, submit it here.
Google Sitemaps Knowledge Base
The Google Sitemaps KB (BETA) is a project of the Google Sitemaps Group (Google's Sitemaps Forum). The searchable knowledge base gathers articles from various contributors answering frequently posted questions, and guides on particular tasks related to Google Sitemaps. With the help from experienced Sitemaps experts and Google's Sitemaps team the knowledge base is on its way to become a comprehensive help system for Google Sitemaps users.
Browse the index or use the site search to find the topic related to your Google Sitemaps question. If the short answer isn't enough, follow the links to more detailed information.
Google's Sitemaps Team, interviewed in January 2006, provides great insights and information about the Sitemaps program, crawling and indexing in general, handling of vanished pages (404 vs. 410) and the URL removal tool, and valuable assistance on many frequently asked questions. Matt Cutts chimed in and stated '...It's definitely a good idea to join Sitemaps so that you can be on the ground floor and watch as Sitemaps improves'. This interview is a must-read for Webmasters.
Declined submissions and error messages like URL not under Sitemap path are often caused by an incorrect usage of multiple server names. Why are example.com and www.example.com different, when both serve the same content? How to avoid confusion?
I have a site hosted under two domains, example.com and example.net, can I have a Google Sitemap on both servers, or should I consolidate the site's various addresses? I want to consolidate several brands with Web sites hosted on separate domains and sub-domains on my main site, now I need a checklist.
I've moved my site to a new domain. Can I submit a Sitemap to tell Google to index the new site rather than the old site? What else should I do to ensure a smoothly move?
By default, XML-based sitemaps are not human readable - the browser just renders a big bunch of XML code and pure data. To make your sitemap look like a normal HTML webpage, you just have to add one line to your Google Sitemaps file, that's all.
My Google Sitemap submission was accepted but I can't find (all) my Web pages with a site:example.com search. Why doesn't Google index my Web site (completely)?
Googlebot downloads my XML sitemap frequently, but didn't crawl and index all pages of my Web site yet. What can I do to improve my search engine visibility?
The free Python Google sitemap generator can be used to create Google Sitemaps in XML format by walking the file system on the web server and scanning access logs. It requires Python version 2.2 (or compatible newer versions) installed on your server.
How blog networks, (free) hosting services, and communities where each user publishes on a different sub-domain, can provide a centralized Google Sitemap service.
Although Google dominates search, I'd like to get the most out of my efforts to create and maintain a Google Sitemap. Are there other search engines which accept mass URL submissions via XML sitemaps? Yes.
Should I include all URLs in my sitemap, even feeds, images, videos and other Web objects without META data? Will a Google Sitemap help to get framed pages indexed? Should I submit thin pages via Google Sitemaps or would this hurt?
Google Sitemaps XML Validation
Validating the XML structure of a Google XML Sitemap (respectively the Google Sitemaps Index File interlinking segmented XML sitemaps on huge Web sites) before its submission is a good idea. Our free online tool validates and (re-)submits uncompressed XML sitemaps stored on any Web server.
Google Sitemaps Generation & Maintenance
There are hundreds of great Sitemaps tools out there (links), each tool supporting other business processes, or types and sizes of Web sites. To decide which Google Sitemaps Tools make sense for your Web site, please refer to the FAQ item How to choose a Google Sitemaps tool.
Simple Sitemaps is a free tool generating dynamic Google XML Sitemaps and HTML Site Maps as well as a RSS site feed from one set of page data.
There are great Google XML Sitemap Generators out there. Many Content Management Systems come with a build-in HTML Site Map generator. So why the heck do we provide just another free sitemap tool?
License agreement (choose free or paid) and download link.
A few simple steps to your dynamic site maps. Plainly explained with samples.
Check in here for new features and bugfixes.