Google has introduced a new beta product called Google SiteMaps. Google SiteMaps is a method for Webmasters to communicate directly with a Google robot that their website’s content has changed and it needs to be re-indexed.
According to Google, “Google Sitemaps is an experiment in web crawling. Using Sitemaps to inform and direct our crawlers, we hope to expand our coverage of the web and improve the time to inclusion in our index. By placing a Sitemap-formatted file on your webserver, you enable our crawlers to find out what pages are present and which have recently changed, and to crawl your site accordingly.”
As part of Google SiteMaps, Google offers its Sitemap Generator to format a website’s URL’s into an XML formatted file that the robot can easily read. For those who know scripting, customized sitemaps can also be generated.
SEO’s and others have known for a couple of years now the value of placing a sitemap on your website. This has been one method to encourage the robots to crawl your site frequently and give them a one-stop shopping place to get all of the information they need.
Google has taken this one-step further by introducing their own Google SiteMaps, which will spoon-feed their robot the information they need in the format that they require. For impatient Webmasters keeping close track on how often Googlebot crawls their sites, Google SiteMaps will be welcomed with open arms.
What will happen after we re-name the planet Google?
Q