Google Sitemap Update Frequency and Sitemap Stats

Posted on 02/03/09 2 Comments

This post is a bit off our normal topics; however, it should be of interest to just about everyone who administers a web site.

Google sitemaps have an  ‘Update Frequency’  (Change Frequency) parameter that informs GoogleBot how often a web page is updated.   For example, a blog homepage may be updated as often as ‘Daily’ while the individual blog post entries are basically ‘Never’ updated.   Setting the Update Frequency of the links in a Google sitemap correctly is critical for search engine optimization (SEO).

Recently, we experienced an issue in a large forum with a Google sitemap of around 350K links that grows month-over-month around 15K links.   The problem was that the Sitemap Stats in Google Webmaster Tools showed that the number of links indexed by Google would not converge with the number of links in the Google sitemap.  Making matters even worse, the number of links indexed by Google would drop significantly each month, and the ratio of indexed links to the size of the sitemap would often drop below 50 percent.   This problem resulted in decreased web traffic due to less Google search engine referrals.

After much research, I noticed that the Update Frequency for the Google Sitemap for the forum posts were set to ‘Monthly’.   This sitemap parameter was originally set to monthly to encourage GoogleBot to crawl the site more frequently.   Unfortunately, the unintented side effect for a large site with hundreds of thousands of posts, like a large on-line forum was that the number of indexed links would never converge with the number of links in the sitemap.

Google advises that this parameters will never fully converge, for various reasons.  However, for a forum with few images, a 40 to 60 percent sitemap convergence ratio is not clearly unacceptable.

A few weeks ago, I changed the Update Frequency for the individual forum posts to ‘Never’ and the results have been encouraging.   The ratio of indexed links to the size of the sitemap is currently nearly 80 percent and continues to increase weekly. This is a significant improvement.

So, for all you webmasters out there with a large number of pages, instructing GoogleBot to crawl less frequently can result in more traffic to your site.   At first, this seems counter-intuitive, because our first impression is that the more GoogleBot is crawling, the better.   For established sites with a large sitemap, make sure you keep your Update Frequency for individual posts (that do not change significantly) to ‘Never’ or, at most, ‘Yearly’.  Based on our experience, I recommend you set this to ‘Never’ for large sites (blogs, forums) with hundreds of thousands of posts (pages) that rarely, if ever, change.

Enjoy your increase in web traffic !

Share and Enjoy:
  • Print
  • Digg
  • StumbleUpon
  • del.icio.us
  • Facebook
  • Yahoo! Buzz
  • Twitter
  • Google Bookmarks

2 Comments

  1. Amarjeet says:
    Wednesday, December 16, 2009 at 9:44am

    Hi Tim,
    I have read your article, its quite interesting one and also helpful for creating site maps. But here you have mentioned about forums, blogs especially. I want to know about the change frequency for shopping websites, in which we used to enter new products many times.

    Thanks

  2. Tim Bass says:
    Saturday, December 19, 2009 at 2:43pm

    Entering new products on most SEO optimized web sites results in new URLs because the dynamic parameters are rewritten using rewrite rules.

    These new product URLs do not change often either.

    Please don’t confuse change frequency for existing pages with adding new pages. New products should create new pages and if you use dynamic parameters, you should make sure you rewrite those URLs to be SEO-friendly.

    Cheers.