Proper Sitemap Usage

by Admin


28 Nov
 None    Site Promotion


by Rob Sullivan


by Rob Sullivan
http://www.enquiro.com

Sitemaps are great things. Not only do they provide searchers with a way to easily navigate your site, but they also let search engine crawlers find all your content.

As such, sitemaps have become a regular tool in the SEO arsenal. Today, however, I came across a forum discussing sitemaps. There were many questions on the proper way to use sitemaps such as 'should I limit my sitemap to 100 links as Google suggests? And if so, what do I do for sites with more than 100 pages?' and 'why use a sitemap? Why not just use Google's XML Sitemap submission service?' to name just a few.

So I thought for this article I'd address a couple of these issues and show you what I've found.
First, let's address the question of the number of links per sitemap. It is true that Google's webmaster's guidelines say:

"Keep the links on a given page to a reasonable number (fewer than 100)."

And while you don't want to do anything to get Google upset with you, I'm sure if you have 103 links on your page that you won't get into trouble.

In fact, I have sites that have many hundred links on a page that are getting crawled just fine. That means that while Google recommends 100, that doesn't mean that you can't go over 100.

Of course there are a host of other issues relating to why you shouldn't use more than 100 - usability being the most important - but if your sitemap is intended to be spider food then make it that: a links page that links to many of your other site pages.

I should warn you, however, that in most cases I haven't gone over 400 links per page. But at the same time I can also tell you that all 400 links do get followed eventually.

Now before you go saying "400 links! That page must look terrible!" I must tell you that this isn't the case.

For example, let's say your business has dealings in every city across the US. And for each city you have a page, or pages. How do you get the crawlers to find that deep content? Sitemap.

Of course you wouldn' link every single page to that sitemap. That could generate something in the order of 7000-8000 links (depending on the scope of the site).

So what we've done in the past is create multiple sitemap pages. Each sitemap page is themed.

The first level sitemap page lists all the states, and each state links to a state sitemap page.

From there each state page has a list of all the cities for which you have pages. It is on these state pages where the number of cities can (and likely will) exceed 100 links. But that's ok. If you are dead set on keeping to 100 links then I have another suggestion.

What we've done with still other sites is to offer the top cities on a state.

For example, on your 'California' page you could list the top 50-80 cities with direct links to those cities and then link to a 'more' page. That 'more' page could work a few different ways - it could be an alphabetized list of city pages - starting with 'A' and ending at 100 links (so for example you might get cities starting with A, B, and C on this page) with a link to the next page containing links (For example, D, E, and F). You would continue this process until you have created links to every single city page. These pages would also interlink between each other.

Or the 'more' page could simply be a list of all the California cities for which you have pages. But, as I said above, you should split it up across a few pages, limiting the number of links on the page to 300-400 per page.

As you can imagine, this sort of linking could create dozens of sitemap pages. And this is fine from the search engine point of view.

What's more, if you don't want your site's visitors landing on one of these pages, you could insert a meta tag on these pages like this:

<meta name="robots" content="noindex,follow">

This tells the search engine crawlers that it's fine to follow all the links on this page, but please don't add the page to your index.

I must tell you that we've done this type of navigation on a few different sites now and it is quite effective.

In one case, this type of navigation is supplementary. That is, a visitor could navigate the site this way, however it's not the preferred method (from the site owners point of view). In another example, this is the primary mode of navigation. That means that a site visitor is forced to browse through the site by picking the state, then a city to get to the page they want to see.

In both cases it is very functional as I've outlined here. Further, the total number of indexed pages for both site has gone up dramatically.

In the case where this is the primary navigation, the total site pages went from a few hundred pages to a few hundred thousand. The whole site has been indexed despite some of the sitemap pages exceeding 100 links. Further the site now ranks highly for most cities for which they have product. This has also lead to a dramatic increase in traffic and conversions.

For the other site - the one where this is secondary navigation - we've seen indexed pages rise from a few thousand to almost 10 million! This site also ranks highly in most cities for which it has product. In addition we've seen visitor traffic more than triple over 2 years.

And all this was due in large part to a search engine crawlable series of sitemap pages which in many cases had more than 100 links per page.

Which leads into the next question: "Why go through all this trouble when I could just submit a Google XML sitemap and be done with it?"

Well, I've addressed this in a previous article. In it I said that if a crawler is having problems indexing your site, such a sitemap won't add much benefit to you.

Sure your pages may get indexed but they won't likely rank highly. Therefore you should look at the cause of the problem - why aren't the pages getting found on their own? Answer this question first, and then see if crawling improves.

Even then, I'd still recommend your own site based map because there's one other thing we know: That is the crawlers like to find the content on their own. That means, giving them a sitemap and let them find what they can, don't force feed them a list of URLs through the XML system.


Rob Sullivan
Head Organic Search Strategist
Enquiro Full Service Search Engine Marketing

Copyright 2003 - 2005 - Searchengineposition Inc.


News Categories

Ads

Ads

Subscribe

RSS Atom