Understanding Googles supplemental index and results

by Admin


30 Jun
 None    Search Engines


by Michael Bloch


by Michael Bloch
http://www.tamingthebeast.net

Have you run a search on Google for pages on your own site and noticed "Supplemental Result" next to the URL listing?

What is a supplemental result? Is it bad? How do you get out of Google's supplemental index?

For Google's brief explanation of the supplemental index, read this entry in their Webmaster FAQ.
How the supplemental index works
Supplemental results don't necessarily indicate you've done something wrong and it is not a penalty as such. In a nutshell, the main determining factor for if a page url is in Google's main web index or in the supplemental index is PageRank; Google's measure of popularity of that page.

Pages that are listed as supplemental may still rank highly for multiple word search phrases, but are unlikely to be highly ranked for competitive or short phrases. If the pages returned as supplemental aren't what you feel to be important pages on your site, don't sweat it too much; it isn't necessarilty the harbinger of site-wide doom.

Think of the supplemental index as Google's main index saying "hmm, not sure about this page, we won't totally slam or penalize it, but we'll stick it in the supplemental index for a while and see what happens with it"

New sites and supplemental results
As an example, when I launched a site late last year and it began to be listed in Google's search index, many pages were coming up as supplemental. By the time the site had started appearing in Google results there were over a hundred pages of content, and only a handful of those pages were linked directly from the home page. The pages that went supplemental were those that were not linked from the home page.

Over a couple of months, these supplementals started disappearing as others began to link to the site and specific pages and it began to establish some authority. Where 90% of pages were supplemental, the figure is now around 10%.

Even well established sites, particularly regularly updated blogs, are likely to have supplemental pages for the same reasons as the above. I just ran a few searches on some of the most popular blogs in certain industries and came across dozens, sometimes hundreds of supplemental entries for all the blogs I checked. Taming the Beast.net also has its share of supplemental results.

Escaping the supplemental index
It's not really hard to get pages back into the main index if your site is established and clean. The easiest way I've found to get pages out of the supplemental index is to either:

a) Create a link to the supplemental page from a page that's ranking well in the site or on the home page. The home page probably works best.

b) Create a "best of" page on your site listing all the pages you deem to be the most important and link to that page from every other page. This will also benefit your visitors, not just search engines.

c) Create a site map and link to that from every page, including your home page. This option is a little unwieldy if your site consists of thousands of pages.

d) Try to attract *quality* inbound links from other sites to that particular page.

e) If content is a little thin on the page, add more quality material to it.

In most cases, it really is that simple. In summary, if a page is buried and has few/no inbound or internal links pointing to it, the chances are increased of that page winding up in the supplemental index.

Added to the above, some of the points below are also worth considering if you're having supplemental issues.

Suddenly supplemental
But what if have pages that have ranked solidly for quite a while, the internal linking structure hasn't changed, you have inbound links to the pages and they go supplemental?

You'll need to think big picture in this scenario. While being listed in the supplemental index isn't a penalty, what it can do is indicate that some inbound links have lost authority and link love power; particularly if they are pointed to that specific page.

Let's say you have a page that's buried on your site and has few internal links to it, but Site A who is an authority has linked to that page. That's enough to have it ranking well in many cases. If Site A then buries the page the link is on, removes the link or loses authority itself, that's enough to cause a scenario where your page will wind up in the supplemental index.

The same might occur if you've been participating rather blindly in link exchanges. Once the link exchange network has been discovered by a search engine, sometimes a penalty is applied, but sometimes the links within that network are simply disempowered. Read more on linking to bad neighborhoods.

Sitewide supplementals
If you have an established site that winds up totally supplemental, there's a couple of things to look for:

a) Are you using unique meta-tags for each page, particularly title tags? If not, this could be a contributing factor. Using the same tags is like saying to the googlebot "this page is the same as every other page on my site"

b) Duplicated content. Does your site have a substantial proportion of duplicated content with little surrounding it to distinguish it from the original site where the content was from? Again, this can cause supplemental issues. When using content from other sites, add an editorial with your own comments about the article, provide extra resources etc.

c) Little on-page content. This particularly becomes an issue when your site template is complex and contains a great deal of text. If your pages/posts only consist of a hundred words or so, this can be a very low ratio of content to repetitive elements. I suggest ensuring each page has at least 250 words - that can also help encourage you to provide more value to your readers.

d) Incredibly long URL strings. Google is a lot better at handling long URLs created by dynamic applications than it used to be, but it can still choke on them from time to time and the risk increases the longer the string is. Most popular applications have mods avaialable for making URL strings shorter, find one for your application by searching for:

applicationname search engine friendly mod

... via your favorite search engine. Also check your application's back end - there may already be a URL shortening feature available that's activated by ticking a box.

e) Over-indulgence in link exchanges. Keep what's relevant and quality, burn the rest and work on getting more quality inbound links. The best way to achieve this is to produce good quality content that people will want to link to.

Another supplemental tip
As I was watching one of my sites tussle with the supplemental index, what did spring to mind that some pages really deserved to be in there as they had little value to the visitor. For example, some pages offering registration, generated by the application the site is based on and which I don't utilize.

Some blog, forum and CMS packages can generate all sorts of pages that you really don't want or need. Instead of leaving this type of crud in there for Google to sort out, it makes better sense to use the robots exclusion tag or a robots.txt file to tell search engine spiders "nothing to see here, move along". You want spiders to be focused on your good stuff, not the offcuts.

Supplemental index history
When Google originally implemented the Supplemental Index in 2005, it saw the introduction of two types Google spiders, aka googlebots - the "main index" bot and the "supplemental" bot . These bots are the critters that Google releases to crawl across the web, indexing pages. The "main" bot passing pages to the supplemental bot under certain conditions is probably the simplest way to describe the process.

In the early stages of the Supplemental googlebot, it didn't visit affected pages all that often, creating a sort of "Supplemental the devils playground", whereby if you did make changes such as listed above, it would take a very long time for any positive effects to become apparent. The Supplemental googlebot now visits more often. I've found I've been able to fix pages that have gone supplemental within weeks instead of months.

Just a further note on anything Google related; or any search engine for that matter - there will always be aberrations to the "rules".

Search engines aren't infallible, they screw up just like anything else created by humans. One site may break all the rules and rank highly, whereas a site may follow all the rules and rank poorly. Search engine optimization is part science, part magic and part good luck.


Michael Bloch
Taming the Beast
http://www.tamingthebeast.net
Tutorials, web content, tools and software.
Web Marketing, Internet Development & Ecommerce Resources


News Categories

Ads

Ads

Subscribe

RSS Atom