15 Google spam filters and how to avoid them - Part 5

by Admin


27 Mar
 None    Search Engines


Copyright by Axandra.com


Copyright by Axandra.com
Web site promotion software

Google tries to keep its search results as clean as possible. For that reason, they have a variety of spam filters in their ranking algorithm that try to remove low quality web sites.

If your web site gets caught by one of these filters, it will be very difficult to get high Google rankings. In this article series, we're taking a look at the 15 most common Google spam filters and how you can get around them.

Duplicate content, false use of the robots.txt file and Google bowling can be the reason if your web site rankings on Google dropped. This article will help you to find a solution.
Duplicate content, false use of robots.txt and Google bowling

  • The duplicate content filter is applied to web pages that contain content that has already been indexed on other web pages. This can happen if you have multiple versions of the same page on your web site or if you use content from other sites on your own web pages.

    If the content is already available on another page then it will be difficult to get high rankings for your own web page. If the same content is available on multiple pages then Google will pick only one of them for the search results. Having the same page more than once on your web site might also look like a spamming attempt.

  • False use of the robots.txt file is not exactly a Google spam filter but it basically has the same effect. While a robots.txt file can help you to direct search engine spiders to the right pages it can also lock out search engines from your web site if you use it incorrectly. Further information about the robots protocol can be found here.

  • Google bowling means that competitors use spammy SEO techniques to get your web site out of the search results. These people set up doorway pages with JavaScript redirects, blog spamming, referral spamming, etc.

    Although your competitor has set up these spam pages that redirect to your web site, Google might think that it is you who is responsible for these spamming attempts and downgrade your web site. Google claims that external factors cannot influence your rankings on Google. However, some "black hat" SEO'lers offer services that can harm the rankings of your competitors.

How to get around these filters

If you have multiple versions of the same page on your web site (print version, online version, WAP-version, etc.) then make sure that search engines will index only one of them.

You can exclude special web pages from indexing by using a robots.txt file or the Meta Robots tag. IBP's web site optimization editor allows you to quickly add Meta Robots tags to your web pages.

Double check the contents of your robots.txt file to make sure that you don't exclude search engines by mistake.

If your web site has been hit by Google bowling then the only thing you can do is to file a reinclusion request.

The best way to get high rankings on Google and other major search engines is to use white-hat SEO methods: Optimize the content of your web pages and get high quality inbound links.

Copyright by Axandra.com
Web site promotion software


Related links:

15 Google spam filters and how to avoid them - Part 1
15 Google spam filters and how to avoid them - Part 2
15 Google spam filters and how to avoid them - Part 3
15 Google spam filters and how to avoid them - Part 4
15 Google spam filters and how to avoid them - Part 5


News Categories

Ads

Ads

Subscribe

RSS Atom