Google doesn't like duplicate content. The reason for that is that the top 10 search results should offer users a choice of different web pages.
Google's new patent application on near duplicate content describes a new method on how Google tries to keep its users from finding redundant content in the result pages.
What will it take to beat the Google Habit? There's billions of dollars that hang on the answer to that question. My last two columns looked at the nature of habits and how they can lead to an advantage for incumbents by 'locking in' customers or users.
Before we look at some possible answers, it's important to understand how and why previous attempts at breaking habits have fallen short in an area where far more academic work has been done: health care (Verplanken & Wood, 2006 - PDF file).
Search engine spiders are becoming more intelligent. It was possible to fool search engines with a simply meta keywords tag some years ago, but search engines now have a deeper understanding of the content of a web page.
Yahoo recently published a patent application that gives some insight on how Yahoo finds and evaluates keyword phrases on web pages.
Now that there seems to be some sort of union in Yahoo's future, blessed or otherwise, I felt the urge to pass along some advice to whoever the happy couple might be. For, in all this talk about the impending nuptials , the clear objective is to survive and compete in the business of attracting the attention of prospects online.
I offer this advice on behalf of users, because frankly, I think that's the only perspective you should be interested in. I'll explain why.
Three weeks ago, we informed you about Google's new position 6 penalty. At this time, it was unclear why Google assigned this penalty to some websites.
The theories were that Google considered usage data when calculating the rankings and that Google had a better understanding of word and phrase relationships.
A discussion in an online webmaster forum indicates that Google might have invented a new ranking penalty for websites that rank well for popular search terms.
Having high rankings on search engines is a great thing. However, it's also important that your web pages are displayed with an attractive description in the search results. If the description is not appealing to web surfers then they might not click the link.
How do Google, Yahoo and MSN/Live create the descriptions and snippets that are used in the search results?
Yesterday, I had the tremendous privilege of moderating a Webinar with our Search 2010 Panel: Marissa Mayer from Google, Larry Cornett from Yahoo, Justin Osmer from Microsoft, Daniel Read from Ask, Jakob Nielsen from the Nielsen Norman Group, Chris Sherman from Search Engine Land and Greg Sterling from Sterling Market Intelligence. It was a great conversation, and the full one hour Webinar is now available.
Last week, Yahoo announced that they now support the X-Robots-Tag in the HTTP header. This new tag allows you to influence how Google and Yahoo index your website pages.