Getting listed on Google's first result page is the goal of many webmasters. Unfortunately, many webmasters still do the wrong things to improve their rankings.
It's not possible to get top 10 rankings on Google by focusing on a single strategy. Getting listed on Google requires you to work on all elements of your website. Here are the top 5 factors that influence the position of your website in Google's search results:
If you’re interested in the academic side of search, chances are you’ve come across the work of Marti Hearst, a professor at Berkeley’s School of Information. Her work covers everything from search engines, search interfaces, social technology, information visualization and web usability. Her book, Search User Interfaces, came out late last year and is a great resource for anyone interested in the information seeking process and how users interact with search and search results. Better yet, the text of the entire book is available free online.
You have a beautiful website with great products, great guarantees, many comprehensive pages and great customer service. Unfortunately, Google and other search engines won't give your website high rankings.
There are several reasons why search engines do not list websites although they look great and offer quality content:
The official Bing blog recently had a post about web spam. According to Bing's definition, web spam is "unwanted web content that uses overtly manipulative techniques in an effort to fraudulently attain undeservingly high ranking in search engines."
Okay, I admit it. Bing is starting to show some glimmering signs of promise. But I still have concerns. Big concerns.
I had the chance to chat with Stefan Weitz recently about where Microsoft wanted to take Bing and it’s hard not to get swept up in Stefan’s evangelism. Microsoft is trying to do some very impressive things with search: parse the ambiguity out of our language, stitch together disparate fragments of content into a whole that’s useful to the user and present all this in a results format that informs and assists without requiring extensive tweaking on the part of the user.
Yesterday, I blogged about a great webinar in my post B2B Social Best Practices in the Marketing Cloud, where the discussion centered saround B2B companies and social media strategies. Everyone keeps saying that 2010 is the year for social. The fact is that some organizations have been developing social strategies for years. Social is not new, but it is a hot topic for many B2B companies as they are shuffling budget over to develop social strategies.
Google has announced a major change in the way that they handle search results by including synonyms for some words that may be used in queries. How does this affect the position of your web pages in Google's search results?
Last month, Google started to display real-time results in addition to the regular top 10 pages on their search result pages. The real-time results are meant to offer web searchers access to brand new news items as fast as possible.
This month, Google was granted a patent with the name Duplicate document detection in a web crawler system. The patent explains how a content filter from the search engine can work with a duplicate content server.
To say Google has been busy would be an understatement – there’s been over 30 new search innovations since October. Keeping itself on pace, today Google announced the launch of real-time search, an innovation we know has been coming for a while now as Google tries to compete with the freshness of Twitter content. The new real-time search function, called “latest results”, incorporates feeds from news, blogs, FriendFeed, Twitter, Facebook, and Myspace. It shows up in the Google search results page similarly to how news, blog, images or video results are currently displayed.