The Lowest Bounce Rate I have Ever Seen

I own lots of websites, most of them I use to do some search engine experiments in order to understand algorithms more and more. For a long time I was giving less attention to user’ experience and focusing more on improving link profile and authority for those website along with changing the page elements to achieve better rankings. Recently I decided for one of these websites to focus mainly on users experience (it was not easy though I spend lots of hours), that was the main but not the only focus as I did link building and improved my page elements to be search engine friendly, but with this site in specific the main goal was HOW TO GIVE MY USERS WHAT THEY ARE LOOKING FOR, amazingly I got a great traffic and the lowest bounce rate I have ever seen and more $$$$:

I am not going to comment on this screen shot as it speaks to itself, one thing I want to mention here is that none of these keywords are brand based. I gave search engines what they want, I gave my users what they are looking for and in result everyone is winning:

  • Search engines are keeping their users happy and coming back to use them because they delivered related results.
  • Users won as they found exactly what they are looking for
  • I won as I got thousands of visits from search engines that I was able to monetize.

For everyone reading this post there are few takeaways:

  1. You do not need to cheat search engines, you can still win by following their term of services
  2. Relevancy is key, do not be greedy in targeting your keywords, if you are a dentist you do not need to target a keyword like “tooth paste” even if it seems relevant to your business it is not relevant to the intent of users typing in this phrase in search engines.
  3. User experience is VERY IMPORTANT, what applies to the off line world applies to the online world, a happy client off line is a good referral, and a happy user online is recurrent user and a good referral/advocate to your website and your brand.

Finally I want to mention that those takeaways apply also to paid search, I still have a good laugh with my friend and SEO guru Paul Teitelman any time we remember when he was able to achieve 50% click through rate with one of his paid campaign by being so relevant in choosing his keywords, writing his ad to speak to the keywords and building his site to give users what they want.

Posted in Google Analytics, Usability | Leave a comment

Is Your Page Fully Optimized?

Search engine optimization in the early days (1996-1998) was purely on page optimization, with the merging of basic meta search engines that time all what people needed to do in order to rank well was making sure that their keyword is repeated several times in meta tags and page content, keyword density was a very popular term that time (a hot SEO topic was shall I make my keyword density 2% or 5%).

Google came later with Page Rank and inbound links concepts as a way to score pages and rank them, sure enough it did not take webmasters a long time to adapt with the new system, link building or what is known as off page optimization became the ultimate goal for most webmasters, the link model was proofing a huge success with Google, it was possible at some point to rank a blank page for competitive keywords using only links.

The link rash really destructed and still distrusting webmasters from paying more attention to their on page optimization, more SEO companies and webmasters spend a short one time efforts on optimizing their pages and move to start a contentious long run link building campaign, links are still more important than on page but resource/money wise links are not easy to be acquired, especially with Google applying lots of filters on links and its spam team is becoming very efficient in weeding out spammy low quality links, on page factor defiantly worth more attention especially for hyper competitive keywords.

What I am recommending here is not only one time basic on page optimization where you:

  • Add your keywords to your title tags and meta tags
  • Making sure all your content is crawlable and readable by search engine
  • Making sure that there is adequate amount of keyword rich content on the page to enable search engines to analyse and theme the page
  • Your internal navigation is passing page rank to the pages that are competing for the most competitive keywords

On page optimization can go further by trying different things on your page and track the impact of your changes on search engines’ result, with Google presenting Caffeine which almost immediately translate crawling data into scoring data there is no need to wait a long time to see the result of your changes, things that worth to be tested:

  • Take some links from your navigation or your page, then add them back
  • Change the structure of your page
  • Rewrite the content on the page
  • Increase or decrease keyword density
  • Change internal linking

Test on page exactly the way you do it with conversion optimization (A-B test), find the best ranking formula and keep it, then try to replicate it for other pages on the site.

Finally remember to revisit your on page optimization once a while

Posted in Links, SEO | Leave a comment

Why Google Still Gives too Much Attention to Anchor Text?

Search engines are suppose to search millions of documents for queries typed by users and return results in no time, this simple task is more complicated than what it looks like, search engines cannot search the web instantly for each query and return results, they can only search their own database that is hosted on fast servers in order to return results as quick as users are expecting.

In order for search engines to build a searchable database they need to crawl the web on daily basis and turn unstructured data to a structured data, a simple example for that is crawling web pages then saving title tags in column call TITLE-TAGS, meta Keywords tag in a column META-KEYWORDS, properly that is how early search engines started, with resources and technology limitations the easiest way to build a search engine was to crawl the web and save few tags of a page without any content analysis and make it available for search, imagine how easy it will be to manipulate this search engine :)  all what you need to do is stuffing your title tag and keyword tag with lots of keywords and wait to get indexed then voila you are #1 for terms that you dream now to hit #100 for.

It did not take spammers a long time to figure out how to rank well in the first generation of search engines, quality of search results start to drop, search engines tried (not that hard) to weed out spam by either improving their content analysis, or using trusted sources of data (Yahoo relied on Yahoo directory at some point for that) but trusted source that time meant human edited which couldn’t handle the growth of the web.

Later on Google’s founders came to say; there must something better than that to control search quality and they came with two great concepts: 1- Page rank, which mean not all pages should be scored the same, their score must be based on how many links pointing back to them (internally or externally but more value will go to external links) 2- The best way to check a document relevancy is not to check what is the owner of this document is saying about it (title tags and keyword tags) but what other people are saying about this document (anchor text in external links)

Great concepts!!! The whole webmasters/web site owners/web 2.0 users turned to be a quality assurance force working for Google!! No need for human editors at all just get lots of powerful machines to crawl the web on regular basis and analyse pages and links.

That worked really well for Google and made it stand out search engines crowd as the quality of results have been increased significantly.

Many people may be figured it out but it is not as manipulable as the title tags keyword tags one.

Lots of people in the SEO community keep saying why Google is still relaying on anchor text to measure relevancy? The answer is simply because there is no alternative YET, Google needs to analyse billion of documents, score them using machines without any human intervention, it is hard to do that without using links/anchor data.

Remember also that Google has a web spam team with a main focus of fighting link spam, doesn’t that tell you anything? Simply in the near future Google has no alternative for links/anchors the solution was to create web spam team that is trying always to go after link traders and keep reminding/scaring the SEO community not to manipulate links. Google also must be given a big credit for several enhancements that have been done on the page rank formula along with many other quality control improvements.

Finally in this post I did not try to encourage buying links, it is more about explaining why Google is still relaying on links/anchors and will for a long time.

Posted in Search Engines | Tagged , | Leave a comment

Give Google What It Wants

The ultimate goal of Google as a search engine is to return relevant results to users so they come back to use Google in the future, with more users coming to Google more ads money will cash in, anything you do that goes against this goal will be against Google quality (money making) guidelines. Here are few actions you can take to give Google what it wants:

  1. Be relevant: Relevancy is key for your business, set each page on your site to target few keywords that are with great relevancy to your content, do not go broad, users that land on your page must easily find the information that they are looking for, are you a dentist? Then do not target pluming it is totally irrelevant, do not even target “tooth paste” people that are looking for a tooth paste are not looking to go to a dentist.
  2. Quality content: Quality content is a key to keep both users and search engine happy, make sure to focus more on plain text content as it is easy to be crawled and indexed by search engine. Write content with keywords in mind.
  3. Usability: Usable site will keep your users more time on your site which will result in better conversion rate at the same time will reduce the bounced traffic to search engines (bounced traffic is a low quality/relevancy signal for search engines)
  4. Links Links Links: Unfortunately, no matter how relevant you are, how much quality content do you have on your site you still need inbound links to get search engine’s love. The web is very noisy, it was very hard for search engines especially Google to evaluate billions of sites without counting on links and use them as votes of measuring quality.
Posted in Google, Links, SEO | Tagged , | Leave a comment

Hello world!

Hello world!! Usually first thing I do when I install WordPress for a client is deleting this post, however for the first time ever may be I am going to keep it to say hello.

Posted in Uncategorized | Leave a comment