News

Google Updates The Search Quality Raters

Google just refreshed the search quality raters document which at some point in the past was leaked then made available by Google, this document is still getting a lot of attention from the SEO community especially after Google presenting the E-A-T concept (experience authority trust). Unfortunately many people in the industry do not read well what is behind this document:

  • This document is created by engineers that write algorithms, the main objective for them is to understand the impact of their algorithm changes on the quality of the results, something like if we give more weight to online reviews is that going to improve the results or not, when they ask raters to use E-A-T to evaluate the results the engineers objective is not to use E-A-T as isĀ  in the algorithm (many parts of the E-A-T can not be programmed in an algorithm) but to adjust parts of the algorithm that can lead to better E-A-T results, what are those parts? This is the question that SEOs need to answer
  • There is a lot of data available for Google now, the part that scares me is Google Chrome data. I was playing last week with their user experience report using their BigQuery and I was surprised that each user's session speed was available for the public for almost every domain on the web, I know Google keeps saying we do not use this data in the ranking algorithm but really? If Google decides to use Google Chrome data along with some other data available for them they will have bullet proof algorithm. What I am saying here is that there is no shortage of ranking signals for Google, in the rating document they mentioned few and every one started to use them as the only ranking factor forgetting about the other 200 if not more ranking factors

What I like more about the quality raters document are the ideas that it gives to webmasters to improve user experience, in my opinion that is where webmasters need their time and energy and they will be ahead of all algorithms if they do that

Google is all about scaling:

Scaling an algorithm needs metrics that a machine can understand easily and quickly, anchor text is a good example for that, yes it can cause you a penalty if you misuse it but when it used right its power is still the same as it was when I started doing SEO in 2004, we need to remember that very well when trying to reverse engineer the algorithm

 

You Might Also Like

No Comments

    Leave a Reply