Monthly Archives

May 2019

News

Browsers Pro-privacy Enhancements and Their Impact on Marketers

Privacy is becoming a major concern for online users and companies are taking actions to address this concern and improve user’s privacy. Apple is a leader when it comes to pro-privacy enhancements, FireFox and Google Chrome seem to be following their footsteps recently.

What is Apple Safari doing to address privacy concerns:

In 2017 Apple started their ITP project (intelligent tracking prevention), with ITP1.1 they started to be strict with third party cookies (partition them or purge them when some conditions met) which caused big issues to cross-sites trackers like Google and Facebook. Google and other providers came up with a solution to get around that by storing their cookies as first party cookies and use variables in the URL to pass their tracking token to the next website.

ITP1.1 kept developing to become ITP2.0, ITP2.1 then finally ITP2.2. The biggest change that will make a big difference for marketers started with ITP2.1 where Safari started being strict with some first party cookies (keeping them only 7 days from the last visit), ITP2.2 is supposed to remove tracking variables from the URL (URL decoration) and reduce first party cookies lifespan to one day, this will be more difficult to get around for cross-site trackers, even if technologies like GA find a solution to get around that Safari may shut it down quickly if they are not happy with it.

How is ITP2.2 Going to Impact Marketers?

  • New users data in Google Analytics will be impacted with that, if a user comes back to the website after 2 days of the last visit they will be treated as a new user
  • Attributions will be impacted in Google Analytics, if a new user comes from a paid click and 2 days later they come through an organic click the initial medium will be set as organic. This will paralyze the whole attribution model in Google Analytics
  • Google Remarketing Ads will lose its ability to track users for more than one day
  • Disabling URL decoration will cause issues to cross domain tracking in Google Analytics as this feature relays solely on URL decoration to pass the CID to the next domain
  • Disabling URL decoration will cause also big issues to affiliate marketing platforms like Commission Junction where cross site tracking and URL decoration are the foundations of their conversion tracking

Safari's market share is almost 15% worldwide, but not all versions of Safari will have ITP2.2 enabled so the short term impact will not be that significant. I have done some year over year analysis on metrics like new users where I expect to see a spike in new users coming from Safari but I could not notice any difference yet.

I tried my websites on Safari 12.1.1 and the life span of _ga cookie was only 7 days:

This cookie is supposed to have 2 years lifespan (the screen shot below is taken from Google Chrome on a Mac)

So for now only ITP2.1 seems to be live not ITP2.2  (_ga cookie will have one day lifespan when ITP2.2 becomes live)

What About Chrome and FireFox?

FireFox has announced their own version of anti cross site tracking policy, they are supposed to roll it out this summer but it is still not clear how strict it is going to be.

Google Chrome has announced a cookie control plan without a lot of details (I do not see it causing any interruptions to Google Ads or Google Analytics)

I will keep updating this post with any new privacy and cookies policy changes by any major browser.

 

News

Google Updates The Search Quality Raters

Google just refreshed the search quality raters document which at some point in the past was leaked then made available by Google, this document is still getting a lot of attention from the SEO community especially after Google presenting the E-A-T concept (experience authority trust). Unfortunately many people in the industry do not read well what is behind this document:

  • This document is created by engineers that write algorithms, the main objective for them is to understand the impact of their algorithm changes on the quality of the results, something like if we give more weight to online reviews is that going to improve the results or not, when they ask raters to use E-A-T to evaluate the results the engineers objective is not to use E-A-T as is  in the algorithm (many parts of the E-A-T can not be programmed in an algorithm) but to adjust parts of the algorithm that can lead to better E-A-T results, what are those parts? This is the question that SEOs need to answer
  • There is a lot of data available for Google now, the part that scares me is Google Chrome data. I was playing last week with their user experience report using their BigQuery and I was surprised that each user's session speed was available for the public for almost every domain on the web, I know Google keeps saying we do not use this data in the ranking algorithm but really? If Google decides to use Google Chrome data along with some other data available for them they will have bullet proof algorithm. What I am saying here is that there is no shortage of ranking signals for Google, in the rating document they mentioned few and every one started to use them as the only ranking factor forgetting about the other 200 if not more ranking factors

What I like more about the quality raters document are the ideas that it gives to webmasters to improve user experience, in my opinion that is where webmasters need their time and energy and they will be ahead of all algorithms if they do that

Google is all about scaling:

Scaling an algorithm needs metrics that a machine can understand easily and quickly, anchor text is a good example for that, yes it can cause you a penalty if you misuse it but when it used right its power is still the same as it was when I started doing SEO in 2004, we need to remember that very well when trying to reverse engineer the algorithm

 

News

Googlebot Is Now Evergreen With The Latest Chromium

Google has done a major update to Google bot making it evergreen (always running the latest version of Chromium) Compared to the previous version, Googlebot now supports 1000+ new features, like:

  • ES6 and newer JavaScript features
  • IntersectionObserver for lazy-loading
  • Web Components v1 APIs

This sounds great but the biggest question is still unanswered for webmasters which is do we still need to do dynamic rendering when we use advanced JavaScript frameworks like Angular? I think the answer is still yes or at least most of the time. Assuming the Google Mobile friendly tool is now up-to-date with the most recent Google Bot version, we still need to test the website there and find out if Google is able to render it properly or not, based on that we can make a decision to provide a dynamic rendering or not.

News

Google is Launching CallJoy Small Business Phone Technology

Google is launching Easy-to-use phone technology for small businesses CallJoy for $39/month inspired by Google Duplex

calljoy

Call tracking is vital for a small business in order to understand which marketing channels are working and improve customer service. This system should be able to provide all call tracking needs like:

  • Logging calls in an online dashboard
  • Call recording
  • Phone number with local area code
  • Unlimited call recording & transcripts
  • Textback functionality
  •  Analytics & insights
  •  Spam blocker