All Posts By

Wisam Abdulaziz

Google Tag Manager Tracking & Analytics

Using GTM to Track Shopify Conversions with Facebook and Bing

Shopify has a very nice integration with Google Analytics e-commerce tracking, all what it takes to enable it, is adding the property ID to Shopify and checking the enhanced e-commerce box, then going to Google Analytics to enable e-commerce tracking.

Integrating other codes like Bing or Facebook conversion code is not as easy, you can rely on the thank you page to trigger the pixels for those platforms but that will not be enough to provide very valuable data like the value of the transaction. Thankfully Shopify pushes all transaction variables to the source code of the thank you page using a JavaScript variable Shopify.checkout that has a JSON format:

That means you can add any value of that transaction as a JavaScript variable in Google Tag Manager, something like below:

You can read more about the available checkout variables here

Once you have this variable available in GTM you can use it as a trigger if you want (when the value is > 0), you can also use its value in any pixel tracking code


How to Forecast Natural Organic Traffic Growth

Forecasting organic SEO traffic to fit in a digital marketing plan is a very difficult task to do, there are many variables controlling organic traffic and many of them are not under our control. Let us go through the most popular forces behind organic traffic growth and try to evaluate their impact on our forecast:

Population and internet user growth of the target country:

Most countries grow their population by a certain percentage annually, here in Canada we grow by 1.4% annually, so if all things equal organic traffic should grow by this percentage. Internet user growth (internet penetration) can contribute also to organic traffic growth (please note that there is some overlap between population growth and internet user growth), in Canada our internet users grew by 1.5% last year, for most advanced countries this number will be low, but it could be much higher for developing countries like India where it reached 18% last year.

Do some research and some estimation to come up with a number that you feel comfortable with, if this number is 2.9% and the population of the target country is 1,000,000 then you will get 40,000 extra visits next year through population and internet user growth.

Popularity growth (brand awareness) :

Brand popularity can grow in many different ways:

  • Off-line advertisement (e.g. old media like TV, radio and newspapers).
  • Online advertisement (e.g. Google and Facebook ads).
  • Word of mouth (great products can generate a lot of word of mouth popularity).

First time advertisers in traditional media like TV can expect a large increase in their branded traffic (you can track that number in Google Search Console by monitoring branded keywords impressions and clicks), if branded traffic is 10,000/month a good ad campaign can take it to 20,000/month and that is a 100% YOY increase in branded search, the annual traffic growth as a result of that = (branded traffic grow/last year total organic traffic), let us assume this number is 2%

Please note that an increase in brand awareness can also improve the overall organic traffic for all keywords as people tend to click more on brands that they know in the SERPs (higher CTR is expected in this case).

Authority growth  (link profile):

Link profile’s strength is the most important ranking factor with all major search engines, if the website you are working on has:

  • A weak link profile.
  • Large number of quality unique pages.

Then you know that a with a good link building campaign you can possibly double this traffic, on the other hand if you are working on website with a strong link profile, a good link building campaign will have a smaller impact (in the single digit range at best).

Please note that brand popularity growth should help with link growth as more people will be talking about the brand on social media websites, forums and blogs.

I am going to assume that link building can provide a 10% YOY traffic increase.

Content growth:

Websites grow their content by adding more pages on a regular basis, it is very difficult to grow marketing pages (pages where products and services are sold) on a high rate, that is where a blog can be a good tool to keep a high content growth. First you need to check the average traffic on posts that are 1+ month old that were published in the previous year, then multiple that number with the number of posts you are planning to publish next year to get the traffic gain forecast.

Content enhancement:

There are three type of content that you need to optimize and enhance on a regular basis:

  • Pages that are losing traffic YOY, this content could be expiring/outdated content that needs to be refreshed or it is content that is getting a lot of competition and needs enhancement.
  • Content that is not getting any organic traction (could be thin content or low quality content), this content needs to be enhanced otherwise removed or redirected.
  • Content that ranks at the bottom of the first page or at the top of the second page in the SERPs, with some extra work this content can gain few more spots in ranking which can make a big difference in the amount of traffic it can generate.

It will be very difficult to estimate the amount of traffic growth that content enhancement will generate without going back to historical data and calculate the growth, so for this one you need to wait for year 2 to come up with the right amount, for year one you can:

  • Assume you are going to restore the lost traffic for pages that are losing traffic.
  • Use search volume on the keywords you are optimizing for and assume you can double their CTR.

The tally:

Once you have the percentages or numbers from the four items above just add them up to find the organic traffic increase forecast for next year.

Disclaimers you need to include in your forecast when you submit it to your manager for review:

Make sure your manager understands that this is an estimate (educated guess) and there are many variables that can throw this plan off like:

  • Popularity decline as a result of a brand crisis.
  • Algorithmic update (Google does multiple updates every year that can affect rankings positively or negatively).
  • SERPs layout update (Google keeps changing SERPs above the fold area by adding featured snippets or more ads which can negatively affect CTR and traffic).
  • Technical issues (the website can go down for a while or the technical team can make a mistake that can cause traffic loss like blocking Google bots form crawling the website).
  • Production delays (executing the SEO plan is not guaranteed especially in a tight resource situation).

Another disclaimer that you need to include,  the website will go throw a traffic loss situation in a case your SEO plan is not approved. The risk of stopping or slowing SEO efforts could result in:

  • Website/content stagnation which is a bad signal to send to search engines and it can cause reduction in authority (if you are the authority in your space you are expected to produce and refresh content on a regular basis).
  • Expiring/outdated content (e.g. legal related content will lose ranking YOY if stayed unchanged).
Link Building

How to Evaluate the Quality of a Link Quickly Without DA or PA

If you are starting a link building campaign the first question jumps to your mind is how to evaluate the quality and eventually the impact of each link on your ranking, the right evaluation will help you to decide how much efforts is it worth putting to acquire that link.

Back in the day Google used to provide a metric called PageRank (a metric that reflects your website popularity on the web based on the number of quality links pointing back to it), it was publicly available to everyone and ranges from 0 to 10, obviously 10 refers to a very strong website which makes it high quality link for SEOs.


PageRank used to be available as a green bar in Google’s tool bar, Google stopped supporting that feature in 2016 leaving SEOs running in the dark looking for quick ways to evaluate the quality of their links. Many companies found an opportunity in that and started crawling the web to gather link data to sell it as a service.

On a side note, Pagerank is still part of Google’s ranking algorithm but it is not available anymore for the public.

PageRank alternatives that came to fill the gap for the hidden Google’s Pagerank:

Since Google stopped showing page rank publicly many third party link crawlers started offering their own version of page rank:

  • MOZ offers DA (domain authority) and PA (page authority) which become very popular metrics among link traders.


  • Majestic’s link quality metrics are TF (trust flow) and CF (citation flow)


  • Ahrefs link quality metrics are DR (domain rating) and UR (URL rating)


  • SEMrush’s link quality metrics are Domains Score, Trust Score and Authority Score


But the big question that faced link builders is can I trust those metrics? the metrics above work well for highly trusted websites like but they are easy to manipulate by spammer, there are many websites with a DA of 50 but yet they are not even indexed by Google, so what is really the best Pagerank alternative?

The best page rank alternative which is even better that page rank is organic traffic data:

The answer is simply organic traffic and the number of keywords the domain is ranking for in Google, if Google is happy with a website’s:

  • Content
  • Link profile
  • Authority
  • Technical elements like website speed

They will simply rank it for more keywords and send it more traffic, and thankfully this data is available for any domain using the same tools that are scoring the link profile (SEOmoz, SEMrush and Ahrefs), my favorite tool for that is SEMrush:


The analysis above shows that this domain gets 209 desktop visits/month and ranks for 527 keywords, not really that good but better than a DA 100 domains with 0 traffic.

Interestingly enough link suppliers like TheHoth finally got that point and started offering links based on traffic (I am not encouraging buying links here):


You can see that links with minimum high traffic (1000 visits/month) worth twice as much as domains with minimum DA which is 10.


Link metrics that are not offered by Google are good to use for some competition analysis but when it comes to link acquisition, traffic and number of organic keywords must be the two metrics that should be used to evaluate the value of that link.






Hosting Technical SEO

How To Block link Cralwers like Majestic Ahrefs Moz and SEMRush

The web has a lot of web crawlers, some of them are good and vital for your website such as Google bot, others can be harmful like email harvesting crawlers and content scrapers. Link crawlers come short of harmful but far from useful. They are not useful for your website, and they are not harmful in way they try to scrape content or anything like that, but they could be consuming your server resources with no benefit.

For SEOs that adopt black hat tactics like PBN (private blog network) those crawlers are a nightmare and can expose the network to competitors if left open, which in most cases will lead to a spam report causing the whole network to be de-indexed + a manual action applied to the money site if not a total deindexation.

The most popular link crawlers are Majestic, Ahrefs, Moz and SEMRush, please note that their crawlers user-agents will not match their brand name and can change in the future, so it is very important to keep an up-to-date list with the user-agents used by those crawlers. I will list below different ways to block them:


You add few lines to your robots.txt file that can disallow most popular link crawlers:

User-agent: Rogerbot
User-agent: Exabot
User-agent: MJ12bot
User-agent: Dotbot
User-agent: Gigabot
User-agent: AhrefsBot
User-agent: SemrushBot
User-agent: SemrushBot-SA
Disallow: /

The method above will be very effective assuming:

  • You trust those crawler to obey the directions in the robots.txt file.
  • The crawlers do no keep changing their user-agent’s names.
  • The companies that operate those crawlers do not use third party crawling services that come under different user-agents.


The issue with this method is that it requires your hosting provider to be Apache based, if your host supports htaccess you can use the code below to block most popular link crawlers:

<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} (ahrefsbot|mj12bot|rogerbot|exabot|dotbot|gigabot|semrush) [NC]
RewriteRule .* – [F,L]

This method is better that robots.txt as the crawlers have no choice but to obey assuming they are not changing their user-agents, or using third party crawlers.

Using PHP:

If you website is built with PHP like WordPress you can add the code below to your header.php to block all link crawlers:

$badAgents = array(‘rogerbot’,’mj12bot’, ‘ahrefsbot’, ‘semrush’, ‘dotbot’, ‘gigabot’, ‘archive.org_bot’);
foreach ($badAgents as $blacklist) {
if (preg_match(“/$blacklist/”, strtolower($_SERVER[‘HTTP_USER_AGENT’])) ) {
} }

This methods is good if your server doesn’t support .htaccess , if you are using this method you need to make sure you block also the RSS feed feature in WordPress, you can do that by adding the code below to your function.php file in the theme folder:

function wpb_disable_feed() {
wp_die( __(‘No feed available,please visit our <a href=”‘. get_bloginfo(‘url’) .'”>homepage</a>!’) );
add_action(‘do_feed_xml’, ‘wpb_disable_feed’, 1);
add_action(‘do_feed’, ‘wpb_disable_feed’, 1);
add_action(‘do_feed_rdf’, ‘wpb_disable_feed’, 1);
add_action(‘do_feed_rss’, ‘wpb_disable_feed’, 1);
add_action(‘do_feed_rss2’, ‘wpb_disable_feed’, 1);
add_action(‘do_feed_atom’, ‘wpb_disable_feed’, 1);
add_action(‘do_feed_rss2_comments’, ‘wpb_disable_feed’, 1);
add_action(‘do_feed_atom_comments’, ‘wpb_disable_feed’, 1);

Aggressive blocking (for PBN users):

If you are a regular webmaster that it is willing to save some server resources by blocking link crawlers, applying any of the methods above should be suffice; however, if you are a webmaster that wants to leave no chance for those crawlers to sneak in, you need to apply harsher measurements.

Robots.txt will be as below:

User-agent: *
Disallow: /
User-agent: Googlebot
Allow: /

This will allow only Google bot to crawl the website assuming the crawlers will obey robots.txt directions. You can also allow other agents used by major search engines like Bing.

If you are using Wodrepss you can hide the links from all user-agents excluding Google using the code below in functions.php:

add_filter( ‘the_content’, ‘link_remove_filter’ );
function link_remove_filter( $content ) {
if (!preg_match(“/google/”, strtolower($_SERVER[‘HTTP_USER_AGENT’])) && !preg_match(‘/\.googlebot|google\.com$/i’, gethostbyaddr($_SERVER[‘REMOTE_ADDR’])) ) {
$content = preg_replace(‘#<a.*?>(.*?)</a>#is’, ‘\1’, $content);
return $content;

This code will allow only Google to see the links, it verifies also that the IP address belongs to Google and it is not faked.

Make sure also to block RSS using the code listed in the previous step, the code above will not be impacted by those crawlers changing their agents or coming with different agent’s names.


Tracking & Analytics

Integrating Google Ads Lead Form With CRMs

Google has announced the rollout of Lead Form Extensions that enable advertisers to capture form submissions directly from the ad without sending users to a landing page, the submissions will be stored in Google Ads’ database and will be downloadable as a CSV file, there is also an option to integrate with CRMs using a webhook, this howGoogle explained the integration:

A webhook is an API that enables you to send lead data to your CRM system in real-time. To set it up, you will need to add a webhook URL and key to your lead form extension. You may need to do some configuration within your CRM system to generate the URL and key.

The URL is the delivery path: after the user submits a lead form, an HTTP POST request is sent to the configured URL, allowing the lead data to go directly into the CRM system. The key is used for validating the leads sent.

The explanation above is confusing for marketers and at least unclear for developers. What is the webhook URL and where to find it? This program (Google ads lead form extension) is in Beta now, and I am not sure if there is any CRM that supports it at the moment, even Salesforce doesn’t have a webhook URL for it yet.

How to generate a webhook for the lead form extensions?

The best resource that can help with that is the developer guide provided by Google, anytime the form is submitted to Google a JSON POST request will be sent to the webhook URL added in the form lead extensions, something like below:

“lead_id”: “lead_id1”,
“form_id” : “form_id1”,
“user_column_data”:[ {“column_name”:”Full Name”,”string_value”:”John Doe”},
{“column_name”:”User Phone”, “string_value”:”12345678″},
{“column_name”:”User Email”, “string_value”:””}],
“google_key” : “secret”

Firs you need to decide what is the best way to capture this data, for me I am going to use PHP to do that, my webhook could be something like this  with the code below:

$json = file_get_contents(‘php://input’);
$form_data = json_decode($json);
$leadid = $form_data->lead_id ;
$form_id = $form_data->form_id ;
$fullname = $form_data->user_column_data[0]->string_value;
$phone = $form_data->user_column_data[1]->string_value;
$email = $form_data->user_column_data[2]->string_value;

At this point all the values are available in the code and ready to be pushed to any CRM or a local database, most popular CRMs like SalesForce and Hubspot have APIs with PHP libraries (they have also libraries available for the most popular programming languages) making it easy to push the data to those CRMs.

Additional values like campaign and keyword could be available also in the fields and worth saving. Before pushing any values to your CRM make sure to create fields to match all the values you will be saving.


SEO Tracking & Analytics

How to Diagnose Organic Traffic Loss

Traffic loss is one of the most popular issues that can face SEO specialists, traffic can not go up forever and each website at some point will face a traffic loss situation that needs to be diagnosed. To help you to diagnose a traffic loss situation I will take you first thorough the most popular reasons to lose organic traffic, understanding those reasons and learning how to monitor them will make diagnosing traffic loss an easier task.

Most popular reasons to lose traffic:

1- Ranking loss:

Position #1 in the SERP can enjoy 30% or more click through rate, after that CTR will go down for every lower position in the SERP, position #10 can get 5% CTR or less. If a website losses ranking for highly searched keywords, the overall organic traffic will go down.

The most popular reasons for ranking loss are:

  • Algorithm updates (search engines like Google runs multiple updates every year).
  • Losing authority (e.g. Losing a lot of quality inbound links) or slow link growth.
  • Losing popularity (e.g. Less social signals and lower branded searches).
  • Website stagnation (no new content or no content refreshment).
  • CMS change or content change (e.g. website redesign which can include CMS change, URL change and content change).
  • Increase in competition, competitors could be providing better content and promoting their website more, so they get higher ranking.
  • Technical issues with search engines like crawlability, indexability, downtime, slow loading and manual actions.

2- SERP layout Change:

Google keeps changing the SERP layout, putting more ads at the top sometimes or featured snippets which can affect CTR while keeping the same ranking.


In the example above even a website ranks #1 it will be still below the fold, which can bring CTR significantly down, so we are not going to enjoy 30% CTR being number one anymore, we could be receiving 10% CTR only with the new layout.

3- Trending change and user behaviour:

Human needs and behaviours change throughout the years. Products and brands get disrupted sometimes and that can change search volume. A product like mini DVD has been disrupted by smart phones and tablets, what brought the search volume for it close to zero:


Another example could be online dating, the need for dating did not decline nor the need for online dating, but social media websites like Facebook are becoming a go-to destination for people that are looking for dating, which brought down the interest for the keyword online dating:


Step by step traffic loss audit:

Now that you know the most popular reasons to lose traffic, it is time to run some analysis on key metrics that can help us to evaluate a website against each traffic loss reason.

1- Identify which keywords are losing traffic using GSC:

Any traffic loss will be linked eventually to ranking and search volume for the keywords that are driving organic traffic, when a website is losing organic traffic the direct reason for that is either some keywords are getting less traffic or less keywords getting traffic, so the main focus of any traffic loss audit should be identifying which keywords are losing traffic, thankfully this analysis is made easier with GSC, just use the comparison feature and choose the too time spans that you want to analyse for traffic loss and identify which keywords are causing that loss:


Once you identify the keywords you need to assess what is causing them to lose traffic by running them against the reasons of losing ranking explained above.

2- Check the brand name in Google trends to make sure that the brand is not losing any popularity.

Run the checking for the last 5 years in the target audience country.

3- Link profile analysis:

Check the domain using Majestic SEO or Ahrefs to see if there is any recent link loss, the screen shot below is taken from Majestic:


Link loss will lead to lower authority which in most cases will cause ranking loss.

4- Check index growth in Google Search Console:

This will help to identify any deindexation issue:


Losing more indexed pages means losing ranking for any keywords those pages are ranking for. Deindexation could be a result of:

  • Technical errors in the website (e.g. server errors or very slow load time).
  • Duplicate or thin content.
  • Issues with setting up the canonical tags.
  • Algorithm update that affect crawling standards.
  • Losing authority (e.g. the link profile is getting weaker).


Organic traffic loss can happen to any website, diagnosing the situation and finding the reasons behind it is not quick nor easy in most cases, sometimes you get lucky to find out that a technical issue on the site caused it, the webmaster can fix it quickly and things will go back to normal in few weeks, but for most other cases recovering may not be even possible or can take a very long time. The key thing when it comes to traffic loss is finding about early and start reacting to it immediately.



Google Analytics

Track Form Abandonment Using Google Analytics Funnel Visualization

My previous post was about CRO, in that post I included form optimization as an important step for a better CRO. Form optimization can not be done without collecting data about users interaction with the form, and form abandonment is probably the most important metric that can do that.

What is form abandonment:

Form abandonment is the event where people start filling a form (at least filling one field) but did not click the submit button, so they have the intent to fill the form but due to possibly some hurdles the did not complete it, possible hurdles:

  • Long form (too many fields)
  • Technical issues (the form is not working on some devices or browsers)
  • The form has personal questions that users are not willing to fill
  • The form is broken or the Captcha is so difficult.

There are some online service like Hotjar that can track form abandonment, but unfortunately Hotjar does not work with every form easily, not mention that the data lives outside Google Analytics which means you have another platform to work on and monitor. You can see the form tracking chart offered by Hotjar below, it gives you the time spent by users field each field and the abandonment rate for each field.


In this post I will provide a step by step tutorial how to do form abandonment tracking using Google analytics. I will assume you a simple form on your website like the one belwo:


The source code of the form will like like this:


I will assume that the submission will lead to a thank you page like, please be aware that this tutorial will require a good knowledge of Google Analytics and Google Tag Manager (GTM).

The method I will be explaining will be utilizing the funnel visualisation feature in Google Analytics, which is not designed initially for that, it is designed more to track multiple pages funnel. Considering that a high level of accuracy is not required here the method below should provide a good insight on where and why people abandoning your form.

Step #1:

First step will be pushing an event to the data layer when a user try to fill in a field along with the field name, you can do that by adding the JavaScript code below as a custom HTML tag to your Google Tag Manger:

(function($) {
$(document).ready(function() {
$(‘form :input’).blur(function () {
if($(this).val().length > 0 && !($(this).hasClass(‘completed’))) {
switch($(this).attr(‘name’)) {
case “first_name“:
virtualp = “first-name“;
case “last_name“:
virtualp = “last-name“;
case “phone_number“:
virtualp = “phone“;
case “email_address“:
virtualp = “email“;
case “comments“:
virtualp = “comments“;
virtualp = “unknown”;

dataLayer.push({‘eventCategory’: ‘Form – ‘ + $(this).closest(‘form’).attr(‘name’),
‘eventAction’: ‘completed’,
‘feildLabel’: virtualp,
‘event’: ‘gaEvent’});
else if(!($(this).hasClass(‘completed’)) && !($(this).hasClass(‘skipped’))) {

Please note that:

  • The code above assume that jQuery is installed already on the website.
  • You need to change the case to match your field names.
  • You can change virtualp (which is going to be the virtual page URL) to anything also just make sure the page is not already existing on your website.

Step #2:

Add a datalayer variable to track the field name:


Step #3:

Create a virtual page view to track every form field filling as a pageview:


The firing rule for the tag above will be as below:


After that publish your GTM container.

Step #4:

In Google analytics create a goal that tracks the thank your page with funnel steps that reflect the virtual page names you have set in step #1:


When you have enough data you should be able to see a funnel visualisation as below:


The funnel above should tell which field is causing that highest abandonment rate so you can either remove or change it.

Step #5 – Bonus – Track scrolling down as a page view:

Some forms are placed below the fold, or they are multiple step forms where people need to scroll down to see the other parts of the form. For those forms you can add the scrolling down to the form location as a virtual page view (see the tag and the trigger below and use them in GTM):



Form tracking abandonment is an important part of conversion rate optimization, always do it with any tool you feel comfortable with, there are other things you can do from a CRO prospective when it comes to form:

  • Exit intent pop-up window trying to give users unrefusable offer to stay on the page and fill the form.
  • Tracking filled fields even if people did not click submit and use it to better understand user behaviour.
UX and CRO

Essential CRO Audit That Doesn’t Need A-B testing


With traffic of all types (SEO, PPC and referrals) becoming more expensive and difficult to get, conversion rate optimization is becoming more important than ever. Doubling website traffic is becoming way more expensive than doubling conversion rate, and both of them can lead to the same objective which is more sales or leads.

CRO is defined as: “Conversion rate optimization seeks to increase the percentage of website visitors that take a specific action (often submitting a web form, making a purchase, signing up for a trial, etc.)”, this could be done by applying human behavior principles to website elements and methodically testing alternate versions of a page or process to find which version generates the best results. In doing so, businesses are able to generate more leads or sales without spending more money on website traffic, that will result in increasing their marketing return on investment and overall profitability.

Human Behavior:

It is very important to understand that big part of CRO is understanding human behaviour and what signals or triggers lead humans to take certain actions. Fogg behaviour model is one of the most popular models that can help us a lot to understand how humans take actions, see the chart below:


The model in a nutshell suggests that humans likelihood to take a certain action rely heavily on the motivation and the difficulty of taking that action.

Core motivations:

  • Pleasure / pain
  • Hope / fear
  • Social acceptance / rejection

Ability Variables:

  • Time, does it take a long time? Does it worth my time.
  • Money, can I afford it? Does it worth it?
  • Physical effort, do I need to leave my house to do it?
  • Non-routine, can I do this my usual way?

So when optimizing a website for a better conversion rate make sure any improvements you make can either:

  • Increase users motivation to take an action.
  • Make it easy for users to take an action.

There is another model that you can have a look at called the Lift Model.

Conversion Tracking Setup:

“If you can not measure it, you can not improve it”, deciding the right KPIs and setting up goals (possibly events also) to track them is vital for CRO, having an analytics software (i.e. Google Analytics) installed on the website is a must in order to set up goals to track the KPIs. Try to track as many actions as possible, tracking is not retroactive in Google Analytics, it is better to have more than less. Divide the goals into hard and soft goals.
The KPIs that could be measured are almost infinite, here’s a list of the most popular KPIs to track on a website:

  • Form submissions.
  • Cart abandonment rate.
  • Checkout conversions.
  • Traffic to sales pages.
  • Unique visitors.
  • Returning visitors.
  • Scroll depth.
  • Time on page.
  • Phone calls.
  • Funnels.
  • Form dropoff.

Many of the KPIs above are tracked by default in Google Analytics, however there are many others that need to be manually set up.

Have a KPI discussion with the client and understand their business model, based on that create a list of hard and soft goals that need to be tracked.

How to start a CRO audit:

A CRO audit relies heavily on studying user behaviour then making changes to web assets then finding if those changes improved conversion rate, the best tools that can help with that are:

  • Mouse tracking (could be done using Hotjar)
  • User journey tracking (Could be done using Google Analytics)
  • A-B testing (could be done using Google Optimizer)

The three tools above are not easy nor cheap to run, if you are facing a low budget situation the only way to do CRO is what I call “CRO Fundamentals Audit” which is simply running the website against a check list of CRO fundamentals >> then evaluate the website against each of them >> then make recommendations.

In this post I will provide you with a step by step guide to check the CRO fundamentals and create an essential CRO audit.

Essential CRO Audit  – Sep by step:

This guide will assume that Google analytics is installed with a long enough history of goal tracking (soft goals and hard goals). Based on the most popular human behaviour models I created a checklist of 13 items that influence users ability and motivation to be checked and evaluated:

  1. Site speed.
  2. Conventions and standards of  web design.
  3. First impression (production quality and relevancy).
  4. Page layout, Clarity of headlines (people scan do not read), sublines and bullet points.
  5. Navigation and website hierarchy clarity, breadcrumbs and internal search.
  6. Images analysis (consistent with the website message and offering).
  7. CTA and distraction.
  8. Trust elements and security (legitimacy of the company and trust).
  9. Remove fear (30 days money back guarantee).
  10. User generated reviews and testimonial (social proof).
  11. Users analysis.
  12. Sense of reciprocity (free content).
  13. Forms.
  14. Maximizing users value (exit-intent popup windows, chat and chatbot).

Step 1 – Site speed:

There is no shortage of studies that show how faster websites will enjoy better conversion rate (in some cases 3X better than slow websites), and a lower bounce rate. You can read this post to find what tools to use to evaluate website speed. A 3 seconds or less speed index is considered to be good.

Step 2 – Standards of  web design:

Standards and conventions of web design are a set of best practices that are followed by majority of websites on the web. Those best practices gradually become guidelines that most web designers follow, knowing that they align with visitors’ expectations. Examples of those standards:

  • Logo in the header (top left corner).
  • Main navigation is in the header.
  • Value proposition is at the top (above the fold).
  • Call to action is located above the fold.
  • Search box is in the header.
  • Contact us page is included in the upper navigation.
  • The design should fit any device especially mobile devices.

In this step just make sure the design follows the standards of web design as much as possible.

Step 3 – First impression:

Visual appeal can be assessed within 50 ms, suggesting that web designers have about 50 ms to make a good first impression, this first impression depends on many factors:

  • Structure
  • Colors
  • Spacing
  • Symmetry
  • Amount of text, fonts, and more

It is very difficult to assess your own website in an unbiased way, you can setup a 5 seconds test asking a question like “What is your first impression about this website”.

It is very important to understand users entry points to each landing page and make sure the messaging on the landing page is in alignment with that entry point, here are few different scenarios/examples:

  • If you are running a TV ad you need to make sure users that are coming from this ad land on a page that is consistent with the messaging in the ad.
  • Running a PPC campaign with a broad match can drive traffic from keywords that do not match the messaging on the page.
  • Google can rank pages organically for keywords that do not really match the messaging on the page, those pages need to be adjusted or new pages need to be created.

Step 4 – Page Layout and clarity of headlines:

People scan (do not read) in most cases, they need to find information quickly with less efforts. A good page layout (with the characteristics below) can help a lot with that:

  • A clear strong headings/headlines must be used and placed at the top of the page, a strong headline on the homepage is critical to validate that the user is in the right place.
  • Break down extraneous blocks of text into separate paragraphs.
  • Breakdown text walls into paragraphs, lists, or even segments with appended / additional headings.

Step 5 – Navigation and website hierarchy:

At any point of time a user should be able to tell their location (which page) and how to go back to the previous page, a breadcrumb can be very helpful for that. The upper navigation should be well-thought-out and provides users with a quick access to the most important information/products/solutions the site provides. Site search is recommended (placed at the top in a clear box, with a magnifying glass icon, with autocomplete and common spelling error support), it helps in situations where a user could not find the information they are looking for using internal linking or the navigational system.

Step 6 – Images analysis:

“A picture is worth a thousand words”, images can improve conversion rate and support the story told by the text content, images also are big part of the brand identity. Few points to consider while evaluating if images can improve conversions or not:

  • Clear and tell a story.
  • Text is easy to read inside the image.
  • They support the product or the service offered on the website.
  • They evoke emotions.
  • They help clarifying a confusing concept.
  • They are original and look professional with a high quality (no stock photos).

Background images can work better with a colourful overlay that matches brand colours, which will reduce brightness and complexity for the image, for text that will come at the top of a background image use solid background container.

Step 7 – CTA and distraction:

Key-points when designing and placing a CTA:

  • Make it visible (above the fold or close to triggers)
  • Make it clear that it is clickable (frame it with a shade) and include actionable text in it like “buy now” or “read more”.
  • Respect human attention ratio, do not place too many CTAs on a page and get them to compete with each others.

Step 8 – Trust elements and security:

Unless you are a big brand there is a good chance that first time visitors do not know you yet so you need to increase their trust level, and decrease their anxiety which could be preventing them taking the actions you want them to take, here are few elements that you can include on the website to address that:

  • Media mentions (i.e. as seen on CNN, you can include a logo with a link to reference your media mention).
  • Awards and achievements.
  • Partnerships.
  • Case studies and surveys.
  • Client lists with their logos.
  • Security badges (work well for shopping carts).

Step 9 – Remove fear:

Remove fear by using guarantees/security seals: Survey your customers for their pre-purchase concerns then formulate guarantees which pre-empt these concerns. Possible messages that can help to remove fear:

  • 30-day product guarantee.
  • Same day shipping.
  • Lowest price guaranteed.
  • Largest range of products.
  • No contracts cancel anytime.
  • Include the services or the products prices clearly where possible (in other words avoid the request a quote button).

Step 10 – User generated reviews and testimonial (social proof):

Social proof is an important part of CRO, customers buy products that make them feel good about themselves, products that change them and make them better, social proof will help customers to make a decision, feel confident about their choice and feel a part of something bigger. Popular forms of social proof:

  • Testimonials.
  • User gendered reviews.
  • Social media widgets.
  • Data/numbers “X customers served” “Y projects completed”.

Step 11 – Users analysis:

Understanding the website demographics and user persona can help a lot with:

  • Improve messaging.
  • Identify needed shifts.
  • Flag technology needs.
  • Adjust PPC campaigns audience settings.

Step 12 – Sense of reciprocity:

Creating sense or reciprocity is a good way to improve conversion rate, it could be done in many different ways:

  • Free ebook.
  • Free consultation.
  • Free tutorial.
  • First month free no credit card required.

Step 13 – Forms:

Forms should:

  • Be as short as possible, long forms could be split into multiple pages.
  • Field labels must be clear and explain what information should go in each field.
  • Support auto filling.
  • Work well on mobile devices.
  • Fields with validation must show a clear message how to be corrected if filled wrong.

It is recommend to track form dropoff count for each field using tools like Hotjar and tweak the form based on the collected data.

Step 14 -Maximizing users value:

Offering users more help using online chat and chat bots is proved to improve conversion rate. Intercepting users that are exiting the website with a popup window giving them an offer or asking them to subscribe to the news letter can also improve conversion rate.


The 13 checking points above should help you to conduct an essential CRO audit without doing any A-B testing, if resources are limited this type of audits will be still helpful until you have enough resources to invest in a contentious A-B testing program.



Tracking & Analytics

How To Build Your Own Call Tracking System Using Twilio

This post is an extension to the previous post I did explaining how to connect Twilio to Google Sheets and Google Data Studio , the missing part in that post was how to designate different phone numbers to different mediums or sources of a website traffic, and this is what I will be covering in this post.

Step 1:

Decide what mediums or sources you want to track, the most basic tracking if you have a low budget is tracking three mediums: organic, paid and others, a more advanced tracking will be:

  • Organic Google
  • Organic Bing
  • Paid Google
  • Paid Bing
  • Referral
  • Direct

You can also consider tracking email campaigns and offline campaigns like radio. Obviously the more mediums/sources you want to track the more phone numbers you need.

Step 2:

Assuming you are going after a basic tracking (organic, paid and others) you need to buy three phone numbers, I will assume they are:

  • 111-111-1111 for organic traffic
  • 222-222-2222 for paid traffic
  • 333-333-3333 for others

Make sure to redirect those numbers to your own phone number, I will call that the default phone number and it is 000-000-0000

Step 3:

We need to save the source, medium and term for every website visitor in a cookie, you can do that using UTMZ-replicator  just add it to your website using Google Tag Manger as a custom HTML tag and let it fire on every pageview.


Step 4:

Make sure the phone number on your website is contained in a unique class, something like below:

<a class=”default-phone-number” href=”tel:000-000-0000″>000-000-0000</a>

Step 5:

Add the JavaScript code below as a tag in Google Tag Manager and let if fire after the UTMZ tag in step 3:

var findnum = [
{ "number": "111-111-1111", "medium": "organic" },
{ "number": "222-222-2222", "medium": "cpc" },
{ "number": "333-333-3333", "medium": "others" }
function searchMD(findnum, medium){
for(var i= 0, L= findnum.length; i<L; i++){ if(findnum[i].medium=== medium) return findnum[i].number; } return ''; } var ga_source = ''; var ga_campaign = ''; var ga_medium = ''; var ga_term = ''; var ga_content = ''; var gc = ''; var c_name = "__utmzz"; if (document.cookie.length>0){
c_start=document.cookie.indexOf(c_name + "=");
if (c_start!=-1){
c_start=c_start + c_name.length+1;
if (c_end==-1) c_end=document.cookie.length;
gc = unescape(document.cookie.substring(c_start,c_end));
if(gc != ""){
var y = gc.split('|');
for(i=0; i<y.length; i++){ if(y[i].indexOf('utmcsr=') >= 0) ga_source = y[i].substring(y[i].indexOf('=')+1);
if(y[i].indexOf('utmccn=') >= 0) ga_campaign = y[i].substring(y[i].indexOf('=')+1);
if(y[i].indexOf('utmcmd=') >= 0) ga_medium = y[i].substring(y[i].indexOf('=')+1);
if(y[i].indexOf('utmctr=') >= 0) ga_term = y[i].substring(y[i].indexOf('=')+1);
if(y[i].indexOf('utmcct=') >= 0) ga_content = y[i].substring(y[i].indexOf('=')+1);
if (!((ga_medium =="organic") || (ga_medium =="cpc"))) ga_medium = "others" ;
document.getElementsByClassName('default-phone-number')[0].innerHTML= searchMD (findnum,ga_medium);
document.getElementsByClassName('default-phone-number')[0].href= "tel:"+searchMD (findnum,ga_medium);


At this point you can be sure that phone number will fire only based on their correspondent medium, at this point you can start building your dashboard following the post here

Tracking & Analytics

Connect Twilio to Google Sheets and Google Data Studio

Call tracking is a vital service for any marketer, and when it comes to call tracking, Twilio is the largest telephony infra structure provider in the market that can help a lot with that; however, Twilio is geared more towards developers, nontechnical users can not do much using Twilio and they have to go for a plug and pay services like, Dialogue Tech and Call rail.

In this post I will provide Twilio’s nontechnical users with a solution to create a friendly dashboard using Google Sheets, which could be eventually used with Google Data Studio for better visualization.

Step 1:

Make sure you have a Twilio account with at least one active phone number that redirects to your own phone number, Twilio has a nice guide here that explains how to find and buy a phone number, after buying the phone number you can use Twilio’s built in web-hook to redirect it to your own phone number (see below)


Just add your phone number to the end of the web-hook’s URL:

You can buy multiple phone numbers and use each of them to track a different medium, like organic, paid, and others. In this tutorial  you can find how to rotate phone numbers based on incoming traffic to your website, if you are using the the numbers on dedicated landing pages or other mediums like off-line, number’s rotation will not be required as each number will be served only for one medium.

Step 2:

Find your Twilio’s API credentials by clicking the gear icon at the top >> settings >> copy ACCOUNT SID and AUTH TOKEN


Step 3:

Replace your ACCOUNT SID and AUTH TOKEN in the code below that we will be using in Google Sheets:

function myFunction() {
var ACCOUNT_SID = "*********************************";
var ACCOUNT_TOKEN = "*********************************";
var findmedium = [
{ "number": "+1**********", "medium": "organic" },
{ "number": "+1**********", "medium": "cpc" },
{ "number": "+1**********", "medium": "others" }
// you do not need to edit anything below this line
function searchMD(findmedium, num){
for(var i= 0, L= findmedium.length; i<L; i++){
if(findmedium[i].number=== num) return findmedium[i].medium;
return '';
var numberToRetrieve = 10;
var hoursOffset = 0;
var options = {
"method" : "get"
options.headers = {
"Authorization" : "Basic " + Utilities.base64Encode(ACCOUNT_SID + ":" + ACCOUNT_TOKEN)

var url=”” + ACCOUNT_SID + “/Calls.json?PageSize=” + numberToRetrieve;
var response = UrlFetchApp.fetch(url,options);

var theSheet = SpreadsheetApp.getActiveSheet();

function search(xyz) {
var sheet = SpreadsheetApp.getActiveSpreadsheet().getActiveSheet();
for(var i=2;i<12;i++){
var value = sheet.getRange(i, 6).getValue();
if(value == xyz){
var match = 1 ;
return match ;
var startColumn = 1;
var theRow = 2 ;
var dataAll = JSON.parse(response.getContentText());
for (i = dataAll.calls.length -1 ; i >= 0 ; i–) {
var found = search (dataAll.calls[i].sid) ;
var calldirection = dataAll.calls[i].direction ;
var midnight = new Date();
midnight.setHours(0, 0, 0, 0);
rowDate = dataAll.calls[i].date_created;
var theDate = new Date (rowDate);
if (!found && calldirection == “inbound”) {

theColumn = startColumn;

if(isNaN(theDate.valueOf())) {
theDate = ‘Not a valid date-time’;
else {
theSheet.getRange(theRow, theColumn).setValue(theDate);

var callduration = dataAll.calls[i].duration/60 ;
callduration = +callduration.toFixed(2);

theSheet.getRange(theRow, theColumn).setValue(dataAll.calls[i].to);
theSheet.getRange(theRow, theColumn).setValue(dataAll.calls[i].from);
theSheet.getRange(theRow, theColumn).setValue(callduration);
theSheet.getRange(theRow, theColumn).setValue(searchMD(findmedium, dataAll.calls[i].to)); //tw searchMD(findmedium, dataAll.calls[i].to) ;
theSheet.getRange(theRow, theColumn).setValue(dataAll.calls[i].sid);

If you are using multiple phone numbers for different mediums replace the values in the findmedium array, otherwise just leave the code as is.

Step 4:

Create a new sheet in Google Sheets and name the first 6 columns in the top row as below (feel free to change the names to any other names keeping the same order)


Step 5:

In Google Sheet go tools >> script editor >> delete any existing codes and replace them with the codes from step 3


Step 6:

In Google Sheets script editor click on the clock icon to create a trigger >> Click + Add  Trigger (bottom right corner) >> Time-driven >> Hour timer >> Every hour >> save


Once you do that, in few hours the data will start to show up in your sheet as below:


Having the call date in Excel will give you a lot of flexibility to manipulate and extract data, for more visualization read step 7 which will explain how to create a simple report in Google Data Studio.

Step 7:

Connect the sheet you created to Google Data Studio by going blank report >> Create data source >> Create new data source >> Google Sheets >> Choose the spread sheet created in step 4 >> Click connect


You should be able to see an empty Google Data Studio report with a grid, now Insert >> Time series >> Add to report

If everything works fine you should be able to see a report like below (a daily total calls duration report) :


There is a lot you can do to customize this report but I am not going to go through all of that, watch this video here to learn more.

Finally I want to say that this post was inspired by a great post provided by Twilio How to Receive SMS Messages into Google Sheets with Apps Script and Twilio and part of code was used in my post.

Please feel free to comment if you have any questions.