All Posts By

Wisam Abdulaziz

Technical SEO

Website Redesign and Migration Checklist to Prevent ,Ranking, Traffic and Conversions Loss

The web evolves on a fast pace, and websites need to keep evolving to meet ever growing user expectations, that includes upgrading the infrastructure (i.e. hosting and CMS) and improving the visual design of the website.

Unfortunalty website redesign is not a risk free process, when lanunching a new website there is a risk of:

  • Losing organic search engine traffic
  • Conversion rate getting lower
  • Data loss due to mistakes migrating the tracking pixels
  • Poor user experience manifested in metrics like bounce rate

It is strongly recommended to conduct a pre-launch audit before launching a new website, the list below includes items with the highest impact on organic search in specific, and digital marketing performance in general when they are changed:

  1. URL changes
  2. Tags changes (title tags, description tags, ALT and Hx tags)
  3. Content changes and content removal
  4. Structured data changes (that includes schema, canonical and  social mark-ups)
  5. Design changes (mainly above the fold area)
  6. Usability, UX, accessibility (WCAG) and CRO
  7. Crawlability and indexability
  8. Navigation menu changes
  9. Website speed and performance
  10. Tracking changes

I will take you through every item of these and describe in details how to check if they changed in a way that can negatively affect search engine traffic or user experience.

One rule to remember while doing a website redesign "minimize the variables" where possible try to keep things the same (i.e. content, title tags, description tags), that will make traffic loss diagnosis easier if the new website happens to under perform the old one

URL Changes:

The whole search engine ranking system is built around the URL, it is used as an index to rank content and assess authority (mainly backlinks), changing a URL even with the content staying the same without a proper 301 redirect can cause a significant traffic loss. Work closely with the web developer to see if it is possible to keep the URLs unchanged, and if they have to be changed map the old URLs to the new ones, and ask the developer to implement a 301 redirect solution from the old URLs to the new ones.

Tags changes (title tags, description tags, ALT and Hx tags)

Keyword rich title tags and H1 tags can help the content to rank better for the targeted keywords, changing them in a way the become less relevant to the targeted keywords can cause a traffic loss

Title tags and description tags are used in most cases in the SERP by search engines, they should be written like an ad in a way they attract people to click on the listing in the SERP, changing them may bring CTR down.

The safest bet in a transitional stage is to keep them the same which will reduce lim the variables so diagnosing a potential traffic loss becomes easier.

Content changes and content removal

Removing text content from pages or changing content in a way it becomes less relevant to targeted keywords will have a negative effect or ranking. Same thing applies to other form of content like photos and videos.

Structured data changes

Structured data like Schema is a very helpful tool for search engines to better and quicker understand the content on your pages, you can embed specific type of content like an address, reviews, recipes and more in structured data codes, in some cases those codes will be used by search engines to add extra features to the SERP (e.g. the 5 stars you see some times in the SERP).

It is very common for developers to drop those codes while redesigning a website, make sure to check those codes on the staging website using the schema test tool.

Design changes

It is possible for the design and layout changes to cause some issues to a website ranking and user experience, pay a lot of attention to the above-the-fold area and make sure:

  • The content above thee fold is not changing, especially the content that is related to the targeted keyword
  • The call to action elements are placed above the fold are not being removed

Usability, UX, accessibility (WCAG) and CRO

Design changes (layout and elements repositioning) can impact many usability elements such as forms, payment process, and funnels. Usability elements may not affect search engines a lot but they will have a significant impact on user experience and eventually conversion rate.

Soft launch is a good a tool to assess usability and conversion rate changes, you can start with a 25% exposure of the new design and monitor the KPIs between the old and the new website (this almost becomes an A-B test)

Accessibility WCAG check up could be done using a tool like WAVE.

Crawlability and indexability

Search engines crawl and index mainly text content that is presented in a plain HTML code and to a less degree text content that is presented in JavaScript, make sure to go with the developer through the crawlability and indexability checklist below:

  • Make sure there is no commands in robots.txt (like disallow) that prevents search engines from crawling the site
  • Check the source code for noindex tags
  • Make sure the website has a dynamic site map submitted to Google Search Console
  • Make sure text content is not included in images
  • If you are using Javascript framework like Angular, avoid single page applications where the URL doesn't change when users move between pages.

The best tool to check for crawalability and indexability is Google Mobile-Friendly Test, then check the HTML source code section and make sure all the text content that you want search engines to see is visible there.

Navigation menu changes

The upper navigation of the website has two important roles to play, one of them is SEO and the other one is user experience.

From an SEO prospective, search engines give more weight to pages that are linked from the upper navigation as those pages are linked site-wide, the anchor text used in thee upper navigation will give search engines some context regarding the destination page. Removing important pages form the upper navigation or changing the anchor text may have a negative impact on ranking.

From a UX standpoint, links in the upper nav should be organized in a way that makes key pages easy to access for users, at the same time helping the website to achieve its business objectives.

Website speed and performance

Website speed and performance measured by Core Web Vitals, can be negatively impacted by changing the CMS or the design, measuring Core Web Vitals and making sure the performance is at least stying the same if not improving will be key to maintaining a good ranking (as Core Web Vitals are ranking factors now) and keeping a good user experience (keeping and improving conversions rate).

Tracking changes

It is very normal for a website to have a lot of tracking pixels (i.e. Google Analytics, Facebook, Linkedin and more), sometimes they are hardcoded in the source code and other times that are managed using tools like Google Tag Manager. Removing any of those pixels/tags will cause a data flow interruptions, it is very important to migrate all of these pixels to the new website.

Sometimes even keeping those pixels may not be enough, firing rules can depend on URLs and HTML elements names to fire, changing that will also interrupt tracking.

Also tracking tools like Google Analytics have settings to track goals and events that will be impacted by changing URL and HTML codes.

It is ver important to make sure there is not data/tracking interruptions when moving to the new website.


Website redesign or upgrade will be unavoidable at some point, what could be avoidable with good preparation is:

  • Traffic loss
  • User experience deterioration
  • Tracking and data flow interruption

This post focuses mainly on pre-launch audit, but post-lauch audit is also very important, the most important checking points for that are:

  • Google Search Console traffic, impressions and average ranking loss
  • Google Search Console coverage section (errors, indexability and crawlability metrics)
  • Google Analytics conversions loss and bounce rate decline
Google Analytics Tracking & Analytics

How to setup a User ID UID property in Google Analytics

Attracting all type of web traffic is becoming more expensive every day (that includes organic, paid and social), maximizing the value of every visit is key for any business to maintain a high profitability (same marketing budget with better ROI will significantly impact the bottom line). Performance marketing is becoming a popular term, as marketers we hear every day about metrics like:

  • Cost per click
  • Cost per lead
  • Cost per acquisition
  • Revenue per channel for ecommerce
  • Revenue per channel for off-line sales

Performance marketing is not possible without enough tracking metrics like the ones above, Google Analytics without any customization is able to provide some of those metrics like CPC and cost per lead but it fails to report on cost per acquisition and revenue per channel for off-line sales. To track those metrics and close the marketing loop we need to communicate data back from the CRM to Google Analytics. Using GCLID, CID and UID + the measurement protocol is the way to accomplish that.

Closing the marketing loop in Google Analytics:

Few years ago I published a post explaining how to communicate back from SalesForce to GA when a sale is close using CID, today I want to explain how to implement a CRM integration using UID, the main advantages of using UID over CID is that:

  • UID can track users through multiple devices while CID can not do that,
  • UID does not expire like CID, it is permanent

On the other hand there is a big problem with UID is that it needs users to be registered with your website (username and password), and use your website while they are logging in to their accounts (works well with shopping carts with recurring customers). In this post I will explain how to close the marketing look in GA using UID setup.

Step 1 - Create a GA UID view:

Create a new GA view with UID enabled (you can keep your old views but you need to start a new one using UID that has no data), in order to do that in your GA account: Click Admin >> navigate to the property in which you want to implement User-ID, in the PROPERTY column, click Tracking Info >> User-ID. Read more here how to complete the process.

Step 2 - Push UID value as a datalayer variable

In my example I will assume that you are using Google Tag Manger, once a user successfully logs in to their account the two datalayer events below must be pushed in the source code, the code below should be placed right after GTM code:

          event: 'login_verified',
          userId: crm_unique_id,

It is very important to have the userId value as a number that is saved in your CRM so you can use the same value to push data back to Google Analytics using the Measurement Protocol.

The code above must be pushed to every page a user visits after logging in, not only the first page they visits after logging in.

Step 3 - Capture UID in GTM

A new datalayer variable must be created in GTM:

The userId variable needs to be added to Google Analytics Settings variable in GTM as a custom field:


By completing this you should start seeing some data in the UID view you created in GA.

Step 4 - Pushing data from the CRM to GA

When a new sale happens in the CRM, the Measurement Protocol must be used to push the data back from the CRM to GA, we need to build dynamic URL and use POST method to push the data to GA (the URL below will push a sale to GA):

Check the table below to understand the values that were used in the example above, use the hit builder to test your calls.

Step 5 - Verify the data flow in GA

Go to audience then user explorer, find a userid that you pushed some data back to using the measurement protocol and make sure you can see the data there (something like below):



What happens to users that do not login, can I see them in the UID view in GA?

No, you will not be able to see those users at all, you will need to use the CID view.

What happens to a session where a user visits few pages without logging in then they login after?

Google Analytics will connect the dots for you and append the previous pages to the session when the user logs in. There could be some delays for that to happen and become visible in GA (possibly 24 hours sometimes).

UX and CRO

How to Optimize User Journey to Reduce Bounce Rate and Improve Conversions

Organic traffic 3 user journeys

When optimizing organic traffic we need to look at three different user journeys and optimize for them separately or at least keep each of them in mind while optimizing for the other one. The three user journeys to optimize organic traffic for are:

  • From the search engine box where a keyword is typed and the listing is clicked in the SERP
  • From the SERP to the landing page
  • Checkout journey (mainly for shopping carts) or the conversion funnel for lead generation
SERPLanding pageConversion funnel
Measured byCTR in GSCBounce rateConversion rate
Possible improvementsBetter descriptions tags and title tagsBetter content relevant to the queryFunnel analysis and optimization
Better ranking Better layoutForms optimization
Featured snippet optimizationProfessional design
Brand recognition/marketing Better CTAsBetter CTAs
Better information architecture
Better usability especially website speedBetter usability especially website speed

I will be using Google Store demo account for most examples, I will be explaining each step using the order in the table but what you start optimizing your website do it backwards:

  • Test and optimize the conversion funnel
  • Test and optimize the landing page
  • SERP optimization

The priority goes to utilize existing traffic and fix any major issues with the conversion funnel before trying to bring more visitors to the website.

SERP journey optimization:

Optimizing for SERP starts by checking Google Search Console performance dashboard and finding keywords with high impressions and low CTR:


The next step will be looking up those keywords on Google to check the ranking and the snippets that Google is serving in the SERP (for my example I am checking the keyword: improve LCP wordpress):


As you can see the title and the description in the SERP for "improve lcp wordpress" are not relevant nor enticing for users to click, my post about LCP is not centered around WordPress, so it is not possible to optimize the title and the description to target this term, re-writing the post or creating a new post could be required in this case.

There is no quick solution that fits all cases, we just need to go through a list of keywords with low CTR, look them up and based on the results come up with recommendations. The most popular fixes to poor CTR are:

  • On-page and off-page SEO to improve ranking (better ranking means better CTR)
  • Optimize the title tags and descriptions tags to be more relevant to terms with high impressions (no guarantee that Google will use the exact title and description tags)
  • Optimize for for featured snippets when applicable, it is very popular to get a low CTR for keywords that trigger featured snippet in SERP as most of the clicks will go to the 0 ranking listing, and many people do not even click on anything if the answer to their question is included in the featured snippet.

Landing page user journey optimization:

Organic search users bounce from a website when:

  • They find the information they are looking for very quickly
  • They could not find the information they are looking for very quickly (e.g. the landing page was not relevant to what they searched for)
  • Bad user experience (could be the design, load time or technical difficulties viewing the page on their device).

How to analyze the landing page user journey:

Optimizing for a landing page user journey starts in Google Analytics by identifying pages with poor performance metrics like:

  • High bounce rate
  • Low conversion rate
  • Low revenue
  • Low page/session
  • Low session duration

In a quick view for the landing pages performance, a poor page was detected, with a quick look at this page, the reason for that poor performance becomes very clear:


It is a 404 page, the fix in this case could be redirecting this page to another page with similar products, or adding a better call to action that guides users to another relevant page. Let us take another page like this one which also happens to have a high bounce rate with no conversions. This page sells dino game T-shirts but not a dino game, checking what keywords this page is ranking for will give us an answer to that low performance:


There is no easy solution here as Google Store doesn't sell games, may be a link could be added to their dinosaur game page.

The right approach to analyzing the landing page user journey will enable identifying problems and finding solutions for them.

Conversion funnel user journey optimization:

Conversion journey starts when a user:

  • Adds a product to their shopping cart
  • Visits a page that contains a lead generation form
  • Clicks on any link or button that leads to the two actions above

Tracking this journey for ecommerce websites will be possible if enhanced ecommerce tracking is installed for Google Analytics.

Checking points to optimize for the checkout user journey:

Call to action and message analysis:

The landing page must be optimized for users to take the next step that we want them to take, there are two key elements for that:

  • The message on the page
  • Call to actions

Adding a call to action like buy now on the product page (see below) could be a quick win:


Shopping behaviour analysis:


There is a high drop-off rate between all sessions and sessions with product views, and a high drop-off from sessions with product views and sessions with check-out, the changes that can help to reduce drop rate are:

  • Better CTAs on product pages (especially for mobile users)
  • Relevant products section
  • Improving information architecture especially the upper navigation
  • Improving the internal search engine

Checkout behaviour analysis:


The drop rate from checkout to sessions is not very high, optimizing for the checkout journey could be done in different ways:

  • Provide users with as many payment options as possible
  • Make prices and shipping cost clear every step of the way
  • Reduce the form requirements as much as possible
  • Allow checkout without creating an account
  • Make sure the checkout process works well on mobile phones and cross browsers

final note:

User journey optimization is not a one time set and forget process, continuous efforts must be spent on a regular basis monitoring and improving all three user journeys.

Google Analytics

Why Direct Traffic Is High In Google Analytics

I was hesitant to write a new post about this topic as it is already covered well, in this post I will try to focus more on the diagnoses and recommendations than explaining the reasons behind direct traffic.


What is direct traffic in Google Analytics:

The popular conception about direct traffic that it is a type-in traffic where people type in the website directly in the browser or visit it using a browser bookmark list, but realty is that it is not always the case, all web tracking software need the referring URL to identify the traffic source, and it comes empty too them they recognize it as a direct traffic.

Take this URL as an example when Google Analytics detects a URL like that, it compares to its existing list of known sources, they will see that is listed as a search engine, the q=seo will confirm that it is a search URL not one of Google's properties, the other checking point is to check if the landing page has GCLID in the URL which refers to paid search, GA also checks for any UTM variables that forces GA to override the source. If there is no GCLID or UTMs then GA will categories this medium as organic and the source will be Google.

As you can see the referring URL is vital for GA to track any traffic, when the referring URL is missing r the source of the visit will be categorized as direct.

What causes the referring URL to be missing (in other words what causes direct traffic in GA):

The most popular reasons for direct traffic in GA are:

  • Excluded referrals in Google Analytics without a proper cross domain tracking
  • Type-in traffic (When someone types a website’s URL into their browser, it’s direct traffic) and browser's bookmarks
  • Traffic via desktop e-mail clients like Outlook
  • Traffic from APPs and desktop software
  • Fake direct traffic from spam bots
  • Referral traffic from a secure (HTTPS) site to a non-secure (HTTP) site
  • Non-web documents like PDF or DOC
  • Traffic from incorrectly tagged campaigns (mainly wrong UTM tracking)
  • Traffic from links that do not send referral traffic (links with rel="noreferrer" tag)
  • Traffic from browsers that block referrals using add-ons or due to firewall settings

I do not want to spend a lot of time explaining what causes direct traffic in GA, especially that many of the reasons above are beyond webmasters control, direct traffic will be always available in GA, it is normal to see 10%-20% direct traffic, no point of fighting, all what webmasters need to do is inspecting the problem and making sure that they have done everything from their end too prevent it.

Diagnosing direct traffic in GA:

Most traffic sources in GA have another tracking point, first thing is to check if GA traffic is matching the numbers provided by other platforms:

  • Install Google Search Console and make sure that the organic traffic in a specif time frame matches the number of sessions in Google Analytics for the same time frame
  • If you are running Google Ads make sure the number of clicks in Google Ads' dashboard matches the sessions in GA
  • Same thing applies to Facebook ads or any other platforms

If the numbers are close that will be a good signal that the most important traffic sources are tracked properly, in the next step I will explain:

How to reduce direct traffic in Google Analytics:

You do not need to worry about every reason for direct traffic, especially the ones you can not control, like type-in traffic, no-referrer etc, direct traffic will be always there, just work on things that are under your control:

  • Make sure GA settings are correct and the code is installed on every single page of the website
  • Check the referral exclusion section in GA and make sure all excluded domains have cross domain tracking installed.
  • Make sure all pages are served with HTTPS with no errors, broken HTTPS can stop browsers from passing referral path and showing traffic as direct
  • Tag email campaigns and other marketing channels (especially APPs advertisement) using UTM
  • Filter out bot traffic in GA if there is any

Finally just remember, it is almost impossible to reduce direct traffic to zero, 10%-20% is normal, just do your part and keep monitoring and diagnosing any spikes in direct traffic.

Google Analytics

Google Analytics Page Filters Explained

I was checking which page filters in Google Analytics can be used to track user journey on the website, so I can use those filters in conversion rate optimization, I did the two sessions below and tracked the results in Google Analtyics:

First session (flat URL structure):


Second session (deep URL structure):




I will check the results of each filter and mark it as useful or useless to study the user journey and do conversion rate optimization:

Landing page (useful): will show the first page visited in a new session :


Destination page (useless): all pages visited in a session will be destination pages:


Exit page (useful): last page visited in the session:


Next page path (useless): it seems to be just telling the potential path for the next page on the same folder level:


Second page (useless): not sure what is that, it seems to be including the second page up in the hierarchy of the site:


Previous page path (very useful): this is showing the previous page that the user visited before getting to the current page:


Page depth (useful): it is an indicator to the number of pages visited by a user (even in multiple sessions):


Page path level 1 (useless): this filter seems to be analyzing the folder levels and deciding the page path based on that:


Page path level 2 (useless): similar to level one, but it shows the second level in the folder structure:


Page path level 3 (useless): similar to level one and 2:


Search destination page (useful): a page that is clicked on after a user conducted an internal search:


Start page (useless): a page where a user started an internal search:


SEO and Tracking for online Stores

Ecommerce SEO has a lot of similarly with regular websites SEO, however there are some areas that need more focus or a different approach comparing to content websites, below is a list of the SEO steps that need to be done for an ecommerce site:

  1. Keyword research
  2. Faceted navigation
  3. Directory structure and information architecture
  4. Top navigation and IA
  5. Content creation
  6. Dynamic on-page optimization
  7. Sitemap creation
  8. Internal linking
  9. Internal search engine (site search)
  10. Product reviews
  11. Google Merchant Centre feed
  12. Schema installation
  13. GSC (Google Search Console) monitoring
  14. Core Web Vitals
  15. Tracking
  16. Conversion rate optimization CRO

I will go through those steps one by one with examples, and recommend the right approach that will work well for online stores. For my examples I will assume that I am trying to optimize an online store that sells mainly computers.

Keyword research:

The tools that are used for online stores keyword research are similar to ones that are used for regular websites, the main difference for ecommerce will be the large number of targeted keywords and their variations, the three tools below will be very helpful:

For an online computer store here is an example how AHREFS tool can help with keyword variations:


Faceted navigation:

“Faceted navigation” refers to how ecommerce websites allow visitors to filter and sort results based on product attributions (i.e. colour, size, model, brand and features), the tricky decision when it comes to faceted navigation is: should the result pages be served under dedicated pages and made available for search engines to crawl? Or should they be kept under disallowed pages or run by Ajax technology (no dedicated pages), in other words keep them hidden from search engines?

People search for keywords with a pattern like "laptop + colour", in this case creating multiple pages (page for each colour) for the same laptop can lead to duplicate content, so may be this pattern could be kept available only by a filter or an internal search but not for search engines to crawl. There are other situations like the monitor size where many laptop features can change with that, and it makes more sense to create a dedicated pages for each "laptop + monitor size" and making those pages available for search engines to crawl.

A lot of keyword research and product research will be required to decide which levels of faceted navigation must be served with dedicated URLs that are available for search engines and what should stay disallowed or under an Ajax filter.

Directory structure and information architecture :

The keyword research along with looking at Amazon or other competitors should be very helpful to design the directory structure and the information architecture.


The directory structure will be designed based on the products inventory, the URLs must follow logically the directory structure, this will also make building the upper navigation and the breadcrumbs easier:

  • (category page)
  • (subcategory/brand page)
  • (faceted navigation category page)
  • (product page)

Top navigation:

The top navigation (could be located in the header or on the side) is a very important component from a user experience standpoint and from search engines standpoint. Pages that are linked from the top navigation will gain more weight with search engines as they are internally linked from all pages on the website, users also will be able to navigate the website easier when provided with a well designed/structured top navigation.


Top navigation must be adjusted on a regular basis to follow search volume and user behaviour (especially site search).

Content creation:

Most large ecommerce websites use the manufacturer product description which can lead to duplicate content issues. The solution around duplicate content will be writing a unique product description for every product, which could be expensive for large websites. A good alternative could be:

  • Using product features in the database to create some unique information
  • Provide relevant products section
  • Provide user generated product reviews

The category level must have some text content at the top of the pages if possible.

Dynamic on-page optimization:

The keyword research may reveal that users/searchers include words that have certain patterns/modifiers like, online, store, on-sale, free shipping in their search, if that is applicable to your store those words could be added in the title tag and the description tag of all pages that they apply to (it could be done programatically for all pages).

Sitemap creation:

The sitemap is very important due the high number of pages (products) and the availability of those pages (products will be added/removed on a regular basis), it will be difficult for search engines to stay up-to-date with all the changes happening on the website using only crawling, a sitemap will improve the discoverability and crawalability.

Multiple sitemaps must be added to track each level separately (brand level, category level and product level), or even more granular than that:

  • categories-sitemap.xml
  • faceted-sitemap.xml
  • products-sitemap.xml
  • etc

Internal linking:

Internal linking helps users to navigation the website, it also helps search engines to crawl and distribute authority to internal pages (especially pages that are not linked from the upper navigation). The most popular way to provide internal linking for an ecommerce website are:

  • Navigation menus
  • Bread crumps
  • Pagenation
  • Similar products, featured products and customer who purchased this product also bought that section

Internal search engine (site search):

An internal search engine will be very helpful for users, best practices to follow for an internal search engine:

  • It must be placed at the top of every page of the website
  • It must be tracked using Google Analytics
  • It must be optimized on a regular basis based on user behaviour data in Google Analytics

In a case where the CMS does not have a sophisticated internal search engine and your website is indexed well by Google, consider using Google Programmable Search Engine.

The results of tracking the internal search in Google Analytics can help to optimize the information architecture and the upper navigation.

Product reviews:

It is very important to collect customer reviews after every successful transaction, this could be done using Google Customer Reviews program or Google partners, doing that will enable the seller ratings extension that could be used in Google ads. A review badge could be also used on the website which can provide enhanced SERPs for some keywords.

A review form also could be added to the product page to allow all users (not only buyers) to provide products feedback.

Google Merchant Centre:

Google Merchant Centere is a vital tool to run shopping ads with Google. Google has announced recently that they will start serving free product listings in their product search for websites that have products feed in GMC.

A key component to create a GMC account is a product feed, most popular online shopping carts provide a GMC compatible feed either natively or using a plugin. GMC can now create the feed using product schema when it is installed correctly on the website.


<script type="application/ld+json">
  "@context": "",
  "@type": "Product",
  "image": "",  
  "name": "Example Test",
  "description": "This is just a boring example",
    "offers": {
    "@type": "Offer",
    "priceCurrency": "USD",
    "price": "199.99"

Schema installation

As discussed in GMC section, Schema installation is very important for ecommerce websites, Here is a list of other Schemas that apply to ecommerce websites:

GSC monitoring:

Google Search Console provides a lot of valuable feedback for any website and more for ecommerce websites, it helps monitoring :

  • Structured data (Schema) issues
  • Mobile issues
  • Duplicate content
  • Index-ability, discoverability and crawblity report (index coverage)
  • Crawling stats
  • Sitemaps analysis
  • Security issues

Core Web Vitals:

"Web Vitals is an initiative by Google to provide unified guidance for quality signals that are essential to delivering a great user experience on the web."


Core Web Vitals could be measured using Google Speed Insights tool, GSC also provides a section to assess Core Web Vitals, generally a fast website that is easy for users to interact with will pass the Core Web Vitals assessment.

Google score on Core Web Vitals should help also with conversion rate:



It is strongly recommended to setup tracking for all KPIs, Google Analytics advanced setup will be required for that, tracking should include:

  • Enhanced ecommerce tracking, it tracks the check-out process step by step and create a funnel for that
  • Event tracking where applicable
  • Goal tracking with funnels where applicable
  • Form abandonment tracking (field by field)
  • Feed tracking using the UTM system
  • Internal search tracking
  • Website heatmaps

Conversion rate optimization

With the right tracking in place CRO becomes an easier task to do, studying users journey and removing frictions (especially in the check-out process) will help the website to better utilize existing traffic, which in most cases has a better ROI than acquiring new traffic.


SEO fundamentals for ecommerce websites are similar to content websites, however considering the size and the complexity of ecommerce websites a different approach could be required for some SEO steps.

Core Web Vitals Website Speed

How To Get Your Website Ready For Page Experience Core Web Vitals Update

Google has been tweaking their algorithm in the last few years toward user experience elements such as:

  • Website speed
  • Mobile friendliness
  • Secure connection HTTPs

Early 2020 Google Chrome team announced Core Web Vitals, following that Google announced that Core Web Vitals will be added as ranking factors to their algorithm along with the old user experience elements, in what the SEOs call page experience update.

Core Web Vitals measure real world user experience dimensions of web usability, such as load time, interactivity, and the stability of content as it loads. The three core web vitals announced by Google for 2020 are:

  1. Loading: Largest Contentful Paint (LCP)
  2. Interactivity: First Input Delay (FID)
  3. Visual Stability: Cumulative Layout Shift (CLS)

Due to COVID-19 Google will delay launching the new Google algorithm that includes Core Web Vitals (page experience update), and will provide at least six months notice before it rolls out.

How to evaluate Core Web Vitals for a website:

There are many tools that can help with that:

It is very important to understand the difference between lap data and field data before start optimizing for Core Web Vitals.

From Google: "The field data is a historical report about how a particular URL has performed, and represents anonymized performance data from users in the real-world on a variety of devices and network conditions. The lab data is based on a simulated load of a page on a single device and fixed set of network conditions. As a result, the values may differ."

 Field DataOriginal SummaryLap Data
Internet Connection speedDifferent connection speed for real-world usersDifferent connection speed for real-world usersLatency: 150ms Throughput: 1.6Mbps down / 750 Kbps up.    
Machine HardwareDifferent CPU, RAM, GPU, depending or real-world usersDifferent CPU, RAM, GPU, depending or real-world usersMid-Tier Mobile (4X slowdown of a high tier desktop)      
Results timeAverage last 28 daysAverage last 28 daysCurrent
LocationReal-world user’ locationReal-world user’ locationCalifornia
Result forTested URLWhole DomainTested URL

1- Loading - Largest Contentful Paint - LCP

"Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading."

The Largest Contentful Paint (LCP) metric reports the render time of the largest image or text block visible within the viewport (hero element). Website speed used to be monitored using metrics like Speed Index or First Contentful Paint FCP, those metrics are still important to monitor website speed, but the new focus should be improving LCP.

The best tool to measure and optimize for LCP is Google Chrome Developers Tools, the timestamp number represents LCP, the highlighted area marks the hero element that LCP is calculated when it finishes loading.


It is very important to run LCP test using different internet speed modes, and CPU speed modes, to better understand LCP value for different users based on their internet connection and machine power. Same thing applies to other Core Web Vitals and performance metrics.


How to optimize for LCP?

Optimizing for LCP is optimizing for speed, check my how to optimize for LCP post. A good start will be checking the recommendations provided by Lighthouse:


2- First Input Delay (FID)

"FID measures the time from when a user first interacts with a page (i.e. when they click a link, tap on a button, or use a custom, JavaScript-powered control) to the time when the browser is actually able to begin processing event handlers in response to that interaction."

Not all users will interact with your site every time they visit and the timing of the interaction is different, and not all interactions are relevant to FID. This means some users will have no FID values, some users will have low FID values, and some users will probably have high FID values.

FID is a metric that can only be measured in the field, as it requires a real user to interact with your page. You can measure FID only using Google Speed Insights tool, which is using Chrome User Experience Database, read here to better understand the difference between lab data and filed data. The good news that there is a lab metric (Total Blocking Time) that can help to diagnose and improve FID.

The Total Blocking Time (TBT) metric measures the total amount of time between First Contentful Paint (FCP) and Time to Interactive (TTI) where the main thread was blocked for long enough to prevent input responsiveness. No task should exceed 50 MS, Chrome Developers Tools provides total blocking time along with the tasks that are exceeding the 50 MS mark:


How to optimize for FID:

Optimizing for LCP includes lots of common elements that will improve FID, most common causes of high FID that needs to be worked on are:

  • Unnecessary JavaScript loading, parsing, or execution. While analyzing your code in the Performance panel you might discover that the main thread is doing work that isn't really necessary to load the page. Reducing JavaScript payloads with code splittingremoving unused code, or efficiently loading third-party JavaScript should improve your TBT score.
  • Inefficient JavaScript statements. For example, after analyzing your code in the Performance panel, suppose you see a call to document.querySelectorAll('a') that returns 2000 nodes. Refactoring your code to use a more specific selector that only returns 10 nodes should improve your TBT score.

3- Cumulative Layout Shift (CLS):

"CLS measures visual stability. Have you ever been reading an article online when something suddenly changes on the page? Without warning, the text moves, and you've lost your place. Or even worse: you're about to tap a link or a button, but in the instant before your finger lands—BOOM—the link moves, and you end up clicking something else!"

This could be the easiest Core Web Vital to diagnose, by looking at the thumbnails tracker in Chrome Developers Tools and enabling the Layout Shift Regions it should be easy to find out what is causing a high CLS:


How to optimize for CLS?

CLS diagnosis and optimization could be the easiest among all Core Web Vitals, in most cases CLS issues are visible to the human eye, with some CSS and JS work, creating stability while the page is loaded should be achievable. Check the experience section in Chrome Developers Tools, and find which elements are moving and see if you can:

  • Make them stable on the page
  • Reserve a place holder for them from the beginning wile waiting for them to load.

CLS is measured for the entire life cycle of the page (even if it is kept opened for days by users), for that reason the actual user experience value could be different form Lighthouse or Chorome Developers Tools (where they measure only one trace).

The Marathon Of Good Core Web Vitals score:

Keeping a good Core Web Vitals score is a marathon that needs constant website improvements, there are too many variables that can change the score that used to be a pass before:

  • Website technology changes (hosting and CMS) and content changes.
  • Users technology changes (the type of machines and internet connection they use)
  • Google changing their Core Web Vitals criteria and scoring

Monitoring the Core Web Vitals section in Google Search Console will be key to find any Core Web Vitals issues and start working on them as soon as they are detected.

Core Web Vitals improvement is not only a search engine best practice, it is also a user experience element that can lead to better a conversion rate.

Accessibility News

How To Comply With Website Accessibility Standards In Ontario WCAG


New public websites and significantly refreshed websites were required to be compliant with WCAG 2.0 Level A by January 1, 2014.

Furthermore, by January 1, 2021, all public websites and web content posted after January 1, 2012, must meet WCAG 2.0 Level AA (check this tool to find out if your website is compliant).

All large (50+ employees) private and non-profit organizations, and all public-sector organizations are required to make their websites accessible.

I am not sure what is this going to mean from a legal standpoint and what actions the government will take against non-complying websites, since 2014 1000s of websites were built and refreshed without fully complying with WCAG 2.0 A and I did not hear of any penalties or legal issues.

I have contacted the government several times in the last few months trying to learn more about the tools they will be using to audit websites, and what are the penalties, but I did not get any answers.

Options to comply with WCAG

1- Check with your web developer

Most web developers started to design websites with accessibility in mind, check with your web developer if they have the capability of making the whole website compliant with WCAG 2.0 AA, expect to pay additional cost for that.

If you are redesigning/refreshing your website ask your web developer to include WCAG compliance in the SOW.

2- Request an audit and pass it to your web developer for implementation

There are many audit providers including AODA

Audit providers do not offer implementation in most cases, the implementation must be done by the web developer, which means more cost, you will be paying the audit provider to conduct the audit + the web developer to implement the recommendations in the audit.

The other issue with audits is that they rarely cover the whole website, they will take a sample of pages (home page and other random pages) and use them in the audit.

3- Automated third-party accessibility services

There are many automated accessibility providers that can make websites compatible with WCAG 2.0 AA. There will be a monthly fee to use those services, and the cost will rely on the size of the website (number of pages, and the traffic in some cases), for a small website expect to pay $29+/month to use those services:

Please note that automated solutions may not offer a 100% compliance guarantee, but they will be still far better than leaving the website without any accessibility optimization (in case you do not have the resources to do it).

4- Automated accessibility software/plugins

There are many automated accessibility software that can make websites compatible with WCAG 2.0 AA (especially if you are using WordPress)

Cons and Pros of every method:

1- Web Dev A-Z2- Audit + Web Dev3- Service4- Software
CostHighVery highLowLowest
100% WCAG 2.0 AA complianceGuaranteedGuaranteedNo guaranteedNo guarantee
ImplementationSlow and difficultVery slow and difficultQuick and easyQuick and easy
LongevityMust be updated when major changes in content or website happenMust be updated when major changes in content or website happenNo future updates requiredSoftware updates will be required (it could be automated sometimes)

Tools to check accessibility compliance:

There are many tools that can help you to test your websites compliance with WCAG, I will include 3 of them which happen to have a Google Chrome extension (which normally makes things easier):

I also recommend using screen readers or screen reader simulators (this is what people with impaired vision use to browse the web):

Using screen readers will help you to discover accessibility issues that can not be covered by accessibility tools.

Core Web Vitals

How to Optimize For Largest Contentful Paint (LCP)

LCP (largest contentful paint) is one of the most important core web vitals, it is the only core web vital that is related 100% to speed and hence it is the most difficult one to diagnose and optimize for. LCP could be measured using Google PageSpeed Insights which is powered by Lighthouse.

LCP should load within 2.5 seconds, if 75% of the loads ( regardless of which pages) on a website achieve that number, LCP will be marked as passing assessment. Only elements within the user viewport (above the fold) will be used to calculate LCP.


Google has provided a great tutorial how to optimize for LCP, in this post I will try to provide a check list and actionable directions that you can implement yourself or take to your web developer that is working on improving Core Web Vitals (mainly LCP).

14 improvements that you can do to improve LCP:

  1. Adequate resources on the server
  2. Enable browser caching
  3. Enable GZIP (text compression)
  4. Enable server caching like OPcache or reverse proxy
  5. Keep your software up-to-date (CMS, Plugins, operating system like Ubuntu, control panel like Cpanel, PHP, MySQL and Apache with HTTP2 module)
  6. Install a server side HTML caching for your CMS
  7. Setup browser side caching using service worker
  8. Minify JavaScript and CSS
  9. Compress images and use the right format for them
  10. Resize images to fit the required dimensions in the style sheet
  11. Remove or defer files and codes that block critical rendering path for above the fold content
  12. Inline critical CSS and critical JavaScript files where possible
  13. Lazy load images below the fold
  14. Use CDN for resources like images and JavaScript files or for the whole website
  15. Fast DNS server

I will provide more details how to improve each of the items above, most of my examples will work best for WordPress that is hosted on Linux with Apache being the web server.

Adequate resources on the server:

You do not need to be cheap when it comes to web hosting, having more resources than what your website needs is better than lacking resources. Start with a VPS, possibly 2 cores, 4 GB of ram and SSD, and keep eye on your resources usage, your average CPU load must stay one or below and your RAM usage must be less than 50% on average. Add more resources if you did not achieve those numbers.

Enable browser caching:

This setup could be done in most cases using the server's configuration, Apache has mod_expires module that enables you to do that by adding the code below to your .htaccess file:

<FilesMatch "\.(webm|ogg|mp4|ico|pdf|flv|jpg|jpeg|png|gif|webp|js|css|swf|x-html|css|xml|js|woff|woff2|otf|ttf|svg|eot)(\.gz)?$">
<IfModule mod_expires.c>
AddType application/font-woff2 .woff2
AddType application/x-font-opentype .otf
ExpiresActive On
ExpiresDefault A0
ExpiresByType video/webm A10368000
ExpiresByType video/ogg A10368000
ExpiresByType video/mp4 A10368000
ExpiresByType image/webp A10368000
ExpiresByType image/gif A10368000
ExpiresByType image/png A10368000
ExpiresByType image/jpg A10368000
ExpiresByType image/jpeg A10368000
ExpiresByType image/ico A10368000
ExpiresByType image/svg+xml A10368000
ExpiresByType text/css A10368000
ExpiresByType text/javascript A10368000
ExpiresByType application/javascript A10368000
ExpiresByType application/x-javascript A10368000
ExpiresByType application/font-woff2 A10368000
ExpiresByType application/x-font-opentype A10368000
ExpiresByType application/x-font-truetype A10368000
<IfModule mod_headers.c>
Header set Expires "max-age=A10368000, public"
Header unset ETag
Header set Connection keep-alive
FileETag None

Enable GZIP (text compression):

This setup could be done in most cases using the server configuration, Apache has mod_deflate module that enables you to do that by adding the code below to your .htaccess file:

<IfModule mod_deflate.c>
AddType x-font/woff .woff
AddType x-font/ttf .ttf
AddOutputFilterByType DEFLATE image/svg+xml
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE text/javascript
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript
AddOutputFilterByType DEFLATE application/x-font-ttf
AddOutputFilterByType DEFLATE x-font/ttf
AddOutputFilterByType DEFLATE application/
AddOutputFilterByType DEFLATE font/opentype font/ttf font/eot font/otf

Enable server caching like OPcache or reverse proxy:

Server caching will save popular pages in the memory of the server so they could be served quickly to users, PHP has OPcache that could be enabled to do that.

Keep your software up-to-date:

There is a lot of technology involved in running a website:

  • The server and its software (Operating system like Linux, web server like Apache with HTTP2 module, programming language like PHP, database engine like MySQL, control panel like Cpanel and more)
  • CMS, like WordPress.
  • CMS add-ons like Plugins.

Keeping all this software up-to-date is vital for both performance and security.

Install a server side HTML caching for your CMS:

For a CMS like WordPress to generate a page there will be few PHP calls to the database followed by a HTML version of the page passed by the web server (Apache) to the browser to parse it then render it, this is a long journey with too many variables. Caching a copy of every single page of the website in a HTML format will save a lot for work required by the server and enable it to pass a static HTML file to the browser for rendering, this is a big time saver. A plugin like WPfastest cache or WP Rocket can take care of that for WordPress.

Setup browser side caching using service worker:

The Service Worker API comes with a Cache interface, that lets you create stores of responses keyed by request. Service Worker can even enable the website to work totally off-line after first load.

Minify JavaScript and CSS:

Spaces and comments can inflate CSS and JavaScript files significantly, minifying those files manually or using a plugin if you have WordPress will improve performance. For WordPress WPcache can do the minification.

Compress images and use the right format for them:

Images have different format and purposes, JPEG, PNG, GIFs, SVG, and each of them also has multiple format like JPEG 2000, choosing the right format and resolution could be a huge size saver. For WordPress WPsuper cache plugin has an image compression module that makes image compression an easy process, you can also use plugins like ShortPixel.

Consider using WEBP, It can reduce your image size by 30% comparing to JPEG or PNG.

Resize images to fit the required dimensions in the style sheet:

The image dimensions should always match the required space on the page, having a 1000X1000 pixel images in 100X100 pixel space will add a lot of unnecessary bytes to the total page size.

Remove or defer files and codes that block critical rendering path for above the fold content:

Files that are blocking the critical rendering path could be checked using the coverage tab in Chrome Developer Tools:

Coverage section chrome developer tools

Files that are not used for critical rendering must be deferred. If you have multiple JavaScript files that are partially used for the critical rendering path, you can combine them in two files, one that has all the codes that are required for the critical rendering path critical.js/critial.css and another one that includes codes that are not required for the critical rendering path non-critical.js/non-critial.css, the second file must be deferred.

Consider also preloading files that are required for critical rendering path:

<link rel="preload" as="script" href="critical.js">

<link rel="preload" href="critical.woff2" as="font" type="font/woff2" crossorigin>

<link rel="preload" href="critical.css" as="style">

<link rel="preload" href="bg-image-narrow.png" as="image" media="(max-width: 600px)">


You can defer JavaScript and CSS files using the codes below:

<script src="demo_async.js" async></script>

<link rel="preload" href="styles.css" as="style" onload="this.onload=null;this.rel='stylesheet'">
<noscript><link rel="stylesheet" href="styles.css"></noscript>

You can also consider inlining CSS and JS codes that are required for the critical rendering path (read more below).

Inline critical CSS and critical JavaScript files where possible:

Web page rendering will be blocked by the browser until all included resources like JS and CSS are fully loaded, having CSS or JS codes that are required for critical rendering path in files will add an extra step to the rendering process which is requesting those file from the server (extra requests to the server). Inlining codes that are required for critical rendering path and put the rest in deferred files will improve website speed.

Lazy load images below the fold:

Images below the fold contributes to page load time but they are not seen by users during the initial page load, lazy loading those images will reduce load time without negatively affecting user experience.

WordPress 5.4 will natively support lazy load, for other CMS you can use a lazy load library or you can use the browser native lazy load attribution:

<img src="image.png" loading="lazy" alt="…" width="200" height="200">

Use CDN for resources like images and JavaScript files or for the whole website:

CDN has a lot of features that can help with speed:

  • Provide files to users from data centres close to their physical location which improves website speed
  • Reduces server load as most files will be served from the CDN's provider servers (normally powerful servers with load balance)
  • Firewall with DDOS protection
  • Caching content in static HTML (server side caching)

For WordPress Jetpack plugin can provide images and JS files caching, you can also use providers like Cloudeflare which providers a full website caching.

Fast DNS server

DNS is the system that connects readable domain name that human can read and remember ( to an IP (internet protocol) which a machine/computer can understand and handle. This process takes in most cases less than 300 MS, but the difference between a slow DNS and a fast DNS is worth it considering how cheap good DNS providers are. See the screen shot below for



Be aware that optimization for speed is not one and done process, there is always something to improve. Keep monitoring your Core Web Vitals using Google Search Console and make sure your website is having good score.

Core Web Vitals Website Speed

Lab data VS field data VS Origin Summary Core Web Vitals

PageSpeed insights along with Google Search Console are the best tools when comes to assess a website performance against Core Web Vitals (LCP largest contentful paint, FID first input delay and CLS cumulative layout shift). However the terminology used in those tools could be sometimes confusing to users.

One of the most popular questions when it comes to Core Web Vitals assessment is what is the difference between lap data, field data and origin summary, in this post I will try to answer this question.

core web vitals

Lab data:

Things you need to know about lab data:

  • Lab data is empowered by LightHouse technology that simulates mobile throttling with a lower speed. Read more here.
  • Lighthouse uses in most cases a slower CPU than the average CPU available for users "Lighthouse applies CPU throttling to emulate a mid-tier mobile device even when run on far more powerful desktop hardware" .
  • Lighthouse in most cases will use one location (USA).
  • Lab data is generated only for the tested URL in Google PageSpeed Insights tool.
  • Lab data is a live data, it reflects the speed at the time the test was run.
  • Some users interaction elements like FID will not be visitable in the lab data.

When you run Lighthouse in Google Chrome Developer Tools, at the end of the report you will be able to see the settings that are used in that experiment:

lighthouse test

Looking at the settings above we can easily see how Lighthouse is using a slower internet connection and slowing the CPU power by 5 times. The reasons Lighthouse does that is to get the test to cover all online users that use different type of computers and connect to the internet using different levels of speed.

Field data:

Field data is generate by actual Google Chrome everyday users, with:

  • Different computers/phones (different resources like CPU, RAM and GPU).
  • Different internet speed.
  • Different locations.
  • Field data is generated only for the URL that is tested in PageSpeed Insights .
  • Field data is based on Chrome User Experience Report (CrUX), so it is not live data, it the average user data collected in the last 28 days for the tested URL.
  • Google uses the 75th percentile value of all page views to the tested page to produce the score, if at least 75 percent of page views to the tested page meet "good" threshold, the page is classified as having "good" performance for that metric.

So if most of your users coming from a population with a high internet speed and powerful devices, it is normal to see field data better than lab data.

On the other hand if your server is overloaded at the time you are running the test, it is normal to see lab data recording higher numbers than field data.

Original summary:

This is very similar to field data but it represents the average performance of all pages on your website/domain.

Google uses the 75th percentile value of all page views to that site/domain to produce the score, if at least 75 percent of page views to the tested site meet "good" threshold, the site is classified as having "good" performance for that metric.

The default Core Web Vital charts you see in Google Search Console represents the history of original summary:


Which data set to care mostly about?

Google will be using the field data (CrUX) and original summary data to judge your pages/website, which is also data that is available in Google Search Console.

What I recommend to monitor in the PageSpeed Insights report is the original summary data, which is most likely the data that will be used by Google algorithm updates in the future. It is ok to use lab data while working on speed improvements as you need an instant feed back.

 Field DataOriginal SummaryLap Data
Internet Connection speedDifferent connection speed for real-world usersDifferent connection speed for real-world usersLatency: 150ms Throughput: 1.6Mbps down / 750 Kbps up.    
Machine HardwareDifferent CPU, RAM, GPU, depending or real-world usersDifferent CPU, RAM, GPU, depending or real-world usersMid-Tier Mobile (4X slowdown of a high tier desktop)      
Results timeAverage last 28 daysAverage last 28 daysCurrent
LocationReal-world user’ locationReal-world user’ locationCalifornia
Result forTested URLWhole DomainTested URL

Why I do not see field data or origin summary data?

As those two data sets are actual user data coming from Google Chrome Experience Report (CrUX), it is normal for low traffic websites not to see data in one of them or in both of them.

In a situation like that is is recommended to adjust Google Chrome developers tool to match the most popular connection speed used by your users and do a lap test using those settings: