Monthly Archives

March 2019

UX and CRO

User experience Conversion Rate and CTR Optimization

Conversion rate optimization (CRO), UX and CTR optimization could be considered  part of SEO now. SEOs are supposed to optimize websites to score well based on known ranking factors (Google uses 200+ of them but few of them are known to SEOs), but do search engines have any ranking factors related to CTR, CRO and UX? In the early days of search engines they did not use any user experience related ranking factors, but that has changed in the recent years, the ones that Google admitted using as ranking factors (with small impact or results) are:

  • Website speed
  • Secure websites
  • Mobile fist index (they want to understand user experience with mobile phones especially above the fold area)

Let us remember also that Google bot has evolved a lot now, it can crawl, parse and render pages with better understanding of the layout (e.g. what content is above the fold, what is visible to users on the first load and what is not), understanding user experience on websites is becoming a high priority for Google especially in the mobile era where it is difficult to create a good user experience considering the limited screen space available for developers. The other looming question is whether Google going to use the user experience data available for them as ranking factors in the future if they are not already using them, Google owns :

  • The most popular browser on the web (Chrome)
  • The most popular email service (GMAIL)
  • Cookies on almost every website on the web (Via Google Analytics, YouTube and more)

They can go beyond link profile and on-page SEO analysis to find quality websites, there are too many user experience signals available for Google now that they can use as ranking factors.

To make it clear here I am not asking you to be a UX expert, you just need to gain enough knowledge to cover the basics. There are four types of UX optimization that you can do and they have different level of difficulties and time requirements, I will walk you through all of them one by one and you can choose what to apply for each project depending on the needs and the available budget:

  • Web design standards and convention of the web
  • Experimental (using A-B tests and multivariate testing)
  • Data driven
  • User feedback driven

Web design standards and convention of the web

People visit 5-7 websites/day on average, most website follow very similar standards that become expected by most users when they visit a new website, some of them are:

  • A logo on the left upper corner of the page that you can click on to take you to the home page
  • Upper navigation menu in the header of the page, in most cases it has a contact us page
  • A slider or a page wide image at the top is becoming popular in modern design
  • A phone number placed in the header and the footer of the website and social media profiles linked from header or footer
  • Websites should adapt to all device sizes (responsive design)
  • A call to action is expected in the header area of the website (read more, get a quote etc)
  • The message has to be loud and clear above the fold (marketing message, company vision, summary of the service)

By checking a website design against the conventions of web design you will be able to come up with few recommendations to improve user experience

Experimental A-B testing and multivariate testing

You can provide users with two different copies of the page (multivariate testing where the difference between the pages is not significant) using technologies like Google optimizer or even two significantly different pages (A-B testing), after you have enough data available you can find out the winning version (the one with less bounce rate and better conversion rate) and use it on the website.

Landing pages services like Unbounce and Instapage are offering similar services now

Please note that you need a significant amount of traffic and conversions in order for those tests to give you trusted results which is not available small websites with few conversions/week. For small websites you need to follow the conversion rate best practices (e.g. a prominent call to action) and convention on the web when doing CRO

Data driven:

Technology can provide us with many user behaviour signals without users even knowing that we are doing that (we still have to respect users privacy), few examples for that:

  • Mouse movement tracking, you can use CrazyEgg, HotJar or a free service like HeatMap, once you have enough data you should be able to restructure the page to make it more effective for users
  • GA (Google Analytics) a good starting point is analyzing pages with a high bounce rate, in most cases this is a signal that users are not finding what they are looking for quickly. Find the top keywords that drive traffic to those pages and make sure that users are getting their questions answered quickly without taking any extra actions like scrolling down or clicking on a tab or a button
  • Landing pages with low conversion rate should be analyzed and re-optimized, low conversion rate could be a result of a poorly structured page, it could be also a signal of low trust due to lacking of social proof, adding partner and awards badges, clients or users testimonial could be helpful.
  • Slow loading pages which could be found in GA present an opportunity for improvement. Reducing web page speed will have a positive impact on conversion rate
  • Low CTR (click through rate) in SERP (search engine result page), this data is available in GSC (Google Search Console), when CTR is low it could be a signal that your content is not relevant enough for that search query or the text snippet that is used by search engines is not good enough, always make sure that title tags and description tags are enticing for users to click, if Google is picking the SERP snippets from another section in the page optimize that text to make more relevant for the target keyword and more attractive for users to click.

There are many other data sources where you can find user behavior signals that you can use to optimize your website for a better user experience and eventually better ranking with search engines

User  feedback based:

Surveying website users asking them general questions what do they think about the website or what changes they like to see is a very effective usability improvement tactic, this could be done using services like 5 seconds testing or any other similar testing services where you upload a screen shot of a website and ask people see it for 5 seconds before answering few questions like:

  • What services do this website offer?
  • What phrases or words do you remember from the website?
  • What do you think about the design of the website?

There many solutions to conduct a scurvy on a website depending on the CMS your are using, some solutions that rely on JavaScript embedded codes can work almost on any website, Getsitecontrol is an example for that.

Trust:

I tried my best not to tab into marketing while taking user experience and conversion rate but that is really difficult, you can create the best landing page on the web but it will still convert less than poorly designed landing page if it will not be trusted by users. Think about a bank like Royal Bank offering wealth management service and Joe the Financial adviser offering the same service, even with Joe having a very well optimized landing page and offer the same services for a lower price Royal Bank will have a better chance to convert, here are few tips to over come that:

  • Be a brand, create a nice logo and user it consistently, design a professional website, pick clours and fonts for your brand and user them on all marketing material. Advertise on different channels especially community events.
  • Use social proof signals like user testimonials, case studies, stats and award winning and mention badges (e.g. as seen as Toronto Star)
  • Offer quick communication options like online chat

Finally even if search engines will not use more user experience signals as ranking factors in the future, providing a good user experience should be always a priority. User experience can even affect other areas of SEO like attracting link, do people link to websites where they had a bad user experience with? The answer is no.

Next step: tracking and Analytics

Technical SEO

Technical SEO

Technical SEO refers to the optimization work done on the website technical infrastructure (HTML source code, server side codes, hosting, assets and more) to make it search engine friendly

When it comes to technical SEO there are few buzzwords that you need always to keep in mind:

  • Crawlability
  • Renderability
  • Indexability  (Readability )

How do search engines work (mainly Google):

When search engines find a new web page they send their crawlers (a software that behaves  like a browser installed on a powerful computer) to read the source code of that web page and save it to the search engines storing servers to be parsed and processed later.

Second step will be parsing that source code, possibly save missing resources like images, CSS and other dependencies if the crawler decides that the content in the source code is readable without rendering (that happens when the content is included in plain HTML or a simple JavaScript format) they will start processing it by turning that source code into structured data (think about your excel sheet, columns and rows) that they will eventually turn into a database, considering the size of the web searching files and returning results quickly is impossible but searching a database and returning results in less than a second is a possibility. In order for search engines to turn web content into a searchable database they need to isolate text from codes, if they find any content with Schema codes (structured data) turning content into database will be a lot easier, HTML elements like title tags and descriptions tags are also easy to process.

If the crawler decides that this is Javascript heavy page that needs to be rendered to find the text content a more advanced crawler will be sent later (some times in few days) to render the page and get it ready for processing, watch the video below to understand how Google's crawlers work.

Renderability was not it a thing when search engines started, it is a very resource intensive and slow process but with more websites using advanced JavaScript framework like Angular (where the source code doesn't have any text content in some cases) search engines started to see a need to fully render the page in order to capture the content. The other benefit of fully rendering the page is understanding the location and the state of the content on the page (hidden content, above the fold or under the fold etc).

Currently the only search engine that does well with rendering is Google, they send a basic crawler at the beginning (it can understand simple JavaScript but not frameworks like Angular) then they comeback after with another crawler that can fully render the page, check this tool to see how a web page renders with Google bot 

Once processing is completed a functional copy of the page will be stored in search engines servers, they will make it available for the public eventually using the command cache:URL at this point the page will be fully indexed and able to rank for whatever keywords search engines decide it relevant to based on the quality of the content and the authority of the website (in other words the ranking algorithm).

Crawlability Optimization, Speed and mobile friendliness):

Firs step in making a website crawlable is providing access points to the crawlers like:

  • Site map
  • Internal links from other pages
  • External Links
  • RSS feed
  • URL submission to Google or Bing Search Console, or using their indexing API if it is available (only available for few industries)

Discuss with your webmaster what happens when a new content is added to the website and make sure there is one or more access point available for the crawler to find that page (ideally sitemap + one or more internal links from prominent pages), modern CMS like WordPress will provide access points automatically when adding a new post but they do not do that when you add a new page and that is where you need to manually modify the information architecture to include a link to that page.

Each piece of content available on a website must have it is own dedicated URL, this URL must be clean and not fragmented using charterers like # . Single Page Application (SAP) is an example of a situation that you need to avoid where the whole website operates based on a single URL (normally the root domain and the reset will be fragmented URLs), in this case technologies like AJAX (mix of HTML and JavaScript) will be used to load content from the database to answer any new page request (#anotherpage), users will not see any issue with that but search engines will not be able to crawl the website since they use dedicated URLs as keys to define a page in their index which is not available in this case as they totally ignore fragment URLs

Search engines have a limit on the number of pages they can crawl from a website in a session and in total they call it crawl budget, most websites (with less than a million pages and not a lot of new content added every day) do not need to worry about that, if you have a large website you need to make sure that new content is getting crawled and indexed quickly, providing strong internal links for new pages and pushing them to the sitemap quickly can help a lot with that.

URLs management: Many websites use parameters in URLs for different reasons, many eCommerce websites use parameters in URLs to provide pages for the same product in different colours or the same product in different sizes (faceted navigation), sometimes search engines will be able to index those pages and they will end up with infinitive number of pages to crawl which can create a crawling issue and also a duplicate content issue. Search engines provide webmasters with different tools to control crawlability and indexability by excluding pages from crawling, ideally if the website is structured well there will be less need to use any of the tools below to influence crawlability:

  • Robots.txt, a file located on the root of your website where you can provide rules and direction to search engines how to crawl the website, you can disallow search engines from crawling a folder, a pattern, a file or a file type.
  • Canonical tags, <link rel="canonical" href="https://www.wisamabdulaziz.com/" />  you can place them in page B' header  to tell search engines that the page with the original content is page A located under that canonical URL. Using canonical tags is a good alternative to 301 as they do not need any server side coding what makes them easier to implement
  • Redirects, I mean here the server side redirects (301 for example) which is used to tell search engine that page A was moved to page B, this should be used only when the content on page B was moved to page A. It could be used also when there is more then one page with very similar content.
  • Meta refresh, <meta http-equiv="refresh" content="0;URL='http://newpage.example.com/'" />   normally located in the header area, it directs browsers to redirect users to another page, search engines listen to meta refresh, when the waiting time is 0 they will be treat it like a 301 redirect
  • noindex tags, <meta name="robots" content="noindex"> they should be place in the header of a page that you do not search engines to crawl or index

The final thing to optimize for crawlbility is website speed which is a ranking factor also, few quick steps you can do to have a fast website:

  • Use a fast host, always make sure you have extra resources with your host, if your shared host is not doing the job just upgrade to a VPS or a dedicated server it is a worthwhile investment
  • Use a reliable fast CMS like WordPress
  • Cache your dynamic pages (WordPress posts for example) into an HTML format
  • Compress and optimize images
  • Minify CSS and JavaScript

To test your website you can use different speed testing tools listed here

Google Bot For Mobile:

With mobile users surpassing desktop users few years ago mobile friendly websites are becoming more important to search engines and web developers, search engines like Google have created a mobile crawler to understand more how the website is going to look for mobile users, when they find a website ready for mobile users they set their mobile crawler as the default crawler for this website (what they call mobile first), there are few steps you can take to make sure your website is mobile friendly (for both users and crawlers):

  • Use responsive design for your website
  • Keep important content above the fold
  • Make sure your responsive website is errors free, you can use GSC for that or Google Mobile Friendly Test
  • Keep the mobile version as fast as possible, if you can not do that for technical or design reasons consider using Accelerated Mobile Pages (AMP)

Renderability Optimization:

The best SEO optimization you can do for renderability is to remove the need for search engines to render your website using second crawling (advanced crawling), if your website is built around some advanced JavaScript platform like Angular is strongly recommended to give crawlers like Google bot a pre-rendered HTML copy of each page  (regular users can still get the Angular format of the website, this practice is called dynamic rendering), this could be done using built in feature with Angular Universal or using third party solutions like Prerender.io

Do not block resources (CSS, JavaScript and images) that are needed to render the website, back in the days webmasters used to block those resources using robots.txt to reduce the load on the server or for security purposes, inspect your website using Google Search Console, if you see any blocked resources that are needed for Google to render the website discuss your web developer how you can safely allow those resources for crawling

Indexability Readability (Schema)

When a website is optimized well for crawlability and renderability, Indexability will be almost automatically taken care of, the key point for indexability is providing a page with a dedicated clean URL that returns unique content with a substance and loads fast so search engines can crawl and store in their severs.

Content that can cause indexability issues:

  • Thin content as it may not be kept in the index ..
  • Duplicate content
  • Text content in images,  text in SWF files, text in a video files and text in complex JavaScript file, this type of content will not make its way to the index and it will not be searchable in Google

Content that can help search engine in parsing and indexability:

  • Structured data mainly Schema can help search engines to turn content into a searchable database almost without any processing, eventually that will help your website to have rich results in the SERP (example for that is the five stars review that Google adds for some websites)
  • Using HTML markup to organize content (i.e. <h2>, <strong>, <ol>, <li>, <p>) will make it easier for search engines to index your content and show it when applicable in their featured snippets like the answer box.

Monitoring and errors fixing:

Contentious monitoring of websites crawlability and indexability is key to avoid any situation where part of the website becomes uncrawlable  (could be your webmaster adding noindex tag to every page on the website), there are different tool that can help you with that:

  • Google Search Console (GSC), after verifying your website with GSC Google will start providing you with feedback regarding your website' health with Google, the index coverage is the most important section in the dashboard to keep eye on to find out about crawlability and indexability issues. Google will send messages through the message centre (there is an option to forwarded to your email) for serious crawlability issues
  • Crawling tools: SEMrush, Ahrefs, Oncrawl, Screaming Frog can be helpful to find out about errors
  • Monitor 404 errors in Google Analytics and GSC, make sure to customize your 404 error pages, add the words "404 not found" to the title tag so it becomes easier to find 404 error pages using Google analytics
  • Monitor indexability, check if the number of indexed pages in GSC make sense based on the size of your website (should not be too big or too small comparing to the actual number of unique pages you have in your website)
  • Monitory renderability using the URL inspection tool in GSC, make sure Google can render that pages as close as possible to how users can see it, pay attention to blocked resources that are required to render the website (the URL inspection tool will notify you about them)

 

Index coverage monitoring and analysis definitely needs to be a service that you offer to your clients as an SEO specialist, a GSC monthly or quarterly audit is strongly recommended.

Next step: user experience and conversion rate optimization

 

Off Page SEO

Off page SEO And Link Building

Off page SEO refers to all activities you do that do not require making any changes to your website or your content but it can still help with ranking, the most popular off page SEO activity is link building, before explaining more about link building I want you to be aware that historically the harshest penalties applied to websites by Google are all driven by link building activities that are against Google' guidelines. I have been through this myself as I relayed on artificial link building activities (mainly article syndication with keyword rich anchor text)  and ended up with manual actions for some websites in March 2012 then hit again by Penguin update in April 2012. So take your time to learn more about link building before start acquiring any links or hiring someone to do that for you.

Why inbound links are the backbone of Google's ranking algorithm?

Think how marketing happens in the offline world, you can say anything you want about yourself in a resume but at the end of they day the employer will call some referrals to verify that. When we hire a business or buy a product we try to find other people that use them before and hear their experience (in other words reviews). We watch celebrities doing TV ads all the time and as marketers we know those ads work well as people like and trust those public figures so an endorsement from them will be higher impact on people. Ask yourself if you have the opportunity to choose between putting the prime minster of Canada as a referral on your resume or put a 15 old child what do you change

Let us take the above analogy to the web, you can see anything on your website and do a perfect on page SEO, but so other 1000 websites so how search engines can rank a 1000 website or more optimized well for a keyword like "web hosting"? In the early days of search engines Yahoo could not figure out a solution for that and they ended up showing spammy and low quality results which opens the door for Google to come up with Page Rank (will talk about it more here) which is a scoring from 0-10 for every web page in Google' index based on the page rank and number of links pointing back (voting for) to that web page, and they gave PR a big weight in their algorithm, and the result was a huge improvements in the quality of their results

What makes a quality link: 

When I started doing SEO in 2004 and until 2010 the answer was clear and easy it is Google' Page Rank you simply need to get as many links as possible from pages with high Page Rank (hopefully 10 if you can), in 2010 Google started to delay page rank updates then totally stopped updating page rank before totally making PR invisible to all webmasters, that doesn't mean it is not used by Google any more, it is still a big part of their algorithm but it means webmasters can not see it anymore. Many third party tools saw an opportunity with that and started building their version of page rank (DA PA by MOZ, TF CF by Majestic, UR DR by AHREFs , Authority Score by SEMrush) but none of these was successfully able to replace PR, new people to the SEO industry and in-house manager swear by some of them like DA, link traders using them a lot to sell links also.

What is the public metric that you can safely use to evaluate the quality of a web page? How to do link prospecting? The answer is not very difficult considering that there are two metrics provided by Google that we can use to decide the quality of a link:

  • The ranking of that page of keywords relevant to our target keywords, could be done simply by searching Google for that keyword. This is a manual process that can take a long time but it works so well
  • The organic traffic of the root domain which you can find using services like SEMrush and Ahrefs, if you are asking how do they get those numbers assuming they do not have Google Analytics access to all websites on the web, they actually guess it with a good level of accuracy, thy have the ranking of billions of keywords in their database including their search volume, not only that they know the first 100 websites that rank for every one of these keywords, so by applying some database quires with some educated guessing work (for example #1 ranking gets 25% of clicks, #2 12% etc) they can predict the monthly organic traffic. For link prospecting purposes this number doesnt have to be accurate, as far as this domain has 500+ organic visit according to SEMrush for example then the link is worthy
  • The relevancy of the link is a helping factor also, not only for search engines but also they can generate useful traffic that can turn into business. If you are providing link building services you need to worry more about relevancy as many customers can not differentiate link building from branding an public relationship, they will not be happy to see a link form an irrelevant website or poorly designed website.
  • A good link (from SEO prospective) should not be contained in a "nofollow" tag which was created by search engines to signal not passing any authority or link juice to the destination URL <a href="https://www.targeturl.com" rel="nofollow">click here</a>

Link profile Anatomy:

It is very important to analyze the link profile on any new website your are working on to decide if you need a link building campaign or not, tools like SEMrush, Majestic and Ahrefs can help you a lot with that, they also can help you to conduct a link profile competition analysis to find the link gap with other competitors. Analytics the anchor text will be also very helpfully to decide the gap with the competitors, in some case analysis the anchor text can reavle some link building activities that are again Google' guidelines, if you find these for your client then you need to find out more about the link building campaign that led into those links, if those links caused a big damage you need to make that clear to your client and lower their expectations about the success of your own link building campaign.

Then you need understand the type of links your client is getting and how are they acquired, this will help you to find opportunities and build your link building strategy.

Natural links:

If  you are an active business in North America you will be naturally attracting links for business profile pages like Yellow Pages and business directories. Other business avenues that you are registered in like chamber of commerce will link back to you also. Large businesses will have partners that link back to them, they are also in the news on a regular basis (for good or bad) which attracts more links, people talk about on forums and link back to them, they sponsor lots of events and they do digital PR.

This is where you hope to see the majority of the links in a client link profile, if you are working with a big brand you will see sometimes million of links coming organically without anyone from the client's side trying to build them, for client like that you can focus more on on page SEO if the competition analysis shows that there is no big gap in the link profile.

Editorial links:

Editorial links refers to links that you gain by having great content on your websites that got bloggers and news website to link back to it, this type of links is very valuable is it comes in most cases from content rich pages, it is normally contextual with a keyword rich anchor text (depending of the content you have)

Examples of content that can attract editorial links:

  • Original research and surveys
  • Case studies, guides, white papers and infographics
  • Interviews
  • Funny or Entertaining original materials
  • Tools and calculators
  • Blog posts if you are the authority in your space

Please note that in some cases especially when you do not have your own user base (subscribers or followers) you need to do some content marketing to get people with interested aware with this content, here few content marketing activities you can do:

  • Blogger outreach (mainly relevant blogs) and influences outreach in your space, there are many tools and search queries to find them, write a nice email explaining how this content can benefit their users
  • Social media marketing: promote your content on your social media accounts and give it a boost using paid social media marketing like Facebook, the level of targeting you can reach with Facebook is unprecedented and can put your content quickly in-front of the right people, native advertising with websites like Taboola could be an option also
  • Share it on relevant forums where possible without too much self promotion, check the guidelines of any forum sometimes sharing third party content even if it is very useful is not allowed

Manually created - (not editorial but also not paid)

Manually created links are not against Google's guidelines but in some cases they are. When you artificially build a dofollow link you can control the anchor text and when you used a keyword rich anchor text that matches exactly your keyword you could be violating Google's guidelines

Popular forms of manually acquired links in 2019:

  1. Digital press releases
  2. Guest posts
  3. Forums signature
  4. Embeddable widgets and infographics

With artificial link building you could be easily tapping into the gray or the black area and violate Google's guidelines, two rules that can keep you safe form doing that:

  • Make sure that manually created links to don't contribute to a large percent of your link profile
  • Avoid using keyword rich anchor text while you are doing link building

Paid links:

Exchanging links for money is totally against Google' guidelines, many people are still doing it, lists with websites selling links based on DA still circulating the web. I do not recommend buying links however; if you decide to do it in the future keep the same quality standards that you use for outreach, the website should have organic traffic, add value to users and relevant to your website.

Machine Generated links (automated):

Those links are generated by a software totally against Google's guidelines, those software target websites that allow users generated content (mainly forums and blogs) and submit auto generated content with link, this was a big thing back in the days but it almost has 0 impact on ranking as of 2019, also chances to see any clients with this type of links is very low now unless a competitor is trying to do negative SEO for them which also has a very low chance to work.

Xrumer, SEOnuke, Scrape box are poplar software that are used for auto generated links, their main targets are blog comments, forums and social media profiles

 

Branding and reputation signals:

Google never admitted using branding signals as a ranking factor but that seems to be hard to believe, let us check what branding signals could be available for Google:

  • Mentions, could be the brand name mentions or the domain name mention without a link
  • links with the brand as an anchor text and
  • The search volume of the brand name

All the signals about are almost fool proof for Google, even if Google is not using them for now they are very good comparison point for competition analysts especially the brand name search volume, if your competitors brand name 10 times more search than your then they are doing better job in marketing and most likely they will rank better in search engines for different reasons, one of them is attracting more natural links as people tend to talk and link more to big brands

Social  signals as a ranking factor:

I believe Google when they say they do not use social signals like likes and shares in their ranking algorithm but I know for sure when content gets good exposure on social media it tends to attract more link, actually it is very popular for content marketer to boost their content on social media outlets like Facebook hoping for users that how the power of linking (bloggers for example) to find it and link back to it

Online Reputation and  Customer reviews GMB (Google my business):

When do any local search like city + dentist you will see 3 local results above the organic results which we call the 3 pack, the 3 pack includes business information like location and phone number it also includes reviews. Google did not admit using reviews as a ranking factor but it will be difficult to believe that Google will have no problem ranking a one star business in the 3 pack or even on the first page organically

 

The link building process (mainly for blogger outreach)

Find if your customer needs links or not at least at the beginning, you can do quick link profile comparison using Majestic or Ahrefs to find if there there is a link gap or not. In most cases especially when you are working with big brands link building will not be your low hanging fruit, you will be better off starting with on page optimization and a technical SEO audit.

Explore your likable assets: Building links to the company's home page is not easy and it will not look natural, building linkable assets like cases studies or ifnographics can open the door for editorial links to come and makes your blogger outreach task easier. If you find good to go linkable assets you can start your outreach immediately otherwise you need to build them

Start link prospecting using tools like Pitchbox, Buzz stream or Google (search for keyword + blog), then build your list of blogs to reach out to

Write personal friendly email explaining how the blogger's users will benefit from linking to this content.

Next step: nteractive on page optimization and content maintenance

Tracking & Analytics

Tracking and Analytics

"If you can not measure it, you can not improve it" results measurement is vital in any marketing campaign, luckily when it comes to online marketing there is almost nothing that could not be measured. There are many metrics that SEOs need to keep eye on for different reasons:

  • They are ranking factors (e.g. website speed)
  • They are success signals (e.g. more organic traffic)
  • They are alarming signals that need to be investigated (e.g. ranking with Google is declining or GSC sending error notifications)
  • They present opportunities (e.g. keywords that rank on the second page can easily make their way to the second page with some optimization)

There are a lot of SEO KPIs to track, so before starting any project you need to sit with the project's stakeholders and discuss what KPIs need to be tracked, have not said that there are many standard (universal) SEO KPIs that most SEOs track that I will focus on in the next section.

Organic traffic tracking metrics:

The metrics below could be tracked using Google Analytics (GA), Google Search Console (GSC) and Google My Business. Please note that GSC and GMB do not save a long history of data so you need to make sure you have a solution that keep the data stored permanently

  • Number of organic landing pages (GA)
  • Head (top) keywords ranking (GSC or third party tools)
  • Average position/ranking for all keywords (GSC)
  • Number of indexed pages (GSC)
  • Pages with errors (GSC and GA)
  • CTR (click through rate) (GSC)
  • Bounce rate (GA)
  • Time on site (GA)
  • Pages per session (GA)
  • Number of keywords ranking on the first page (GSC with some processing)
  • Pages that attract zero traffic (GSC + site crawling data)
  • Pages and keywords that are losing traffic  (GSC)
  • Google My Business GMB, impressions, directions, phone calls and clicks (GMB)

It is very important to understand the average values of the metrics above for your industry and set objectives/forecasts for every new project with a clear plan to achieve those forecasts

Website performance, UX,  CRO and ROI

There are a lot of metrics to create and track that can help with UX and CRO

For UX many of them are mentioned above:

  • Bounce rate
  • Time on site
  • Number of pages per session
  • SERP CTR
  • Event tracking (e.g. video play duration and links clicking)

For CRO:

  • Goal tracking (e.g. tracking form submissions)
  • Ecommerce tracking for websites that sell online products
  • Connecting CRMs to Google analytics to track sales (e.g. connect SalesForce to Google Analytics)
  • Audiences building and tracking
  • Phone calls tracking by medium/source
  • Clickable phone number event tracking (this is for mobile users that click to call on a phone number)

It is very important to report on the financial success of the SEO campaign, one way to do that is ROI (return on investment) which is simply  ((Monthly Earnings - Monthly SEO retainer)/Monthly SEO retainer) X 100 some people go with revenue instead of earning. While doing this you need to consider the customer life time value, for a client that offers web hosting  where customers rent servers and keep them almost for life the monthly earning or revenue will not reflect the actual value of this customer, if they are selling $200/month dedicated servers with 30% profit then the earning of one sale should be 0.3 X 200 X (customer average lifespan let us say 36 months) = $2160 .. so if you are paid $1080/month for your SEO services and you are able to generate one sale for this client the ROI of the SEO services will be 100% in this case

Another way to justify ROI if you do not have access to the client financial data is going by the click value, if you have a personal injury lawyer client where a click using Google Search Ads can cost a $100 or more for some keywords, being able to drive 50 clicks/month from relevant keywords can justify an SEO spend of $2000 or more

Competition analysis:

Picking the right tools to do competition analysis is key to come up with any meaningful insights, in order to keep it apple to apple you can not use GSC to measure your own traffic then use SEMrush to measure competitors traffic, just use the same tool for everyone.

We are very lucky to live in an era where tools like SEMrush and Ahrefs are available to provide very useful link graph and traffic data for almost any website on the web, the key metrics to track while doing competition analysis are:

  • Referring domains
  • Inbound links
  • Organic traffic
  • Ranking for highly searched keywords
  • Number of keywords on the first page
  • Number of indexed pages

All of them are available using SEMrush with an exception of number of indexed pages which you can use Google site:domain.com modifier to track them

The most popular tools used by SEOs for tracking key metrics:

  • GMB (Google My Business)
  • GSC (Google Search Console)
  • GA (Google Analytics)
  • GTM (Google Tag Manager)
  • SEMrush
  • Ahrefs
  • Google Data Studio

Offering tracking services:

With tools like GA and GTM tracking is becoming a service on it own that SEOs can provide to their clients, here are few areas where clients need help:

  • GA optimization, Google recommends creating multiple views in each property, each property needs it is own filters and goals
  • Mapping events and goals: After a long time using the same view and many parties adding their own filters, goals, audiences and customization there will be a need for someone to come and map all of this and do some clean up
  • Many clients end up with 100s of tags and triggers in their GTM where it becomes very difficult to navigate and understand so it needs some mapping and clean up
  • Self submission forms and complex events tracking need GTM and GA experts to implement them
  • Ecommerce tracking is normally difficult for many webmasters to implement on their own unless it is natively supported by the shopping cart software (e.g. Shopify)
  • Creating GA and GDS custom dashboards

Gaining advanced knowledge in GA, GSC and GTM is vital for your SEO career, make sure you check the available certificates here that can help you to gain and validate the tracking knowledge you need to support your SEO career

On Page SEO

Winning Content and Professional Design

Content creation might not be the specialty of SEOs (link building was able to get mediocre content to rank) but the updates made by Google in the recent years made content creation the backbone of SEO. Nowadays gaining high ranking relying on traditional SEO tactics like on-page and off page SEO without a winning content is not possible or at least it will be facing a head wind, the opposite applies wining content will be a tail wind and requires less SEO efforts

What is a winning content?

When you are trying to rank for a specific keyword a winning content is content that answers the search query quickly with a good user experience, there are two ways to find winning content:

  • Put your self in the user's shoes and ask your self is this content giving you the best answer for what you searched for
  • Google the keyword you are creating content and visit the top 5 or 10 results to see what is Google favoring

Content has different formats that can go beyond text to images, videos and applications. Let us take the keyword "mortgage amortization schedule", a winning content for this keyword will be a page that has:

  • A web app (mortgage amortization calculator using JavaScript)
  • A chart showing the payment schedule (image format)
  • Advice about the best amortization rate and payment schedule (text content)

E-A-T  (Expertise Authority Trustworthiness) and content

This term is taken form the Google' quality raters guidelines where they ask their human raters to evaluate the website expertise in the topic, authority and trustworthiness. It is easy to conclude that achieving a good E-A-T score is impossible without creating good content.

Please note that E-A-T is not a ranking factor but historically Google always starts a process using humans then build an algorithm to replicate it, a good example for that is their Penguin update that was launched in 2012, they started sending manual actions using human reviewers before making that process machine driven using Penguin Update.

The process of building a winning content:

Before creating any page you need to decide the purpose of that page, finding one or two target keywords is a good start, keywords are questions asked by users to the search engines, search engines main interest is putting the best answer at the top (that is how they keep users happy and coming back to use them again). If your content is not the best or among the best that can answer user's question then you are doomed for failure eventually, even if it is working fine now it will not in the next algorithm update.

Content creation considerations:

  • How much text to do you need to answer  the user's question, long text format or short text format?
  •  Do you need media content like videos, audios, images, infographics and APPs?
  • The page that contains the content has to be part of a professionally designed website that provides a good user experience
  • The content must be well formatted an easy to read
  • Add new images and be sure to optimize those images with alt tags and other image optimization tactics.
  • Do you need to include case studies and data?
  • Do you need to optimize for enhanced SERPs where possible?

Why winning content is a tail wind?

  • Winning content doesn't mean that your page will rank at the top of the SERPs automatically but it gives SEOs the peace of mind that their efforts will have a better chance to work as it will remove a major ranking impediment (lacking of winning content)
  • Winning content has also a better chance to be shared on social media website and eventually attract more links
  • Winning content will get a website to be noticed as authority in the space which eventually will improve branding and reputation

Professional Design:

Hosting great content on a poorly designed website with a bad user experience will cause multiple issues:

  • Less people will share poorly designed websites
  • It reduces the trustworthiness of the content
  • It will negatively affect user experience which will be reflected on bounce rate, time on the site and number of pages per session

Conclusion:

Before taking any SEO project make sure the website is professionally designed (looking good, easy to use and has a nice logo) also make sure it has winning content for the target keywords otherwise you will be facing a headwind with your SEO efforts with good chance to miss the SEO campaign's objectives

If you feel good about the website and the content let us move to the next step keyword research and content mapping