Browsing Category

Technical SEO

Technical SEO

Website Redesign and Migration Checklist to Prevent ,Ranking, Traffic and Conversions Loss

The web evolves on a fast pace, and websites need to keep evolving to meet ever growing user expectations, that includes upgrading the infrastructure (i.e. hosting and CMS) and improving the visual design of the website.

Unfortunalty website redesign is not a risk free process, when lanunching a new website there is a risk of:

  • Losing organic search engine traffic
  • Conversion rate getting lower
  • Data loss due to mistakes migrating the tracking pixels
  • Poor user experience manifested in metrics like bounce rate

It is strongly recommended to conduct a pre-launch audit before launching a new website, the list below includes items with the highest impact on organic search in specific, and digital marketing performance in general when they are changed:

  1. URL changes
  2. Tags changes (title tags, description tags, ALT and Hx tags)
  3. Content changes and content removal
  4. Structured data changes (that includes schema, canonical and  social mark-ups)
  5. Design changes (mainly above the fold area)
  6. Usability, UX, accessibility (WCAG) and CRO
  7. Crawlability and indexability
  8. Navigation menu changes
  9. Website speed and performance
  10. Tracking changes

I will take you through every item of these and describe in details how to check if they changed in a way that can negatively affect search engine traffic or user experience.

One rule to remember while doing a website redesign "minimize the variables" where possible try to keep things the same (i.e. content, title tags, description tags), that will make traffic loss diagnosis easier if the new website happens to under perform the old one

URL Changes:

The whole search engine ranking system is built around the URL, it is used as an index to rank content and assess authority (mainly backlinks), changing a URL even with the content staying the same without a proper 301 redirect can cause a significant traffic loss. Work closely with the web developer to see if it is possible to keep the URLs unchanged, and if they have to be changed map the old URLs to the new ones, and ask the developer to implement a 301 redirect solution from the old URLs to the new ones.

Tags changes (title tags, description tags, ALT and Hx tags)

Keyword rich title tags and H1 tags can help the content to rank better for the targeted keywords, changing them in a way the become less relevant to the targeted keywords can cause a traffic loss

Title tags and description tags are used in most cases in the SERP by search engines, they should be written like an ad in a way they attract people to click on the listing in the SERP, changing them may bring CTR down.

The safest bet in a transitional stage is to keep them the same which will reduce lim the variables so diagnosing a potential traffic loss becomes easier.

Content changes and content removal

Removing text content from pages or changing content in a way it becomes less relevant to targeted keywords will have a negative effect or ranking. Same thing applies to other form of content like photos and videos.

Structured data changes

Structured data like Schema is a very helpful tool for search engines to better and quicker understand the content on your pages, you can embed specific type of content like an address, reviews, recipes and more in structured data codes, in some cases those codes will be used by search engines to add extra features to the SERP (e.g. the 5 stars you see some times in the SERP).

It is very common for developers to drop those codes while redesigning a website, make sure to check those codes on the staging website using the schema test tool.

Design changes

It is possible for the design and layout changes to cause some issues to a website ranking and user experience, pay a lot of attention to the above-the-fold area and make sure:

  • The content above thee fold is not changing, especially the content that is related to the targeted keyword
  • The call to action elements are placed above the fold are not being removed

Usability, UX, accessibility (WCAG) and CRO

Design changes (layout and elements repositioning) can impact many usability elements such as forms, payment process, and funnels. Usability elements may not affect search engines a lot but they will have a significant impact on user experience and eventually conversion rate.

Soft launch is a good a tool to assess usability and conversion rate changes, you can start with a 25% exposure of the new design and monitor the KPIs between the old and the new website (this almost becomes an A-B test)

Accessibility WCAG check up could be done using a tool like WAVE.

Crawlability and indexability

Search engines crawl and index mainly text content that is presented in a plain HTML code and to a less degree text content that is presented in JavaScript, make sure to go with the developer through the crawlability and indexability checklist below:

  • Make sure there is no commands in robots.txt (like disallow) that prevents search engines from crawling the site
  • Check the source code for noindex tags
  • Make sure the website has a dynamic site map submitted to Google Search Console
  • Make sure text content is not included in images
  • If you are using Javascript framework like Angular, avoid single page applications where the URL doesn't change when users move between pages.

The best tool to check for crawalability and indexability is Google Mobile-Friendly Test, then check the HTML source code section and make sure all the text content that you want search engines to see is visible there.

Navigation menu changes

The upper navigation of the website has two important roles to play, one of them is SEO and the other one is user experience.

From an SEO prospective, search engines give more weight to pages that are linked from the upper navigation as those pages are linked site-wide, the anchor text used in thee upper navigation will give search engines some context regarding the destination page. Removing important pages form the upper navigation or changing the anchor text may have a negative impact on ranking.

From a UX standpoint, links in the upper nav should be organized in a way that makes key pages easy to access for users, at the same time helping the website to achieve its business objectives.

Website speed and performance

Website speed and performance measured by Core Web Vitals, can be negatively impacted by changing the CMS or the design, measuring Core Web Vitals and making sure the performance is at least stying the same if not improving will be key to maintaining a good ranking (as Core Web Vitals are ranking factors now) and keeping a good user experience (keeping and improving conversions rate).

Tracking changes

It is very normal for a website to have a lot of tracking pixels (i.e. Google Analytics, Facebook, Linkedin and more), sometimes they are hardcoded in the source code and other times that are managed using tools like Google Tag Manager. Removing any of those pixels/tags will cause a data flow interruptions, it is very important to migrate all of these pixels to the new website.

Sometimes even keeping those pixels may not be enough, firing rules can depend on URLs and HTML elements names to fire, changing that will also interrupt tracking.

Also tracking tools like Google Analytics have settings to track goals and events that will be impacted by changing URL and HTML codes.

It is ver important to make sure there is not data/tracking interruptions when moving to the new website.


Website redesign or upgrade will be unavoidable at some point, what could be avoidable with good preparation is:

  • Traffic loss
  • User experience deterioration
  • Tracking and data flow interruption

This post focuses mainly on pre-launch audit, but post-lauch audit is also very important, the most important checking points for that are:

  • Google Search Console traffic, impressions and average ranking loss
  • Google Search Console coverage section (errors, indexability and crawlability metrics)
  • Google Analytics conversions loss and bounce rate decline
Hosting Technical SEO

How To Block link Crawlers like Majestic Ahrefs Moz and SEMRush

The web has a lot of web crawlers, some of them are good and vital for your website such as Google bot, others can be harmful like email harvesting crawlers and content scrapers. Link crawlers come short of harmful but far from useful. They are not useful for your website, and they are not harmful in way they try to scrape content or anything like that, but they could be consuming your server resources with no benefit.

For SEOs that adopt black hat tactics like PBN (private blog network) those crawlers are a nightmare and can expose the network to competitors if left open, which in most cases will lead to a spam report causing the whole network to be de-indexed + a manual action applied to the money site if not a total deindexation.

The most popular link crawlers are Majestic, Ahrefs, Moz and SEMRush, please note that their crawlers user-agents will not match their brand name and can change in the future, so it is very important to keep an up-to-date list with the user-agents used by those crawlers. I will list below different ways to block them:


You add few lines to your robots.txt file that can disallow most popular link crawlers:

User-agent: Rogerbot
User-agent: Exabot
User-agent: MJ12bot
User-agent: Dotbot
User-agent: Gigabot
User-agent: AhrefsBot
User-agent: SemrushBot
User-agent: SemrushBot-SA
Disallow: /

The method above will be very effective assuming:

  • You trust those crawler to obey the directions in the robots.txt file.
  • The crawlers do no keep changing their user-agent's names.
  • The companies that operate those crawlers do not use third party crawling services that come under different user-agents.


The issue with this method is that it requires your hosting provider to be Apache based, if your host supports htaccess you can use the code below to block most popular link crawlers:

<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} (ahrefsbot|mj12bot|rogerbot|exabot|dotbot|gigabot|semrush) [NC]
RewriteRule .* - [F,L]

This method is better that robots.txt as the crawlers have no choice but to obey assuming they are not changing their user-agents, or using third party crawlers.

Using PHP:

If you website is built with PHP like WordPress you can add the code below to your header.php to block all link crawlers:

$badAgents = array('rogerbot','mj12bot', 'ahrefsbot', 'semrush', 'dotbot', 'gigabot', 'archive.org_bot');
foreach ($badAgents as $blacklist) {
if (preg_match("/$blacklist/", strtolower($_SERVER['HTTP_USER_AGENT'])) ) {
} }

This methods is good if your server doesn't support .htaccess , if you are using this method you need to make sure you block also the RSS feed feature in WordPress, you can do that by adding the code below to your function.php file in the theme folder:

function wpb_disable_feed() {
wp_die( __('No feed available,please visit our <a href="'. get_bloginfo('url') .'">homepage</a>!') );
add_action('do_feed_xml', 'wpb_disable_feed', 1);
add_action('do_feed', 'wpb_disable_feed', 1);
add_action('do_feed_rdf', 'wpb_disable_feed', 1);
add_action('do_feed_rss', 'wpb_disable_feed', 1);
add_action('do_feed_rss2', 'wpb_disable_feed', 1);
add_action('do_feed_atom', 'wpb_disable_feed', 1);
add_action('do_feed_rss2_comments', 'wpb_disable_feed', 1);
add_action('do_feed_atom_comments', 'wpb_disable_feed', 1);

Aggressive blocking (for PBN users):

If you are a regular webmaster that it is willing to save some server resources by blocking link crawlers, applying any of the methods above should be suffice; however, if you are a webmaster that wants to leave no chance for those crawlers to sneak in, you need to apply harsher measurements.

Robots.txt will be as below:

User-agent: *
Disallow: /
User-agent: Googlebot
Allow: /

This will allow only Google bot to crawl the website assuming the crawlers will obey robots.txt directions. You can also allow other agents used by major search engines like Bing.

If you are using Wodrepss you can hide the links from all user-agents excluding Google using the code below in functions.php:

add_filter( 'the_content', 'link_remove_filter' );
function link_remove_filter( $content ) {
if (!preg_match("/google/", strtolower($_SERVER['HTTP_USER_AGENT'])) && !preg_match('/\.googlebot|google\.com$/i', gethostbyaddr($_SERVER['REMOTE_ADDR'])) ) {
$content = preg_replace('#<a.*?>(.*?)</a>#is', '\1', $content);
return $content;

This code will allow only Google to see the links, it verifies also that the IP address belongs to Google and it is not faked.

Make sure also to block RSS using the code listed in the previous step, the code above will not be impacted by those crawlers changing their agents or coming with different agent's names.


Technical SEO

Technical SEO

Technical SEO refers to the optimization work done on the website technical infrastructure (HTML source code, server side codes, hosting, assets and more) to make it search engine friendly

When it comes to technical SEO there are few buzzwords that you need always to keep in mind:

  • Crawlability
  • Renderability
  • Indexability  (Readability )

How do search engines work (mainly Google):

When search engines find a new web page they send their crawlers (a software that behaves  like a browser installed on a powerful computer) to read the source code of that web page and save it to the search engines storing servers to be parsed and processed later.

Second step will be parsing that source code, possibly save missing resources like images, CSS and other dependencies if the crawler decides that the content in the source code is readable without rendering (that happens when the content is included in plain HTML or a simple JavaScript format) they will start processing it by turning that source code into structured data (think about your excel sheet, columns and rows) that they will eventually turn into a database, considering the size of the web searching files and returning results quickly is impossible but searching a database and returning results in less than a second is a possibility. In order for search engines to turn web content into a searchable database they need to isolate text from codes, if they find any content with Schema codes (structured data) turning content into database will be a lot easier, HTML elements like title tags and descriptions tags are also easy to process.

If the crawler decides that this is Javascript heavy page that needs to be rendered to find the text content a more advanced crawler will be sent later (some times in few days) to render the page and get it ready for processing, watch the video below to understand how Google's crawlers work.

Renderability was not it a thing when search engines started, it is a very resource intensive and slow process but with more websites using advanced JavaScript framework like Angular (where the source code doesn't have any text content in some cases) search engines started to see a need to fully render the page in order to capture the content. The other benefit of fully rendering the page is understanding the location and the state of the content on the page (hidden content, above the fold or under the fold etc).

Currently the only search engine that does well with rendering is Google, they send a basic crawler at the beginning (it can understand simple JavaScript but not frameworks like Angular) then they comeback after with another crawler that can fully render the page, check this tool to see how a web page renders with Google bot 

Once processing is completed a functional copy of the page will be stored in search engines servers, they will make it available for the public eventually using the command cache:URL at this point the page will be fully indexed and able to rank for whatever keywords search engines decide it relevant to based on the quality of the content and the authority of the website (in other words the ranking algorithm).

Crawlability Optimization, Speed and mobile friendliness):

Firs step in making a website crawlable is providing access points to the crawlers like:

  • Site map
  • Internal links from other pages
  • External Links
  • RSS feed
  • URL submission to Google or Bing Search Console, or using their indexing API if it is available (only available for few industries)

Discuss with your webmaster what happens when a new content is added to the website and make sure there is one or more access point available for the crawler to find that page (ideally sitemap + one or more internal links from prominent pages), modern CMS like WordPress will provide access points automatically when adding a new post but they do not do that when you add a new page and that is where you need to manually modify the information architecture to include a link to that page.

Each piece of content available on a website must have it is own dedicated URL, this URL must be clean and not fragmented using charterers like # . Single Page Application (SAP) is an example of a situation that you need to avoid where the whole website operates based on a single URL (normally the root domain and the reset will be fragmented URLs), in this case technologies like AJAX (mix of HTML and JavaScript) will be used to load content from the database to answer any new page request (#anotherpage), users will not see any issue with that but search engines will not be able to crawl the website since they use dedicated URLs as keys to define a page in their index which is not available in this case as they totally ignore fragment URLs

Search engines have a limit on the number of pages they can crawl from a website in a session and in total they call it crawl budget, most websites (with less than a million pages and not a lot of new content added every day) do not need to worry about that, if you have a large website you need to make sure that new content is getting crawled and indexed quickly, providing strong internal links for new pages and pushing them to the sitemap quickly can help a lot with that.

URLs management: Many websites use parameters in URLs for different reasons, many eCommerce websites use parameters in URLs to provide pages for the same product in different colours or the same product in different sizes (faceted navigation), sometimes search engines will be able to index those pages and they will end up with infinitive number of pages to crawl which can create a crawling issue and also a duplicate content issue. Search engines provide webmasters with different tools to control crawlability and indexability by excluding pages from crawling, ideally if the website is structured well there will be less need to use any of the tools below to influence crawlability:

  • Robots.txt, a file located on the root of your website where you can provide rules and direction to search engines how to crawl the website, you can disallow search engines from crawling a folder, a pattern, a file or a file type.
  • Canonical tags, <link rel="canonical" href="" />  you can place them in page B' header  to tell search engines that the page with the original content is page A located under that canonical URL. Using canonical tags is a good alternative to 301 as they do not need any server side coding what makes them easier to implement
  • Redirects, I mean here the server side redirects (301 for example) which is used to tell search engine that page A was moved to page B, this should be used only when the content on page B was moved to page A. It could be used also when there is more then one page with very similar content.
  • Meta refresh, <meta http-equiv="refresh" content="0;URL=''" />   normally located in the header area, it directs browsers to redirect users to another page, search engines listen to meta refresh, when the waiting time is 0 they will be treat it like a 301 redirect
  • noindex tags, <meta name="robots" content="noindex"> they should be place in the header of a page that you do not search engines to crawl or index

The final thing to optimize for crawlbility is website speed which is a ranking factor also, few quick steps you can do to have a fast website:

  • Use a fast host, always make sure you have extra resources with your host, if your shared host is not doing the job just upgrade to a VPS or a dedicated server it is a worthwhile investment
  • Use a reliable fast CMS like WordPress
  • Cache your dynamic pages (WordPress posts for example) into an HTML format
  • Compress and optimize images
  • Minify CSS and JavaScript

To test your website you can use different speed testing tools listed here

Google Bot For Mobile:

With mobile users surpassing desktop users few years ago mobile friendly websites are becoming more important to search engines and web developers, search engines like Google have created a mobile crawler to understand more how the website is going to look for mobile users, when they find a website ready for mobile users they set their mobile crawler as the default crawler for this website (what they call mobile first), there are few steps you can take to make sure your website is mobile friendly (for both users and crawlers):

  • Use responsive design for your website
  • Keep important content above the fold
  • Make sure your responsive website is errors free, you can use GSC for that or Google Mobile Friendly Test
  • Keep the mobile version as fast as possible, if you can not do that for technical or design reasons consider using Accelerated Mobile Pages (AMP)

Renderability Optimization:

The best SEO optimization you can do for renderability is to remove the need for search engines to render your website using second crawling (advanced crawling), if your website is built around some advanced JavaScript platform like Angular is strongly recommended to give crawlers like Google bot a pre-rendered HTML copy of each page  (regular users can still get the Angular format of the website, this practice is called dynamic rendering), this could be done using built in feature with Angular Universal or using third party solutions like

Do not block resources (CSS, JavaScript and images) that are needed to render the website, back in the days webmasters used to block those resources using robots.txt to reduce the load on the server or for security purposes, inspect your website using Google Search Console, if you see any blocked resources that are needed for Google to render the website discuss your web developer how you can safely allow those resources for crawling

Indexability Readability (Schema)

When a website is optimized well for crawlability and renderability, Indexability will be almost automatically taken care of, the key point for indexability is providing a page with a dedicated clean URL that returns unique content with a substance and loads fast so search engines can crawl and store in their severs.

Content that can cause indexability issues:

  • Thin content as it may not be kept in the index ..
  • Duplicate content
  • Text content in images,  text in SWF files, text in a video files and text in complex JavaScript file, this type of content will not make its way to the index and it will not be searchable in Google

Content that can help search engine in parsing and indexability:

  • Structured data mainly Schema can help search engines to turn content into a searchable database almost without any processing, eventually that will help your website to have rich results in the SERP (example for that is the five stars review that Google adds for some websites)
  • Using HTML markup to organize content (i.e. <h2>, <strong>, <ol>, <li>, <p>) will make it easier for search engines to index your content and show it when applicable in their featured snippets like the answer box.

Monitoring and errors fixing:

Contentious monitoring of websites crawlability and indexability is key to avoid any situation where part of the website becomes uncrawlable  (could be your webmaster adding noindex tag to every page on the website), there are different tool that can help you with that:

  • Google Search Console (GSC), after verifying your website with GSC Google will start providing you with feedback regarding your website' health with Google, the index coverage is the most important section in the dashboard to keep eye on to find out about crawlability and indexability issues. Google will send messages through the message centre (there is an option to forwarded to your email) for serious crawlability issues
  • Crawling tools: SEMrush, Ahrefs, Oncrawl, Screaming Frog can be helpful to find out about errors
  • Monitor 404 errors in Google Analytics and GSC, make sure to customize your 404 error pages, add the words "404 not found" to the title tag so it becomes easier to find 404 error pages using Google analytics
  • Monitor indexability, check if the number of indexed pages in GSC make sense based on the size of your website (should not be too big or too small comparing to the actual number of unique pages you have in your website)
  • Monitory renderability using the URL inspection tool in GSC, make sure Google can render that pages as close as possible to how users can see it, pay attention to blocked resources that are required to render the website (the URL inspection tool will notify you about them)


Index coverage monitoring and analysis definitely needs to be a service that you offer to your clients as an SEO specialist, a GSC monthly or quarterly audit is strongly recommended.

Next step: user experience and conversion rate optimization


Technical SEO Tracking & Analytics

Daily Reading to Stay up-to-date With the SEO Industry Changes

SEO is a very dynamic industry that changes almost on a daily basis, there is something new every day to read about or to learn about. It is strongly recommended to read, watch videos or listen to podcasts for an hour or more every day to stay at the top of all changes relate to SEO in specific and internet marketing in general. I will list below the websites (mostly blogs) that I check on a daily basis for news and updates, the links below will take you directly to their RSS feed page so you can add it to any reader of your choice (I personally use which has a Google Chrome Plugin, Phone APP and a web based interface), recently both Firefox and Chrome browsers added a built in RSS reading feature.

This is how will look like in my Google Chrome, the number in the blue box represents the number of the unread posts from the websites I subscribed to

The website in my subscription list are:

  1. SEO Round Table
  2. Search Engine Land
  3. Search Engine Journal
  4. MOZ Blog
  5. Search Engine Watch
  6. Ahrefs Blog
  7. SEMrush Blog
  8. PPC Hero
  9. Distilled
  10. Webmaster World
  11. Bright Local
  12. Google Ads
  13. Google Analytics
  14. Simo Ahava
  15. WebKit  (the technology behind Safari browser)
  16. Chromium Blog
  17. The Mozilla Blog



Technical SEO Tracking & Analytics Uncategorized UX and CRO

Best Tools to Evaluate Website Speed

When it comes to measuring website speed I recommend:

1- Testing the home page + few other key pages from the website

2- Test in different times of the days and different days of the week

3- Test using more than one tool (two at least)

Web Page Test:

Available form multiple locations, multiple devices and using different internet speed, provides speed index (the time required for the site to visually load for users even if there is still process going in the background of the website)

This tool provides insights how to speed up the website, a report, and a video showing the load progress.

Google Speed Insight:

The tool has become more valuable after adding Light House data and Google Chrome data (not available for every website), be aware that score is not speed, speed is measured by seconds only, seeing very low score does matter a lot of your web page speed is 3 seconds or less

This tool provides insights how to speed up the website, a report, a video showing the load progress and industry comparison.


Available form multiple locations, multiple devices and using different internet speed, this tool can track speed history (paid feature) which is a handy feature to evaluate the website speed though out the whole day or the whole week

This tool provides insights how to speed up the website, a report, a video showing the load progress

Test my Site By Google

This tool is designed to analyze speed on mobile websites with low speed connection (3G), it provides insights how to speed up the website, a report and industry comparison

Google Developer Tools (Advanced)

This is a built in extension with Google Chrome, it has the ability to change connection speed, device, disable/enable cache.


Google Analytics (the numbers there are not very reliable)

GA provides average page load time in seconds, I did it find it that reliable, possibility because it averages number from different users and it works based on the code load completion which is not always a reflection of the actuation page load

Off Page SEO On Page SEO Technical SEO Uncategorized UX and CRO

SEO Training Courses and Conferences

The SEO certificates post includes a lot of learning resources as all the certificates require you to go through some training before you can take the exam. If you want a faster rout in a case where you applying a job that needs some SEO knowledge (not an SEO specialist job) you can find many online resources that cover the SEO fundamentals and give you a good jump start in your SEO knowledge

Search Engine Optimization starter guide (by Google) this should be the first document your read, it covers the basics of on-page and technical SEO

Google Quality Guidelines this is very important one to read especially if you are planning to be aggressive in your link building efforts

Google Quality Raters Guidelines Google uses quality raters (humans) to evaluate their search results so their engineers can improve them, what we learned about Google's logarithm throughout the years that it will always try to replicate human quality judgment, reading this document will give you an idea where is Google's algorithm going in the future

SEO Learning Center (by Moz) similar to Google's starter guide

Google Best Practice (mainly for ads) this is Google's best practice document for ads, quality guidelines for ads apply in most cases to SEO which makes this document worthy to read even for SEO specialists

Conferences to attend:

Going to conferences to learn SEO is not going to give you the best ROI however; going there to network and meet new people is the investment you should be looking for

Another benefit of going to those conferences is the status and the credibility it gives you with your clients (especially the big ones), major search engines like Google send speakers to many of those conferences so you will have a chance to hear from the horse's mouth, then you can communicate your SEO recommendations with your clients saying I heard Google saying this at SMX Advance for example.

Technical SEO Tracking & Analytics

Enhance Your SEO Resume with Those Internet Marketing Certificates


In this post I am going to include what certifications can help you to land your next SEO job (I will add another post for some training courses), most of the certificates below have online training sections that you need to go through before taking the exam, if you pass the exam you will be granted a digital certificate that you can print and hang in you office, you will also get a web page that you can add to your Linkedin profile. With the education system falling behind when it comes to digital marketing the certificates below will give you an instant advantage with any poetical employer.

Google Analytics Academy

  • Google Analytics for Beginners
  • Advanced Google Analytics
  • Google Analytics for Power Users
  • Getting Started With Google Analytics 360
  • Introduction to Data Studio
  • Google Tag Manager Fundamentals

Google Mobile Sites certification

Google Partners

  • Google Ads Display Certification
  • Google Ads Mobile Certification
  • Google Ads Search Certification
  • Google Ads Video Certification
  • Google Shopping Certification
  • Digital Sales Certification
  • Google Ads Fundamentals

Facebook Core Competencies Exam

Facebook Planning Exam

SEMrush course SEMrush Academy

Hubspot inbound certification

MOZ Academy  (the essential SEO certificate is a good start)

If you are asking why do I need paid search or marketing automation certificates if I will be applying to a SEO job? SEO for most companies is one piece of the whole marketing landscape that includes in most cases, social media marketing, PPC, SEO and marketing automation. SEO specialists will be working closely with other digital marketing channels and they need at least a basic understanding of how those other channels work. The other benefit of having PPC certificates is that smaller companies tend to hire one in-house marketing individual to manage all their digital marketing channels, being at familiar with the PPC channels will increase the number of jobs you can apply to.

SEO certificates are not must to gain SEO knowledge, all the information your need to learn SEO is available online however they can help you with three things:

  • As the education system is not caching up yet they will give you some credibility and increase your chances to land an SEO job
  • They will streamline your learning curve and test your knowledge
  • They will increase you commitment level to learn SEO and chase it as a career especially if you pay for some of those courses
Off Page SEO On Page SEO Technical SEO Tracking & Analytics UX and CRO

Tools That You Need To Master If You Want to Be an SEO Expert

There are some tools and platforms that you need to master if you want to be an SEO expert, some of these tools are used for monitoring and tracking others are used to make your work more efficient. Some of those tools will be required for the SEO specialist role and employers ask for them in the job posting.

Google Search Console GSC (Bing also has its own):

This tool has been growing for years and becoming the most important tool for SEO specialists. Why GSC is that important?

  1. This is only place where you can see what keywords are receiving impressions, clicks, CTR and where do they rank
  2. The message centre is a great communication tool that Google uses to tell webmasters about issues and improvements for their websites
  3. The index coverage and crawlability information contain very valuable insights that will help webmasters to understand how Google crawls and index their websites
  4. A sample of back links is available in GSC

A dedicated post related to Google Search Console will be posted soon I will make sure I link to it form this post

Google Analytics (GA):

"If you can not measure it, you can not improve it.", GA is the tool when it comes to track traffic sources and users' interaction with a website. Key things you need to know how to do in GA:

  1.  Setup goals and track goals per source
  2.  Understand and analyze bounce rate, time on site and pages per session
  3.  Analyze traffic by medium/source
  4.  Attribution models
  5.  Reporting on conversions

It is strongly recommend to be Google Analytics certified by passing the Google Analytics Individual Qualification (IQ) test

Google My Business:

Things you need to learn:

  • How to submit and edit a business
  • How to monitor users interactions like reviews and phone calls
  • How to tag the URL using utm tags so you can identify GMB traffic in GA

Link analysis tools:

Those tools will enable you to analyze the link profile of any website, the most popular usage of those tools is creating competition analysis reports that help to understand the authority gap between your website and other competitors, the most popular ones are:

All of the tools above are paid, buying one of them only could be suffice

Keyword research Tools:

Learning about the client's business then finding relevant keywords to that business is the starting point for any SEO project, there are many tools that can help with that and all of them are using Google's database:

  • SEMrush
  • Ahrefs
  • MOZ
  • Google Keyword planner

Website speed tools:

Website speed is a ranking factor with Google, more importantly it can improve user experience and eventually increase conversion rate. It is important to monitor website speed on a regular basis and the best tools that can help with that are:

  1.  Web page test
  2.  GT metrix (I like their speed monitoring service)
  3.  Google Speed test
  4.  Think with Google speed test
  5.  Google Analytics  (the numbers there are not very reliable)

Mobile friendliness tools:

Mobile users have surpassed desktop users a long time ago and Google is following that trend by focusing more on mobile users, Google bot desktop is being replaced by Google mobile crawler for the most of mobile ready websites. Having a mobile friendly website that is fast and provides a good user experience is key for SEO success, tools that can help with improving mobile friendliness are:

Structured data tools:

Major search engines use structured data (Schema is the most popular one) to have better understanding of the content structure as structured data can provide content in a database friendly format (almost ready to save to a database without processing), once structured data is added to a website there are many tools that can help to preview them and test them for errors:

Crawlability tools:

The first step any search engine needs to do is crawling the web, if the content is not crawlable or reachable (via a link or a sitemap) then search engines will not find it, index it and rank it. Best tools to find about crawlability issues:


The ability to gain experience and feeling comfortable to use the tools listed on this pages will help you in many ways:

  • Your next job interview if you are interviewing for an SEO job
  • Monitoring key metrics like ranking, traffic, links, errors and more
  • Generating client reports
  • Doing keywords content analysis
  • Providing clients with useful insights and actionable recommendations