Browsing Category

News

Accessibility News

How To Comply With Website Accessibility Standards In Ontario WCAG

New public websites and significantly refreshed websites were required to be compliant with WCAG 2.0 Level A by January 1, 2014.

Furthermore, by January 1, 2021, all public websites and web content posted after January 1, 2012, must meet WCAG 2.0 Level AA (check this tool to find out if your website is compliant).

All large (50+ employees) private and non-profit organizations, and all public-sector organizations are required to make their websites accessible.

I am not sure what is this going to mean from a legal standpoint and what actions the government will take against non-complying websites, since 2014 1000s of websites were built and refreshed without fully complying with WCAG 2.0 A and I did not hear of any penalties or legal issues.

I have contacted the government several times in the last few months trying to learn more about the tools they will be using to audit websites, and what are the penalties, but I did not get any answers.

Options to comply with WCAG

1- Check with your web developer

Most web developers started to design websites with accessibility in mind, check with your web developer if they have the capability of making the whole website compliant with WCAG 2.0 AA, expect to pay additional cost for that.

If you are redesigning/refreshing your website ask your web developer to include WCAG compliance in the SOW.

2- Request an audit and pass it to your web developer for implementation

There are many audit providers including AODA

Audit providers do not offer implementation in most cases, the implementation must be done by the web developer, which means more cost, you will be paying the audit provider to conduct the audit + the web developer to implement the recommendations in the audit.

The other issue with audits is that they rarely cover the whole website, they will take a sample of pages (home page and other random pages) and use them in the audit.

3- Automated third-party accessibility services

There are many automated accessibility providers that can make websites compatible with WCAG 2.0 AA. There will be a monthly fee to use those services, and the cost will rely on the size of the website (number of pages, and the traffic in some cases), for a small website expect to pay $29+/month to use those services:

Please note that automated solutions may not offer a 100% compliance guarantee, but they will be still far better than leaving the website without any accessibility optimization (in case you do not have the resources to do it).

4- Automated accessibility software/plugins

There are many automated accessibility software that can make websites compatible with WCAG 2.0 AA (especially if you are using WordPress)

Cons and Pros of every method:

1- Web Dev A-Z2- Audit + Web Dev3- Service4- Software
CostHighVery highLowLowest
100% WCAG 2.0 AA complianceGuaranteedGuaranteedNo guaranteedNo guarantee
ImplementationSlow and difficultVery slow and difficultQuick and easyQuick and easy
LongevityMust be updated when major changes in content or website happenMust be updated when major changes in content or website happenNo future updates requiredSoftware updates will be required (it could be automated sometimes)

Tools to check accessibility compliance:

There are many tools that can help you to test your websites compliance with WCAG, I will include 3 of them which happen to have a Google Chrome extension (which normally makes things easier):

I also recommend using screen readers or screen reader simulators (this is what people with impaired vision use to browse the web):

Using screen readers will help you to discover accessibility issues that can not be covered by accessibility tools.

News

Browsers Pro-privacy Enhancements and Their Impact on Marketers

Privacy is becoming a major concern for online users and companies are taking actions to address this concern and improve user’s privacy. Apple is a leader when it comes to pro-privacy enhancements, FireFox and Google Chrome seem to be following their footsteps recently.

What is Apple Safari doing to address privacy concerns:

In 2017 Apple started their ITP project (intelligent tracking prevention), with ITP1.1 they started to be strict with third party cookies (partition them or purge them when some conditions met) which caused big issues to cross-sites trackers like Google and Facebook. Google and other providers came up with a solution to get around that by storing their cookies as first party cookies and use variables in the URL to pass their tracking token to the next website.

ITP1.1 kept developing to become ITP2.0, ITP2.1 then finally ITP2.2. The biggest change that will make a big difference for marketers started with ITP2.1 where Safari started being strict with some first party cookies (keeping them only 7 days from the last visit), ITP2.2 is supposed to remove tracking variables from the URL (URL decoration) and reduce first party cookies lifespan to one day, this will be more difficult to get around for cross-site trackers, even if technologies like GA find a solution to get around that Safari may shut it down quickly if they are not happy with it.

How is ITP2.2 Going to Impact Marketers?

  • New users data in Google Analytics will be impacted with that, if a user comes back to the website after 2 days of the last visit they will be treated as a new user
  • Attributions will be impacted in Google Analytics, if a new user comes from a paid click and 2 days later they come through an organic click the initial medium will be set as organic. This will paralyze the whole attribution model in Google Analytics
  • Google Remarketing Ads will lose its ability to track users for more than one day
  • Disabling URL decoration will cause issues to cross domain tracking in Google Analytics as this feature relays solely on URL decoration to pass the CID to the next domain
  • Disabling URL decoration will cause also big issues to affiliate marketing platforms like Commission Junction where cross site tracking and URL decoration are the foundations of their conversion tracking

Safari's market share is almost 15% worldwide, but not all versions of Safari will have ITP2.2 enabled so the short term impact will not be that significant. I have done some year over year analysis on metrics like new users where I expect to see a spike in new users coming from Safari but I could not notice any difference yet.

I tried my websites on Safari 12.1.1 and the life span of _ga cookie was only 7 days:

This cookie is supposed to have 2 years lifespan (the screen shot below is taken from Google Chrome on a Mac)

So for now only ITP2.1 seems to be live not ITP2.2  (_ga cookie will have one day lifespan when ITP2.2 becomes live)

What About Chrome and FireFox?

FireFox has announced their own version of anti cross site tracking policy, they are supposed to roll it out this summer but it is still not clear how strict it is going to be.

Google Chrome has announced a cookie control plan without a lot of details (I do not see it causing any interruptions to Google Ads or Google Analytics)

I will keep updating this post with any new privacy and cookies policy changes by any major browser.

 

News

Google Updates The Search Quality Raters

Google just refreshed the search quality raters document which at some point in the past was leaked then made available by Google, this document is still getting a lot of attention from the SEO community especially after Google presenting the E-A-T concept (experience authority trust). Unfortunately many people in the industry do not read well what is behind this document:

  • This document is created by engineers that write algorithms, the main objective for them is to understand the impact of their algorithm changes on the quality of the results, something like if we give more weight to online reviews is that going to improve the results or not, when they ask raters to use E-A-T to evaluate the results the engineers objective is not to use E-A-T as is  in the algorithm (many parts of the E-A-T can not be programmed in an algorithm) but to adjust parts of the algorithm that can lead to better E-A-T results, what are those parts? This is the question that SEOs need to answer
  • There is a lot of data available for Google now, the part that scares me is Google Chrome data. I was playing last week with their user experience report using their BigQuery and I was surprised that each user's session speed was available for the public for almost every domain on the web, I know Google keeps saying we do not use this data in the ranking algorithm but really? If Google decides to use Google Chrome data along with some other data available for them they will have bullet proof algorithm. What I am saying here is that there is no shortage of ranking signals for Google, in the rating document they mentioned few and every one started to use them as the only ranking factor forgetting about the other 200 if not more ranking factors

What I like more about the quality raters document are the ideas that it gives to webmasters to improve user experience, in my opinion that is where webmasters need their time and energy and they will be ahead of all algorithms if they do that

Google is all about scaling:

Scaling an algorithm needs metrics that a machine can understand easily and quickly, anchor text is a good example for that, yes it can cause you a penalty if you misuse it but when it used right its power is still the same as it was when I started doing SEO in 2004, we need to remember that very well when trying to reverse engineer the algorithm

 

News

Googlebot Is Now Evergreen With The Latest Chromium

Google has done a major update to Google bot making it evergreen (always running the latest version of Chromium) Compared to the previous version, Googlebot now supports 1000+ new features, like:

  • ES6 and newer JavaScript features
  • IntersectionObserver for lazy-loading
  • Web Components v1 APIs

This sounds great but the biggest question is still unanswered for webmasters which is do we still need to do dynamic rendering when we use advanced JavaScript frameworks like Angular? I think the answer is still yes or at least most of the time. Assuming the Google Mobile friendly tool is now up-to-date with the most recent Google Bot version, we still need to test the website there and find out if Google is able to render it properly or not, based on that we can make a decision to provide a dynamic rendering or not.

News

Google is Launching CallJoy Small Business Phone Technology

Google is launching Easy-to-use phone technology for small businesses CallJoy for $39/month inspired by Google Duplex

calljoy

Call tracking is vital for a small business in order to understand which marketing channels are working and improve customer service. This system should be able to provide all call tracking needs like:

  • Logging calls in an online dashboard
  • Call recording
  • Phone number with local area code
  • Unlimited call recording & transcripts
  • Textback functionality
  •  Analytics & insights
  •  Spam blocker
News

Thank You Google Case Studies to Sell Schema to Clients

Google has been encouraging websites owners to apply Structured Data to their content for a long time (Schema is one formation of structured data built by Google), today they have published a new reminder to stress how important Structured data is, they took it even a step further by providing few case studies in their post, they have also added more in their case studies gallery:

Structured data

Caption