Hosting

HostPapa Hosting Service Review NO BACKUP BE CAREFUL

I have been hosting few sites with Host Papa for more than two years now, I have two accounts with them (one I signed few months ago) things where just fine with them looking that I do not pay that much I have a server monitoring service that pings my site on daily basis ever 5 minutes, Hostpapa goes down almost every day for 5 minutes, however I blinked eye on this due to the cheap cost BUT today the server that hosts my second account has been hacked, when I visit my site my Norton Antivirus was alerting me about a a malware, few minutes later the whole site went down, I contacted them after 8 hours to find out that they did not solve the issue! Further worse they asked me if I have a backup as they do not backup sites!!! I jumped to their home page to check and sure enough that was not listed in their service, REALLY I have been building and hosting sites for 10 years, tried more than different hosts but the backup feature is always there, I do not even look at this feature when I shop for hosting as it is a basic built in feature with most hosts.

Hostpapa you suck, I do not want to say you are scammers as you did not mention the backup option, but what is the possibility that people backing up their websites on regular basis unless they have a backup service (not many people have), even if some people backing up their files what about emails, domains and other settings, what if you have a SSL certificate installed.

If you are not a web hosting savvy and you do not understand how important a backup is I want to advise you to avoid any webhosting company that doesn’t have daily or at least weekly backup OR you are taking a high risk to lose all your data or a part of your data. If you are looking for Hostpapa reliable alternative I highly recommend Hostgator.

Web Hosting

Iweb Managed Dedicated Servers Hosting Reviews

Iweb is a Canadian based hosting provider that offers a wide array of hosting services varies from shared hosting for couple of dollars a month to dedicated servers for few hundred dollars/day.
What I will be writing here is not a review, complain or scam type of posts (like Iweb sucks), it is simply my experience with Iweb being a client for more than a year with two dedicated servers and two backup servers.

After this long time dealing with Iweb I found out that Iweb is not a managed hosting service AT ALL, Iweb has structured its hardware and staff to get you a server connected to the internet and working fine with a your control panel access, after that point they do not care or they DON NOT KNOW how to help you. So before I continue with my story with them and in case you do not have time to read this long post DONT consider using Iweb if you are not a Linux expert or you have in-house  server admin and a remote backup.

Here is what happened exactly with Iweb:
–    Two weeks ago the whole server crashed, Iweb doesn’t have any monitoring service for free they charge $75/month for that, what I do not understand here is why they do not use a simple method to ping the server IP to check if there is a respond instead of charging that amount of money for what they say a very advanced monitoring system. Funny enough that in your customer hub you can see the server has no activities so they could simply develope a system to e-mail you when your server has no activities for longer than an hour.

–    It took me 20 hours to know about the crash as it happened on Saturday night, I opened a ticket after that the response was fine, it took Iweb 30 hours to put the server back with restoring the backup  (I have less than 20 sites on that server with less than 10 GB space), so far it is ok even if it took them too long (no mention that 5 admins worked on my ticket and for each new one I had to explain where are we at now as the ticket became too long and there is not seem to be lots of coordination between admins).
–    The server went back fine but MYSQL service did not show up in Cpanel service status, the whole database connection did not work, and the reason is that they copied MYSQL folder from the old drives without setting up proper permissions (I knew that later), they partially fixed my issue but there was still lots tables that are not working, I asked them to help with the tables and check why MYSQL is not in the service status with Cpanel but their respond now started to be VERY SLOW and here is my point it seems that those guys have instructions that once the server outage stage is gone and the server back even if it is not working properly then reduce the priority on the ticket, at this stage the response time started to be 5+ hours on the ticket, I could not wait and went and restored one of Cpanel backups in the old disk to another server which took me an hour  (they did not bother to try this solution even) everything started to be fine otherthan MYSQL service still not showing in service status in Cpanel (they closed the ticket after few hours without even checking with me if this was solved), after that the server looked to be working fine but I started to get messages about failure in Cpanel updates, opened a ticket and got a late response after almost 12 hours that we forced the update, few days later another message about failure in Cpanel update, I opened a ticket and did not get any response for this one.

–    I forgot to mention that I am using their basic management level I think it is 5 but after this long experience  believe it doesn’t matter it is not about the management level (higher level will get you only more free support time) it is about the customer service culture in Iweb as I offered them more money for their support. Another funny thing is that I was sold a 20 GB backup solution whiteout being asked about my requirements and after 13 months I was surprised that no one set up this for me and when I asked them to set it up they said your hard drive is way bigger than this and you need a backup partition which needs server re-installation, with communication of that type not related to your server their response time is 24 to 72 hours if they respond.

Two weeks after the initial crash:
After the crash I signed up with a service that sends SMS alert when your site goes down (it cost me $50/year for 3 sites they ping my sites every 10 minutes), last Friday I woke up with few SMS on my mobile telling me that my sites are down again (no surprise I have trying to tell Iweb that there is something wrong with the server for two weeks), checked the home page to see a database connection error, so good news it is not a server crash, I restarted MYSQL using WHM did not help also, I checked PHPmyadmin to see the tables showing the message INUSE, I opened a ticket with Iweb, I will post here their first response, I did not want to post anything from my tickets but you have to see this to see the level of experience their system admins have:

Greeting sir,

There’s many reason why this could happen,

I have try a few time to recover you table but without success. Still a
few table stuck in ‘In Use’. Please refer the the screen shot attached.

You can review this blog post to find out more information about this issue:
http://wordpress.org/support/topic/blog-reverted-to-install-screen

Unfortunately we’re not able to recover your databases.

You may want to restore from a previous backup if you have any.

If you have any question do not hesitate to contact us,

Best Regards.

I have a basic experience in Linux and hosting and a very good programming experience, this answer did not make any sense, I got back to the guy saying it is not only WordPress all other database and also this happened before after restoring the server but no response, this time I got the lesson WE DONT CARE your server is up (the point where there management stops) and decide at this point I can not continue with those guys, I moved all accounts to another server, in some cases I used the very old backup on my old disk and lost some data, that took me 8 hours, after I finished I decided to go back and see what happened as there is no stress now as the sites are back to work, I did some research in Google about this issue and found those two commands:
chown -R mysql:mysql *

mysqlcheck –all-databases -r #repair

It took me 10 minutes searching Google and few minutes for the server to execute the two commands above and voila all my database are fine now, it is too late as all sites now on another server but I can not believe the level of knowledge Iweb admins have, It might be just a strategy to ignore any ticket that is not related to server crash or software failure, Iweb is one of the cheapest dedicated service provider but that is for sure affecting their service quality
Finally remember when you are buying a server from Iweb that you are renting a box only, it is not a managed hosting for sure.

Link Building SEO SEO Services

The Right SEO Process

I am listing in this post what I consider the right SEO process that applies to small and medium size businesses, this process assumes that you have a specific target which is ranking for a list of high traffic keywords and you have a limited budget, in most cases the home page will be enough to target all your keywords so you do not need even to do any changes to your site structure:
  1. Keyword research to find the highest search volume relevant keywords to your business (Usually using Google keyword, with [exact match] checked).
  2. Once you approve the keywords if the home page is not enough to target all of them content mapping must be implemented, which is matching those keywords with your site’s content, the home page must  be used to target the most competitive keywords above, few internal pages could be needed.
  3. ONE TIME on page changes for the pages that will be used to target those keywords (a Good SEO company must implement on page recommendations if there is no resources available from your end to do that).
  4. Link building campaign will start for those pages, quality unpaid links are what a good SEO company provides.
Expectations:

  1. Ranking for your targeted keywords which should drive more search engine traffic to the site (this might take from 3-6 months depending on your site age and benchmark ranking).
  2. Improving site’s link profile and site authority which will bring more long tail traffic (all ships rise with the tide), the number of keywords driving traffic to the site will increase also.
  3. The number of inbound links will increase (the amount of link growth will vary depending on the link exploring tool that you use)
Reporting:

  1. Benchmarking report must be provided at the beginning of the project, including current ranking for the targeted keywords, indexed pages with search engines and the number of inbound links.
  2. Detailed Link report will be provided on demand.
  3. Ranking and traffic report will be sent on monthly basis (Google Analytics access will be required).
  4. Monthly link growth report.
Google Filter Google Penalty Google Spam Team

Staying in Search Engines Safe Zone

It is very important for SEO professionals to understand how search engines work especially Google, for all major search engines results are decided by machines that use complicated algorithms, there is almost no human intervention in the results, even with Google having a team of quality assurance human editors.

Search engines spam fighting algorithms rely heavily on signals that could be detected and evaluated by machines followed by either taking action against the site in question by itself (very clear spam signals) or to turn a human editors’ attention about that site to do a manual audit, very similar to tax departments where they use signals to audit (along with random audit) a signal could be a person that has been making less than $15 000 for a long time and bought a house that worth $1 000 000 with a 40% down payment.

What could be a spam signals for search engines:

  • A website that did not change its content in the last few years getting thousands of links in few days from low quality sources
  • A new site that has been deserted for years that expanded by 10 000 pages in few days, this expansion was not followed by any inbound links or social media attention.
  • 40 outbound links in your blog roll, all of them with a keyword rich anchor text, not very related to your website, furthermore your website is still new with no more than 10 posts or lots of posts with duplicate content.
  • 90 % of your links coming with the same anchor text and always dofollow
  • 30% or more clicks through rate in your Adsense account

MACHINES like PATTERNS and hate RANDOMIZATION, whatever you do think always if a programmer in Google Plex decided to write a software to track websites doing what you are doing, is that possible? If it’s possible how many websites could be caught using this software, if the answer is millions do not worry about it (i.e. How many websites tweeted twice last week, this is a pattern but millions of sites did that).

Understanding HTML, some programming languages (If this Do that Else .. OR .. AND) will help you for better judgment.

Unfortunately some of the red flags above might happen even while you are doing a clean link building but that will not protect you from actions taking by machines (you will pass human review in this case if it happens), those machines are just programmed to take actions based on analysis that they make to your website and link profile, but this will work like removing weeds from a large garden; pulling out weeds manually (human edit) will cause no harm to other plants, however using chemicals (machine filtering) is faster but might kill other good plants.
In order to be in the safe zone and avoid any search engine filter RANDMOIZATOIN must be always your friend while doing on page and off page optimization, avoid patterns or any other easy detection techniques.

Google Analytics Usability

The Lowest Bounce Rate I have Ever Seen

I own lots of websites, most of them I use to do some search engine experiments in order to understand algorithms more and more. For a long time I was giving less attention to user’ experience and focusing more on improving link profile and authority for those website along with changing the page elements to achieve better rankings.
Recently I decided for one of these websites to focus mainly on users experience (it was not easy though I spend lots of hours), that was the main but not the only focus as I did link building and improved my page elements to be search engine friendly, but with this site in specific the main goal was HOW TO GIVE MY USERS WHAT THEY ARE LOOKING FOR, amazingly I got a great traffic and the lowest bounce rate I have ever seen and more $$$$:

I am not going to comment on this screen shot as it speaks to itself, one thing I want to mention here is that none of these keywords are brand based.
I gave search engines what they want, I gave my users what they are looking for and in result everyone is winning:

  • Search engines are keeping their users happy and coming back to use them because they delivered related results.
  • Users won as they found exactly what they are looking for
  • I won as I got thousands of visits from search engines that I was able to monetize.

For everyone reading this post there are few takeaways:

  1. You do not need to cheat search engines, you can still win by following their term of services
  2. Relevancy is key, do not be greedy in targeting your keywords, if you are a dentist you do not need to target a keyword like “tooth paste” even if it seems relevant to your business it is not relevant to the intent of users typing in this phrase in search engines.
  3. User experience is VERY IMPORTANT, what applies to the off line world applies to the online world, a happy client off line is a good referral, and a happy user online is recurrent user and a good referral/advocate to your website and your brand.

Finally I want to mention that those takeaways apply also to paid search, I still have a good laugh with my friend and SEO guru Paul Teitelman any time we remember when he was able to achieve 50% click through rate with one of his paid campaign by being so relevant in choosing his keywords, writing his ad to speak to the keywords and building his site to give users what they want.

Links SEO

Is Your Page Fully Optimized?

Search engine optimization in the early days (1996-1998) was purely on page optimization, with the merging of basic meta search engines that time all what people needed to do in order to rank well was making sure that their keyword is repeated several times in meta tags and page content, keyword density was a very popular term that time (a hot SEO topic was shall I make my keyword density 2% or 5%).

Google came later with Page Rank and inbound links concepts as a way to score pages and rank them, sure enough it did not take webmasters a long time to adapt with the new system, link building or what is known as off page optimization became the ultimate goal for most webmasters, the link model was proofing a huge success with Google, it was possible at some point to rank a blank page for competitive keywords using only links.

The link rash really destructed and still distrusting webmasters from paying more attention to their on page optimization, more SEO companies and webmasters spend a short one time efforts on optimizing their pages and move to start a contentious long run link building campaign, links are still more important than on page but resource/money wise links are not easy to be acquired, especially with Google applying lots of filters on links and its spam team is becoming very efficient in weeding out spammy low quality links, on page factor defiantly worth more attention especially for hyper competitive keywords.

What I am recommending here is not only one time basic on page optimization where you:

  • Add your keywords to your title tags and meta tags
  • Making sure all your content is crawlable and readable by search engine
  • Making sure that there is adequate amount of keyword rich content on the page to enable search engines to analyse and theme the page
  • Your internal navigation is passing page rank to the pages that are competing for the most competitive keywords

On page optimization can go further by trying different things on your page and track the impact of your changes on search engines’ result, with Google presenting Caffeine which almost immediately translate crawling data into scoring data there is no need to wait a long time to see the result of your changes, things that worth to be tested:

  • Take some links from your navigation or your page, then add them back
  • Change the structure of your page
  • Rewrite the content on the page
  • Increase or decrease keyword density
  • Change internal linking

Test on page exactly the way you do it with conversion optimization (A-B test), find the best ranking formula and keep it, then try to replicate it for other pages on the site.

Finally remember to revisit your on page optimization once a while

Search Engines

Why Google Still Gives too Much Attention to Anchor Text?

Search engines are suppose to search millions of documents for queries typed by users and return results in no time, this simple task is more complicated than what it looks like, search engines cannot search the web instantly for each query and return results, they can only search their own database that is hosted on fast servers in order to return results as quick as users are expecting.

In order for search engines to build a searchable database they need to crawl the web on daily basis and turn unstructured data to a structured data, a simple example for that is crawling web pages then saving title tags in column call TITLE-TAGS, meta Keywords tag in a column META-KEYWORDS, properly that is how early search engines started, with resources and technology limitations the easiest way to build a search engine was to crawl the web and save few tags of a page without any content analysis and make it available for search, imagine how easy it will be to manipulate this search engine 🙂  all what you need to do is stuffing your title tag and keyword tag with lots of keywords and wait to get indexed then voila you are #1 for terms that you dream now to hit #100 for.

It did not take spammers a long time to figure out how to rank well in the first generation of search engines, quality of search results start to drop, search engines tried (not that hard) to weed out spam by either improving their content analysis, or using trusted sources of data (Yahoo relied on Yahoo directory at some point for that) but trusted source that time meant human edited which couldn’t handle the growth of the web.

Later on Google’s founders came to say; there must something better than that to control search quality and they came with two great concepts:
1- Page rank, which mean not all pages should be scored the same, their score must be based on how many links pointing back to them (internally or externally but more value will go to external links)
2- The best way to check a document relevancy is not to check what is the owner of this document is saying about it (title tags and keyword tags) but what other people are saying about this document (anchor text in external links)

Great concepts!!! The whole webmasters/web site owners/web 2.0 users turned to be a quality assurance force working for Google!! No need for human editors at all just get lots of powerful machines to crawl the web on regular basis and analyse pages and links.

That worked really well for Google and made it stand out search engines crowd as the quality of results have been increased significantly.

Many people may be figured it out but it is not as manipulable as the title tags keyword tags one.

Lots of people in the SEO community keep saying why Google is still relaying on anchor text to measure relevancy? The answer is simply because there is no alternative YET, Google needs to analyse billion of documents, score them using machines without any human intervention, it is hard to do that without using links/anchor data.

Remember also that Google has a web spam team with a main focus of fighting link spam, doesn’t that tell you anything? Simply in the near future Google has no alternative for links/anchors the solution was to create web spam team that is trying always to go after link traders and keep reminding/scaring the SEO community not to manipulate links.
Google also must be given a big credit for several enhancements that have been done on the page rank formula along with many other quality control improvements.

Finally in this post I did not try to encourage buying links, it is more about explaining why Google is still relaying on links/anchors and will for a long time.

Google Links SEO

Give Google What It Wants

The ultimate goal of Google as a search engine is to return relevant results to users so they come back to use Google in the future, with more users coming to Google more ads money will cash in, anything you do that goes against this goal will be against Google quality (money making) guidelines.
Here are few actions you can take to give Google what it wants:

  1. Be relevant: Relevancy is key for your business, set each page on your site to target few keywords that are with great relevancy to your content, do not go broad, users that land on your page must easily find the information that they are looking for, are you a dentist? Then do not target pluming it is totally irrelevant, do not even target “tooth paste” people that are looking for a tooth paste are not looking to go to a dentist.
  2. Quality content: Quality content is a key to keep both users and search engine happy, make sure to focus more on plain text content as it is easy to be crawled and indexed by search engine. Write content with keywords in mind.
  3. Usability: Usable site will keep your users more time on your site which will result in better conversion rate at the same time will reduce the bounced traffic to search engines (bounced traffic is a low quality/relevancy signal for search engines)
  4. Links Links Links: Unfortunately, no matter how relevant you are, how much quality content do you have on your site you still need inbound links to get search engine’s love. The web is very noisy, it was very hard for search engines especially Google to evaluate billions of sites without counting on links and use them as votes of measuring quality.