web statistics

Call Us Now

(226)-500-3736

Top 10 Hidden SEO Dangers

Leave a comment

Top 10 hidden SEO dangers sign

By Adrian Drysdale

These hidden dangers could be causing you to lose rankings in Google. We often spend most of our time strategizing as to what can help us increase our rankings but rarely pay attention to the hidden dangers happening behind the scenes. We will be giving instructions on how to identify these holes and offer possible solutions.

Slow Loading Pages

This can be very hard to identify as your home and work internet speeds can often bias your view. You could be on a very fast 100mbps internet speed and think that everything is loading super-fast. However, the average Canadian internet speed is 16mbps. You have to factor this in when determining if your pages are loading in 5 seconds and under. The general rule of thumb is that if your website is taking over 5 seconds to load this will cause users to start bouncing. If a page on the website takes 10 seconds or longer then studies show that you have most likely lost 50% of your users.

The best way to identify the average loading speed by page is to check your Google Analytics and click “Behaviour”, then click “Site Speed”, and “speed suggestions”. Please check the image below for visual instructions.

how to check site speed in analytics

Slow Site Speed Solution be sure to use the free Google developer’s tool https://developers.google.com/speed/pagespeed/insights/ this will give you an insightful breakdown of all the different issues causing your site to load slowly and offer possible solutions.

If you have a high traffic website it may be time to upgrade your web server from shared hosting to a Virtual Private Server, or Dedicated Server. While shared hosting plans offer low cost hosting, this doesn’t guarantee that your website will have all the hosting resources in your favor particularly if another website that is sharing your server suddenly goes viral and gets a ton of traffic. In this instance the shared hosting resources will be tipped into the other persons website to accommodate their website, leaving yours struggling.

Failing to Monitor 404 Pages

Every successful webmaster should be monitoring for 404 page errors. There is nothing wrong with getting a 404 page not found error, don’t panic! But what happens when it is happening behind the scenes and you have no idea that it’s happening? You can’t fix something if you don’t know that something is broken in the first place right?

The good news is that there is an easy way to create alerts from your Google Analytics account to be alerted when users are hitting these pages. Here is a link to an easy to follow guide from Google as to how you can set this alert feature up. https://plus.google.com/u/0/+GoogleAnalytics/posts/1UUqv3jAn1a

Local Business Pages

The biggest hidden danger that I have found while working in Guelph and chatting to local business owners looking to rank for local queries is that a large percentage never update their phone number and address citations and wonder why they aren’t showing up in local search. Many times a business might change their number or address and not search the web for all references to the old details so they can update. For example you might have a Facebook page that says your phone number is one thing, but the contact number on the website is another. When there are too many inconsistencies like this Google will take you out of the Google Maps business listing area. You can only appear in these listings and rank higher if Google knows with authority that your details are consistent. Do yourself a favor and make sure you update your address and phone number the moment you change it.

Site Content is Never Updated

Google freshness is a real thing ladies and gentlemen. The analogy I like to give clients is that imagine there are two news websites. The first one has content updated once a year, the second one has content updated once a day. If someone is searching for the latest news which website do you think Google with give ranking preference to? Of course it makes sense that the website that is updated regularly will likely deliver a better experience to the user and for that reason Google will factor this into its ranking algorithms. Update content regularly. If your website has the exact same content as it did this time last year then it’s time to rethink your content strategy and stop wondering why you are failing to pull rankings.

Unresponsive Website

Does your website adapt well to mobile searchers? Do users have to pinch and zoom every time they want to navigate your site and read the text correctly? If you answered yes to this then you don’t have a responsive (mobile friendly) website. You can use a free tool developed by Google to check if your website is mobile friendly or not https://www.google.ca/webmasters/tools/mobile-friendly/ .

Why is this important and how does this affect your websites rankings? The simple answer is that it’s very important to know if your website is mobile friendly or not because if it’s not responsive then you are going to struggle to rank for mobile search queries.

Unsecured Website

In 2015 Google announced that it would start giving slight preferential treatment to websites that are secured. In simple terms if your website is https then you have a secured site which means you tick a new ranking signal that determines your overall ranking in Google. At the time of writing this I can report that this is still only a very small ranking signal but in the next two to three years it will play a more significant role. Making a website secured using an SSL certificate which can be purchased through your hosting provider isn’t always a simple thing. For this reason Google is giving us all some extra time to kick into gear and get our sites secured before it ramps up this ranking signal.

For more information on making your website HTTPS secured please read the official Google blog on this topic https://support.google.com/webmasters/answer/6073543?hl=en

Content Hidden Behind Scripts

If you have a large portion of your text hidden behind a script that requires the user to “click to reveal” then this content won’t be seen in the same light as plain text that is readily available on that page that doesn’t require any additional clicks to reveal.

Makes sense right? Imagine having 20 words on a page but 2,000+ locked behind a javascript. Can Google honestly look at that page and tell the reader that this page they are about to click has a large amount of relevant text to the search they just performed? Probably not. What are the chances that this JavaScript is also compatible with the user’s device and browser? Not always reliable. Click to reveal scripts are notoriously buggy and should be seen as an additional feature to the readily available plain text that is featured within the page.

Ad to Content Ratio

Have 100 words on a website with 30 ads? Ads, including affiliate ads are fine and don’t get punished by Google so long as you are adding value. If I have a page on my website that has a ton of ads with hardly any context, featured text or anything that would enhance the users experience then you run the chance of being deemed as “low quality” in the eyes of Google which isn’t good.

This is particularly the case when running affiliate advertisements. You have to ensure that when you are presenting an affiliate ad that you are enhancing the user’s experience. As yourself why Google would want to rank you above the original source of the ad? If I am promoting my affiliate ad for a local Dentist in Guelph for example, why would Google not rank the actual dentists website ahead of me? The simple answer is that it will unless you have added something to the experience. Maybe the persons website you are advertising doesn’t answer a lot of common answers. If you answer these questions and surround the ad with this then you have a good chance of pulling rank. Don’t make the mistake of just throwing up some ads with a line or two and scratch your head wondering why it is failing to rank.

The same can be said for the amount of code on a web page. If the page you are trying to get ranking in Google has 3000 lines of code but only one paragraph of text then you have to ask yourself what is happening behind the scenes that might not be necessary. I mention this because the Toronto Star, and Guelph Mercury websites which might link to 60+ news items on a page has between 3000 to 5000 lines of code. Why does your page with a few sentences have as much code as a major news site? What isn’t

running efficiently that should be? Simple pages like these should require 80+ processes, refer to my first point regarding website speed.

Two Versions of the Same Site

We have all seen examples of this before. A company wants to rank for the same keywords but in different areas. For example a florist in Guelph might decide to replicate the existing site but change all the keywords to Kitchener. They then buy a new domain for the Kitchener website and wonder why it is not ranking. This falls into the same category for me as people that buy domain names with keywords in it and just redirect them to their existing website and wonder why the new redirect domains are failing to show up in Google.

Google refers to two sites the same owned by the same person as a “Mirrored Website”. Google is on the hunt to ban these and has been doing so for years particularly since the Hummingbird and Panda algorithm updates. Are the domains owned by the same person hosted on the same server? Exactly! Google will have no problems finding these patterns. They have an algorithm to stop the same person ranking over and over again in Google and occupying the entire first page of Google. It’s for this reason that they take Mirrored websites very seriously. If you want to rank for multiple areas then create multiple pages on the same website. Make the content unique and an overall good experience so that anyone coming from any of these areas are highly engaged and don’t go clicking that back button the moment they land on your site.

Keyword Stuffing Pages

This might have worked as recent as five years ago but only works today if the user has a good experience on your website and doesn’t bounce. Let me put it this way, if you have an overly stuffed article that really doesn’t make for a good read but mentions your desired keywords and key phrases 20 times AND the user loves it then great! Keep stuffing those pages! Chances are that this isn’t the case and stuffing keywords into articles at the mere drop of a hat and putting keywords in places they shouldn’t go in order to get your keyword density up is going to provide a bad overall experience for the user. The end result 90% of the time is that they are just going to click that back button and start bouncing in the droves. Google monitors this and for good reason.

I hope this blog helps. I’m going to try and write some more in the coming months when I have some spare time. Please remember to like, share and comment if you found this interesting.

Leave A Reply

Our Advantages

1

We are located in Guelph. Happy to meet up at any time to discuss as we work just up the road to you.

2

Highly Experienced. We have serviced over 20 companies with 5 star reviews to our name.

3

Google Certified. We have passed our Google PPC certification exam and keep our skills up to date.

A Great Service

SEO Guelph continue to go above and beyond to get those hard to get rankings. Traffic is at a 3 year high. I have personally recommended SEO Guelph to close friends and family.

Saving Money on PPC

I noticed a dramatic decrease in the amount I was paying for my Google PPC campaign when I switched to SEO Guelph. I'm paying less and getting more clicks as a result.

Ranked #1 in Guelph

I went with seoguelph.com because they are ranked up the top when I google "SEO Guelph", and "Guelph SEO". Plain and simple decision for me. We went with the team that was ranked the highest in Guelph.

Get Your Free SEO Quote

Your Website

Your Name

Your Email

Your Phone Number