10 Web Design Mistakes That Affects SEO

Lekan Akinyemi and Caroline Wabara - SEO experts in Lagos

As SEO specialists in Nigeria, Caroline Wabara and I have been approached by several clients to assist in the auditing of their websites. Most of them want to find out why they are still being found on Google by their customers, even after like 2 years of designing their websites.

From our experience in analyzing lots of such websites, here are some of the most common web design mistakes that affects SEO.

Using Dummy Content

Most web designers are fond of using dummy contents pending when the final content for the page will be ready. These, in most cases, get picked up by search engines and it affects the SEO performance of the site.

Below is a snapshot taken from one of the websites we recently audited.

 

Can you notice the presence of dummy content on the website? This site has been designed for over 3 months and they are wondering why it’s not ranking! They have over 100 pages of dummy contents.

To know if there are still remnants of dummy content on your website or blog, do the following :

How To Check Indexed Pages Of a Website
Go to Google.com and input the string below into the search bar

site:yoursite.com

Here is a snippet from that of DMM.

From the above image, one can see that DMM has 221 pages indexed in Google!!

Now, compare the number of pages indexed with the actual number of pages you have on your website or blog. It should be equal to your actual pages. If it’s higher, then there is a probability that your site still has dummy contents indexed in Google.

You can simply move from one search result page to another to find them and delete them from your blog or website. Google and other search engines want to show only sites that offer value to search engine users, so if you have lots of dummy contents, why should Google show your website in Search?

Not Blocking Bots While Designing

It is generally advisable to block Google bots and other search crawlers when designing your websites or blogs. Bots crawls websites to find the contents in them before then listing them in search results. By blocking bots, your site will not be discoverable in search engines. Sounds scary hun?

Due to the hundreds or thousands of changes that will be made to a site during the design stage, it is advisable to always restrict the access of these bots.

This also gives you the chance to use dummy content pending when your final content will be ready. So if you a web designer that designs on a live server, ensure you block bots until you are done with the design.

To block bots on WordPress, do the following:

  • Login to your WordPress dashboard.
  • Navigate to Settings>>> Reading>>.

In the Search Engine visibility row, tick ‘’discourage search engines….’’ as shown below.

how to block bots on WordPress

For other CMS and custom designs, find you robots.txt file and change it to:

User-agent: *
Disallow: /

Putting Content In JavaScripts and Flash

Most search bots can’t read the content embedded in javascript and Flash. This means it wouldn’t contribute to the ranking of your pages on search engines.

So lets always ensure that we tell our developers and designers not to put vital contents in Flash and Javascripts as search engine crawlers cannot touch those script-generated pages at all.

Using Multiple H1s

In web design, H1s make text bigger and bolder and as such. Most designers like putting lots of them on their pages.

In SEO, H1s help inform the search bots what the page is all about. Therefore, you shouldn’t have more than one H1 on each of your pages. Having lots of them can affect the performance of your site in search results.

You can use the diagnosis feature of SEOQuake to check the number of H1s on your pages.

Another method is to open any of website pages on chrome and use CTRL + U to check the source code, then press CTRL + F then H1 to find out how many are on the page.

Not using Canonicals

Most eCommerce websites have products that can be accessed via several URLS. These in most cases confuse the search engines as they wouldn’t know which of those URLs they should show.

For example, the homepages of most open cart websites can be accessed via at least 2 different URLs. To the search engines, this means that there are 3 different pages with the same contents and that can warrant a harmer strike from them.

It might be hard to prevent your pages from being accessible with several URLS but it is always advisable to ask your designers or developers to add a rel=Canonical to those pages.

This is what a Rel ‘’Canonical looks like:

link rel=”canonical” href=”DMMNigeria.com” / >

Where DMMNigeria.com is the actual page you want the crawlers to show

The code above needs to be added to the head of your duplicate pages. Canonical URL allows you to tell search engines that certain similar URLs are actually one and the same.

Inter and Intra duplicate content as we all know can affect SEO. Intra duplicate content can kill your site in search. To find out if you currently have pages with duplicate content that can be accessed from several URLs, do the following :

Login to your Google Search Console

Navigate to Search Appearance >>>> HTML Improvements

Most of them will be in the duplicate title tags section 

Not Renaming Images

I see my clients make this mistake all the time! E.g. a furniture shop uploading images of dining set labeled as DSC_12444 or IMG 005.jpeg.

Most web crawlers can’t see what an image contains. Instead, they utilize the use of the image names and ALT tags to deduce that.

So while sending over images to your website designers as part of your web content, ensure you use appropriate names.  Rename DSC_12444.jpeg as e.g. 3-seater dining set

Using very Large Images

Excessively big images (in terms of MB) can affect user experience and the load time of a website. It generally assumed that Google considers the speed of websites as a ranking factor. Ever wondered why they have a page speed insight tool?

Big images make websites very beautiful and some developers won’t bother reducing the quality of the images to make it web friendly.

Ensure your images are less than 1MB before sending them over to the designers. Also, ensure to use ‘’Smush it’’ WordPress plugin to help reduce the size of your images.

Everyone loves fast websites!

Pop-ups

Pop-ups most especially mobile pop-ups are also bad for SEO as they affect the website user experience. Please try to avoid them as much as possible, so your website visitors don’t leave as soon as they land on your website.

Insufficient Content

Contents they say is the king. Write as much as possible on each of the pages of your website, having too little contents can make Google give you a red or yellow card.

Ensure you prevent pages with little or no value from being crawled by bots. This will help ensure your crawl budget is used judiciously on your high value-adding contents. But what qualifies a content to be too little?

There are several schools of thought on this. To be safe, ensure your pages have at least 300 words of articles.

Conclusion

Thanks for taking your time to read through all the mistakes.

Whether you are a web developer or a website owner, ensure you fix all the mistakes listed above if you want your website to do well in search engines.

Over to you! Which of the mistakes listed above is your site guilty of?

2 Comment(s)

  1. Caroline
    May 31, 2018

    I agree with everything written here. Thanks for sharing Lekan

    1. Lekan Akinyemi
      June 1, 2018

      Thank you very much for the nice words Caroline.

Write a comment

Twitter