From a purely technical perspective (of course), is your web site a little dirty? Would it benefit from a good spring clean? Would a digital Kim and Aggie be horrified by the dust and cobwebs in your HTML?

let’s discuss the cleanliness of your web site and the potential negative implications with regard to SEO. Depending on the size and nature of your web site you may have a wealth of content rich pages and many of these may be at risk of falling by the wayside. If you have at any time in the lifetime of the web site re-structured internal linking or adjusted URL re-writes, then it may be time to see if you can clean anything up. If this is the case then you may have some unknown 404 ‘page not found’ pages or you may be left with duplicated pages of an original based on old URLs.

Warning!!! Large block of text approaching...  its worth the read ;)

By modern day standards, those SEO savvy folks amongst us should now look towards table heavy site designs as a thing of the past. A distant dream that nightmares may once have been made of! As mentioned in previous posts, we really need to look towards fresh new clean and concise coding standards using HTML 5 and CSS3 where feasible.

As many of you reading this may know, duplicate content can be a real issue and take away valuable credit from your web site. If you have replaced an old page with fresh content on another but the old page remains indexed, consider re-routing the old page with a 301 redirect. This may pass ‘juice’ towards the new page, particularly if the content is relevant and related. If the old page and new page are identical but have a different URL then you definitely need to be considering the previous point!

So… how do you know when to adorn your web site with the digital duster? Well… you are going to have to perform some kind of site audit (or employ somebody to do one for you ;) ). Removing any un-required pages, page errors or apply well-placed re-directs could facilitate a marked rise in the SERPS. Within a thorough audit you should also look towards the potential existence of infinite loops within and existing redirects. The last thing you need is for visitors to become frustrated by a site that doesn’t function properly. The VERY last thing you want is for Google to effectively become bored of trying to find the page it is supposed be landing on, and thus penalise you.

Prime candidates for the issues discussed so far are websites that have undergone a migration in recent times. By this we mean anything from the re-location to alternative servers to IP and design modifications. Are you sure that the entire web site displays and behaves the same on the new server? Are any dynamic elements not what they used to be? How aware are you of the performance in its entirety since moving or changing the web site?

In an ideal world we should all undergo performance benchmarking and log web site snap shots. This allows us to compare and contrast at different points in time. If you clean up your web site to remove errors or improve the flow of internal linking, how much has this affected your site in the SERPS? If you moved from one server to another, how much faster or slower do the pages load? The cleaner and faster the code of your web site, the better for visitors and spiders alike.

Finally, take a little time check your sitemaps are up to date and tidy and then also consider if your inbound links could be tidied up/improved. If you have changed URLS, page content or even the domain name for the web site, you may be able to contact web masters to update link anchor text to reflect this. Yes this is a painstaking affair but like many other walks of life, you get out what you put in.