How to technically optimise your website SEO

To appear on search engines results pages (SERPs), you cannot rely on keywords and Meta tagging alone. There is a much more technical side to search engine optimisation – technical SEO can’t be ignored…

A quick note before you start though: Set up analytics!! After all, there is little point in implementing SEO on your website if you have no way to track and measure its effectiveness!

[We recommend Google Analytics, but you could also use Bing Webmaster Tools or any other analytic tool. Once you have set up your analytical tools correctly, you are ready to follow our tips to ensure the technical elements of your website are optimised for search engines.]

5 technical tips to achieve great SEO

1.  Avoid cloaking

Cloaking SEO is when you show one version of a page to users, but a different version to search engines. Search engines want to see what the user will see, and anything hidden is often treated as suspicious. If you are cloaking content on your website, you are likely to face penalties.

If you are unsure if your website is using cloaking, there are lots of free SEO cloaking checker tools. We would recommend you check your URL as part of your routine website reviews to ensure no cloaking is occurring.

2. Check your robots.txt

The first thing a search engine looks for when arriving to your site is a robots.txt file in your root directory. This file enables you to “disallow” the indexing of certain pages such as:

• Private directories you don’t want the public to find

• Temporary or auto generated pages

• Advertising

• Under-construction pages

All websites should have a robot.txt file, even if it is left blank. You must be careful what you enter into this file though. If you enter the wrong details you could end up accidentally telling search engines not to index your entire site! This is useful if you are still developing your website, but once live, make sure you check your robot.txt file.

3. Have good site architecture

A well-structured, thought out website will not only rank better in search engine results, it will also provide better UX (user experience) to visitors. To help search engines understand the structure of your site you should use sitemaps.

There are two types of sitemaps, HTML and XML. The most vital type is an XML sitemap. This will act as a starting point for search engine spiders to understand the structure of your website content.

It is worth bearing in mind that XML sitemaps can only hold up to 50,000 URLs of data and the clearer you can make things for search engines the better. Split your XML sitemaps up into separate ones for each type of content: video, images, articles, etc.

DON’T FORGET: You should also add your sitemap locations to your robots.txt file. This tells search engine spiders where to check!

4. Keep your authority with redirects

When you make a page redundant because you have made a new version or covering a slightly different topic, you can use a redirect to preserve some of the previous link’s authority. It also means when users visit an old link, they won’t land on a 404 error page but the new page. Win-win!

There are two types of redirect codes:

301: a permanent redirect

Use this code to tell search engines that a page has permanently moved to a new URL. This will transfer most of the old page’s authority to the new one.

302: a temporary redirect

Use this code to tell search engines that the redirect is only temporary. This won’t transfer the authority of the original page and if the 302 redirect stays in place too long you will usually loose some of your website traffic.

5. Eliminate 404 and crawl errors

When a search engine indexes your website, they send out bots called spiders that crawl your website looking at its content. Problems occur when the spiders crawl your website and find 404 errors. You can check for crawl errors using your Google Search Console or similar tool.

To eliminate 404 and crawl errors ensure you are updating your redirects every time you make changes to page URLs on your website.

You can also help reduce incoming traffic 404 errors by identifying any broken backlinks and ensuring these links are redirected to the new relevant page URL.

On the occasion someone does land on a 404 page, make sure you are using a custom 404 page. Having a custom 404 page will make it less likely a visitor will leave immediately, and instead provides you with the opportunity to invite them to explore your site.

Now you have the technical aspects of SEO nailed, next week we will be sharing industry knowledge on optimising your content for search engines.

If you need help to implement the technical elements of SEO, then we would be more than happy to help. Give us a call on 01536 560 435.