SEO Juice - Home

Backend Maintenance

Don't forget the essentials

Online SEO tools Off page  SEO advice On page SEO advice

Backend Maintenance ArticleBackend Maintenance

The purpose of this article is to show you some of the things that you can do, behind the scenes, to make sure that only pages that you want indexing get indexed, pages are redirected appropriately and the correct use of 404 page error handling with the visitor in mind...

Sections in this article



Once you have built your website you’ll be very eager to release it. But, there is still some work to do behind the scenes to ensure that Google does not see your homepage as 2 different sites and your visitors are only seeing the pages that you intended them to see. Take a look at the 3 sections below and make sure you have created a .htaccess and robots.txt file.

Creating a Robots txt file

Every site should have a robots.txt file, even if it is empty. When bots come to crawl your site looking for this file and it’s missing, then your server will report an error. The main purpose of the robots.txt file is to tell crawl bots what is not allowed to be crawled. You can do this by excluding certain files and directories. So to start creating a basic robots.txt file, follow the steps below.

  • Create a file called robots.txt on your website
    At this stage either upload the file as is, or exclude the paths that you don’t want indexing. Use the example code below but change the paths to the folders that you want excluding.

User-Agent: *

# Files/Folders to ignore

Disallow: /admin/
Disallow: /assets/

  • Once you have entered the paths to the folders that you want excluded from being crawled, save the file and upload it.

That’s it! Although you have excluded the files in the robots.txt file, sometimes a page will still get indexed, to prevent this from happening, you can add the following line into the <head> section of the individual pages that you don’t want indexing.

  • <meta name=”robots” content=”noindex”>

Redirecting the non-www domain to the WWW version in .htaccess

Warning, if you don’t understand the .htaccess file then don’t touch it. One mistake in this file will break your entire website. Carry on at your own risk!

The .htaccess file is very useful for redirecting pages from one location to another. The code below tells Google that the and the domains are one and the same. If you don’t put these lines of code in your .htaccess file then any PageRank coming to the main domain will be split between the two. Now that is something you don’t want to happen! I have filled in the example below with my website’s details. Change the bits in red to match your website and place them in your .htaccess file.

RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} !^$ [NC]
RewriteRule ^(.*)$$1 [L,R=301]

Please note that once you use the lines above, your local testing server will redirect you to your main website. It would be a good idea to comment out the above on your local machine when working offline.

Redirecting files in .htaccess

If you want to redirect a page on your website from one folder to another, then you can also do this with the .htaccess file. Redirects done this way should divert any PageRank that the original page has to the new one. Be careful though when moving pages from one location to another, especially if you have external back links to them. You will always need to keep the 301 redirect in the .htaccess file otherwise people will end up at your 404 error page and you will lose any back linking power to those pages and possible rankings.

So to redirect a file in .htaccess, follow the steps below.

  • Find the name of the file you want to relocate and substitute your pathing information from my example below.
  • redirect 301 /folder1/file.php

Please note that you don’t include the domain name in the first part of the redirect, just the later part.

If you have some really old content on your site and there are spaces between the words in the filename then you would need to use the redirectmatch command as shown below with (.) for the spaces.

  • redirectmatch 301 /magazine/history(.)of(.)british(.)government.pdf

Creating a 404 error page

Every now and again you’ll remove a page from your website and forget to redirect it somewhere. What happens in this situation is that your web server will return a 404 page not found error and the visitor will be given a nasty white page that is not very helpful at all.

So, what you need to do is make your visitor’s experience as pleasant as possible and guide them in the right direction. To start creating your 404 error page, follow the steps below.

  • Create a page called 404.php and base it on the template that you use to construct one of your article pages.
  • Place a nice message in here apologising for the issue and give them some options for where to go next. You may even decide to put a small html sitemap in here to point the visitor in the right direction.
  • When you have created your page, upload it to the root of your website so that your 404 pages now resides in the following location

Now, to let your webserver know that you have an error page uploaded to handler 404 page not found errors, you need to drop the following line of code below into your .htaccess file.

  • ErrorDocument 404 /404.php

Done! Just remember to test your site straight after any .htaccess update to make sure your site is still working. One error in a .htaccess file will bring your website to its knees.

Now, to test your 404 page works, type anything after your domain name and press enter, your custom built 404 page will now appear!

Creating an XML sitemap

Every website needs to have a sitemap. This is so you can submit it to Google Webmaster tools and let Google know of any pages that it may not have found whilst it was crawling. To create an xml sitemap by hand could take a while but there are plenty of applications on the internet that can do it for you. One such site that will create a sitemap for you is:


The website above will produce a free sitemap for you but only a maximum of 500 pages will be listed. All you need to do now is type in your website’s address and click on the start button. After about a minute, you will be presented with a few sitemap download options. You want to download the sitemap.xml and urllist.txt files and save them to the root of your website and upload them.

Leave a comment


With TechSEO360 you can
do all your website auditing
in one amazing tool! And, it's
FREE for the 1st 500 pages! Show me

Content Dissemination

Post to multiple social media
networks simultaneously in
Show me

Create an RSS feed

Understand the power of
RSS and your social media
efforts will be made easier!
Show me

Get your domain name here