This is a copy of the technical report we have sent to Seasoned Pioneers. You will see it details any issues and also lists the actions necessary at the end of each point. We have categorised the actions as high, low or medium priority – some are essential but others can wait and get scheduled in later. Typically we would send this report out, give Seasoned Pioneers an opportunity to read it then answer any questions and guide them through the actions necessary.

Technical Search Engine Recommendations Report

seasonedpioneers-header (1)
Search Engine Optimisation Report
November 2012

 1. Spidering & Search Engine Saturation

To appear in a search engine the search engine must first spider your website. This spidering is done by the search engine sending out code that crawls each of your website pages for information; it gathers this information and saves it as a cached copy of the page. This cached page is then stored on the search engine database which is also known as a data centre. It is worth knowing that most search engines have more than one data centre and each data centre may hold their own collection of cached copies of your website pages. This information is merged and together with a collection of other data and will determine where your page will appear on the search engine for the phrases that people search for.

We can see which pages the different search engines have cached by using a simple Site: command. If you type site: into a search engine you can view which pages the search engine has cached.

It is important that your website is included not only on Google but the other top search engines as well. Studies show that Google is the main search engine in the UK and US but Yahoo and BING are also used by a large number of people.

Your website is included in the data centres of the 3 main search engines. Google has cached 2,450 of your pages. Yahoo has cached 325 pages and BING has cached 1,790 pages.


At the moment you have an XML sitemap which is a list of all the pages on your website and this has been submitted to Google Webmaster Central. However we would suggest that you consider setting priorities up for important pages as pages with high priority values are likely to get indexed faster and crawled more often and this will help search engines to decide which URL to show if multiple pages from same website rank for a search query.


At the moment you have a Google Webmaster Account and have linked your sitemap to this account which is great. This should also be done for BING. If you have already done this then please send us the login details. If not then there is a link at the end of this section to a tutorial on how to do this. Alternatively we can do this for you.

Also as mentioned above, as we start optimising pages we will start altering the priorities of these optimised pages to be higher than non-optimised pages.

Further Advice and Information

For people that have not set up Google and BING Webmaster Tools please follow these links and follow our tutorials which will show you how to do this:

Setting Up BING Webmaster Tools

Setting Up Google Webmaster Tools

Creating an XML Sitemap (less than 500 pages)

This is a high priority Task.

2. Robots.txt

Web Robots (also known as Spiders), are programs that traverse the Web automatically. Search engines use them to index the web content; spammers use them to scan for email addresses and they have many other uses.

A robots.txt file can be added to your website to control how these spiders traverse your website. This can be an extremely useful file and can stop spiders from accessing pages that you do not want them to visit/cache. We use this in SEO to stop spiders from visiting pages that will not be useful in search engine listings. A good example of a page that we would not want to include in search engines is a login page or a ‘terms and conditions’ page.

We have looked on your website and you have a robots.txt file although it is set to allow access to everything on the site. We recommend adding pages to this file that, as mentioned above, you would not necessarily need spiders to visit, as an example any login pages could be disallowed etc…and once the sitemap has been added this file can be amended to show spiders where it’s located. This can be done with the following simple command:


Here is what we recommend adding as your Robots.txt:

User-Agent: *
Allow: /
Disallow: /cart.aspx
Disallow: /search.aspx
Disallow: /Login.aspx


Further Advice and Information

If you have an e-commerce website then you should look through your website at pages that will not be useful to search engines or for people to land on when searching on search engines. Once you have found these you would simply add them with a Disallow function to your robots.txt file as we have for the page above by adding in the following code: Disallow: /cart.aspx

If you don’t have an e-commerce website then there might be other pages that you don’t think will be relevant to search engines, such as Customer Login pages.

This is a medium priority task. However if your robots.txt was using the disallow: function to disallow pages that would be good for search engines then this would be a high priority task.

3. Server Location

The server location of your website is where your website is actually hosted (stored) in the world.

Search engines sometimes use the hosting location as a signal for where traffic is required. For example if you have a country neutral domain (.com, .info, .net etc..) and are hosted in Germany – then Google will assume that’s where you want rankings and your rankings will be higher in Germany than other places.

It may be the wrong assumption, as if you are a UK company you will want your rankings to be as high as possible in the UK. In this instance you will need to override this hosting signal by informing the search engines (via their webmaster tools) that the UK is where you want your rankings. There’s a free lesson at our blog which goes into this targeting in more detail: Geo Targeting.

If the site has a country specific domain –as it does here – – then the search engines will take that as the main signal and not the hosting location – so no geo-targeting is necessary.
We still check this since the hosting location can also have an impact on the site access speed which may be important for high transaction rate e-commerce sites for example.
This site is hosted in the UK and has a domain so no further action is required.


If you wish to check your own hosting location here are a few tools to help:
We recommend using all 3 of these tools to ensure they show the same results and there is no confusion.


There are no actions at the moment if you want to target UK traffic. However, if you want international traffic then please read the notes below.

Further Advice and Information

If you have a or other country specific domain then you are targeting traffic from that country and should ideally have your website hosted in that country if possible. However if you have a non-country specific domain name such as a .com, .net etc… and you want to target traffic from a specific country you should use the Geo targeting tools in Google Webmaster Tools . Also if you have a non-country specific domain and don’t want country specific traffic then you should set your Geo Targeting tools to Unlisted.

This is a medium priority task. However, if you are targeting the UK but want to target the world or vice versa the priority of this task may change.

4. Text to Code Ratio

Search engine spiders mainly focus on the text of a website page so it is important for each page on your website to include a reasonable amount of on page text with as little code as possible. Although there have been advances in search engines and they can read certain parts of Flash files and JavaScript it is better to not have excessive amounts of code on your website and where possible this code should be stored in an external file, we can see that you have done this in certain areas but would recommend where possible more code should be made external to improve the text to code ratio.
Pages including the homepage and category pages should have a minimum of 300 words of text whereas pages such as product pages can have less text because they are usually less competitive, although this isn’t always the case. This 300 word guideline is just that, a guideline, the more text the better really depending on how competitive the keyphrases are you wish to capture – more on this later.


When we decide on the phrases that we should use for the different pages of your website we can recommend how much text should be on each page and how we think this text should be written. This will be included in the SEO Blueprint that we will be sending you soon.

Further Advice and Information

You should ensure that there is room for text on all of the pages on your website. This text should be visible and there should be no attempts to hide this text. There will be more information on SEO Copy-writing and text to Code Ratio in the SEO Blueprint which will be published 4th December.

5. Server Side Issues

If there is a server side issue with your website then this can cause a lot of problems in search engines. There are a wide range of possible server side issues including 404 errors, which you have probably seen, to incorrectly set up 302 or 301 redirects.


As you can see from the bottom line in this snippet of code: Your homepage is showing a HTTP/1.1 200 OK. This is good because this shows that your website is running well.


As mentioned above, this is running well and there are no actions necessary here.

Further Advice and Information

A result of anything apart from a 200 code is not ideal. One good code to see is a 301 code which means the page has permanently moved and this code tells the search engines that this page has moved. There are a lot of codes but here are a few obvious ones:

500 – This normally means there is a problem on the server and you should contact your web developer of hosting company to fix this.

4xx – There are a few of these such as 404 which you have probably all heard of and 403. These are big problems and mean your website cannot be accessed by anyone or thing.

302 – This is a code that tell search engines that the content has temporarily moved, this is okay if you are rebuilding a website but a lot of web development companies accidentally use this where a 301 should be used. Normally a 301 should be used instead.

As mentioned above there are a lot of HTTP codes so we have only ran through the key ones. If it isn’t a 200 or 301 then there is normally a problem.

There are a few tools online to check this code including. Our favourite is

This is a high priority Task if action is required.

6. Search Engine Friendly URLs

At the moment your category and product URLs aren’t search engine friendly. We recommend changing this. For example: should be something more like which will improve search engine listings and relevance for visitors. When you make these changes please ensure that 301 redirects are in place.


If possible all of these URLs should be re-written to be search engine friendly and the current URL should 301 redirect to the new URL to ensure the search engines know that the page has moved. The best way to do this is to add the following code to the website .htacess file:
redirect 301 /seasoning_detail.aspx?ID=156

Further Advice and Information

If your site contains URLs that are not search engine friendly and include parameters such as the ID=156 above then you should look into altering these to include meaningful names and don’t forget to 301 redirect the current URL to the new URL.

This is a medium priority task.

6.1 Spaces in URLs

Some of your URLs appear to have spaces within them, which is not good for search engines so we recommend altering these to include hyphens instead – then 301 redirects can be set up from the spaced pages to the new pages. For example should be:

Notice here how the %20 has been replaced by a hyphen (%20 is code for a space). This should be fixed while making the URLs search engine friendly as mentioned above so this URL should actually be:


This will probably be built into your system and we will/your developer will have to add some code to replace spaces with hyphens. We will obviously have to then 301 redirect the old page to the new page, but when we do this the 301 code changes slightly:

redirect 301 “/recipe_detail.aspx?RecipeID=60&Name=Lamb Khouzi”

Notice above that the current URL has quotations around it and the %20 code is replaced with a space.

Further Advice and Information

If you have space names in your filenames then these should be renamed and then redirect to filenames with hyphens replacing the spaces. This can become complicated in e-commerce systems because you have to add coding rules but if you have a website that is static (does not rely on a database) then this can be a simple renaming filenames and adding redirects task.

Note: This also applies to other files such as .pdfs.

This is a high priority task.

7. Crawl Errors

Crawl errors are errors that spiders and people visiting your website may experience when visiting your website. Normally these are 404 errors caused by incorrect links to your website. Some of these links may be internal links (links on your own website to other pages on your website) and others may be incoming links (links from other websites).

If there are crawl errors from other websites to your website all we can do is contact that webmaster and ask them if they can either amend or remove the links.

If these are internal links then this is something that is normally easy to fix because we can amend or remove these links ourselves.

There are a number of ways to find these errors but the best way is to use the Google Webmaster Tools. According to Google Webmaster Tools your website has the following crawl errors:

Server errors: You have 6 ‘500’ errors, this occurs when a search engine can’t access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request.

Access Denied Errors: Your website has one error for page, this error usually occurs because your server either requires login to access the page, or is blocking search engines from accessing your site.

Not Found Errors: Your website has 209 pages that cannot be found by spiders – To fix the Not Found Errors please visit this tab in the Google Webmaster Tools account and Google will let you know which pages link to them in the Linked From column. Please either remove the links or amend them so that they go to a page that has no errors.

To view these errors please login to your Google Webmaster Tools > Click on your website then click on Crawl Errors. Once you start correcting these errors you can mark the errors as fixed in Google Webmaster Tools.


Here we need to work through these errors one by one and either ensure they are no longer there or if they are fix them,  normally by either fixing the links of removing the links. Once these errors are fixed or we find that they no longer exist we need to click the Mark As Fixed button in Google.

Further Advice and Information

If you find that you have errors in Google webmaster tools you should follow the above recommendations. This should be re-visited at least every 2 weeks to ensure your website is running smoothly.

This is a high priority task.

8. W3C Compliance

W3C is the World Wide Web Consortium which is an international community where Member organizations, a full-time staff, and the public work together to develop Web standards. W3C is here to develop protocols and guidelines that ensure long-term growth for the Web.

Although making your website W3C compliant is not an SEO priority we recommend making your website W3C compliant if possible.

At the moment your website has errors: You can visit the site below to fix these errors:



These errors are easy to fix. A lot of them will be fixed once section 6 in this report is complete.

Further Advice and Information

Most websites will have errors and really they should be fixed however if you are using off the shelf software such as WordPress, Drupal, Joomla then chances are that you will not be able to fix these errors because they will be built into the software.

This is a low priority task.

9. Page Speed

Page Speed is the amount of time that it takes your website pages to load in browsers. In the past this used to be measured in seconds but now thanks to Google there is a new tool that not only states whether a page takes too long to load but also tells us why. This tool gives each page a score out of 100. We recommend that each page on your website has a score of at least 80 out of 100. At the moment your homepage has a score of 82, this shows that currently this follows best practices so no work needs be done on this.



Your website is running well which is good. Once all of the high priority tasks have been complete you should look into improving this website speed further but 82 is a very good score.

Further Advice and Information

If your website is running below 80 or if you notice that pages take more than a couple of seconds to load then you should look into improving your website speed. Here is a tool that will show you your website speed and give you recommendations on how to improve the website speed:

This is a low priority task if your speed is above 80, medium if it’s 70-80 and high if it’s below 70.

10. Domain Names & Affiliates

If you own any other domains that redirect or point to this domain we recommend ensuring that 301 server side redirects are set up. (see details in 10.1 on how to do this) However you may want to consult with us before you do this (or give us the details and we can do this for you). Also if you have any affiliate websites that sell/promote your services please ensure that they do not use the same content. Other domain names, affiliates and basically any link on any other website can cause duplicate content issues. We also recommend not duplicating content on other websites when link building.

Note: There is information on how to set up domain 301 redirects in the Appendix.


If you own any other domain names you should let us know what domains they are and why you own them then we can discuss what we should do with them.

Further Advice and Information

If you own multiple domain names then you should ensure that you are not duplicating content across the domains. Where possible you should 301 redirect these domains to your main domain. However if they have unique content that is relevant to your website (that you are optimising) then you should link to this domain using relevant text in the anchor text.

This is a low priority task unless you have duplicate content, then this is a high priority task.

11. Multiple Homepage Links

All pages on a website should only have 1 URL. This helps reduce and duplicated content which are not good for search engines. This also helps ensure that each page has as much ‘search engine equity’ as possible.

The homepage can be a special case because in its own right it is technically a duplicated page. Every homepage of every website can be accesses through 2 URLs. On your website your homepage can be accessed through both:

For the reason above it is important to only link to one of these URLs and this should be the page. At the moment there are links on your website to please change these links to


All links to your homepage should be changed to link to your homepage rather than the page that is associated with your page. This will be a simple task to replace all relative links such as <a href=”default.aspx”>Home</a> with
<a href=””>Home</a>

Further Advice and Information

If you notice that when you browse through your website and click on links to go to your homepage the URL in the address bar doesn’t just show your domain name then you should look at following the above action. The URL could be anything but the most common are:

This is a high priority task.

12. DMOZ Listing

DMOZ is a vertical search engine. Basically it is a Web directory. It is actually the most important Web directory on the Web at the moment because it is the one the Google uses. So it is important to try and have listings on the directory if possible. After searching we have found that you have a listing in three areas which is good news.

1. Home: Cooking: Herbs and Spices
2. Regional: Europe: United Kingdom: England: Merseyside: Liverpool: Business and Economy: Shopping
3. Shopping: Food: Seasonings: Spices


There is no action required here.

Further Advice and Information

If you don’t have a listing on DMOZ then you should try and get one. This is done by visiting browsing through to the place where you would expect your website to be. Click on the Suggest URL link in the top right menu, then fill in the form truthfully to add your website.

This is a low priority task.

13. Link Building

The number and quality of back-links into a website is a crucial factor determining ranking positions, particularly with Google but increasingly with all other major search engines. They also, if properly created, will be responsible for direct traffic driving from other websites.

As time goes by your inbound links will grow naturally as the site becomes discovered and people link to your unique and interesting content, however there are things which can be done to pro-actively increase inbound links from other relevant websites

The Google Page Rank indicator is a reflection of the number and quality of the back-links into your site. This is not the only factor but it is still quite an important one. Right now your Google Page Rank is 4 (Page Rank is a number from 0 to 10 Google assigns to sites – the higher the number the better ranking you will have when searches are carried out).

This Page Rank will be improved by increasing the back-links. (Then, all other things being equal, you will rank better than competitors).

This could involve contacting suppliers, customers etc… for links. It is recommended that you work with us before moving forward with this task. In addition we will conduct some background activity over a period of time to systematically add other links.

According to Google you currently have 3,900 links from 638 domains.

You can also visit type in your website and check how many links you have in this database but Google is the best place to look for your own links (we will use open site explorer in the future to look at your competitor’s links). At the moment on open site explorer you have 912 links from 155 different domains.



At the moment there are no actions here but there will be once the Onsite SEO is on place. Once the on-page SEO work is done we will run a ranking check on the most important keyphrases – If the rankings aren’t satisfactory for some of these phrases additional link building work may be required.

Further Advice and Information

You should check your links through both of the above tools before you start making changes and improving your website online so you some initial data to compare with as you make progress.

This is a low priority task – To start but may become increasingly important once the On-site SEO is in place.


10.1. Setting Up A 301 Redirect

When creating a 301 redirect from one domain name to another you need access to the server of the domain that you want to direct away from.
So you need FTP access for the domain name you are redirecting
1. To create a .htaccess file, open notepad, name and save the file as .htaccess (there is no extension).

2. If you already have a .htaccess file on your server, download it to your desktop for editing.

3. Place this code in your .htaccess file:

Options +FollowSymLinks
RewriteEngine on
RewriteRule (.*)$1


4. If the .htaccess file already has lines of code in it, skip a line, then add the above code.

5. Save the .htaccess file

6. Upload this file to the root folder of your server.

7. Test it by typing in the old address to the page you’ve changed. You should be immediately taken to the new location.

Please find all other relevant posts here:-

Post 1: SEO exposed and Interview with Seasoned Pioneers

Post 2: SEO Process and Objectives

Post 3: Technical Analysis and Audit 

Post 4: Keyphrase and Traffic Analysis

Post 5: SEO Blueprint report

Post 6: Implementation programme

Post 7: SEO Results