Showing posts with label Search Engine Optimization. Show all posts
Showing posts with label Search Engine Optimization. Show all posts

Wednesday, 28 January 2015

NOFOLLOW DOES NOT MEAN DO NOT CRAWL!

NOFOLLOW DOES NOT MEAN DO NOT CRAWL!

By Strictly-Software

I have heard it said by "SEO Experts" and other people that to prevent excess crawling of a site you can add rel="nofollow" to your links and this will stop GoogleBOT from crawling those links.

Whilst on the surface of it this does seem to make logical sense, I mean the attribute value does say "nofollow" not "follow if you want" it isn't. BOTS will ignore the nofollow and still crawl the links if they want to.

The nofollow attribute value is not meant for blocking access to pages and preventing your content from being indexed or viewed by search engines. Instead, the nofollow attribute is used to stop SERPS like GoogleBOT from having any "link juice" from the main page leak out to the pages they link to.

As you should know Google still uses PageRank, even though it is far less used than in years gone by. In the old days it was their prime way of calculating where a page was displayed in their index and how one page was related to another in terms of site authority.

The original algorithm for Page Rank and how it is calculated is below.

PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))


An explanation for it can be found here. Page Rank Algorithm Explained.

The perfect but totally unrealistic scenario is to have another site with a very high Page Rank value e.g 10 (the range goes from 1 to 10) and to have that sites high PR page (e.g their homepage) have a single link on it that goes to your site - without a nofollow value in the rel attribute of the link.

This tells the SERP e.g GoogleBOT that this high ranking site THINKS your site is more important than it in the great scheme of the World Wide Web.

Think of a pyramid with your site/page ideally at the top with lots of high PR pages and sites all pointing to it, passing their link juice upwards to your site. If your page then doesn't have any links on it at all then no link juice you have obtained from inbound links will be "leaked out".

The more links there are on a page the less PR value is given to each link and the less "worthy" your site becomes in theory.

So it should be noted that the nofollow attribute value isn't meant for blocking access to content or preventing content to be indexed by GoogleBOT and other search engines.



Instead, the nofollow attribute is used by sites to stop SERP BOTS like GoogleBOT from passing "authority" and PR value to the page it is linking to.

Therefore GoogleBOT and others could still crawl any link with rel="nofollow" on it.

It just means no Page Rank value is passed to the page being linked to.

Thursday, 12 September 2013

SEO - Search Engine Optimization

My two cents worth about Search Engine Optimisation - SEO

Originally Posted - 2009
UPDATED - 12th Sep 2013

SEO is big bucks at the moment and it seems to be one of those areas of the web where there seem to be lots of snake oil salesmen and "SEO experts" who will promise no 1 positioning on Google, Bing and Yahoo for $$$ per month.

It is one of those areas that I didn't really pay much attention to when I started web developing mainly because I was not the person paying for the site and relying on leads coming from the web. However as I have worked on more and more sites over the years its become blatantly apparent to me that SEO comes in two forms from a development or sales point of view.

There are the forms of SEO which are basically good web development practise and will come about naturally from having a good site structure, making the site usable and readable as well as helping in terms of accessibility.

Then there are the forms which people will try and bolt onto a site afterwards. Either as an after thought or because an SEO expert has charged the site lots of money, promised the impossible, and wants to use some dubious link-sharing schemes that are believed to work.


Cover the SEO Basics when developing the site


Its a lot harder to just "add some Search Engine Optimization" in once a site has been developed especially if you are developing generic systems that have to work for numerous clients.

I am not an SEO expert and I don't claim to be otherwise I would be charging you lots of money for this advice and making promises that are impossible to be kept. However following these basic tips will only help your sites SEO.

Make sure all links have title tags on them and contain worthy content rather than words like "click here". The content within the anchor tags matter when those bots come a crawling in the dead of night.

You should also make sure all images have ALT attributes on them as well as titles and make sure the content of both differ. As far as I know Googlebot will rate ALT content higher than title content but it cannot hurt to have both.

Make sure you make use of header tags to differentiate out important sections of your site and try to use descriptive wording rather than "Section 1" etc.

Also as I'm sure you have noticed if you have read my blogs before I wrap keywords and keyword rich sentences in strong tags.

I know that Google will also rank emphasised content or content marked as strong over normal content. So as well as helping those readers who skim read to view just the important parts, it also tells Google which words are important on my article.

Write decent content and don't just fill up your pages with visible or non-visible spammy keywords.

In the old days keyword density mattered when ranking content. This was calculated by removing all the noise words and other guff (CSS, JavaScript etc) and then calculating what percentage of the overall page content were relevant keywords? 

Nowadays the bots are a lot cleverer and will penalise content that does this as it looks like spam.

Also its good for your users to have good readable content and you shouldn't remove words between keywords as it makes it more unreadable and you will lose out on the longer 3, 4, 5 word indexable search terms (called long-tail in the SEO world).

Saying this though its always good to remove filler from your pages. For example by putting your CSS and Javascript code into external files when possible and removing large commented out sections of HTML.

You should also aim to put your most important content at the top of the page so its the first thing crawled.

Try moving main menus and other content that can be positioned by CSS to the bottom of the file. This is so that social media sites and other BOTS that take the "first image" on an article and use it in their own social snippets don't accidentally use an advertisers banner instead of your logo or main article picture.

The same thing goes for links. If you have important links but they are in the footer such as links to site-indexes then try getting them higher up the HTML source.

I have seen Google recommend that 100 links a per page is the maximum to have per page. Therefore having a homepage that has your most important links at the bottom of the HTML source but 200+ links above them e.g links to searches even if not all of them are visible then this can be harmful.

If you are using a tabbed interface to switch between tabs of links then the links will still be in the source code and if they are loaded in by JavaScript on demand then that's no good at all as a lot of crawlers don't run JavaScript.

Items such as ISAPI URL rewriting are very good for SEO plus they are nicer URLs for sites to display.

For example using a site I have just worked on as an example http://jobs.professionalpassport.com/companies/perfect-placement-uk-ltd is a much nicer URL to view a particular company profile than the underlying real URL which could also be accessed as http://jobs.professionalpassport.com/jobboard/cands/compview.asp?c=6101

If you can access that page by both links and you don't want to be penalised for duplicate content then you should specify which link you would want to be indexed by specifying your canonical link.You should also use your Robots.txt file to specify that the non re-written URL's are not to be indexed e.g.

Disallow: /jobboard/cands/compview.asp

META tags such as the keywords tag are not considered as important as they once were and having good keyword rich content in the main section of the page is the way to go rather than filling up that META with hundreds of keywords.

The META Description will still be used to help describe your page on search results pages and the META Title tag is very important to describe your page's content to the user and BOT.

However some people are still living in the 90's and seem to think that stuffing their META Keywords with spam is the ultimate SEO trick when in reality that tag is probably ignored by most crawlers nowadays.

Set up a Sitemap straight away containing your sites pages ranked by their importance, how often they change, last modified date etc. The sooner you do this the quicker you site will be getting indexed and gaining site authority. It doesn't matter if it's not 100% ready yet but the sooner it's in the indexes the better.

Whilst you can do this through Googles webmaster tools or Microsofts Bing you don't actually need to use their tools and as long as you use a sitemap directive in your robots.txt file BOTS will find it e.g

Sitemap: http://www.strictly-software.com/sitemap_110908.xml

You can also use tools such as the wonderful SEOBook Toolbar which is an add-on for Firefox which has combined numerous other free online SEO tools into one helpful toolbar. It lets you see your Page Ranking and compare your site to competitors on various keywords across the major search engines.

Also using a text browser such as Lynx to see how your site would look to a crawler such as yahoo or google.is a good trick to see how BOTS would view your site as it will skip all the styling and JavaScript.

There are many other good practises which are basic "musts" in this day and age and the major SERP'S are moving more and more towards social media when it comes to indexing sites and seeing how popular they are.

You should set up a Twitter account and make sure each article is published to it as well as engaging with your followers.

A Facebook Fan page is also a good method of getting people to view snippets of your content and then find your site through the world most popular social media website.

Making your website friendly for people viewing it on tablets or smart phones is also good advice as more and more people are using these devices to view Internet content.

The Other form of SEO, Black Magic Optimization

The other form of Search engine optimization is what I would call "black magic SEO" and it comes in the form of SEO specialists that will charge you lots of money and make impossible claims about getting you to the number one spot in Google for your major keywords and so on.

The problem with SEO is that no-one knows exactly how Google and the others calculate their rankings so no-one can promise anything regarding search engine positioning.

There is Googles Page Ranking which is used in relation to other forms of analysis and it basically means that if you have a site with a high PR that links to your site that does not link back to the original site then it tells Google that your site has higher site authority than the linking site.

If your site only links out to other sites but doesn't have any links coming in from high page ranked relevant sites then you are unlikely to get a high page rank yourself. This is just one of the ways which Google will use to determine how high to place you in the rankings when a search is carried out.

Having lots of links coming in from sites that have nothing whatsoever to do with your site may help drive traffic but will probably not help your PR. Therefore engaging in all these link exchange systems are probably worth jack nipple as unless the content that links to your site is relevant or related in some way its just seen as a link for a links sake i.e spam.

Some "SEO specialists" promote special schemes which have automated 3 way linking between sites enrolled on the scheme.

They know that just having two unrelated sites link to each other basically negates the Page Rank so they try and hide this by having your site A linking to site B which in turn links to site C that then links back to you.

The problem is obviously getting relevant sites linking to you rather than every tom dick and harry.

Also advertising on other sites purely to get indexed links from that site to yours to increase PR may not work due to the fact that most of the large advert management systems output banner adverts using Javascript therefore although the advert will appear on the site and drive traffic when people click it you will not get the benefit of an indexed link. The reason being that when the crawlers come to index the page containing the advert the banner image and any link to your site won't be there.

Anyone who claims that they can get you to the top spot in Google is someone to avoid!

The fact is that Google and the others are constantly changing the way they rank and what they penalise for so something that may seem dubious that works currently could actually harm you down the line.

For example in the old days people would put hidden links on white backgrounds or position them out of site so that the crawlers would hit them but the users wouldn't see which worked for a while until Google and the others cracked down and penalised for it.

Putting any form of content up specifically for a crawler is seen as dubious and you will be penalised for doing it.

Google and BING want to crawl the content that a normal user would see and they have actually been known to mask their own identity ( IP and User-Agent ) when crawling your site so that they can check whether this is the case or not.

My advice would be to stick to the basics, don't pay anybody who makes any kind of promise about result ranking and avoid like the plague any scheme that is "unbeatable" and promises unrivalled PR within only a month or two.

Monday, 20 May 2013

Some clever code for SEO that won't annoy your users

Highlighting words for SEO, turning them off for the users

You might notice in the right side bar I have two options under the settings tab "Un-Bold" and "Re-Bold".

If you try them out you will see what the options do. Basically unbolding any STRONG or BOLD tags or re-bolding them again.

The reason is simple. Bolding important words either in STRONG or BOLD tags is good for SEO. Having content in H1 - H6 tags are even better and so are links - especially if they go to relevant and related content.

However, I don't claim to be the first person to start bolding important keywords and long tail sentences for SEO purposes but I was one of the first to catch on that the benefits for SEO were great.

To much bolding and it looks like spam, too little you might not get much benefit but you have to 2 areas to cater for.

1. The SERP crawlers (Googlebot, BingBot, Yandex etc etc) who see the original source code on the page. When they do they will just see words wrapped in normal STRONG and BOLD tags (See for yourself).

2. However if a user doesn't like the format and mix of bolded and non bolded wording then they can use the settings to add a class to all STRONG and BOLD tags that basically takes aways the font-weight of the element. You would only see this in the generated source code. Running the "Re-Bold" function after the first "Un-Bold" will just remove the class that took away the font-weight in the first place returning the element to it's normal bolded state.

Therefore the code is aimed for both BOTS and users and you can see a simple test page on my main site here: example to unbold and rebold with jQuery.

I have used jQuery for this only because it was simple to write however it wouldn't be too hard to rewrite with plain old JavaScript.

Another extension I have lost since updating this blog format but would be easy to add is the use of a JavaScript created cookie to store the users last preference so that they don't have to keep clicking the "un-bold" option when they visit the site.

As Blogger won't let  you add server side code to the blog you will need to do it all with JavaScript but with the new blogger layout (which I love by the way - unlike Google+) it is easy to add JavaScript (external and internal) plus CSS sections and link blocks to control the actions of your functions.

An example of the code is below and hopefully you can see how easy it is to use.

First I load in the latest version of jQuery from Google.

Then I use selectors to ensure I am only targeting the main content part of the page before I add or remove classes to STRONG or BOLD tags.

<style type="text/css">
.unbold{
 font-weight:normal;
}
</style>

<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js"></script>

<script>
function unbold()
{
 $(".entry-content").each(function(){  
  $("strong",this).addClass("unbold");
  $("b",this).addClass("unbold");
 });
}

function bold()
{
 $(".entry-content").each(function(){
  $("strong",this).removeClass("unbold");
  $("b",this).removeClass("unbold");
 });
}
</script>

So not only are you benefiting from SEO tweaks but you are letting your users turn it off if they feel it's a bit too much. Hey Presto!

Tuesday, 30 October 2012

New version of the SEO Twitter Hunter Application

Introducing  version 1.0.4 of the Twitter Hashtag Hunter Application

I have just released the latest version of the popular windows application that is used by SEO experts and tweeters in combination with my Strictly Tweetbot Wordpress plugin to find new @accounts and #hashtags to follow and use.

Version 1.0.4 of the Twitter HashTag Hunter application has the following features:
  • A progress bar to keep you informed of the applications progress in scanning.
  • More detailed error reporting including handling the fail whale e.g 503 service unavailable error.
  • More HTTP status code errors including 400, 404, 403 and the 420 Twitter Scan Rate exceeded limit.
  • Clickable URL's that open the relevant Twitter account or Hash Tag search in your browser.
  • Multiple checks to find the accounts follower numbers to try and future proof the application in case Twitter change their code again.
  • A new settings tab that controls your HTTP request behaviour.
  • The ability to add proxy server details e.g IP address and Port number to scan with.
  • The ability to change your user-agent as well as a random user-agent switcher that picks between multiple agent strings for each HTTP request when a blank user-agent is provided.
  • An HTTP time-out setting to control how long to wait for a response from the API.
  • A setting to specify a wait period in-between scans to prevent rate exceeded errors.
  • A setting to specify a wait period when a "Twitter scan rate exceeded" error does occur.
  • Extra error messages to explain the result of the scan and any problems with the requests or settings.
The main new feature of 1.0.4 is the new settings panel to control your scanning behaviour. This allows you to scan through a proxy server, specify a user-agent, set delay periods in-between scans and the "Twitter Scan Rate exceeded limit" error which occurs if you scan too much.

Changing the Scanner Settings

For Search Engine Optimisation (SEO) experts or just site owners wanting to find out who they should be following and which #hashtags they should be using in their tweets this application is a cheap and useful tool that helps get your social media campaign off the ground by utilising Twitters Search API.

You can download the application from the main website www.strictly-software.com.

Thursday, 6 October 2011

Twitter Hash Tag Scanner SEO Application

Introducing the first version of my Twitter Hash Tag Scanner Application

The Strictly HashTag Hunter is a Windows form application that allows you to find the most relevant HashTags and Twitter accounts for a variety of specified search terms and keywords.

This application is ideal for people who have just created a Twitter account and want to analyse their own site specific keywords to find #HashTags they should be following or using themselves.

For instance you might have a blog or site that uses an AutoBlogging tool like my Strictly TweetBot Wordpress Plugin and you might want to set up various AutoTweets with relevant HashTags that are related to certain keywords and content snippets.

This tool also helps you find the most important Twitter accounts that you should be following as it analyses those people that are using the keywords or sentences that you enter on Twitter at that point in time to find the most popular HashTags related to those words as well as the accounts that are using them the most.

Obviously the time of day you run your scan will affect the results as different people Tweet at different times of the day but you will see from the results which Twitter accounts have the most followers and therefore worth following for your own account.

The primary aim of this tool is to help you save time trying to work out which #HashTags to use for your own Tweets as well as working out which @accounts to follow for your own Twitter account.

The Strictly Twitter Hash Tag Hunter is built as a windows application that runs on your own desktop and it hooks into Twitters API to obtain the results. It is perfect for SEO and Social Media analysts as well as people with a new Twitter account who don't know which hash tags and accounts they should be following to make an impact on the social scene.



Screen 1 shows how you enter one or more search terms that you want to find information for. These terms can be anything but if you are looking to utilise this tool with my Strictly TweetBot Wordpress Plugin then you should be looking to find the #HashTags and @Accounts to follow on Twitter related to the key terms your website is based on.

For example if you were running a site about Horse Racing and wanted to find out which Twitter @Accounts to follow and which #HashTags to use in your Tweets you would enter a number of search terms like so:

Horse Racing
Kempton Park
fromthestables.com
Free Racing Tips 
Twitter HashTag Hunter Start up screen
Enter each keyword or search term on it's own line.

Once you have entered each term on it's own line you click the "Search" button and the Scanner gets to work analysing your keywords and finding related Twitter information.

For each search term and keyword it will scan the Twitter API for those words looking for the most popular #hashtags that are related to those keywords.

It will also find the Twitter accounts that make the most use of these terms before ordering the accounts by the number of followers each account has and the hash tags by the number of times they are referenced by those accounts.

On completing the Scan

Screen 2 shows the most popular hash tags found for the search terms that were entered.
Twitter HashTag Completion Screen
The most popular hash tags found for the entered search terms and keywords.

Screen 3 shows the most followed Twitter accounts that used the terms you searched for.
Twitter HashTag Completion Screen
The most followed Twitter accounts for the entered search terms and keywords.

Following Accounts or Hash Tags

Once the Twitter Scan has completed and you have looked at the results you can simply click on the Account or Twitter Link column value to open up the desired URL in your default browser.

Screen 4 shows you selecting the desired Account you want to examine on Twitter.

Selecting a Twitter Account
Selecting an account to examine


Screen 5 shows the http://twitter.com page opening in your browser where you can decide whether or not the account or hash tag is worth following.

Folllowing the selected Twitter Account
Viewing the account in Twitter and following them


If you are already logged into Twitter at the time then it's just a simple matter of clicking the "Follow" button in the top right of the screen and your own Twitter account will now be following the account you opened.


About the Twitter Hash Tag Scanner Application

The application is a multi-threaded standalone executable Windows application and it has been built with users and Twitter in mind so that the Twitter API is not overloaded and abused and that you can continue to get all the information you need from their service.

A progress bar keeps you updated with the amount of scans it has carried out as well as the number of accounts and hashtags it has already found matching your search terms.

If for whatever reason Twitter blocks your requests (for example if you were hammering their API with dozens of search terms in one scan) then the application will slow down the amount of the requests it makes and increase the delays between requests. It also has some built in methods for bypassing certain blocking methods as well as the ability to access the information from other sources.

I am hoping to expand this tool over the years and I have had great feedback from both novice users who have found it very useful in deciding who to follow when they first start to use Twitter as well as SEO experts who utilise social media and Twitter all the time for marketing purposes.

As an introductory offer I am offering this application for the same price as a small donation of only £10.00 and you can buy this application from my application order page.

Saturday, 12 December 2009

Setting up a new site for SEO

Creating a new system for Search Engine Optimisation

When you are creating a new website the amount of time from conception to roll out can be anything up to a few months. I am talking here about custom built systems not Wordpress or blogger type sites that can be knocked up instantly.

You have the design work to be done, the back end to be developed, debugging and testing and any number of changes to accommodate for. This is usually an iterative process and once its complete and the site is live the owner will then concentrate on SEO by looking into viable search terms to target and trying to get back-links to the system as well as creating an online presence for the site.

The problem with this approach is that the SEO marketing is left until last and as we know with the major search engines they can be quite slow to react to changes within a site and Google will only update the Page Ranking for your site every few months. Therefore a good idea would be to move the SEO right to the front of your process so that you gain the benefits that time brings an online presence before the actual roll out of your site.

How would you go about this? How can you build up an online presence when there is nothing to present to the online world? Well here are some ideas that you can take onboard.

1. Get your domain purchased and ready as soon as possible.

2. Don't worry about features and functionality at this point as you will be doing this over the next few months as you site is being created. However once you have access to your webserver and your domain is ready put up a holding page.
This should be a pure HTML page that loads fast and contains lots of text describing your site and what features its going to be providing. Make sure the content explains that this system is coming soon and not ready yet. The content should not be a list of spammy keywords but it should contain various configurations of the search terms you want to target.

3. Make sure your holding page is laid out correctly with H1, H2 tags, paragraphs containing your targeted keywords wrapped in strong or em tags and links to other static pages. Do not use too much highlighted text but concentrate it on your key search terms.

4. If possible create static versions of the main pages that you are expecting to use on your finished system and link to them from your home page. For example if you are a catalogue site put up a static results page with a number of product items on it. Target your main sales items, the ones that you are hoping to sell the most of once the main site is ready. If you are going to gain SEO points for site age then you want those points to be related to items that will benefit you in future.

5. Make sure its clear that your site isn't ready yet and if possible have an email link or form where potential customers can put their names down on a mailing list to be informed when the site goes live. Or if you are a shop and have the ability to take orders offline ask them to contact you by email or phone if they see something they like. This way you can drum up interest before your site is ready.

6. Once you have a temporary website up you can concentrate on getting your real system ready without losing out on all the benefits of SEO that just having a site online gain. For example search engines such as Google will not even consider putting brand new sites at the top of rankings for certain search terms until a set period of time has passed. Known as a Sandbox this is to prevent spam sites from being created with huge numbers of backlinks taking all the top spots.
You may still be able to find your site in the index by searching for your domain or company name but other search terms which you are trying to target may not give you the desired result. Therefore having a temporary site up and running until your main site is ready is one way to get over this time delay as you really don't want your new site to suffer the sandbox on release date.

7. Search engines also see the length of time a site has been around as an important factor in calculating a sites authority. Remember site authority is important as when you get backlinks from sites with a higher authority than yours it helps boost your sites authority and also increases your Page Rank therefore you should aim to get backlinks in from sites with high authority but not link back as this may cancel out the benefit.

8. Remember Page Rank (PR) only gives an indication of the number of sites that link to yours. It doesn't say anything about the quality of your site in terms of content or the quality of the links that are pointing to you.
In the old days Page Rank was important as it was used as a way of allowing Internet users to rank the sites on the web as they would only be linking to good sites or so the idea went. However as with most ideas it was soon worked out that link farms and link exchanges could boost a sites PR which is why so many people still think that getting thousands of inbound links from anywhere is still a good thing.
I have no idea whether Google still considers PR on itself a useful tool but I have heard from many people that they don't. Because any site could get thousands of inbound links into them from link exchanges I am pretty certain that Google do not consider it that important in ranking otherwise all the top spots for most search terms would be taken up by spammy sites that offer nothing but ebooks or get rich schemes or other MLM BS.

9. Back links are important as Google still looks for sites that have good site authority that link to your site. If you are a new site selling custom widgets then you should aim to get backlinks from other widget selling sites or sites that relate to widgets in some way.
Having lots of backlinks from sites that have nothing to do with widgets won't help. Your PR may go up to a certain level but it doesn't tell Google that you are considered a relevant site in your field. Your site authority will go up when you have lots of sites in the same field as yours linking to you.

10. Backlinks are also important as they help get your pages indexed quickly by search engines. If you create a new page but have no inbound links to it then it may not get indexed even if the page is linked to internally or listed in your sitemap. I have found the quickest way to get a page indexed is by having some good quality backlinks to it from external sites.

11. If you are going to pay for advertising on other sites then you need to decide what you hope to get out of the advert as most advertising system that run over multiple sites use JavaScript iframes to load the advert content. Therefore if you are hoping to get backlinks to your site you are going to be disappointed unless the adverts are loaded server side so that search engine spiders can crawl them. If you are just trying to get brand awareness or drive traffic through clicks then Javascript loaded adverts are fine.

12. When you do use images wrapped in anchor tags on your site or adverts on other sites that take this format you should make use of all the available attributes. With an image wrapped in an anchor you have 3 possible areas to add search engine friendly content. These are the title attribute on the anchor tag and the title and ALT attributes on the image tag. Make sure you put good search terms in all 3 places and try to vary the terms so that you are making the most of these opportunities. Do not just put the name of your site in the attributes e.g

title = "Strictly Software"

but add search terms e.g

title = "Technical advice and free scripts from Strictly Software"

and then alternate it on the other two

title = "Strictly Software technical blog and downloads for developers"

ALT = "Technical development from Strictly Software, free downloads and great code"

13. The ALT tag is used for when the image cannot be loaded such as text browsers. However because text browsers or image free browsers are very rare you can probably add more text than the recommended short description of the image that an ALT tag is for.

14. Use commas in your search terms and try to get multiple sentence variations out of one piece of text without resorting to a list of keywords. Search bots are quite clever and can spot spam a mile away nowadays so you should make your search terms readable but also combine 2 or more search terms. For example.

title = "free online tools from Strictly Software the best technical blog for downloads and technical advice for free"

Now as a human reading this you might think its a bit unreadable as its quite long however for bots who crawl your site they will see this and not be able to tell that its slightly wordy only that it's not a list of spam keywords. However once your page has been indexed you now can search on the following variations to find the page.

Free online tools
Free online tools from Strictly Software
tools from Strictly Software
Strictly Software the best technical blog
Strictly Software the best technical blog for downloads
technical blog
technical blog for downloads
downloads and technical advice
downloads and technical advice for free
technical advice for free

Obviously this is just a quick example but you see what I mean. Make the most of your title and ALT attributes.

15. Good content is the most important thing on a site. If you are writing a blog then short one paragraph articles are no good as:
  1. They don't look like authoritative examples of the topic you are writing about
  2. They don't offer you the option to highlight the various search terms and combinations that you are targeting without it looking like spam.
  3. They don't offer the reader much and therefore you are less likely to get natural inbound links.

16. META tags are not as important as SEO applications downloaded from the web would like you to think. Most search engines ignore the keywords META tag nowadays and the description tag is hardly used anymore on search engine results pages unless no other search term specific content can be found in the article. Page titles are useful for usability and offer another area to enter search terms specific to the page but none of these are as important as good quality content.

17. Make sure your page content is split out appropriately with H1 tags to denote the topic and then H2 tags for sub headers and so on. Search engines see content within headers, anchor text and strong and em tags as more important than the other content as you have specially marked it out for the users attention therefore the search engine will also pay more attention to it. Do not wrap all your content in headers or strong tags as this will be seen as spam so keep it to under 10% of the overall content.

18. Put all your important links towards the top of the page and try to keep the number of links on a page to under a 100. I have seen pages that contain 250+ links and the most important links e.g to the site index were at the bottom in the footer and did not get indexed.

19. If you have lots of CSS or SCRIPT content try to put these in external files and reference them as far down the page as is possible so that your main content is the first thing a spider comes across. Try to put all your CSS references above any SCRIPT references as the browser will stop rendering the page to load in external script which may cause a nasty effect. This is one thing I hate about Googles AdSense adverts as you have to insert the SCRIPT at the place in the HTML where you want the advert to display and depending on how slow Google is to load the content it can cause a horrible delay. I have tried myself hacking about with this but as of yet don't have a way round it so if anyone does please let me know!

20. Make sure all images have width and height attributes on them. This is so that the browser does not have to wait for the image to load so it can look up the actual size of the image before being able to display it. With a specified height and width it can render the whole page and put aside the correct space for the image before it has loaded. As page load speed is one of the things that Google is now concentrating on when determining ranking then it makes sense to make your pages load as fast as possible.

So there are some tips to utilize on your temporary site as well as the completed site. If you can make the most of the time that it takes to get your site finished by having a search engine optimised temporary site up and running and gaining SEO points just by being accessible then you won't have to try so hard once your real site is ready.

There are lots more things you can try and you should download the various versions of all the Search Engine Optimiser tools that are available on the web. Most of these tools do basic content analysis by determining how much of the content is specific to various search terms as well as looking for missing or non specific META and other tags. However good SEO is something that is learnt over time and once you know the key aspects to look for its just a case of trial and error. Check out your competitor sites and see how they have optimised their pages, who is linking to them and what search terms they rank high on then try to target variations of terms that users search on but are not already saturated by existing sites. As with most things in life its mostly trial and error.


Sunday, 2 August 2009

Implementing SEO Strategy too late in the day

Think about Search Engine Optimisation early in the day

In my 10+ years of web development I have noticed many mistakes that customers make in terms of creating an application that meets their requirements as well as being cost effective and delivered on time. Most of the mistakes and problems can be traced back one way or another to not having a detailed specification document that is not only just signed off by both parties but also kept to 100% with any deviation treated as new work to be costed and developed.

Loose specs lead to misunderstanding's from both parties where the customer expects one thing and the developer is building another. Even with a signed off stringently kept spec there is also the problem of customers not understanding the technical limitations or boundaries of the system they are buying into. An example I want to give is in relation to SEO which is usually treated as an after thought by the customer rather than as a key component during the specification stage. I work for a recruitment software development company and have built a system that is now running 200+ jobboards. I have noticed that what usually happens is that the customer is under some illusion that the 7 or 10k they have spent on a site has bought them a custom built system that is flexible enough to allow any future development requirement that they may wish to have down the line. Now this maybe the fault of the sales person promising the earth for peanuts or it may not but in reality they have bought an off the shelf generic product that can be delivered quickly and comparatively cheaply exactly because it hasn't been custom built for their requirements.

One of the ways this problem manifests itself is in Search Engine Optimisation as the customer will usually wait a couple of months after the site going live before realising that they are not top of Google for the word jobs and ask why. They then discover numerous SEO specialists that offer to get them to the top of Google for certain terms and invest lots of money in audits and optimisation reports only to find out that we cannot implement everything they want because they are using a system with a shared codebase. Yes our system has numerous inbuilt features for SEO that can be turned on and off but asking for and expecting specific development that has been recommended by a 3rd party after development has completed can cause unneeded stress and tension especially when the customer is told no due to system limitations.

What the customer should do is think about and investigate the possibilities of SEO before the specification document has been signed off rather than months after the site has gone live. This way any limitations of the system can be discussed so the customer is made aware that spending money with a 3rd party who is also unaware of system limitations is probably a waste of £££. Also any good ideas they may have regarding SEO requirements can be planned out and possibly developed during the main development phase rather than thrown in later as an after thought. Even if its a generic system good ideas are good ideas and a benefit for one customer will be a benefit to others as the development house can make money by reselling the feature as an add-on to existing customers.

I am not saying 3rd party SEO consultants don't do a good job but potential customers of sites need to be aware of what they are buying and what is and not possible before they spend money with any 3rd party. There can be nothing worse than spending money with a consultant only to find out that their recommendations cannot be implemented or if they are implemented it will cost even more money for the extra development. So take my advice and think about SEO before not after development as not only will it save time and money but having good SEO from the off will mean your site gains better positioning in the search engines quicker rather than later.

Further Reading:

Please read this article of mine about techniques for good and bad search engine optimisation.