Wednesday 12 December 2012

HackBar Not Showing in Firefox

How to display the HackBar in Firefox

My preferred browser for the Internet is Google Chrome. I moved to Chrome because of it's speed and simplicity as well as the fact that like most people, when I got disillusioned with IE and moved to FireFox I installed so many plugins that it became so slow to load and has gone through periods of hanging and high CPU and memory usage.

Therefore I have decided to use FireFox for certain development and debugging when I want to use plugins. Plugins such as the Colour Picker ColorZilla, the Web Developer Toolbar, the Modify Header plugin or any number of others I have installed. I then keep Chrome plugin free to keep it fast for browsing.

I did try Chrome out with plugins when they first started supporting them but I soon decided I didn't want to turn Chrome into another FireFox by overloading it with plugin checking at start up which has happened with FireFox.

So Chrome is plugin free and fast, FireFox is useful and handy, full of plugins and tools and good for development and although IE 9 is fast and probably just as good nowadays I am guessing most developers won't go back to it after Microsoft taking numerous versions to just standardise their code and CSS after years of complaining from the masses.

One of the plugins I use on FireFox a lot is the HackBar.

Not only is it useful for loading up long URL's, quickly Base64 decoding or encoding strings as well as URL Encoding and Decoding but it has numerous other features if you want to test your site out for XSS or SQL injection vulnerabilities.

However I usually find I use it mostly for quickly encoding and decoding strings and sometimes I find myself having to put really long ones in the box and extending it so much the viewport becomes unusable. I then find myself disabling the add-on.

However on more than a few occasions now when I come to re-enable the HackBar by right clicking in the menu bar and ticking the option for HackBar OR using the "View > HackBar" menu option. I then find that it doesn't display as expected and hunting around for it is of no use at all.

You can try disabling it in the Add-ons section or even un-installing and re-installing it but even then it might not appear.

However I have found that a quick hit of the F9 key will make it show. Simples!

So if you are ever having issues trying to make toolbars or plugins show that you have de-activated try the F9 key first before anything else.

Saturday 24 November 2012

Strictly AutoTags 2.0.8 Released

Strictly Auto Tags 2.0.8 Released

I have just released a new version of Strictly AutoTags (or Strictly Auto Tags) which includes extra SEO options which will enable you to now not only wrap certain tags in <strong> tags but also link them to their related tag page.

This feature is called Deep Linking and the aim is that you can let BOTS find extra content related to your article quickly from the post.

You can control the number of words per tag that get deep-linked as well as limiting the deep-linking to tags with a minimum number of posts e.g only deep-link tags with 10 posts associated with them.

The benefit of deeplinking when you are saving a post and not as other SEO plugins do on outputting it is that you speed up the display of the article.

Remember you only save a post one or a few times whilst it can be displayed thousands of times (with or without a caching plugin).

There are also options to clean out the SEO options e.g remove the deep-linking and bolding when you save a post so that every edit gets a new AutoTag scan.

If you want to check out a website that has been using the plugin for sometime then check out and see the tags that are displayed for each article.

If you like the plugin and make a donation then let me know and I will add a backlink from this site to yours so that you get some free advertising and some PR / Link Juice from Google.

Check out the main Wordpress plugin repository to download the latest version or visit the plugin page on my main site.

Strictly TweetBOT Version 1.1.2 Released

Version 1.1.2 of the Wordpress Plugin - Strictly TweetBot released

New features and bug fixes include:

  • Fixed bug with some words in the TweetShrink function.
  • Added new short versions of words for the TweetShrink function.
  • I have improved the performance of the loop that controls the tweets by only getting the permalink for the new post once and not on every loop iteration as before.
  • I have added an option to make an HTTP request to the new post before tweeting so that if caching plugins are installed and their caching options are enabled correctly the page will get cached before any Twitter rush from BOTS visiting the post. I have proved that at least 50+ (growing all the time) requests to any link in a new Twitter post can concurrently occur when a Tweet is posted.
  • I have added an option to use a different user-agent when making this HTTP request so that you can easily identify the request in your access log files for debugging etc.
  • I have also added a new configuration test to the "Config Test" button which tests the HTTP Cache test by posting the last article posted and returning a status code for it.
Caching new posts before any Twitter Rush

The main feature of this release is the ability to fire off an HTTP request to the post being Tweeted about before any Tweets are sent.

The idea is that if you are using any number of caching systems (properly configured obviously - and every caching system is different) the first page request will be from the TweetBot and hopefully cause that page to be cached by the system you are using e.g WP Super Cache, W3 Total Cache, Apache Caching modules etc.

The aim is that by caching the post before any Tweets are sent that any Twitter Rush that occurs when your link hits Twitter does not overload it by having 50+ BOTS all concurrently hitting the page and competing to be the first visit to cause the page to be cached for future visitors.

Obviously if you are logged in as admin when writing your post then this might not cause a cache in some systems. For example WP Super Cache has an option to prevent logged in users from seeing cached pages. Therefore if you are logged in when you write a post this automated HTTP request might not cache the page if you have set the option on NOT to serve cached pages for logged in users.

However if you are auto-blogging and importing posts from feeds using WP Robot, WP-O-Matic or any other system then you will not be logged in and this HTTP request should create a cached page for BOTS to hit.

As I said, it all depends on the caching system you are using and how you configure it.

Remember every caching system is different and it all depends on the options you configure in the admin area of your own particular caching system. So make sure it is configure correctly to allow for the first visit to any post to be cached.

So version 1.1.2 of Strictly-Tweetbot is now live and you can download it from my main site or from the Wordpress plugin repository whenever you want.

Tuesday 30 October 2012

New version of the SEO Twitter Hunter Application

Introducing  version 1.0.4 of the Twitter Hashtag Hunter Application

I have just released the latest version of the popular windows application that is used by SEO experts and tweeters in combination with my Strictly Tweetbot Wordpress plugin to find new @accounts and #hashtags to follow and use.

Version 1.0.4 of the Twitter HashTag Hunter application has the following features:
  • A progress bar to keep you informed of the applications progress in scanning.
  • More detailed error reporting including handling the fail whale e.g 503 service unavailable error.
  • More HTTP status code errors including 400, 404, 403 and the 420 Twitter Scan Rate exceeded limit.
  • Clickable URL's that open the relevant Twitter account or Hash Tag search in your browser.
  • Multiple checks to find the accounts follower numbers to try and future proof the application in case Twitter change their code again.
  • A new settings tab that controls your HTTP request behaviour.
  • The ability to add proxy server details e.g IP address and Port number to scan with.
  • The ability to change your user-agent as well as a random user-agent switcher that picks between multiple agent strings for each HTTP request when a blank user-agent is provided.
  • An HTTP time-out setting to control how long to wait for a response from the API.
  • A setting to specify a wait period in-between scans to prevent rate exceeded errors.
  • A setting to specify a wait period when a "Twitter scan rate exceeded" error does occur.
  • Extra error messages to explain the result of the scan and any problems with the requests or settings.
The main new feature of 1.0.4 is the new settings panel to control your scanning behaviour. This allows you to scan through a proxy server, specify a user-agent, set delay periods in-between scans and the "Twitter Scan Rate exceeded limit" error which occurs if you scan too much.

Changing the Scanner Settings

For Search Engine Optimisation (SEO) experts or just site owners wanting to find out who they should be following and which #hashtags they should be using in their tweets this application is a cheap and useful tool that helps get your social media campaign off the ground by utilising Twitters Search API.

You can download the application from the main website

Monday 22 October 2012

Fixing Postie the Wordpress Plugin for XSS Attacks that don't exist

Fixing the "Possible XSS attack - ignoring email" error message in Postie

As you may know if you read my earlier post on fixing the Wordpress plugin Postie when it wouldn't let me pass multiple categories in their various formats in the subject line a new version of Postie has come out since.

However I have been regularly noticing that emails that should be appearing on my Wordpress site when they are posted by email using Postie haven't been.

Today I looked into why and when I ran the manual process to load emails by pressing the "Run Postie" button I was met with an error message that said

possible XSS attack - ignoring email

I looked into the code and searched for the error message which is on line 38 of the file postie_getmail.php and it gets displayed when a Regular Expression runs that is supposed to detect XSS attacks.

The code is below

if (preg_match("/.*(script|onload|meta|base64).*/is", $email)) {
 echo "possible XSS attack - ignoring email\n";

I tested this was the problem by running the script manually in the config area of Postie and outputting the full email before the regular expression test.

As the email is base64 encoded (well mine is anyway) the full headers are shown at the top of the encoded email e.g

Received: from smtp-relay-2.myrelay (smtp-relay-2.myrelay [])
by (Postfix) with ESMTP id 8497724009C
for ; Mon, 22 Oct 2012 05:49:32 +0000 (UTC)
Received: from xxxxxxx (unknown [])
by smtp-relay-2.myrelay (Postfix) with ESMTP id 9E3B495733
for ; Mon, 22 Oct 2012 06:45:30 +0100 (BST)
MIME-Version: 1.0
Date: 22 Oct 2012 06:46:28 +0100
Subject: Subject: [My Subject Category1] [My Subject Category2] Title of Email
 October 2012
Content-Type: text/html; charset=utf-8
Content-Transfer-Encoding: base64
Message-Id: <20121022054530 .9e3b495733=".9e3b495733" smtp-relay-2.myrelay="smtp-relay-2.myrelay">


I've just shown a bit of the message which is base64 encoded.

As you can see if you do a search for one of the strings he is searching for as a word not in a scriptual context e.g base64 not base64("PHA+PHN0cm9"); 

The word base64 appears in the headers of the email e.g:

Content-Transfer-Encoding: base64

Therefore the regular expression test fails and Postie displays the "possible XSS attack - ignoring email" error message.

Therefore just doing a basic string search for these words:

script, onload, meta and base64

Will mean that you could find yourself having emails deleted and content not appearing on your Wordpress site when you expect it to. All due to this regular expression which will be popping up false positives for XSS attacks when none really exist.

Also these words could legitimately appear in your HTML content for any number of reasons and not just because they are used in the email headers so a better regular expression is required to check for XSS attacks.

How to fix this problem

You could either remove the word base64 from the regular expression or you could delete the whole test for the XSS attack.

However I went for keeping a test for XSS attacks but making sure they were checking more thoroughly for proper usage of the functions rather than simple tests for occurrences of the word.

The regular expression is more complicated but it covers more XSS attack vectors and I have tested it myself on my own site and it works fine.

You can replace the code that is causing the problem with the code below.

      echo "possible XSS attack - ignoring email\n";

Not only does this mean that it won't fall over when the words are mentioned in headers but it actually looks for the correct usage of the hack and not just the word appearing. E.G instead of looking for "script" it will look for

<script %3Cscript </script %3Cscript 

This includes not only the basic <script but also urlencoded brackets which are another common XSS attack vector. You could include other forms of encoding such as UTF-8 but it all depends on how complicated you want to make the test.

As you may know if you have read my blog for a long time hackers are always changing their methods and I have come across clever two stage SQL injection attacks which involve embedding an encoded hack string first and then a second attack that's role is to unencode the first injection and role out the hack.

The same can be done in XSS attacks, e.g encoding your hack so it's not visible and then decoding it before using an eval statement to execute it. However I am keeping things simple for now.

I have also added some more well known XSS attack vectors such as:

eval( , document. , .createElement and .cookie 

As you can see I have all prefixed or suffixed them with a bracket or dot which is how they would be used in JavaScript or PHP.

Notice however that I haven't prefixed createElement and cookie with the word document. This is because it is all too easy to do something like this:

var q=document,c=q.cookie;alert(c)

Which stores the document object in a variable called "q" and then uses that to access the cookie information. If you have a console just run that piece of JavaScript and you will see all your cookie information.

This regular expression still also tests for:

<script, base64(, <meta, onload= and onload =

but as you can see I have prefixed the words with angled brackets or suffixed them with rounded brackets, dots or equal signs (with or without a space).

This has solved the problem for me and kept in the XSS attack defence however if you are passing HTML emails containing JavaScript to your site just beware that if you use any of these functions they might be flagged up. 

I have tested each XSS attack vector but let me know of any problems with the regular expression.

Also I have removed the .* before and after the original test as it's not required. Also it just uses up memory as its looking for any character that may or may not be there and the longer the string it's searching the more memory it eats up.

I have updated my own version of this file and everything has gone onto the site fine since I have done so.

If anyone else is having problems with disappearing posts driven by Postie then this "might be the cause."

You can see my Wordpress response here: Wordpress - Postie Deletes Email but doesn't post

Monday 15 October 2012

Twitter changes their format for status feeds

Tweets not showing up? Twitter changes their feed format for status feeds

You may have noticed over the couple of weeks that the usual Tweets from my Twitter account @strictlytweets have not been showing in the right hand side bar.

This is because Twitter have changed the URL that you must request to get those tweets from.

To their new API which uses a secure URL (e.g from http to https)

Now if you have a callback function it would be called in the same way with a parameter passed in the URL for the name of the function to be called.

Although I am not sure whether Twitter have changed their name of their standard function which formats Tweets into links from twitterCallback to twitterCallback2 if you take a look at the one blogger uses you can see they are calling it twitterCallback2.

You can see their script here:

Therefore if you have a special function that you need to call to format your tweets in your own style, cache on your own server or scroll them you will still need to add the name of your callback function to the script source and make sure it accepts a JSON hash table of Tweet objects.

You will also need to reference this script before calling the new script URL that gets the statuses.

If you look at this blog you can see that the first script loads in bloggers own formatting of the Tweets in  through a call to blogger.js and the second one gets the new Twitter feed status format in the new URL format using JSON.

<script type="text/javascript" src=""></script>
<script type="text/javascript" src=""></script>

Twitter are constantly changing and like most sites are moving to secure URL's e.g https and using JSON instead of XML/RSS to return data to the user.

So if you have a Twitter account and want to show your own Tweets on your site and it hasn't been working lately this is the reason why.

Just switch over to the new https format and you will be okay as long as you also implement some form of caching to handle the rate limit which is only 150 requests per hour per IP address NOT per domain. Or if you authenticate each request with OAuth you can make 350 requests per hour.

As my IP address is also shared by a lot of other sites that use this feature the Tweets may still not appear until I come up with a form of caching (which I haven't got round to doing) but the logic would be something like this.
  • Make the request to the new API URL.
  • Format the HTML using your callback function.
  • Make an AJAX request to a server side script (PHP/ASP/.NET etc) that stores the formatted HTML as a text/html file.
  • If you get back a 400 Status Code (Bad Request) which is the code Twitter send you when you exceed their rate limit. Then you can display this HTML file instead.

You can read more about rate limits here.

Monday 24 September 2012

Using String Builders to speed up string concatenation

Using String Builders to speed up string concatenation

If you are using a modern proper language then a string builder object like the one in C# is a standard tool for adding strings without the overhead of concatenation which can be a performance killer.

The reason is simple.

When you do this

a = "hello I am Rob";

a = a + " and I would like to say thank you";

a = a + " and good night";

A lot of languages have to make a copy of the string built so far and store it in memory before creating the new string.

This means that the longer the string gets the more memory is used up as two copies have to be held at the same time before being joined together.

I have actually seen ASP classic sites crash with out of memory errors caused by people using string concatenation to build up large RSS feeds.

The reason I am mentioning this is because of a comment I was given about my popular HTML Encoder object that handles double encoding, numerical and entity encoding and decoding with partial and fully encoded strings.

I have updated the numEncode function after the comment from Alex Oss to use a simple string builder which in JavaScript is very simple.

You just have an empty array, push the new strings into it (at the end of the array) and then join it together at the end to get the full string out. You can see the new function below.

// Numerically encodes all unicode characters
numEncode : function(s){ 
 if(this.isEmpty(s)) return ""; 

 var a = [],
  l = s.length; 
 for (var i=0,len=l.length;i "~"){ 
   a.push(c.charCodeAt()); //numeric value of code point 
 return a.join("");  

You can download the latest version of my HTML Encoder Script for JavaScript here.

However in old languages like ASP classic you are stuck with either string concatenation or making your own string builder class.

I have made one which can be downloaded from my main website ASP String Builder Class.

You will notice that it ReDim's the array in chunks of 128 (which can be changed) and once 128 elements have been used it then ReDim's by another large chunk.

A counter is kept so we know how many element we actually have added and once we want to return the whole string we can either just RTRIM it (if we are joining with a blank space) or ReDim it back down to the right array size before joining it together.

This is just an example of how a string builder class is used and you could make a similar one in JavaScript that lets you access specific elements, the previous or next slot, update specific slots and set the delimiter like this ASP version.

Most modern languages have a String Builder Class but if you are using old languages or scripting languages like PHP or ASP classic then adding strings to an array before joining them together is the way to go for performances sake.

Friday 31 August 2012

New Version of the Twitter Hash Tag Hunter Application

Twitter HashTag Hunter Version 1.0.3 Released

Version 1.0.3 of the popular Twitter HashTag hunter Application has just been released.

This version has had an update to keep on top of the ever changing Twitter API and fixes a bug that was returning a follower count of 0 for some people.

You can find more out about the application over on my main site: Twitter HashTag Hunter Application.

Any existing users who want a new copy please email me with your reference/invoice number or details that prove that you bought the application previously and I will email you a new version.

Wednesday 29 August 2012

Shrinking an MS SQL Database MDF file after a TRUNCATE or big DELETE

Shrinking an MS SQL Database MDF file after a TRUNCATE or big DELETE

I am not a DBA and we don't have a dedicated one at our company therefore when I had a database file with a huge table containing 80 million rows of data taking up 35 GB I needed to remove it.

I re-planned my usage of the table and decided that it would only keep a days worth of data instead of every day (going back 3 years) and re-jigged some code about.

I then set up an nightly job with a TRUNCATE TABLE statement to remove the data quickly at midnight each day.

However just by doing this alone does not reduce the size of the database file (MDF) and you will have reserved and unused space that you may want to reclaim or get rid of.

The current sizes of the database file before running any clean ups was 35 GB and as I was using simple mode for the transaction log the size of that file was negligible. Therefore after much research I had to go through the following routine to reduce the size of the database file.

A lot of people will warn you not to Shrink a database file as it will only grow again and cause disk fragmentation which is correct however if you have just removed a large table like I have then this is one case where a DBCC SHRINKFILE command is useful.

Therefore this is the approach I followed this morning.

I first ran this SQL to find out current size of tables and indexes plus any reserved and unused space within each database table. I also list out the tables in order of reserved size to see which has the most to be reclaimed and whether or not it matched the table I thought was causing problems - which it did.

CREATE TABLE #t (name SYSNAME, rows CHAR(11), reserved VARCHAR(18), 
data VARCHAR(18), index_size VARCHAR(18), unused VARCHAR(18))

DECLARE @Sizes TABLE(name SYSNAME, rows int, reserved int, Data int, index_size int, unused int)

-- ensure correct values are being returned by using  @updateusage see
EXEC sp_msforeachtable 'INSERT INTO #t EXEC sp_spaceused ''?'', @updateusage = N''TRUE'';'

SELECT Name, Rows, CAST(SUBSTRING(Data, 1, LEN(Data)-3) as INT), CAST(SUBSTRING(Reserved, 1, LEN(Reserved)-3) as INT),
  CAST(SUBSTRING(Index_Size, 1, LEN(Index_Size)-3) as INT), CAST(SUBSTRING(Unused, 1, LEN(Unused)-3) as INT)


SELECT CAST(SUM(Data) as varchar)+' KB' as 'Data Size',
  CAST(SUM(Reserved) as varchar)+' KB' as 'Reserved Size',
  CAST(SUM(Index_Size) as varchar)+' KB' as 'Index Size',
  CAST(SUM(Unused) as varchar)+' KB' as 'Unused Size'
FROM @Sizes

FROM @Sizes

Then I ran this script that I found to find out if there is space that can be removed.

SELECT name ,size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS int)/128.0 AS AvailableSpaceInMB
FROM sys.database_files;

This showed me that there was indeed a good 30 GB or so that could be removed.

Once I had worked out the new size of the data file in megabytes that I wanted to reduce the MDF file of my database to I ceased all activity on the database to help speed up the process and prevent any errors.

I then ran the following command: DBCC SHRINKFILE (details here)

To find the logical file name that you need to shrink either an MDF (database) or LDF (log file) run this command.

sp_helpdb [database name]

Once everything was ready I ran the following command to shrink a database to 6 GB (you have to pass the value in as MB) which left some room for the database to grow into.

DBCC SHRINKFILE([database name], 6144);

After running the command I did not see an immediate difference when I checked the properties of the database in MS Management Studio.

Therefore I ran a job to ReIndex my tables and rebuild rebuild statistics.

If you don't already have a nightly/weekly MS Agent job set up to do this for you then the following script of mine might be useful for you: ReIndex or ReOrganize your MS SQL indexes.

This script will allow you to set a percentage e.g 30% and if the index is fragmented over that percentage then a full REBUILD of the Index is carried out otherwise a REORGANIZE is performed.

After the rebuilding is done a call to EXEC sp_updatestats is made to update the systems statistics.

If you are going to use this SQL code check that the parameters within the stored procedure are to your liking and turn the DEBUG flag on if you require it.

The script will show you the fragmentation level before and after the script has run so that you can see how much difference it has made.

I found that after doing this when I went to database > properties > files I could now see the new sizes.

Logical NameFile TypeFilegroupInitial SizeAutogrowthPathFile Name
MyDBRows DataPRIMARY6144By 1 MB, unrestricted growthC:\Program Files 9x86)\Microsoft SQL Server\MSSQL.1\MSSQL\DATAMyDB.mdf
MyDB_logLogNot Applicable7377By 10 Percent, restricted growthC:\Program Files 9x86)\Microsoft SQL Server\MSSQL.1\MSSQL\Data_logsMyDB_log.ldf

This clearly shows that my database MDF file has been reduced from 35 GB to 6 GB.

If anyone else has any comments or notes on best practise I would be glad to hear them.

Sunday 19 August 2012

PayPal and Instant Payment Notification (IPN) Messages

Blocking bad bots and issues with PayPal's IPN service

If you are someone like me who likes to ban as much bad BOT traffic as possible then you might have added a rule in your .htaccess or httpd.ini file that blocks blank user-agents. Something like this.

# Block blank or very short user-agents. If they cannot be bothered to tell me who they are or provide gibberish then they are not welcome!                                        
RewriteCond %{HTTP_USER_AGENT} ^(?:-?|[a-z1-9\-\_]{1,10})$ [NC]
RewriteRule .* - [F,L]

This is because many bad bots wont supply a user-agent because of the following reasons:
  1. They think websites will check for user-agent strings and by not supplying one they will slip under the radar.
  2. The people crawling are using simple methods e.g they have just grabbed bits of code off the web that don't actually set a user-agent and used them as is.
  3. They are using simple HTTP functions like file_get_contents(''); to get the HTTP response and not passing in any context data such as user-agent, timeouts, headers etc.
  4. They are making requests from websites that use CURL or WGet from places like WebMin to test Cron jobs or run crawl scripts and not supplying the user-agent parameters which are available.
Obviously it's always best to supply a user-agent as it identifies (or masks) your real code in any log file and prevents you getting blocked by people who ban blank user-agent strings.

However there come times when even big companies like PayPal send information without user-agent details.

One instance of this is the IPN notification messages they send (if set up) on sites that are using PayPal as their payment gateway.

When a customer makes a payment or cancels a subscription then PayPal will send a POST request to a special page on the website that then checks the validity of the request to make sure it comes from PayPal. If so it sends back the data along with a special key so that PayPal can confirm they made the request and the website owner is sure it's not a spoofer trying to get free items.

Once this handshake is done the relevant IPN data such as payment information can be obtained from the PayPal IPN Response.

However today I received an email in my inbox that said.

Dear Rob Reid,

Please check your server that handles PayPal Instant Payment Notifications (IPN). Instant Payment Notifications sent to the following URL(s) are failing:

If you do not recognize this URL, you may be using a service provider that is using IPN on your behalf. Please contact your service provider with the above information. If this problem continues, IPNs may be disabled for your account. 

Thank you for your prompt attention to this issue.

Yours sincerely, 


After checking my PayPal account and the IPN history page I could see that a message was stuck in the queue an being re-sent every so often:

18/08/2012 09:01 BST
Transaction made

Therefore I checked my website logs for any instance of this request and I found that they were all returning 403 / Forbidden status codes. - - [18/Aug/2012:08:54:47 +0000] "POST /?myIPN_paypal_handler HTTP/1.0" 403 417 "-" "-" 0/55005 - - [18/Aug/2012:09:27:11 +0000] "POST /?myIPN_paypal_handler HTTP/1.0" 403 417 "-" "-" 0/55010

As you can see - no user-agent!

Therefore I changed my .htaccess file to allow blank user-agents and re-sent the IPN message from the PayPal console and low and behold it was allowed.

Obviously if you are not using PayPal and their IPN system or not using any Payment service then you might consider to keep blocking blank agents due to the amount of bandwidth you will save (lots from my own stats).

Or you should check that payment systems own requests to your site to ensure you are not blocking them from doing anything important like sending payment information to your website.

Just be-warned in case you get a similar email from PayPal.

Saturday 28 July 2012

CTRL + ALT + DEL on Remote Desktop

Accessing the Task Manager through Remote Desktop - CTRL + ALT + DEL

Today I tried to access my work PC from home through Remote Desktop over a VPN.

However when I logged in the page was totally black.

No menus, no taskbar, nothing to press nothing to see.

Was the PC out of memory and couldn't show me anything or just bust. I needed the task manager on that PC to see what was going on.

To send CTRL - ALT - DEL over a remote desktop connection to get to your task manager, see running processes and open or close programs then you need to be able to send that command over the connection. And if you don't know the keystrokes it's not simple if you don't know it.

There are two ways to access this.
1. If you can see the task bar in the footer of your remote screen just right click on it and hit taskbar to open it up.
2. If you cannot see anything use the following combination of keys: CTRL + ALT + END. This will bring up the same screen as if you had hit CTRL + ALT + DEL on the computer.

You can choose to shut down, reboot or show the task manager.

Why it's those keystrokes I don't know as it's not intuitive but it works and it's good to remember and write down if you forget.

Re-Starting Windows Explorer Remotley

If your windows explorer has crashed or killing your remote computers memory or CPU you can kill the process in the task manager and restart it without rebooting.

Just follow this.

  1. Kill the run away windows process.
  2. Wait a few seconds.
  3. In task manager go to the "Applications" tab.
  4. In the bottom right corner select the "New Task" button.
  5. In the dialogue that opens up type "explorer" in the "open" text box. This will open a new instance of Windows Explorer.
Remember killing processes is not advisable as killing a process means the normal shut down functions of that application have not run and this could mean that files are left on your disk, keys in your registry and so on - so only do it if you really need to.

A clean up of your registry with a tool like CCleaner afterwards and regular de-fragmentation of your disk is advisable if you regularly use this method as it will keep your PC clean and fast.

Friday 27 July 2012

SQL Server - Find time of last database backup

Find the last time your Database was backed up

Quick one - wanting to find out the last time your DB was backed up?

This SQL Statement will show all database backup times on a server.

SELECT sdb.Name AS DatabaseName, 
       MAX(bus.backup_finish_date) AS LastBackUpTime
FROM sys.sysdatabases sdb
LEFT JOIN msdb.dbo.backupset bus 
    ON   bus.database_name =
GROUP BY sdb.Name

Also in SQL 2008 there are a load of good built in reports which you can access by

  • Selecting the database
  • Select Reports 
  • Select Standard Reports 
  • Select the report you want such as "Disk Usage" 
This shows you the disk usage and log file usage for your database in a nice pie chart as well as numerical terms.

If you didn't know about it, its a good little feature.

Tuesday 24 July 2012

Use Blogger to create blogs quickly and automate the content easily

Using Google's over

Whereas blogger used to be pretty basic since the new format of blogger arrived on the free blog scene it has been much easier and quicker to set up a blog than other free CMS systems and I would recommend using the new system over the free hosted site any day.

Obviously having your own server and your own code running on it is the best way to go but if you are trying to get a blog up quickly or want to automate it by sending posts by emails then I would say is better than because of its new simplistic design and ease of use and for a number of other reasons.

Whereas in Wordpress you have to use a plugin (which you cannot use on their free version) to post articles by email to the site e.g Postie in blogger it is built in.

As blogger only uses tags and not categories it is a shame they cannot either have an Automatic Tagging widget like my own or allow you to post tags in the subject line as you can when you send the email.

This is possible in the Wordpress plugin Postie (available for the non hosted version)  which needs a fix to get it working properly: Fixing Postie to use Categories

But as we are talking about the free blogging versions they both have downsides on the the automated posting front.

I have a number of horse racing sites that are on my own server and also use blogger e.g:

However before I created UK Horse Racing Star I tried to create a Wordpress free blog on their setup page to use their free domain e.g something like:

The registration process went through okay including double email confirmation but when I tried to login it either allowed me to login but not post any articles or change settings OR it just wouldn't let me login in the first place.

During sign up it didn't say "This user name or URL is unacceptable" or that "this goes against our Terms of Service" (for whatever reason), and it really should have done if certain words are unacceptable to their Terms. If they know a word is banned from their URL's then why not say so at registration time?

On one day I tried creating 4 sites all with different names and they were all blocked with no reason given either by email (when I tried complaining) or by their system. Not good service at all.

If you go to now and try and create a blog with the name ukhorseracingstar it will say "Sorry that site already exists" and it does - by me along with 3 other variations - although I cannot use any of them!

Therefore I tried the new blogger system.

I know this blog is using the old layout but I haven't got round to changing the styling -  I'm a developer not a designer or CSS guru!

However with the new blogger template and layout tools it was so easy to create a new site I had UK Horse Racing Star up in minutes with widgets down the side for links, RSS Feeds of racing results, banners, text and scripts.

It really was a very simple process to re-size uploaded images, change the background colour, add custom CSS and script.

Therefore I don't know if it's against Wordpress rules to have sites related to horse racing but if you have your own copy of the software you can do what you want with it (GPL and all that) so I have a Wordpress site hosted on my own server which is almost ready to rumble.

Also as won't even let you show Google AdSense on their free blog system OR it seems let you create sites with certain words in it like "horse" or "racing" (and whatever else is in their ban list) then I would recommend blogger for free blog hosting because you can show adverts (on non adult themed sites - e.g not gambling or porn sites - I know from my warning by Google over this so be warned)

Their new system is so simple and easy to use you can get a good looking site up in minutes and it took me less than 20 minutes to create both of these blogs: 

Whereas on's free blogging system it really is a very cut down version of their downloadable software and you cannot add your own images, css, scripts, widgets or plugins etc on you can do all of this and more. Hopefully they will be extending their widget selections in the future to add more features and drag n drop functionality to their system.

So there is my two pence on the versus debate.

Wednesday 4 July 2012

Quickly Randomise a number in SQL Server

How to get a true random number in SQL Server

We all know that the RAND() function in SQL Server doesn't generate a true random number and people have come up with many ways to randomise the "seed" that is passed into the RAND() function to make it more "random".

However a quick way in SQL 2005 + to get a random number or randomise a recordset is to use the NEWID() function which generates a GUID.

To randomise a recordset e.g to show a random sub-select of 5 out of 100 records you would just do this:

FROM   MyTable

You will see that on each execution a different recordset is returned.

How To Generate a single Random Number between 0 and 9 with NEWID()

If you want a single number e.g to do a simple test like@

IF @No < 5 
  Do X 
  Do Y

Then you could use the NEWID() function in conjunction with some casting and the varbinary function.

For example this code below will return a random number between 0 and 9.

DECLARE @RandomNumber INT

-- roll a dice to get a number between 0 and 9


I noticed that if you used a LEFT([code],1) instead of RIGHT([code],1) you would get a large sample of 1's in your result set which would skew the results whereas as the RIGHT function gives a more random sample.

Obviously you could change the second parameter of the RIGHT function to get longer digits e.g use RIGHT([CODE],2) to get a random number between 0 and 20.

There you go a simple way to randomise either a recordset or get a single random number.

Sunday 17 June 2012

Flash Crash Problems with Firefox 13.0.1 and Adobe Flash 11.3.300.257

Flash Crashing constantly in Firefox 13.0.1

All day I have been experiencing problems with Firefox and Flash.

Flash is notoriously buggy and crashes a lot which is why HTML5 offers so much hope for those of us wanting to watch movies online, on our phones or devices and Youtube style with the VIDEO tag and a wrapper iFrame.

I don't have a fix but I just wanted to put it out there that Flash version 11.3.300.257 is causing constant problems when used with Firefox 13.0.1.

A movie will start playing and then hang or the screen will go black before the "flash crash" message appears.

I am trying to watch some comedy central with another header plugin and up until today I have had no problems at all.

I don't know if this a problem with Flash or Firefox or a combination of both but it is really starting to piss me off. I have re-installed both Firefox and Flash multiple times.

I don't know why Adobe cannot get their act together with Flash as people are working their ass off to accommodate them and their notorious bugs, high CPU and memory leaks on devices whereas before they were just going to ditch Flash altogether and wait for HTML5 to spread.

If anyone else is having the same problem with Flash crashing let me know.

Saturday 16 June 2012

Forgotten your SSH Password?

How to quickly recover your SSH password

If you are like me your memory sometimes fails you and like today I tried to login with SSH into my server with Putty and couldn't remember my password. After about five attempts of all the ones I could remember I decided to give up incase I blocked myself.

Luckily I had added my IP into AllowHosts so I wouldn't be blocked by the DenyHosts settings which contain a long long list of IP addresses who have tried hacking my server but it was obviously doing my head in.

I then thought of a quick way of recovering it.

I could access WebMin in Chrome easily as my password was auto-filled stored but in Firefox it wasn't which had the web developer toolbar. Therefore as copying passwords doesn't work I couldn't use the web developer toolbar to show the password with a single button click.

Therefore in Chrome with the auto-filled password for my login page I did the following.

  • Hit F12 - this opens the developers console
  • Choose the Console tab
  • In the console write the necessary code to change the type of the input from "password" to "text"
  • If the password doesn't have an ID like VirtualMin then you can just access the first password in the form e.g:

document.forms[0].pass.type = 'text';

And if my magic your password field will change into a text input and you can retrieve your password.

And if your in any doubt about why you should leave your computers locked when you are away from your desk then this should be a reminder to you!

Tuesday 12 June 2012

New unobtrusive EU Cookie Compliance Code

Some people have said that my JavaScript lightbox method for EU Cookie compliance is a bit off putting for users as it takes them away from the site if they disagree.

Also at the last minute the EU has allowed compliance with their new cookie privacy rules through a much easier method.

This allows sites to show the users that cookies are being used with a link to the sites cookie policy and a button to hide the message in future. If they continue to use the site they have basically agreed to the use of cookies.

You may have noticed on sites like the BBC or Channel 4 etc they will show a little piece of text at the top of the page that slides out and tells the user that cookies are being used with a link to their cookie policy. They also have a button to set a cookie so the message isn't shown again.

This method is a lot less intrusive and doesn't take the user away from the site as there is no "agree" or "disagree" button.

Therefore for those of you not wanting to make use of my EU Cookie compliance code I have created a newer version that follows this new format which can be seen here.

It is up to you to create a cookie policy document that details how cookies can be viewed and disabled through various browsers and toolbars etc. Plus the text and styling is up to you but the example html page shows this new unobtrusive method in action.

The code makes use of jQuery to handle the animations so load in the source code from your usual repository e.g Google or jQuery e.g:
The actual JavaScript for the html test page is below.

// run immediatley - place in footer

 EUCookie = {
  confirmed : false,

  Confirm : function(e){   
   var self = this;
   // create cookie   
   // slide back in the cookie bar
     height: 0
   }, 300, function(){
    $("#cookieWarning").css("display", "none");
   return false;

  CheckEUCookie : function(){

   var self = this,
    val = self.ReadCookie("EUCookie");
   // if our cookie has been set
   if(typeof(val)!=undefined && val==1){   
    self.confirmed = true;
   return self.confirmed;

  CreateCookie : function(name,value,days) {
   if (days){
    var date = new Date();
    var expires = "; expires="+date.toGMTString();
    var expires = "";
   document.cookie = name+"="+escape(value)+expires+"; path=/";

  ReadCookie : function(name){
   var nameEQ = name + "=";
   var ca = document.cookie.split(';');
   for(var i=0;i < ca.length;i++) {
    var c = ca[i];
    while (c.charAt(0)==' ') c = c.substring(1,c.length);
    if (c.indexOf(nameEQ) == 0){
     var r = unescape(c.substring(nameEQ.length,c.length));    
     return r;
   return null;

 // add click event;
 // if no cookie set show form
  // cookie already set so hide box
  $("#cookieWarning").css("display", "none");
  // Show the form - set to zilch then slide out to the appropriate height
  document.getElementById("cookieWarning").style.display = "none";
  document.getElementById("cookieWarning").style.height = 0;
     height: 28
   }, 500, function(){
  }).css("display", "block");    

To view the code in action you can visit the demo page and download the html page and tweak it to your hearts desire.

Let me know what you think.

Saturday 2 June 2012

The Strictly iPhone Console - Debugging on iPhones

The Strictly iPhone Console - Increasing real debugging without simulators

Yesterday I blogged about debugging on iPhones and how useful the Debugger console was which showed up JavaScript errors and console messages without the need for an agent switcher. However I complained about the lack of ability to view the generated source code on the page when viewing on an iPhone.

However I remembered sometime back I blogged about some bookmarklets I was using that enabled me to view the generated source on older browsers like IE 6 and tonight I knocked together a little script that enables you to have basic "onscreen" debugging functionality when using an iPhone.

The code is some basic JavaScript that adds a DIV area to the bottom of the current page with two links at the top that allow you to view the generated and raw source code for the page. These open up in new windows and  if you have problems with them opening you might need to enable the popup window functionality in your iPhone Safari settings first.

Underneath is a basic console which due to the lack of scrollbar functionality on DIVs with overflow:auto or overflow:scroll I have created with a readonly textarea. If you need to scroll down the console you should use the two finger drag option to move the content within the textarea up or down.

I have also overwritten the window.console object so that the console.log function pipes out messages to this console if it's being used.

To Install the Strictly iPhone Console

1. On your iPhone visit this page and copy the JavaScript code from the area below into your clipboard. or vis

2 .Bookmark any page on your phone and then go into your bookmarks and edit it. Change the name to "Strictly Console" before pasting in the copied source code as the URL location for the bookmark.

3. Test that the code is working by going to a webpage on your iPhone and once the page has loaded open your bookmarks and select the "Strictly Console" bookmark. The console should appear at the bottom of the page. At the top of the console will be two links in a grey background "View Source" and "View Generated Source". Underneath will be the console area.

Clicking on either of those links will open up a new page with either the original source code or the generated source code from the current page. If nothing happens when you click the link check your Safari settings so that you allow Pop Ups. When you click the link it will ask you whether you want to open the pop up or not. Choosing Yes will show you the source in a new window.

4. To pipe debug out to the Strictly iPhone console window just use the standard console.log('hello'); function to do so.

As with all code that relies on DOM element make sure you check for the the existence of the console before trying to access it e.g a simple debug function could look like this:

function ShowDebug(m){

If you have problems copying and pasting from here (due to the crumy formatting of HTML in blogger) then you can download the compressed script from this location: iphoneconsolebookmark.js

var%20log=_d.createElement("div");log.setAttribute("id","logger");"visible";"block";"2147483647";"relative";"#fff";"1px%20solid";"98%";"0";"0";"3300px";"5px";"left";_d.getElementsByTagName("body")[0].appendChild(log);var%20link=_d.createElement("div");"100%";"lightgray";"navy";"bold";link.innerHTML="%3Ca%20onmouseout='\"navy\";'%20onmouseover='\"blue\";'%20style='text-decoration:none;'%20href='#'%20onclick='console.RawSource();return%20false;'%3EView%20Source%3C/a%3E%20|%20%3Ca%20onmouseout='\"navy\";'%20onmouseover='\"blue\";'%20style='text-decoration:none;'%20href='#'%20onclick='console.GenSource();return%20false;'%3EView%20Generated%20Source%3C/a%3E";log.appendChild(link);var txt=_d.createElement("textarea");txt.setAttribute("id","logwindow");txt.setAttribute("readonly","readonly");"100%";"300px";log.appendChild(txt);

Test the Strictly Debug Console now

You can try the Strictly iPhone debug console out on this page by clicking the following button which will add a new debug message to the console on each click. The console should already have 15 messages inside it which were added when the page loaded. Remember the console will be at the very bottom of the screen so scroll right down to see it.

Obviously this is a very basic iPhone Console and nothing like Firebug or Chromes inbuilt console but it could be easily expanded with a little work and by loading in the jQuery iphone library you could easily create a popup DOM inspector that was initialised by a long tap down event to show the current elements styling and positioning. Let me know what you think and if you amend it to add more features let me know so I can update the code here.

To read why a proper debugging console ON the device is required rather than a user-agent switcher then read this article I wrote about debugging on iPhones.

C# Betfair API Code for identifying WIN only markets

Identifying WIN only markets on the BETFAIR API

Updated 02-JUN-2012 

As it's Derby day again I ran into the same problem as last year with this article except that the wrong market identified as WIN ONLY was the FAV SP market. One in which you bet on the favourites starting price. Therefore I have updated the BOT code for identifying a market from the Betfair API from HorseName, Racedatetime and Course alone.

If you don't know I developed the website which allows members to access UK horse trainer information everyday about their runners.

As a side line I have also developed my own AutoBOT which uses the BETFAIR Free API to place bets automatically using my own ranking system which I have developed. You can follow my tips on Twitter at @HorseRaceInfo.

One of the problems I have come across during the development of my AutoBOT is that if you have the name of the Horse, Course and time of the race and want to find the Market ID that Betfair uses to identify each race there is a costly mistake that can occur due to all the various markets that are available.

I really got hooked on racing (not betting but actually watching horse racing) when I had a bet on Workforce in the 2010 Derby.

It came from the back of the field to storm past everyone else and won the Derby in record course time and in astonishing style.

Watching him apply the same tactics in the Prix de l'Arc de Triomphe to become the champion of Europe that same year installed the racing bug and then watching Frankel win the 2000 guineas this year in such amazing style has ensured that something I used to have no interest in watching whatsoever has become a TV channel turner.

Therefore when Frankel won the St Jame's Palace Stakes this year at Royal Ascot I was happy knowing that the AutoBOT I had written had placed a WIN bet on this horse early enough to get a decent price (for what was on offer for an almost 100% guaranteed win).

However when I found out that I had actually lost this bet that my BOT had placed I spent more than a few minutes scratching my head and cursing the PC I was sat in front of. However I found out that the actual market my application had put the bet on was a special WIN market in which the winner had to win by at least 4 clear lengths. Because Frankel had won by less than a length I had lost the bet. I wanted to know why.

I was annoyed.

I was quite pissed off actually and when I looked into it I found that to place a WIN only bet on the main WIN market in Betfair is quite a pain in the arse to achieve if you don't know the Market ID upfront as there is nothing in the compressed data that is given to you to identify that the market is the main WIN market and not some special market such as the one in which I lost that bet in.

Instead all you can do is run through a series of negative tests to ensure that the market is not a PLACE market, a Reverse Forecast or a Horse A versus Horse B market.

In fact since then I have found that there are so many possible markets it can be quite a nightmare to get the right one if you don't already have the Market ID.

For example today at 15:50 there was a race at Ascot, the Betfair Summer Double First Leg International Stakes that actually had alongside the usual markets a FIVE TO BE PLACED and TEN TO BE PLACED market. This was in a race with 23 runners!

The prices were obviously minimal and you would have had to of put down a tenner to win 70p on the favourite Hawkeythenoo but it meant that my original code to identify the main WIN market required updating as it was returning these new market ID's instead of the one that I wanted.

I have outputted the code for my Betfair API Unpack class below and this is just the part of my AutoBOT that returns a Market ID when provided with the compressed string of data that Betfair provides along with the Course name, the market type (WIN or PLACE) and the Race Date and Time.

You will see that I am using LINQ to filter out my data and I am using a custom function in my WHERE clause to return a match. It is this function that is the key as it has to check all the possible Betfair Market types to rule them out when looking for the main WIN market.

If you don't use C# then LINQ is one of the cool tools that makes it such a great language as it enables you to apply SQL like queries to any type of object that extends IEnumerable.

Obviously if you don't bet or don't use Betfair you might be wondering what the heck this has to interest you and you would be right apart from this bit of code being a nice example of how to use LINQ to return a custom list that can be iterated through like any array or list of objects.

Remember: Betfair may introduce even more markets in the future and if anyone knows of any markets I have missed then please let me know as I don't want to lose any more money by accident because of some weird market Betfair decides to trade on.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace BetfairUnpack
 // This is my object that holds data about the Betfair market
 public class MarketDataType
     public int marketId;
     public string marketName;
     public string marketType;
     public string marketStatus;
     public DateTime eventDate;
     public string menuPath;
     public string eventHeirachy;
     public int betDelay;
     public int exchangeId;
     public string countryCode;
     public DateTime lastRefresh;
     public int noOfRunners;
     public int noOfWinners;
     public double totalAmountMatched;
     public bool bspMarket;
     public bool turningInPlay;

 public class UnpackMarket

     // Use my own class amd make a list object we can loop through like an array
     public List<MarketDataType> marketData;

     private string BaseDateVal = "1/1/1970";
     private string ColonCode = "&%^@"; // The substitute code for "\:"
     private int DaylightSavings = 3600000;

// This method unpacks a compressed string and returns the correct MarketID filtering by the Course, Date and Market type
     public UnpackMarket(string MarketString, string racecourse, DateTime racedatetime, string marketType)

         string[] Mdata;

    // Betfair uses it's own format and we need to split on a colon
         Mdata = MarketString.Replace(@"\:", ColonCode).Split(':');

    // get our date and time
         DateTime BaseDate = Convert.ToDateTime(BaseDateVal);

         // if we are not currently in daylight savings then set that property to 0 so we get the correct time
    // I have had instances where the correct market is not returned due to Daylight savings time
         if (!DateTime.Now.IsDaylightSavingTime())
             DaylightSavings = 0;

    // Use LINQ on our IEnumerable object to query our list of markets filtering by our custom function MatchMarket
         IEnumerable<MarketDataType> queryMarkets =
             from m in Mdata
             where !String.IsNullOrEmpty(m)
             let field = m.Split('~')
             where (MatchMarket(field[5], BaseDate.AddMilliseconds(DaylightSavings + Convert.ToDouble(field[4])), field[1], racecourse, racedatetime, marketType))
             select new MarketDataType()
                 marketId = Convert.ToInt32(field[0]),
                 marketName = field[1].Replace(ColonCode, ":"),
                 marketType = field[2],
                 marketStatus = field[3],
                 eventDate = BaseDate.AddMilliseconds(DaylightSavings + Convert.ToDouble(field[4])),
                 menuPath = field[5].Replace(ColonCode, ":"),
                 eventHeirachy = field[6],
                 betDelay = Convert.ToInt32(field[7]),
                 exchangeId = Convert.ToInt32(field[8]),
                 countryCode = field[9],
                 lastRefresh = BaseDate.AddMilliseconds(DaylightSavings + Convert.ToDouble(field[10])),
                 noOfRunners = Convert.ToInt32(field[11]),
                 noOfWinners = Convert.ToInt32(field[12]),
                 totalAmountMatched = Convert.ToDouble(field[13]),
                 bspMarket = (field[14] == "Y"),
                 turningInPlay = (field[15] == "Y")

    // convert into a nice easy to iterate list
         marketData = queryMarkets.ToList();


// return a Market if the values provided match
     private bool MatchMarket(string menuPath, DateTime eventDate, string marketName, string racecourse, DateTime racedatetime, string marketType)
         bool success = false;

    // do some cleaning as Betfair's format isn't the prettiest!
         menuPath = menuPath.Replace(ColonCode, ":");
         marketName = marketName.Trim();

         // does the path contain the market abreviation - we keep a list of Courses and their Betfair abreviation code
         if (menuPath.Contains(racecourse))
             // check the date is also in the string
             string day = racedatetime.Day.ToString();
             string month = racedatetime.ToString("MMM");

             // we don't want 15:00 matching 17:15:00 so add :00 to the end of our time
             string time = racedatetime.ToString("HH:mm:ss");

             if (menuPath.Contains(day) && menuPath.Contains(month) && eventDate.ToString().Contains(time))
                 // if no bet type supplied returned all types
                 if (String.IsNullOrEmpty(marketType))
                     success = true;
                     if (marketType == "PLACE")
                         // place bet so look for the standard To Be Placed market (change if you want specific markets e.g the 10 Place market = 10 TBP
                         if (marketName.Contains("To Be Placed"))
                             return true;
                             return false;
                     // we can only identify the main WIN market by ruling out all other possibilities if Betfair adds new markets then this
  // can cost us some severe money!
                     else if (marketType == "WIN")                                                                                                                                                                                                                                                  
      // rule out all the various PLACE markets which seem to go up to ten horses! Just look for TBP e.g 10 TBP or 5 TBP
                         if (marketName.Contains("To Be Placed") || marketName.Contains("Place Market") || marketName.Contains(" TBP"))
                             return false;
                         // ignore forecast & reverse forecast and horseA v horseB markets                            
                         else if (marketName.Contains("forecast") || marketName.Contains("reverse") || marketName.Contains(" v ") || marketName.Contains("without ") || marketName.Contains("winning stall") || marketName.Contains(" vs ") || marketName.Contains(" rfc ") || marketName.Contains(" fc ") || marketName.Contains("less than") || marketName.Contains("more than") || marketName.Contains("lengths") || marketName.Contains("winning dist") || marketName.Contains("top jockey") || marketName.Contains("dist") || marketName.Contains("finish") || marketName.Contains("isp %") || marketName.Contains("irish") || marketName.Contains("french") || marketName.Contains("welsh") || marketName.Contains("australian") || marketName.Contains("italian") || marketName.Contains("winbsp") || marketName.Contains("fav sp") || marketName.Contains("the field"))
                             return false;
                             return true;
                         // I cannot match anything!
                         return false;


         return success;

Wednesday 30 May 2012

Overcoming the Demand 5 TV Catchup Performance Problem

Solving the Demand 5, Flash Movie problem - and other Catch Up TV website issues

If you are from the UK then you may often like to catch up on missed TV programmes by watching the online TV catch up services such as BBC iPlayer, Channel Four's 4OD and Channel 5's TV Catch Up service called Demand 5.

The past few nights I have been trying to catch up on a number of programmes shown on Channel 5 as for some reason Five Star has stopped showing the latest episode of Burn Notice on a Sunday afternoon and I like to watch other shows (when they are available) like NCIS and The Mentalist etc so I try to use Channel 5's catch up website Demand 5.

I really wish they would buy the rights to show for longer so that you could watch a programme for more than just 7 days but then it's not called catchup TV for nothing and most TV programmes are made in the USA.

However when I use the Demand 5 website to watch the programme I experience a very annoying problem which effects every browser and computer I tried using including Chrome, FireFox, Safari and IE 8 on a Sony Vaio Windows XP 32 bit Dual Core and IE 9 on a Dell Windows 7 64 bit Quad core.

The Demand 5 catchup TV problem means that pre-show adverts play fine and then after a period of anything between a few seconds and a couple of minutes of the actual show playing the Demand 5 Catchup TV programme would either just freeze up and not start playing again at all. 

Or it would play for a while before stopping for a few seconds as if it was buffering more content before playing for another few seconds again - totally unwatchable.

I couldn't even move the skip bar backwards or forward and the pause / play button just didn't work at all.

A refresh would reload the page, show the adverts before the programme again and then the problem would re-occur. It was all very annoying..

After a number of seemingly ignored comments to Demand 5 complaining about them not fixing the problem and after reading on almost every show on the site that a large number of other people have had similar problems I tried solving the issue myself.

If you go to any show on Demand 5 and read the comments at the bottom e.g you will see comments along the following lines.

"Really poor transmission of episode 23. Can't get more than 20 seconds of coherent dialogue and then it freezes"
"Impossible to watch because of the poor service from Demand Five's website. Even worse than Karen's experience."
"Why is demand 5 not working? It hasn't worked for several weeks. You don't get this with iplayer!!!"
"doesnt seem to be able to play anything"
"wow the longest i got to watch before it froze was 50sec. i did give up 5min in though. I'm with sue off to iplayer which works."
And those are just a snapshot of the many comments on a couple of shows I wanted to watch today all complaining about the Demand 5 website video freezing or not playing at all. 

Demand 5 really need to fix this problem if they want to keep visitors coming to the site.

Due to my own comments and complaints not being answered I tried fixing the problem with Demand Five freezing myself - only because I really wanted to watch the latest episode of NCIS.

I wasn't going to step through all their custom JavaScript and I have no idea what they are doing server side so proper debugging is out of the question but what you can do in situations like this is turn everything else off and then back on again one by one to see if any section of the browser or website is causing the problem.

So without knowing whether they are running some client side code constantly that rewrites the DOM such as many sites try to use to beat all the plugins that are designed to hide adverts, images and flash I could try to see if turning off those parts of the site helped as they maybe running code to constantly re-insert adverts into the HTML to replace any removed by plugins like AdBlocker.

If they were doing something like this (and I don't know if the are) then it's possible that a race condition might occur between Demand 5's own advertising code and any any plugins or virus checkers (which have built in anti-banner options) that constantly scan and rebuild the DOM.Only proper debugging will prove if that is happening,

Debgging the Demand 5 website Performance Problem

So to get a fix for the the Demand 5 problem as soon as possible the first thing I did were all the normal things that developers try when met with similar issues. If you find similar problems on other sites or with plugins for websites you should always try this list first to rule them out as reasons for the problem.
  • Disabling all cookies - the Web Developer Toolbar in Firefox is great for this and if you are not signing into the website there is no need for cookies to be enabled anyway.
  • Disabling Java - Most websites don't even use Java Applets any-more so its not really needed until you find a site that actually makes use of it.
  • Disabling JavaScript - Most websites that show TV content actually require JavaScript to be enabled for their site to work at all even though a basic Flash OBJECT or VIDEO element on a page outputted with server side code playing a movie doesn't require it. However companies like Demand 5 or iPlayer don't do this because it mean their content can easily be stolen or watched overseas by non UK citizens etc so JavaScript is actually required to load in the adverts and programming in chunks.
  • Turn off all Flash tracking by disabling Flash cookies. You do this by right clicking on a flash movie, click the Global Settings option and then choose "Block all sites from storing information on this". Unless you really want another way for advertisers and websites to track your online movements there is little need for this option to be enabled. You may want to check your video and microphone options as well.

Even after all this the Demand 5 flash videos were still having problems playing and although the options I just disabled are worth doing anyway to prevent online tracking by advertisers it didn't fix the Demand 5 TV catch up problem.

Flash is a well known CPU killer on websites and I have seen whole PC's crash due to one to many flash object on a web page being left open too long. Just open a page with a few flash movies and watch the CPU rise in your task manager if you want proof - Chrome seems particularly bad for this but despite this many sites continue to try and write their whole website in Flash which is a BAD IDEA!

However I did notice that the page that held the movie also contained a large number of other banner adverts that used both flash and animated gifs and I wondered whether there was some sort of problem occurring due to all the techniques advertisers now use to try and overcome advert blockers.

Basically websites or plugins use a timer to re-insert banners that have been removed from the HTML DOM by advert blocking plugins like AdBlocker which also uses a timer to remove any adverts re-inserted this way. Both sets of code doing the opposite to each other every split second - not good for performance!

Because both the advertiser and blocker are using the same methods to scan and modify the DOM constantly it basically turns it into one big performance nightmare in which your DOM is constantly being re-written on the fly. The less client code that runs the better KISS IT (Keep It Simple Stupid) for a multitude of reasons.

The Fix for the Demand 5 website Performance Problem

I opened Firefox and went to install a plugin I use on my other computer called Flash Blocker but I noticed a new plugin had been released called Image and Flash Blocker 0.7. I thought I would try it out.

At first I thought something hadn't loaded as I couldn't see any options under my Tools menu however you need to use the context menu (right click the mouse button) to use the add-on and see the various options.

Choose "Image and Flash Blocker" from the context menu and then select "Images off, Flash off" and hey presto most of the imagery, banner adverts, flash banners and movies disappear.In fact most of the Demand 5 page is blank without these features enabled.

The flash movies are replaced with a little red circle with a white "F" (for Flash) in the middle and if you want that particular flash movie to appear you just click it.

I turned every image and flash movie off on the latest episode of the show I wanted to watch e.g Burn Notice, NCIS, Archer, The Mentalist etc, and then I clicked the main movie screen to turn the video back on.

When the adverts had finished playing the TV show played without a single stall, stoppage or flicker. Hey presto problem solved! Big slap on the back for moi.

Remember - too much whizz bangery can cause performance issues

Remember images, Applets, Active-x objects, Flash and all the other fancy whizz bangery that comes with HTML 5, CSS 3 and modern JavaScript libraries is all well and good but more often that not it can cause a massive performance overhead on your computer. The best tactic is to turn it all off and then only turn on what you need once you know you need it and the clients browser supports it.

In the case of fixing Demand 5 catch-up TV it was definitely a case of less the better as it seemed that too much was going on behind the scenes to allow their TV shows to play smoothly. What exactly is happening I don't know without access to their source code but this is definitely a workaround for the Demand 5 performance problem that works!

Hopefully this blog article will help others overcome the same problem. I have tried writing comments with a link to this article on most of the shows I watch as they all contain similar complaints but for some reason they don't like the fact I put a link into my comment or try to help people solve their own technical issues.

So if you watch Demand 5 and have performance problems remember - turn off images and flash and then turn on the only flash movie you need - the TV programme you are trying to watch and then enjoy it without any flickering or stopping every 20 seconds.