Wednesday, 12 December 2012

HackBar Not Showing in Firefox

How to display the HackBar in Firefox

My preferred browser for the Internet is Google Chrome. I moved to Chrome because of it's speed and simplicity as well as the fact that like most people, when I got disillusioned with IE and moved to FireFox I installed so many plugins that it became so slow to load and has gone through periods of hanging and high CPU and memory usage.

Therefore I have decided to use FireFox for certain development and debugging when I want to use plugins. Plugins such as the Colour Picker ColorZilla, the Web Developer Toolbar, the Modify Header plugin or any number of others I have installed. I then keep Chrome plugin free to keep it fast for browsing.

I did try Chrome out with plugins when they first started supporting them but I soon decided I didn't want to turn Chrome into another FireFox by overloading it with plugin checking at start up which has happened with FireFox.

So Chrome is plugin free and fast, FireFox is useful and handy, full of plugins and tools and good for development and although IE 9 is fast and probably just as good nowadays I am guessing most developers won't go back to it after Microsoft taking numerous versions to just standardise their code and CSS after years of complaining from the masses.

One of the plugins I use on FireFox a lot is the HackBar.

Not only is it useful for loading up long URL's, quickly Base64 decoding or encoding strings as well as URL Encoding and Decoding but it has numerous other features if you want to test your site out for XSS or SQL injection vulnerabilities.

However I usually find I use it mostly for quickly encoding and decoding strings and sometimes I find myself having to put really long ones in the box and extending it so much the viewport becomes unusable. I then find myself disabling the add-on.

However on more than a few occasions now when I come to re-enable the HackBar by right clicking in the menu bar and ticking the option for HackBar OR using the "View > HackBar" menu option. I then find that it doesn't display as expected and hunting around for it is of no use at all.

You can try disabling it in the Add-ons section or even un-installing and re-installing it but even then it might not appear.

However I have found that a quick hit of the F9 key will make it show. Simples!

So if you are ever having issues trying to make toolbars or plugins show that you have de-activated try the F9 key first before anything else.

Saturday, 24 November 2012

Strictly AutoTags 2.0.8 Released

Strictly Auto Tags 2.0.8 Released


I have just released a new version of Strictly AutoTags (or Strictly Auto Tags) which includes extra SEO options which will enable you to now not only wrap certain tags in <strong> tags but also link them to their related tag page.

This feature is called Deep Linking and the aim is that you can let BOTS find extra content related to your article quickly from the post.

You can control the number of words per tag that get deep-linked as well as limiting the deep-linking to tags with a minimum number of posts e.g only deep-link tags with 10 posts associated with them.

The benefit of deeplinking when you are saving a post and not as other SEO plugins do on outputting it is that you speed up the display of the article.

Remember you only save a post one or a few times whilst it can be displayed thousands of times (with or without a caching plugin).

There are also options to clean out the SEO options e.g remove the deep-linking and bolding when you save a post so that every edit gets a new AutoTag scan.

If you want to check out a website that has been using the plugin for sometime then check out www.ukhorseracingtipster.com and see the tags that are displayed for each article.

If you like the plugin and make a donation then let me know and I will add a backlink from this site to yours so that you get some free advertising and some PR / Link Juice from Google.

Check out the main Wordpress plugin repository to download the latest version or visit the plugin page on my main site.

Strictly TweetBOT Version 1.1.2 Released

Version 1.1.2 of the Wordpress Plugin - Strictly TweetBot released

New features and bug fixes include:

  • Fixed bug with some words in the TweetShrink function.
  • Added new short versions of words for the TweetShrink function.
  • I have improved the performance of the loop that controls the tweets by only getting the permalink for the new post once and not on every loop iteration as before.
  • I have added an option to make an HTTP request to the new post before tweeting so that if caching plugins are installed and their caching options are enabled correctly the page will get cached before any Twitter rush from BOTS visiting the post. I have proved that at least 50+ (growing all the time) requests to any link in a new Twitter post can concurrently occur when a Tweet is posted.
  • I have added an option to use a different user-agent when making this HTTP request so that you can easily identify the request in your access log files for debugging etc.
  • I have also added a new configuration test to the "Config Test" button which tests the HTTP Cache test by posting the last article posted and returning a status code for it.
Caching new posts before any Twitter Rush

The main feature of this release is the ability to fire off an HTTP request to the post being Tweeted about before any Tweets are sent.

The idea is that if you are using any number of caching systems (properly configured obviously - and every caching system is different) the first page request will be from the TweetBot and hopefully cause that page to be cached by the system you are using e.g WP Super Cache, W3 Total Cache, Apache Caching modules etc.

The aim is that by caching the post before any Tweets are sent that any Twitter Rush that occurs when your link hits Twitter does not overload it by having 50+ BOTS all concurrently hitting the page and competing to be the first visit to cause the page to be cached for future visitors.

Obviously if you are logged in as admin when writing your post then this might not cause a cache in some systems. For example WP Super Cache has an option to prevent logged in users from seeing cached pages. Therefore if you are logged in when you write a post this automated HTTP request might not cache the page if you have set the option on NOT to serve cached pages for logged in users.

However if you are auto-blogging and importing posts from feeds using WP Robot, WP-O-Matic or any other system then you will not be logged in and this HTTP request should create a cached page for BOTS to hit.

As I said, it all depends on the caching system you are using and how you configure it.

Remember every caching system is different and it all depends on the options you configure in the admin area of your own particular caching system. So make sure it is configure correctly to allow for the first visit to any post to be cached.

So version 1.1.2 of Strictly-Tweetbot is now live and you can download it from my main site or from the Wordpress plugin repository whenever you want.

Tuesday, 30 October 2012

New version of the SEO Twitter Hunter Application

Introducing  version 1.0.4 of the Twitter Hashtag Hunter Application

I have just released the latest version of the popular windows application that is used by SEO experts and tweeters in combination with my Strictly Tweetbot Wordpress plugin to find new @accounts and #hashtags to follow and use.

Version 1.0.4 of the Twitter HashTag Hunter application has the following features:
  • A progress bar to keep you informed of the applications progress in scanning.
  • More detailed error reporting including handling the fail whale e.g 503 service unavailable error.
  • More HTTP status code errors including 400, 404, 403 and the 420 Twitter Scan Rate exceeded limit.
  • Clickable URL's that open the relevant Twitter account or Hash Tag search in your browser.
  • Multiple checks to find the accounts follower numbers to try and future proof the application in case Twitter change their code again.
  • A new settings tab that controls your HTTP request behaviour.
  • The ability to add proxy server details e.g IP address and Port number to scan with.
  • The ability to change your user-agent as well as a random user-agent switcher that picks between multiple agent strings for each HTTP request when a blank user-agent is provided.
  • An HTTP time-out setting to control how long to wait for a response from the API.
  • A setting to specify a wait period in-between scans to prevent rate exceeded errors.
  • A setting to specify a wait period when a "Twitter scan rate exceeded" error does occur.
  • Extra error messages to explain the result of the scan and any problems with the requests or settings.
The main new feature of 1.0.4 is the new settings panel to control your scanning behaviour. This allows you to scan through a proxy server, specify a user-agent, set delay periods in-between scans and the "Twitter Scan Rate exceeded limit" error which occurs if you scan too much.

Changing the Scanner Settings

For Search Engine Optimisation (SEO) experts or just site owners wanting to find out who they should be following and which #hashtags they should be using in their tweets this application is a cheap and useful tool that helps get your social media campaign off the ground by utilising Twitters Search API.

You can download the application from the main website www.strictly-software.com.

Monday, 22 October 2012

Fixing Postie the Wordpress Plugin for XSS Attacks that don't exist

Fixing the "Possible XSS attack - ignoring email" error message in Postie

As you may know if you read my earlier post on fixing the Wordpress plugin Postie when it wouldn't let me pass multiple categories in their various formats in the subject line a new version of Postie has come out since.

However I have been regularly noticing that emails that should be appearing on my Wordpress site when they are posted by email using Postie haven't been.

Today I looked into why and when I ran the manual process to load emails by pressing the "Run Postie" button I was met with an error message that said

possible XSS attack - ignoring email

I looked into the code and searched for the error message which is on line 38 of the file postie_getmail.php and it gets displayed when a Regular Expression runs that is supposed to detect XSS attacks.

The code is below

if (preg_match("/.*(script|onload|meta|base64).*/is", $email)) {
 echo "possible XSS attack - ignoring email\n";
 continue;
}

I tested this was the problem by running the script manually in the config area of Postie and outputting the full email before the regular expression test.

As the email is base64 encoded (well mine is anyway) the full headers are shown at the top of the encoded email e.g

Return-Path:
X-Original-To: xx12autopost230.sitename@domain-name.com
Delivered-To: xx12autopost230.sitename@domain-name.com
Received: from smtp-relay-2.myrelay (smtp-relay-2.myrelay [111.11.3.197])
by domain-name.com (Postfix) with ESMTP id 8497724009C
for ; Mon, 22 Oct 2012 05:49:32 +0000 (UTC)
Received: from xxxxxxx (unknown [11.1.1.1])
by smtp-relay-2.myrelay (Postfix) with ESMTP id 9E3B495733
for ; Mon, 22 Oct 2012 06:45:30 +0100 (BST)
MIME-Version: 1.0
From: admin@sitename.com
To: xx12autopost230.sitename@domain-name.com
Date: 22 Oct 2012 06:46:28 +0100
Subject: Subject: [My Subject Category1] [My Subject Category2] Title of Email
 October 2012
Content-Type: text/html; charset=utf-8
Content-Transfer-Encoding: base64
Message-Id: <20121022054530 .9e3b495733=".9e3b495733" smtp-relay-2.myrelay="smtp-relay-2.myrelay">

PHA+PHN0cm9uZz5ZZXN0ZXJkYXlzIG1lbWJlcnMgaGFkIGFjY2VzcyB0byA0NSB0aXBzIGFj
cm9zcyA4IGRpZmZlcmVudCBzeXN0ZW1zLjwvc3Ryb25nPjwvcD48cD5JZiB5b3UgaGFkIHBs
YWNlZCBhIEJldGZhaXIgbWluaW11bSBiZXQgb2YgJnBvdW5kOzIuMDAgb24gZWFjaCBiZXQg
PHN0cm9uZz5vbiBFVkVSWSBzeXN0ZW08L3N0cm9uZz4gKExheXMsIFBsYWNlIERvdWJsZXMs
IFdpbnMgZXRjKSB0aGF0IGhhZCBhbiBTUCBsZXNzIHRoYW4gMTkvMSBhdCB0aGUgdGltZSBJ

I've just shown a bit of the message which is base64 encoded.

As you can see if you do a search for one of the strings he is searching for as a word not in a scriptual context e.g base64 not base64("PHA+PHN0cm9"); 

The word base64 appears in the headers of the email e.g:

Content-Transfer-Encoding: base64

Therefore the regular expression test fails and Postie displays the "possible XSS attack - ignoring email" error message.

Therefore just doing a basic string search for these words:

script, onload, meta and base64

Will mean that you could find yourself having emails deleted and content not appearing on your Wordpress site when you expect it to. All due to this regular expression which will be popping up false positives for XSS attacks when none really exist.

Also these words could legitimately appear in your HTML content for any number of reasons and not just because they are used in the email headers so a better regular expression is required to check for XSS attacks.

How to fix this problem

You could either remove the word base64 from the regular expression or you could delete the whole test for the XSS attack.

However I went for keeping a test for XSS attacks but making sure they were checking more thoroughly for proper usage of the functions rather than simple tests for occurrences of the word.

The regular expression is more complicated but it covers more XSS attack vectors and I have tested it myself on my own site and it works fine.

You can replace the code that is causing the problem with the code below.


if(preg_match("@((%3C|<)/?script|<meta|document\.|\.cookie|\.createElement|onload\s*=|(eval|base64)\()@is",$email)){
      echo "possible XSS attack - ignoring email\n";
      continue;
}

Not only does this mean that it won't fall over when the words are mentioned in headers but it actually looks for the correct usage of the hack and not just the word appearing. E.G instead of looking for "script" it will look for

<script %3Cscript </script %3Cscript 

This includes not only the basic <script but also urlencoded brackets which are another common XSS attack vector. You could include other forms of encoding such as UTF-8 but it all depends on how complicated you want to make the test.

As you may know if you have read my blog for a long time hackers are always changing their methods and I have come across clever two stage SQL injection attacks which involve embedding an encoded hack string first and then a second attack that's role is to unencode the first injection and role out the hack.

The same can be done in XSS attacks, e.g encoding your hack so it's not visible and then decoding it before using an eval statement to execute it. However I am keeping things simple for now.

I have also added some more well known XSS attack vectors such as:

eval( , document. , .createElement and .cookie 

As you can see I have all prefixed or suffixed them with a bracket or dot which is how they would be used in JavaScript or PHP.

Notice however that I haven't prefixed createElement and cookie with the word document. This is because it is all too easy to do something like this:

var q=document,c=q.cookie;alert(c)

Which stores the document object in a variable called "q" and then uses that to access the cookie information. If you have a console just run that piece of JavaScript and you will see all your cookie information.

This regular expression still also tests for:

<script, base64(, <meta, onload= and onload =

but as you can see I have prefixed the words with angled brackets or suffixed them with rounded brackets, dots or equal signs (with or without a space).

This has solved the problem for me and kept in the XSS attack defence however if you are passing HTML emails containing JavaScript to your site just beware that if you use any of these functions they might be flagged up. 

I have tested each XSS attack vector but let me know of any problems with the regular expression.

Also I have removed the .* before and after the original test as it's not required. Also it just uses up memory as its looking for any character that may or may not be there and the longer the string it's searching the more memory it eats up.

I have updated my own version of this file and everything has gone onto the site fine since I have done so.

If anyone else is having problems with disappearing posts driven by Postie then this "might be the cause."

You can see my Wordpress response here: Wordpress - Postie Deletes Email but doesn't post

Monday, 15 October 2012

Twitter changes their format for status feeds

Tweets not showing up? Twitter changes their feed format for status feeds

You may have noticed over the couple of weeks that the usual Tweets from my Twitter account @strictlytweets have not been showing in the right hand side bar.

This is because Twitter have changed the URL that you must request to get those tweets from.

http://twitter.com/statuses/user_timeline/strictlytweets.json?count=10

To their new API which uses a secure URL (e.g from http to https)

https://api.twitter.com/1/statuses/user_timeline.json?screen_name=strictlytweets&count=10


Now if you have a callback function it would be called in the same way with a parameter passed in the URL for the name of the function to be called.

Although I am not sure whether Twitter have changed their name of their standard function which formats Tweets into links from twitterCallback to twitterCallback2 if you take a look at the one blogger uses you can see they are calling it twitterCallback2.

You can see their script here:

http://twitter.com/javascripts/blogger.js

Therefore if you have a special function that you need to call to format your tweets in your own style, cache on your own server or scroll them you will still need to add the name of your callback function to the script source and make sure it accepts a JSON hash table of Tweet objects.

You will also need to reference this script before calling the new script URL that gets the statuses.

If you look at this blog you can see that the first script loads in bloggers own formatting of the Tweets in  through a call to blogger.js and the second one gets the new Twitter feed status format in the new URL format using JSON.


<script type="text/javascript" src="http://twitter.com/javascripts/blogger.js"></script>
<script type="text/javascript" src="https://api.twitter.com/1/statuses/user_timeline.json?screen_name=strictlytweets&callback=twitterCallback2&count=6"></script>


Twitter are constantly changing and like most sites are moving to secure URL's e.g https and using JSON instead of XML/RSS to return data to the user.

So if you have a Twitter account and want to show your own Tweets on your site and it hasn't been working lately this is the reason why.

Just switch over to the new https format and you will be okay as long as you also implement some form of caching to handle the rate limit which is only 150 requests per hour per IP address NOT per domain. Or if you authenticate each request with OAuth you can make 350 requests per hour.

As my IP address is also shared by a lot of other sites that use this feature the Tweets may still not appear until I come up with a form of caching (which I haven't got round to doing) but the logic would be something like this.
  • Make the request to the new API URL.
  • Format the HTML using your callback function.
  • Make an AJAX request to a server side script (PHP/ASP/.NET etc) that stores the formatted HTML as a text/html file.
  • If you get back a 400 Status Code (Bad Request) which is the code Twitter send you when you exceed their rate limit. Then you can display this HTML file instead.


You can read more about rate limits here.

Monday, 24 September 2012

Using String Builders to speed up string concatenation

Using String Builders to speed up string concatenation

If you are using a modern proper language then a string builder object like the one in C# is a standard tool for adding strings without the overhead of concatenation which can be a performance killer.

The reason is simple.

When you do this

a = "hello I am Rob";

a = a + " and I would like to say thank you";

a = a + " and good night";


A lot of languages have to make a copy of the string built so far and store it in memory before creating the new string.

This means that the longer the string gets the more memory is used up as two copies have to be held at the same time before being joined together.

I have actually seen ASP classic sites crash with out of memory errors caused by people using string concatenation to build up large RSS feeds.

The reason I am mentioning this is because of a comment I was given about my popular HTML Encoder object that handles double encoding, numerical and entity encoding and decoding with partial and fully encoded strings.

I have updated the numEncode function after the comment from Alex Oss to use a simple string builder which in JavaScript is very simple.

You just have an empty array, push the new strings into it (at the end of the array) and then join it together at the end to get the full string out. You can see the new function below.


// Numerically encodes all unicode characters
numEncode : function(s){ 
 if(this.isEmpty(s)) return ""; 

 var a = [],
  l = s.length; 
 
 for (var i=0,len=l.length;i "~"){ 
   a.push("&#"); 
   a.push(c.charCodeAt()); //numeric value of code point 
   a.push(";"); 
  }else{ 
   a.push(c); 
  } 
 } 
 
 return a.join("");  
}, 

You can download the latest version of my HTML Encoder Script for JavaScript here.

However in old languages like ASP classic you are stuck with either string concatenation or making your own string builder class.

I have made one which can be downloaded from my main website ASP String Builder Class.

You will notice that it ReDim's the array in chunks of 128 (which can be changed) and once 128 elements have been used it then ReDim's by another large chunk.

A counter is kept so we know how many element we actually have added and once we want to return the whole string we can either just RTRIM it (if we are joining with a blank space) or ReDim it back down to the right array size before joining it together.

This is just an example of how a string builder class is used and you could make a similar one in JavaScript that lets you access specific elements, the previous or next slot, update specific slots and set the delimiter like this ASP version.

Most modern languages have a String Builder Class but if you are using old languages or scripting languages like PHP or ASP classic then adding strings to an array before joining them together is the way to go for performances sake.