Thursday, 20 February 2014

Quickly Generate A Random Number Between 1 and 9

Quickly Generate A Random Number Between 1 and 9

By Strictly-Software

As you should know by now the RAND() function in SQL has the downside that it will return the same value if the same seed is provided to it.

E.G if I call this code multiple times I get the same number for each result:

SELECT RAND(1) AS [RandomNumber]

This always returns 0.713591993212924 for me.

Therefore when people use RAND() they often end up coming up with various ways to generate a unique seed value as well.

One of the quickest ways I have found to return a random number is by using the NEWID() function which generates a random GUID.

If I wanted to return a random record from a recordset I would use the NEWID() function in my ORDER BY statement. 

I use this a lot in my Jobboard code when I want to return a random banner from a banner group like so:


SELECT TOP 1 BannerPK, BannerURL, RedirectURL
FROM   BANNERS
ORDER BY NEWID()


However if I wanted to return a single number between 1 and 9 I could use the following code.

This basically converts the GUID to a VARBINARY value before converting it to a VARCHAR and then taking off the the first character with a LEFT statement.

I have found that this gives a larger proportion of 1's than any other number but it is useful if you want a quick way in SQL to generate a single number (or more if you extend the numbers returned with a string function) e.g LEFT(val,2) or use a MID or RIGHT. Then this code could do the trick.


DECLARE @RANDNO INT
SELECT @RANDOMNO = CAST(LEFT(CAST(ABS(CAST(CAST(NEWID() AS VARBINARY) AS INT)) AS VARCHAR),1) AS INT)

Tuesday, 18 February 2014

A WordPress Performance Killer - Automatic OPTIMIZE or REPAIR on big Databases

A WordPress Performance Killer - Automatic OPTIMIZE or REPAIR on big Databases

By Strictly-Software

I used to use the WordPress plugin WP-DBManager. I still do for the automatic backups (when they work), I have found on one site the scheduling just doesn't work so I have to do it by hand. No difference between that site and another that works but hey that's WordPress!

However one thing I found, especially once I reached 20k+ articles is that my scheduled OPTIMIZE / REPAIR jobs would totally kill my sites every Friday for an hour or so.

It took me a while to find the cause but when I used the Server Monitor in Navicat and saw that both my Databases were doing a REPAIR by SORT at the same time, I soon realised it was my scheduled job in WP-DBManager to REPAIR / OPTMIZE every Friday that was the culprit.

With MyISAM tables (the default storage engine for WordPress), an OPTIMIZE or REPAIR job will take the whole table out of action as the process is carried out. The table is locked as the sorting and repairing is done which means no other users or processes can access it.

Unlike MS SQL which lets you De-Frag a tables index whilst connections to it carry on MyISAM just locks up until the job is complete.

Symptoms


  • Your website is hanging for ages, or timing out. If you are using Cloudflare you might be lucky enough to see cached pages either that a 503 service unavailable.
  • Running a TOP command will show MySQL process consuming 90-100% of your CPU.
  • Checking your Server Monitor in Navicat shows a SORT going on with a long execution time. The more pages and posts the longer it will take.


Risks


  • If you have an scheduled job to REPAIR / OPTIMIZE your tables, either through a plugin like WP-DBManager then you could suffer this problem.
  • The type and size of table matters. If it is using MyISAM it will be locked until the job is over. 
  • The bigger the table e.g WP-Posts the longer it will take. 
  • The more records the longer it will take. If the table is used a lot then it will be crushed until the job is over.


Solutions


  • I turned off the automatic OPTIMIZE and REPAIR jobs on WP-DBManager.
  • If I have scheduled downtime then I will make use of it whilst the werbserver is offline to carry out DB maintanance.
  • If you are clever enough (and I don't know if MySQL supports SYNONYMS like MS SQL) but you could create copies of your existing tables with new names, let your system carry on using the old tables whilst your OPTIMIZE / REPAIR the new ones. Once finished toggle the SYNONYM back to the newly formatted tables. A SYNONYM would be perfect for this but you could write SQL to create a table, insert records, repair and then re-name. Maybe a plugin could do this (ideas for developers? I would be willing to write one for ££)
  • I know use my own Strictly System Check plugin instead to handle automatic REPAIRS / OPTIMIZE as it will only carry them out if the system is already suffering. My plugin checks the server load, the time it took to load a page, the number of connections and so on. If the plugin determines that the system is having trouble you can set it to OPTIMIZE or REPAIR (after a check to see if one is needed). This way you are only fixing something that needs it rather than causing havoc for no need.


It is something to be on the lookout for. If your site suddenly acts slow with high MySQL CPU use then a scheduled REPAIR might be the cause.

If this happens it is best to turn off APACHE, let the REPAIR do it's job and then turn the webserver back on.


Thursday, 6 February 2014

Introducing a new version of Strictly AutoTags 2.9.1

Introducing a new version of Strictly AutoTags 2.9.1

As you should know by now I don't really like open source development.

When you have put literally hundreds of hours into a plugin such as I have with Strictly AutoTags it is a bit dismaying when you visit the Wordpress site and see that you have 180,000+ downloads (not millions but more than a few thousand), pretty good ratings, good reviews on SEO sites and other places Google returns from a search on the plugins name. Yet hardly anyone has bothered donating me any money. Just imagine if everyone of them had donated me a single £1 for my hard work?

Believe it or not it takes a lot of my spare time to keep a plugin like this maintained and working.

Having to keep it up to date with changes Wordpress make to their core code as well as adding new features that often involve complex regular expressions and logic to get right.

Is £1 a lot to ask from someone who finds a plugin such as Strictly AutoTags useful?

I mean I find it very useful on my own sites.

Not only does it automatically add relevant tags to news stories but it does so without having to rely on 3rd party API's that contain lookup databases of possible tags.

If a news story about me came out tomorrow and I was mentioned a few times in the article the plugin would pick my name up and add it to the sites list of tags without having to wait for someone else to decide I am important enough to be worthy of such a tag.

I have a couple of systems that automatically import news feeds into the site and then tweet out the contents to various Twitter accounts using another one of my plugins Strictly TweetBot.

It analyses the content of the article and can change the format of the tweet or the account it sends it to depending on what it finds. Having Strictly AutoTags sit in the middle of this process not only ensures I don't have to manually edit each article to find relevant tags but it also provides them to Strictly TweetBot be used as #HashTags when I tweet about the news story.

Also being able to set the plugin up to discover certain words and then tag the article with a totally different word is very useful for news based sites.

If I have a news story about Edward Snowden or the NSA Prism program I want to tag the article with the word "Privacy" or "Whistle-blower". The paid for version of the plugin allows such logic.

So because of the lack of enthusiastic donations I have split my code in two and kept a paid for only version of my plugin for those people who actually find tools like this useful.

I don't really like the idea of having to keep two lots of code up to date but if a £1 donation is too much from everyone of the 150,000 people who have used it then I have to sell the better version to get some sort of recompense. Otherwise I might as well stop putting the code online and just use it on my own sites.



Latest Features of 2.9.1 Include:


  • The ability to remove basic formatting from the post. Ideal for auto-bloggers who import from feeds containing , , , tags that could cause formatting issues on your site.
  • Better regular expressions, more checks and the ability to bulk re-tag and link with all key attributes (alt,title,src,href,data-blah) including shortcodes [youtube url] saved before re-formatting to keep the original HTML in tact.
  • New functions to find tricky words like al-Nusra Front, World War II or 2,000 Guineas.
  • Bulk update to remove all HTML the plugin adds.
  • Option to turn the plugin into Clean Edit Mode so you can edit individual articles to remove any HTML you don't like.
  • New functions to turn text like http://www.strictly-software.com and www.strictly-software.com into clickable links e.g strictly-software.com.
  • Set a minimum length of characters for a tag to be used.
  • Set the maximum number of words allowed in a tag.
  • Set equivalent words for tags e.g search for Edward Snowden, NSA or PRISM and use the tag Internet Privacy.
  • Bold and deeplink common used tags to create great SEO for Googlebot and BING to crawl.
  • Set the minimum number of tags that must be stored against articles before they can be used in any deep linking.
  • Set the maximum number of tags in an article to deeplink.
  • Set the maximum number of tags to place against an article.
  • Create a list of case sensitive and case insensitive "noise" words to ignore and never use as tags.
  • Use the new finished_doing_tagging hook which is available for other plugin authors to use so that actions only occur after tagging is done on a post. For example my other plugin Strictly Tweetbot uses this hook so that tags are available to be used as #hashtags in tweets.
  • All content already inside special HTML tags title, src, href, alt and new data-blah or data-blah-blah attributes and Wordpress shortcodes e.g [youtube url] are stored in memory to prevent bolding or deeplinking within other tags or attributes.







You can buy the latest edition of the plugin OR download the latest free version from my own webpage at www.strictly-software.com



Wednesday, 15 January 2014

Nightmare with Wordpress, Crashed Corrupt Table and Fixing The Problem

Nightmare with Wordpress, Crashed Corrupt Table and Fixing The Problem

By Strictly Software

This has been a real nightmare day for me!

You can read all about it on the Wordpress forum here: Suddenly Templates Not Working. (not that I actually got any help from anyone!)

I first noticed I had a problem on one of my Wordpress sites when I went to one of my pages that uses a custom page template and saw that it was totally blank. It should have been showing a feed with a Twitter widget in the sidebar and special content in the middle.

I edited the page and saw that the template was set to "Default Template". I set it to the correct template and tried saving it only for the page to reload with the Default Template still selected!

I turned on the WP_DEBUG constant and was met with a load of MySQL errors that all indicated that my wp_postmeta table was corrupt e.g

WordPress database error: [Incorrect file format 'wp_postmeta']
INSERT INTO wp_postmeta (post_id,meta_key,meta_value) VALUES (7381,'_edit_last','1')

I then ran a REPAIR (FULL) on the table only for it come back ASAP with error messages such as:

.wp_postmeta check Error Incorrect file format 'wp_postmeta'
.wp_postmeta check error Corrupt

If a REPAIR or OPTIMIZE wouldn't fix it I knew I was in the shit!

I tried searching the web but I couldn't find anything of use.

I was using the WP-DBManager plugin to create backups for me on this site but for some reason it had stopped working after my recent upgrade to WP 3.8. By the way, the amount of problems I have had since upgrading to this version would fill a book by now!

Anyway the last manual database update was done just before Christmas so it wasn't too far back and anyway the table only holds SEO META, Flags set by plugins and mostly guff that you don't really need anyway.

All my tags, categories, pictures and so on were still the site so it wasn't too major a problem except I hate BUGS! Especially ones I can't fix!

Therefore I created a new empty table with the same structure, indexes and settings as the existing wp_postmeta table and called it wp_postmetaNEW.

I turned off APACHE and then dropped the old table. I renamed my new table to wp_postmeta (I could have truncated the old one by the way) and checked it was working.

I then restored my old backup to a new database on the server called RESTORE.

I checked the wp_postmeta table wasn't corrupt first and then ran a statement (whilst logged in as root to Navicat) to copy across all the meta data I did have from the old database to my new table e.g

INSERT INTO MYDB.wp_postmeta
(post_id,meta_key,meta_value)
SELECT post_id,meta_key,meta_value
FROM RESTORE.wp_postmeta

This took a long time!

Especially for only 94,310 records.

I was waiting over 30 minutes with the server monitor ticking away saying it was repairing the table.

Then I noticed in my Putty console with a TOP command that no MYSQL process was actually running!

This made me think the process had crashed and NAVICAT wasn't telling me the truth.

By the way I have had this happen many a time whilst doing REPAIRS, waiting for the NAVICAT window to give me the results when I notice no MYSQL process is running and that the actual job had finished ages ago.

So I killed the process and checked how many rows were in the table - none. I did another REPAIR and it said the table was CRASHED again!

By this time I was so pissed off and wanting to go home I just truncated the table and turned APACHE back on.

From looking at the old RESTORE wp_postmeta table it was mostly guff and flags anyway and custom SEO META's. Therefore I thought fuck it, I'm not wasting anymore time on this. However before I could go the sites freezed up again and the Server Monitor was telling me a REPAIR by SORT was going on.

However yet again on the servers console running a TOP showed 0.00 load and no MySQL processes!

What was going on I have no idea but it seems truncating a corrupt / crashed table didn't help. So I had to manually re-create the table, indexes, options, PK, auto-increment amount etc by hand.

Then I turned APACHE back on and everything was blistering fast. I waited a little bit to ensure an automatic REPAIR wouldn't start again and then ensured all my pages were using the correct templates, filling in Yoasts SEO boxes at the same time and then left.So far so good it is all working and I don't seem to have lost any data apart from 90,000 rows of guff.

As I don't add my images in by hand (a lot of these blogs are pretty much automated) they are not in the wp_postmeta table. Instead they are just referenced from their local or foreign location which was lucky for me.

The key to this lesson is - always have up to date backups and don't rely on Wordpress Plugins to do it for you. 

You can see my debate over WP-DBManager not working on Wordpress here if you want and it was only a manual backup I did through NAVICAT that would have saved me if the insert had worked as the plugins backups were so out of date.

Anyway that was my "fix" for my problem.

Listen, learn and take notes from my mistakes so you don't make the same ones!

Friday, 10 January 2014

An ASP Classic Security Tool

An ASP Classic Security Tool

By Strictly-Software

The following script is one I wrote a long time ago but one I found quite useful the other day when I was working on an ASP Classic site.

If you download the script from my main website testsecurity.txt and then re-name it to .asp and copy it to your websites folder. You can then run it as an ASP page from your website.

The form is pretty simple and just asks for your ADO connection string so it can connect to your database using your websites login or another if you so wish.

It runs a number of tests and flags any "potential" problems up in red.

These may or may not be issues but could be areas of concern. For example you may need your website to have access to system objects from the website or run certain extended stored procedures but in many cases you shouldn't or wouldn't.

The security holes the tool looks at are:

  • The roles that the user is a member of such as data_reader, data_writer or ddl_admin etc.
  • It runs some select statements to see if it can access system tables using old and new versions e.g INFORMATION_SCHEMA.TABLES or SYS.OBJECTS.
  • Tests whether the user has direct write permission to your database. Obviously I use EXECUTE permission on all my CUD (Create,Update,Delete) statements which are held inside stored procedures but on older ASP systems you are likely to find people doing direct CUD operations (just like many PHP developers still do) from the client code to the database.
  • I test whether I can directly UPDATE a table inside your DB. I try and find a table using the system objects, which in itself could be a security hole, and then update a column in a table to its existing value so I don't overwrite anything. However this tells me whether you potentially have a hole by allowing the website direct UPDATE permission. Imagine if your own sanitisation functions failed or an SQL Injection attack succeeded. This could potentially bring your whole system down!
  • I test for the ability to run certain potentially dangerous extended stored procedures such as xp_cmdshell (can run anything a command prompt could), xp_fileexist which tells a user whether or not a file exists on the server and also xp_regwrite which allows you to write to the registry with SQL. Obviously the first and last functions could cause drastic problems if a hacker could use them against you!
  • I also test whether this ASP page can itself write and delete files (a simple text file with nothing in it) inside the folder that the ASP file is located. Obviously this could be a security hole as most websites only allow certain folders to be able to be written to. Being able to write files wherever the code is located could be dangerous as a hacker could write a file to delete other files or run code. I also check to see if I can write files to other places on the network - again another security hole. Sites should only allow the writing of files in certain places that can be virus checked and are known to be safe due to the permissions on the folder.
You might find this security test file that I quickly knocked up one day useful if you are ever having to work with ASP Classic sites and want to quickly check a variety of potential security holes in the database, website and file system with one form submission.

You can download the file from my main site testsecurity.txt.

Just remember to change the file extension to .ASP before running it!

Thursday, 9 January 2014

Download and Decompress a GZIP file with C Sharp

Using C# to download a remote GZIP file and then decompress it

By Strictly-Software

Recently I had the task of writing a program in C# that would obtain a list of proxies from various sources and then run code to ensure that they were working so I could then use the useful ones.

In this project I had a window that showed the proxy IP address, Country it came from and whether it was Anonymous, High Anonymous, Transparent etc.

I also had a check button which once the proxy list had loaded would run a test against each IP address and Port No to see the time that it took to do the following:
  • Ping the proxy if possible e.g 408 ms
  • HTTP ping the proxy by using the details to request one of a randomly selected number of fast loading pages e.g www.google.com, www.bing.com etc.
Personally I think the HTTP ping is more important when dealing with proxies than a normal PING.

A simple ping to an IP address could respond very quickly or not at all but when you are using Proxies in computing to request HTML pages you want to know how fast it takes to return such a page.

Anyway the whole point of the exercise was that I needed to have a list of countries that I could check the IP addresses against.

Luckily the great site http://geolite.maxmind.com have a free GeoIP.dat.gz file that you can download and use that is pretty accurate (but not as accurate as the paid for version). However the free version was good enough for what I needed.

The issue was that the .dat file came as a GZipped file and once I had downloaded it I needed to decompress it. This wasn't the normal .zip decompress but in .NET 4.5 it is pretty easy to accomplish.

I have shown you a basic example of the class at the bottom of the page but the most important function is the method which does the Gzip decompression.


/// 
/// Decompress a gzipped file to compress we can just use the CompressionMode.Compress parameter instead
/// 
/// 
public static void Decompress(FileInfo fileToDecompress)
{
    using (FileStream originalFileStream = fileToDecompress.OpenRead())
    {
 string currentFileName = fileToDecompress.FullName;
 string newFileName = currentFileName.Remove(currentFileName.Length - fileToDecompress.Extension.Length);

 using (FileStream decompressedFileStream = File.Create(newFileName))
 {
     using (GZipStream decompressionStream = new GZipStream(originalFileStream, CompressionMode.Decompress))
     {
  decompressionStream.CopyTo(decompressedFileStream);                        
     }
 }
    }
}
The three libraries you will need to accomplish all this apart from anything else you intend to do will be:

using System.Net
using System.IO;
using System.IO.Compression;

System.Net is required for the WebClient class to do it's work downloading the remote file to our computer and the System.IO one is required for checking that files and folders exist.

The last one is the most important, System.IO.Compression as it's the library that lets us decompress the file.

You might have to add this in as a reference in Visual Studio. Just go to: Project > Add Reference > Framework > and tick the box next to System.IO.Compression.

Also note that I am using .NET 4.5 on a Windows 7 64 bit machine. In Windows7 for security sake (I presume) most applications that need to write and read to a file, or download and hold data of some sort is done in the new C:\ProgramData folder.

You will notice that this directory is full of well known names like Microsoft, Skype, Sun, Apple and any many other software producers that needs somewhere to log data in a safe place.

In the old days people could just write programs that saved files all over the place which obviously wasn't safe. Especially if you were the admin of the computer and hit a button on a program that you thought was going to do one thing but was actually adding or deleting files all over your computer's hard drive.

Anyway the whole code is below. Make of it what you will but it's pretty simple and I found it very useful.


using System;
using System.Linq;
using System.Text;
// we need this to download the file from the web
using System.Net
// these are the two we need to do our decompression job
using System.IO;
using System.IO.Compression;

// this will hold any error message incase we get one and need to return it to the calling program
private string ErrorMessage = "";

/// 
/// Ensure our special folder in C:\ProgramData\ exists e.g C:\ProgramData\MyProgram
/// Then check the file we need to get countries related to IP's exists from http://geolite.maxmind.com and if it doesnt' download it
/// and copy it to this folder. Then we need to decompress it as its a gzip file e.g GeoIP.dat.gz so we need the uncompressed GeoIP.dat file to work with
/// 
public SetUp()
{
    // ensure a folder we can write to exists - named after my program
    dataFolder = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + @"\MyProgramName";

    if (!Directory.Exists(dataFolder))
    {
 try
 {
     // The folder doesn't exist so try and create it now
     Directory.CreateDirectory(dataFolder);

 }
 catch (Exception ex)
 {
     // set a global error message we can return to the calling object
     this.ErrorMessage = "The data folder could not be created: " + ex.Message.ToString();

     return;
 }
    }

    // we have a folder but do we have an uncompressed .dat file?
   
    // set up the paths
    // first the path of the uncompressed .dat file in case we already have it
    string geoCityDataPath = dataFolder + @"\" + this.GeoLiteCityDataFile;

    // then the path of the .dat.gz compressed file in case we need to uncompress
    string zipFilePath = dataFolder + @"\" + this.ZippedGeoLiteCityDataFile;

    // check for an uncompressed data file
    if (!File.Exists(geoCityDataPath))
    {
 try
 {

     // we don't have a file so download it from the website and copy it to our folder
     // we could schedule this behaviour to get the latest file by checking the dates or just doing a download once a week/month
     WebClient webClient = new WebClient();
     webClient.DownloadFile("http://geolite.maxmind.com/download/geoip/database/GeoLiteCountry/GeoIP.dat.gz", zipFilePath);

     // now we have our file create a FileInfo object from it to pass to our gzip decompress method
     FileInfo gzFileInfo = new FileInfo(zipFilePath);

     // Call the method to decompress the gzip file
     this.Decompress(gzFileInfo);
 }
 catch (Exception ex)
 {
     // set a global error message we can return to the calling object
     this.ErrorMessage = "The GeoIP.dat.gz file could not be downloaded or decompressed: " + ex.Message.ToString();

     return;
 }
    }
}


/// 
/// Decompress a gzipped file to compress we can just use the CompressionMode.Compress parameter instead
/// 
/// 
public static void Decompress(FileInfo fileToDecompress)
{
    using (FileStream originalFileStream = fileToDecompress.OpenRead())
    {
 string currentFileName = fileToDecompress.FullName;
 string newFileName = currentFileName.Remove(currentFileName.Length - fileToDecompress.Extension.Length);

 using (FileStream decompressedFileStream = File.Create(newFileName))
 {
     using (GZipStream decompressionStream = new GZipStream(originalFileStream, CompressionMode.Decompress))
     {
  decompressionStream.CopyTo(decompressedFileStream);                        
     }
 }
    }
}

Tuesday, 7 January 2014

SQL 2012 Bug - Incorrect Syntax Need BEGIN Statements

SQL 2012 Bug - Incorrect Syntax Need BEGIN Statements

We recently moved Database servers from 2008 to 2012.

Everything seemed to have gone fine until we found that one of our logger database tables had grown to a massive size holding millions of rows and containing many more days records than it should have.

As this was over the Christmas period no-one really noticed until we got back from holiday.However a look in the log files from the MS Agent job showed that the part of the Stored Procedure to delete old records was failing and then the proc was bombing out, which meant none of the older historical records were being deleted.

The error message in the log file was "Incorrect Syntax near begi" and then just a load of failed statements.

There are some Microsoft knowledge base articles about this issue that say the problem is due to:
  • The statement contains an IF condition.
  • The IF condition does not contain BEGIN and END statements.
  • The IF condition is followed by a BEGIN TRY block.
  • The IF block is recompiled when you run the query. 

You can read more about the problem on Microsofts own sites:

https://connect.microsoft.com/SQLServer/feedback/details/752276/incorrect-syntax-near-begi-in-sql-2012

and

http://support.microsoft.com/kb/2829372

They actually provide this example code which is supposed to replicate the error in MS SQL 2012 however when I ran it in a query window it worked fine for me.


DECLARE @i INT

IF object_id('tempdb..#temptable') IS NOT NULL
DROP TABLE #temptable

CREATE TABLE #temptable (id INT)
INSERT INTO #temptable VALUES (1),(2),(3);

IF year(getdate())=2012 SELECT @i=(SELECT COUNT(*) AS nr FROM #temptable);
BEGIN TRY
SELECT 'message'
END TRY

BEGIN CATCH
SELECT ERROR_MESSAGE()
END CATCH
Therefore I then thought that maybe the problem might not be the actual language and queries but something to do with Stored Procedures CONTAINING such language.

The stored procedures that were bombing out were full of statements that had IF statements near BEGIN TRY without an END before it. For example we put in DEBUG statements for logging and being able to debug the code after the midnight hours so we would have code like this all over the place.

DECLARE @i INT

IF @DEBUG = 1
  PRINT 'About to run code to delete old historical records'

BEGIN TRY
  ...code
END TRY
BEGIN CATCH
 .. code
END CATCH


As our DBA didn't want to install a "dodgy" hotfix as he called it, even though SP1 had been installed (which didn't fix the issue) to test this theory out we made sure any of our IF statements were wrapped in BEGIN ..code... END statements like this.

DECLARE @i INT

IF @DEBUG = 1
  BEGIN
       PRINT 'About to run code to delete old historical records'
  END

BEGIN TRY
  ...code
END TRY
BEGIN CATCH
 .. code
END CATCH


So we put the new code into our proc and let it run that night from MS Agent.

Guess what?

When we ran the MS Agent job that night that fired off these stored procedures the whole thing worked!

I don't know if it is necessarily a stored procedure problem or not but this simple fix for our own code seemed to solve the problem without having to apply the hotfix so it seemed to solve the problem for us.

How Microsoft have gotten so far into their roll out of SQL 2012 with such a problem in their code I have no idea but it seems to be such an annoying bug that a lot of people are complaining about it.

Hopefully this solution may fix it all for you as well!