Wednesday, 15 January 2014

Nightmare with Wordpress, Crashed Corrupt Table and Fixing The Problem

Nightmare with Wordpress, Crashed Corrupt Table and Fixing The Problem

By Strictly Software

This has been a real nightmare day for me!

You can read all about it on the Wordpress forum here: Suddenly Templates Not Working. (not that I actually got any help from anyone!)

I first noticed I had a problem on one of my Wordpress sites when I went to one of my pages that uses a custom page template and saw that it was totally blank. It should have been showing a feed with a Twitter widget in the sidebar and special content in the middle.

I edited the page and saw that the template was set to "Default Template". I set it to the correct template and tried saving it only for the page to reload with the Default Template still selected!

I turned on the WP_DEBUG constant and was met with a load of MySQL errors that all indicated that my wp_postmeta table was corrupt e.g

WordPress database error: [Incorrect file format 'wp_postmeta']
INSERT INTO wp_postmeta (post_id,meta_key,meta_value) VALUES (7381,'_edit_last','1')

I then ran a REPAIR (FULL) on the table only for it come back ASAP with error messages such as:

.wp_postmeta check Error Incorrect file format 'wp_postmeta'
.wp_postmeta check error Corrupt

If a REPAIR or OPTIMIZE wouldn't fix it I knew I was in the shit!

I tried searching the web but I couldn't find anything of use.

I was using the WP-DBManager plugin to create backups for me on this site but for some reason it had stopped working after my recent upgrade to WP 3.8. By the way, the amount of problems I have had since upgrading to this version would fill a book by now!

Anyway the last manual database update was done just before Christmas so it wasn't too far back and anyway the table only holds SEO META, Flags set by plugins and mostly guff that you don't really need anyway.

All my tags, categories, pictures and so on were still the site so it wasn't too major a problem except I hate BUGS! Especially ones I can't fix!

Therefore I created a new empty table with the same structure, indexes and settings as the existing wp_postmeta table and called it wp_postmetaNEW.

I turned off APACHE and then dropped the old table. I renamed my new table to wp_postmeta (I could have truncated the old one by the way) and checked it was working.

I then restored my old backup to a new database on the server called RESTORE.

I checked the wp_postmeta table wasn't corrupt first and then ran a statement (whilst logged in as root to Navicat) to copy across all the meta data I did have from the old database to my new table e.g

INSERT INTO MYDB.wp_postmeta
(post_id,meta_key,meta_value)
SELECT post_id,meta_key,meta_value
FROM RESTORE.wp_postmeta

This took a long time!

Especially for only 94,310 records.

I was waiting over 30 minutes with the server monitor ticking away saying it was repairing the table.

Then I noticed in my Putty console with a TOP command that no MYSQL process was actually running!

This made me think the process had crashed and NAVICAT wasn't telling me the truth.

By the way I have had this happen many a time whilst doing REPAIRS, waiting for the NAVICAT window to give me the results when I notice no MYSQL process is running and that the actual job had finished ages ago.

So I killed the process and checked how many rows were in the table - none. I did another REPAIR and it said the table was CRASHED again!

By this time I was so pissed off and wanting to go home I just truncated the table and turned APACHE back on.

From looking at the old RESTORE wp_postmeta table it was mostly guff and flags anyway and custom SEO META's. Therefore I thought fuck it, I'm not wasting anymore time on this. However before I could go the sites freezed up again and the Server Monitor was telling me a REPAIR by SORT was going on.

However yet again on the servers console running a TOP showed 0.00 load and no MySQL processes!

What was going on I have no idea but it seems truncating a corrupt / crashed table didn't help. So I had to manually re-create the table, indexes, options, PK, auto-increment amount etc by hand.

Then I turned APACHE back on and everything was blistering fast. I waited a little bit to ensure an automatic REPAIR wouldn't start again and then ensured all my pages were using the correct templates, filling in Yoasts SEO boxes at the same time and then left.So far so good it is all working and I don't seem to have lost any data apart from 90,000 rows of guff.

As I don't add my images in by hand (a lot of these blogs are pretty much automated) they are not in the wp_postmeta table. Instead they are just referenced from their local or foreign location which was lucky for me.

The key to this lesson is - always have up to date backups and don't rely on Wordpress Plugins to do it for you. 

You can see my debate over WP-DBManager not working on Wordpress here if you want and it was only a manual backup I did through NAVICAT that would have saved me if the insert had worked as the plugins backups were so out of date.

Anyway that was my "fix" for my problem.

Listen, learn and take notes from my mistakes so you don't make the same ones!

Friday, 10 January 2014

An ASP Classic Security Tool

An ASP Classic Security Tool

By Strictly-Software

The following script is one I wrote a long time ago but one I found quite useful the other day when I was working on an ASP Classic site.

If you download the script from my main website testsecurity.txt and then re-name it to .asp and copy it to your websites folder. You can then run it as an ASP page from your website.

The form is pretty simple and just asks for your ADO connection string so it can connect to your database using your websites login or another if you so wish.

It runs a number of tests and flags any "potential" problems up in red.

These may or may not be issues but could be areas of concern. For example you may need your website to have access to system objects from the website or run certain extended stored procedures but in many cases you shouldn't or wouldn't.

The security holes the tool looks at are:

  • The roles that the user is a member of such as data_reader, data_writer or ddl_admin etc.
  • It runs some select statements to see if it can access system tables using old and new versions e.g INFORMATION_SCHEMA.TABLES or SYS.OBJECTS.
  • Tests whether the user has direct write permission to your database. Obviously I use EXECUTE permission on all my CUD (Create,Update,Delete) statements which are held inside stored procedures but on older ASP systems you are likely to find people doing direct CUD operations (just like many PHP developers still do) from the client code to the database.
  • I test whether I can directly UPDATE a table inside your DB. I try and find a table using the system objects, which in itself could be a security hole, and then update a column in a table to its existing value so I don't overwrite anything. However this tells me whether you potentially have a hole by allowing the website direct UPDATE permission. Imagine if your own sanitisation functions failed or an SQL Injection attack succeeded. This could potentially bring your whole system down!
  • I test for the ability to run certain potentially dangerous extended stored procedures such as xp_cmdshell (can run anything a command prompt could), xp_fileexist which tells a user whether or not a file exists on the server and also xp_regwrite which allows you to write to the registry with SQL. Obviously the first and last functions could cause drastic problems if a hacker could use them against you!
  • I also test whether this ASP page can itself write and delete files (a simple text file with nothing in it) inside the folder that the ASP file is located. Obviously this could be a security hole as most websites only allow certain folders to be able to be written to. Being able to write files wherever the code is located could be dangerous as a hacker could write a file to delete other files or run code. I also check to see if I can write files to other places on the network - again another security hole. Sites should only allow the writing of files in certain places that can be virus checked and are known to be safe due to the permissions on the folder.
You might find this security test file that I quickly knocked up one day useful if you are ever having to work with ASP Classic sites and want to quickly check a variety of potential security holes in the database, website and file system with one form submission.

You can download the file from my main site testsecurity.txt.

Just remember to change the file extension to .ASP before running it!

Thursday, 9 January 2014

Download and Decompress a GZIP file with C Sharp

Using C# to download a remote GZIP file and then decompress it

By Strictly-Software

Recently I had the task of writing a program in C# that would obtain a list of proxies from various sources and then run code to ensure that they were working so I could then use the useful ones.

In this project I had a window that showed the proxy IP address, Country it came from and whether it was Anonymous, High Anonymous, Transparent etc.

I also had a check button which once the proxy list had loaded would run a test against each IP address and Port No to see the time that it took to do the following:
  • Ping the proxy if possible e.g 408 ms
  • HTTP ping the proxy by using the details to request one of a randomly selected number of fast loading pages e.g www.google.com, www.bing.com etc.
Personally I think the HTTP ping is more important when dealing with proxies than a normal PING.

A simple ping to an IP address could respond very quickly or not at all but when you are using Proxies in computing to request HTML pages you want to know how fast it takes to return such a page.

Anyway the whole point of the exercise was that I needed to have a list of countries that I could check the IP addresses against.

Luckily the great site http://geolite.maxmind.com have a free GeoIP.dat.gz file that you can download and use that is pretty accurate (but not as accurate as the paid for version). However the free version was good enough for what I needed.

The issue was that the .dat file came as a GZipped file and once I had downloaded it I needed to decompress it. This wasn't the normal .zip decompress but in .NET 4.5 it is pretty easy to accomplish.

I have shown you a basic example of the class at the bottom of the page but the most important function is the method which does the Gzip decompression.


/// 
/// Decompress a gzipped file to compress we can just use the CompressionMode.Compress parameter instead
/// 
/// 
public static void Decompress(FileInfo fileToDecompress)
{
    using (FileStream originalFileStream = fileToDecompress.OpenRead())
    {
 string currentFileName = fileToDecompress.FullName;
 string newFileName = currentFileName.Remove(currentFileName.Length - fileToDecompress.Extension.Length);

 using (FileStream decompressedFileStream = File.Create(newFileName))
 {
     using (GZipStream decompressionStream = new GZipStream(originalFileStream, CompressionMode.Decompress))
     {
  decompressionStream.CopyTo(decompressedFileStream);                        
     }
 }
    }
}
The three libraries you will need to accomplish all this apart from anything else you intend to do will be:

using System.Net
using System.IO;
using System.IO.Compression;

System.Net is required for the WebClient class to do it's work downloading the remote file to our computer and the System.IO one is required for checking that files and folders exist.

The last one is the most important, System.IO.Compression as it's the library that lets us decompress the file.

You might have to add this in as a reference in Visual Studio. Just go to: Project > Add Reference > Framework > and tick the box next to System.IO.Compression.

Also note that I am using .NET 4.5 on a Windows 7 64 bit machine. In Windows7 for security sake (I presume) most applications that need to write and read to a file, or download and hold data of some sort is done in the new C:\ProgramData folder.

You will notice that this directory is full of well known names like Microsoft, Skype, Sun, Apple and any many other software producers that needs somewhere to log data in a safe place.

In the old days people could just write programs that saved files all over the place which obviously wasn't safe. Especially if you were the admin of the computer and hit a button on a program that you thought was going to do one thing but was actually adding or deleting files all over your computer's hard drive.

Anyway the whole code is below. Make of it what you will but it's pretty simple and I found it very useful.


using System;
using System.Linq;
using System.Text;
// we need this to download the file from the web
using System.Net
// these are the two we need to do our decompression job
using System.IO;
using System.IO.Compression;

// this will hold any error message incase we get one and need to return it to the calling program
private string ErrorMessage = "";

/// 
/// Ensure our special folder in C:\ProgramData\ exists e.g C:\ProgramData\MyProgram
/// Then check the file we need to get countries related to IP's exists from http://geolite.maxmind.com and if it doesnt' download it
/// and copy it to this folder. Then we need to decompress it as its a gzip file e.g GeoIP.dat.gz so we need the uncompressed GeoIP.dat file to work with
/// 
public SetUp()
{
    // ensure a folder we can write to exists - named after my program
    dataFolder = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + @"\MyProgramName";

    if (!Directory.Exists(dataFolder))
    {
 try
 {
     // The folder doesn't exist so try and create it now
     Directory.CreateDirectory(dataFolder);

 }
 catch (Exception ex)
 {
     // set a global error message we can return to the calling object
     this.ErrorMessage = "The data folder could not be created: " + ex.Message.ToString();

     return;
 }
    }

    // we have a folder but do we have an uncompressed .dat file?
   
    // set up the paths
    // first the path of the uncompressed .dat file in case we already have it
    string geoCityDataPath = dataFolder + @"\" + this.GeoLiteCityDataFile;

    // then the path of the .dat.gz compressed file in case we need to uncompress
    string zipFilePath = dataFolder + @"\" + this.ZippedGeoLiteCityDataFile;

    // check for an uncompressed data file
    if (!File.Exists(geoCityDataPath))
    {
 try
 {

     // we don't have a file so download it from the website and copy it to our folder
     // we could schedule this behaviour to get the latest file by checking the dates or just doing a download once a week/month
     WebClient webClient = new WebClient();
     webClient.DownloadFile("http://geolite.maxmind.com/download/geoip/database/GeoLiteCountry/GeoIP.dat.gz", zipFilePath);

     // now we have our file create a FileInfo object from it to pass to our gzip decompress method
     FileInfo gzFileInfo = new FileInfo(zipFilePath);

     // Call the method to decompress the gzip file
     this.Decompress(gzFileInfo);
 }
 catch (Exception ex)
 {
     // set a global error message we can return to the calling object
     this.ErrorMessage = "The GeoIP.dat.gz file could not be downloaded or decompressed: " + ex.Message.ToString();

     return;
 }
    }
}


/// 
/// Decompress a gzipped file to compress we can just use the CompressionMode.Compress parameter instead
/// 
/// 
public static void Decompress(FileInfo fileToDecompress)
{
    using (FileStream originalFileStream = fileToDecompress.OpenRead())
    {
 string currentFileName = fileToDecompress.FullName;
 string newFileName = currentFileName.Remove(currentFileName.Length - fileToDecompress.Extension.Length);

 using (FileStream decompressedFileStream = File.Create(newFileName))
 {
     using (GZipStream decompressionStream = new GZipStream(originalFileStream, CompressionMode.Decompress))
     {
  decompressionStream.CopyTo(decompressedFileStream);                        
     }
 }
    }
}

Tuesday, 7 January 2014

SQL 2012 Bug - Incorrect Syntax Need BEGIN Statements

SQL 2012 Bug - Incorrect Syntax Need BEGIN Statements

We recently moved Database servers from 2008 to 2012.

Everything seemed to have gone fine until we found that one of our logger database tables had grown to a massive size holding millions of rows and containing many more days records than it should have.

As this was over the Christmas period no-one really noticed until we got back from holiday.However a look in the log files from the MS Agent job showed that the part of the Stored Procedure to delete old records was failing and then the proc was bombing out, which meant none of the older historical records were being deleted.

The error message in the log file was "Incorrect Syntax near begi" and then just a load of failed statements.

There are some Microsoft knowledge base articles about this issue that say the problem is due to:
  • The statement contains an IF condition.
  • The IF condition does not contain BEGIN and END statements.
  • The IF condition is followed by a BEGIN TRY block.
  • The IF block is recompiled when you run the query. 

You can read more about the problem on Microsofts own sites:

https://connect.microsoft.com/SQLServer/feedback/details/752276/incorrect-syntax-near-begi-in-sql-2012

and

http://support.microsoft.com/kb/2829372

They actually provide this example code which is supposed to replicate the error in MS SQL 2012 however when I ran it in a query window it worked fine for me.


DECLARE @i INT

IF object_id('tempdb..#temptable') IS NOT NULL
DROP TABLE #temptable

CREATE TABLE #temptable (id INT)
INSERT INTO #temptable VALUES (1),(2),(3);

IF year(getdate())=2012 SELECT @i=(SELECT COUNT(*) AS nr FROM #temptable);
BEGIN TRY
SELECT 'message'
END TRY

BEGIN CATCH
SELECT ERROR_MESSAGE()
END CATCH
Therefore I then thought that maybe the problem might not be the actual language and queries but something to do with Stored Procedures CONTAINING such language.

The stored procedures that were bombing out were full of statements that had IF statements near BEGIN TRY without an END before it. For example we put in DEBUG statements for logging and being able to debug the code after the midnight hours so we would have code like this all over the place.

DECLARE @i INT

IF @DEBUG = 1
  PRINT 'About to run code to delete old historical records'

BEGIN TRY
  ...code
END TRY
BEGIN CATCH
 .. code
END CATCH


As our DBA didn't want to install a "dodgy" hotfix as he called it, even though SP1 had been installed (which didn't fix the issue) to test this theory out we made sure any of our IF statements were wrapped in BEGIN ..code... END statements like this.

DECLARE @i INT

IF @DEBUG = 1
  BEGIN
       PRINT 'About to run code to delete old historical records'
  END

BEGIN TRY
  ...code
END TRY
BEGIN CATCH
 .. code
END CATCH


So we put the new code into our proc and let it run that night from MS Agent.

Guess what?

When we ran the MS Agent job that night that fired off these stored procedures the whole thing worked!

I don't know if it is necessarily a stored procedure problem or not but this simple fix for our own code seemed to solve the problem without having to apply the hotfix so it seemed to solve the problem for us.

How Microsoft have gotten so far into their roll out of SQL 2012 with such a problem in their code I have no idea but it seems to be such an annoying bug that a lot of people are complaining about it.

Hopefully this solution may fix it all for you as well!

Saturday, 4 January 2014

Latest Version of Strictly AutoTags 2.8.9 - Donation Only

Latest Version of Strictly AutoTags 2.8.9 - Paid Version

By Strictly-Software

I have just released the latest version of my highly popular plugin Strictly AutoTags plugin.

Strictly AutoTags 2.8.9 is a paid for version which means it has features in it that are only available to those people who pay me £40 first.

The new paid for version includes the following features:at

Strictly AutoTags - Version 2.8.9
  • Updated storage to handle new data-description and data-pin-desc attributes so they don't get tagged inside images by mistake. The same goes for shortcodes now handled by the Jetpack plugin so that [youtube http://www.youtube.com/watch?v=USbkB6rVbpc] doesn't get accidentally tagged either if one of the words within the [shortcode] contained a word e.g [customField sales]
  • I have added a new tag equivalent mark up language using a simple method that enables you to match instances of certain words BUT use a different tag. E.G if you want the words Snowden, NSA and Prism to add the tag Police State to your article you would use this markup language [Snowden, NSA, PRISM]=[Police State] 
  • To then add the tag terrorism to the words al-Qaeda, bin-Laden, 9.11, WTC and Taliban you would add this [al-Qaeda, bin-Laden, 9.11, WTC, Taliban]=[Terrorism] To put them together you use a pipe so adding these two would mean an input box filled like this [Snowden, NSA, PRISM]=[Police State]|[al-Qaeda, bin-Laden, 9.11, WTC, Taliban]=[Terrorism]
  • I've added an option to set the minimum number of letters a tag must have before it is used as a tag. This applies to stored tags or newly found ones. This allows you to skip tags of one or two letters long even if they are ACRONYMS and you don't want to add them to the NOISE word list.
  • I've added option to tell system whether or not to convert plain text links like the wording www.msnbc.com into a proper link msnbc.com This is obviously quite difficult due to all the shortened links you can now get, the lack of protocols e.g //twitter.com/strictlytweet and soon UTF8 characters in URLs.
  • I've created a new name function to match names like al-Qaeda or al-Nusra Front or even words with commas in like 1,000 Guineas which would be tagged as 1000 Guineas due to the comma being used as a separator in WordPress.
  • I've added a new finished_doing_tagging hook into the plugin so other plugins can act once the tagging is finished. This is most useful when using my AutoTag plugin in conjunction with Strictly TweetBot which allows you to use categories and tags as #hashtags in any tweets that are automatically sent out. Without this hook you can run into problems when you have a large number of tags to search through and it's possible tweet will get sent with their default #HashTags instead of relevant post tags instead like #Snowden or #NSA on an article about the Police State.
  • In my latest paid version I have included a new function which will allow you clean out any HTML added by my own tagging process when it bolds and deep-links tags. I use specific classes on my HTML for anchors and spans so I know how to remove them with regular expressions. Obviously the more articles you have the longer this process to clean up will take.You could obviously clean articles out yourself with a simple UPDATE MySQL statement and a RegExp Replace function call but this saves you from having to.
There is a free version up on the Wordpress site, the latest being version 2.8.8, but this has far less features.


Buy Now


When I bring out new paid only versions I may add a few of the older paid for features into the free versions.

However the whole reason I am doing this is, is due to the fact that although over 180,000 people have downloaded my plugin and obviously think it's useful from Wordpress (and who knows how many from my own website), I have had hardly any donations. 

It seems hardly worth my time putting the code out on the web for others when I can just use it myself and gain all the SEO benefits it has brought me (which it has!).

Benefits such as:        
  • Deep-linking a specified number of the most important words in an article to their "tag listings" page where crawlers and users can find lots more content related to that word / tag.
  • Also the deep-linking is done on INPUT not OUTPUT. Which means it's faster to load pages as the formatting is already done. Do something once on INPUT not a THOUSAND times on OUTPUT, especially not when thousands of BOTS and HACKERS come crawling using up CPU and bandwidth!
  • Hit Highlighting, bolding or using links, around important words that are key and related to an article is a good way of telling SERP crawlers which words are the most important for an article.Automatically doing this without waiting for 3rd party API's to add new tags relevant to the latest news stories (Peoples names, buildings, companies etc) is a lot faster.

The money I have been donated so far by a few people, usually for adding new features to the plugin has certainly not enough been enough money to warrant me spending the amount of time and effort I have on this plugin so so far - and that's A LOT! This I have brought out a paid version as the main version which will always

Buy Now


have the latest features in it.

Believe it or not a £2-£5 donation maybe a small sign of gratitude but as this is is proper coding and "Open Source", I thought the idea was to take other peoples code and "change" "edit" or "extend" it. Maybe I was wrong but that's what I did with my first sitemap plugin when all the others were giving me problems. I literally learnt Wordpress AND PHP at the same time just so I could make a fast performing plugin that worked with 50,000+ articles.

Also as I get paid £650 a day contracting a fiver just isn't worth a days work I'm afraid.

Also let me remind those people who still cannot read English:
  • I don't work for free.
  • I don't add new features into an open source plugin just because you want them. It takes time and effort!
  • I don't drop everything to help you configure a plugin which has a simple "Test Config" button in the admin panel that tests everything is set up for you and if not tells you what the problems are and what you need to do to fix them. I do this to save support requests believe it or not.
  • I also have a "shit list" for tight wads who have stitched me up in the past. Either by promising me constant work, large payments later on, or payments after the work has been done. If they fail to live up to their side of the agreement or are just plain rude they get their name and comment / email put on my very own "Shit List". You can read some of them here: http://blog.strictly-software.com/2011/10/naming-and-shaming-of-programming.html.
The latest is quite funny, not just because of the broken English but because the person seemed to want me to drop everything and fix something that isn't even broken ASAP!

Not only do I not even support or release new versions of the plugin in question "Strictly Google Sitemap" but the problem the person is talking about is a simple case of him not hitting the "Save Settings" button first before hitting the "Build Sitemap" button - DOH!

Have a laugh as you read (his partially corrected by me) broken English request trying to get me to help him out.

hi dear rob i hope you are fine

i have a very bad probelm
my site hosting has limit and i can't build full sitemap with google xml sitemap by arne

i installed your plugin i see very bad problem
when i installed i went in setting of your sitemap
i changed setting when i clicked in manual building sitemap, 
sitemap build but with default settings

dear rob i don't have time to waste!
If only I could build in a Doofus Config Test button to check the user has basic computer skills first before allowing them to use it that would be a money spinner!

I do like the "dear rob I don't have time to waste!" bit at the end. In the real email he left the end "e" of the word waste but I thought I would add it on to help you understand him more easily.

Yes of course, you have no time to waste so I will just drop everything and fix something for your for free. I think not!

This "Tight Wad" list has worked quite well so far and I have had quite a few people contact me and pay me just to take their name off it.

I suppose they have done a Google search for their own name when they were bored one day only to find themselves on my blog in an article about tight arses who talk the talk but can't walk the walk.

Oh well maybe they should have read the specific part of the FAQ page on the WordPress plugin page that states how they should go about debugging any problem and to prevent themselves ending up on my "tightwad list". 

It's quite simple and in the FAQ section.. From the WordPress sites FAQ Section non the website about the plugin.

I have an error.
If you have any error messages installing the plugin then please try the following to rule out conflicts with other plugins:
  • Disable all other plugins and then try to re-activate the Strictly Google Sitemap plugin as some caching plugins can cause issues.
  • If that worked, re-enable the plugins one by one to find the plugin causing the problem. Decide which plugin you want to use.
  • If that didn't work check you have the latest version of the plugin software (from WordPress) and the latest version of WordPress installed
  • Check you have Javascript and Cookies enabled.
  • If you can code turn on the DEBUG constant and debug the code to find the problem otherwise contact me and offer me some money to fix the issue :)
  • Please remember that you get what you pay for so you cannot expect 24 hour support for a free product. Please bear that in mind if you decide to email me a donation button is on my site and on the plugin admin page.
  • If you don't want to pay for support then ask a question on the message board and hop
Or maybe they should have read the "Support" page that clearly states I don't support this Sitemap plugin any-more. The main reason being that I got Cancer at the beginning of 2012.
              
Everything moves so fast it is hard to keep up and build business models on top of 3rd party code - just look at Twitter and their culling of all the DM message responder applications that put a myriad of companies out of business.

However despite these other SEO features not working the core Sitemap functionality still works (on non MU sites) and I use it on all of my own sites.

However if you want support you must donate. Otherwise you go onto the list: http://blog.strictly-software.com/2011/10/naming-and-shaming-of-programming.html

I just haven't got time to answer questions in broken English that make so sense as if I "must" help them for some reason. It's 3rd party, open source and free code - sort it out yourself or pay someone to do it for you! Simples!

So there you go a brand new donation only version of AutoTags and some reasons why you shouldn't piss me off :)

If your nice I help - I always like to help others, if I didn't I wouldn't be putting all this helpful technical information on the web on my blog would I?

Just remember I have a full time job, a few companies I am partners in or run on my own, contract work and then when I get spare time I do a bit of coding for the Open Source community. 

However if you know my feelings on Open Source you will know I'm not exactly a fan.

Here is a direct quote from article venting my spleen on the topic.

If you can't code and have put a stupid $50 bid on Rentacoder.com to develop a whole website by next week and expect me to do all your work for you just because you found a piece of my code that does 90% of what you need and you require the other 10% doing for free then you can take a running jump.
People like you are the reason good coders work 50 hour weeks for tech companies and then spent all their free time trying to come up with something that might make some money if the world of programming actually behaved like any other kind of market place.
From the point of view of actually making real money I cannot think of a worse idea for anyone than open source coding. Off the top of my head I cannot think of any other business model that acts in a similar way and leaves the person who produced the goods with bog all and the customer with all their hearts desire at a cost of zilch!
So there you go basically if you want good quality products pay for them.

Otherwise if you get your code for free from unknown sources don't be surprised if they infect your computer with viruses, steal your bank account details, use your mailbox as spam and put crap code into your system.


Buy Now