Friday, 27 February 2015

What Do You Want To See In The Next Version Of Strictly AutoTags

What Do You Want To See In The Next Version Of Strictly AutoTags

By Strictly-Software

Hello, this post is to ask my users of  the paid for version of my plugin, Strictly AutoTags what features they would like to see in the next version.

What would make you spend another £40 to get a new version of the code?

Here are some ideas I am thinking about but I welcome comments. You are the buyers so you know what would make it better or which bugs need fixing better than me.

My Thoughts On Options

One option is to add the title of the article into any links converted from plain text into clickable links - good for SEO.

Also an "Auto Categorisation" feature. So not only Auto Tagging but Auto Categorisation.

This would be done through boxes that you could add to the admin form one by one.

One to hold a number of words, the other a number/count of words that needed to be matched, the other a category.

So the logic would be that if the number of words specified in the box, e.g 5 words related to terrorism, al-Qaeda, ISIL, ISIS, Jihad John, Islamic State, then the category "Terrorism" is added to the post

A "limit" to the number of tags you can have. Once this limit is reached no new tags are added OR you could specify that the auto clean function is run so tags with less than 2 posts using them are removed and then tagging can continue.

A similar option for when for when the tagging stops working for no apparent reason. This may just be me as no-one has complained about it. However it does happen on my sites with 25,000+ tags. Usually after an update of WordPress or Jetpack plugin code.

Personally a re-save of my admin settings or a clean out of old tags seems to fix this for me.

So a CRON job feature to run regularly and check for recent non tagged posts and then these articles are re-tagged.

Or an auto memory manager feature, a bit like my Strictly Sitemap plugin, which checks the amount of memory used on the last posting and keeps setting the memory to the right size on each post e.g the last size + 10%, or if it goes down, the last size used. Or allow you to specify an amount to be set in admin.

Tell me which feature you like or would like to see in the next Strictly-Software AutoTags plugin!

You can let me know here or on my Facebook page at https://www.facebook.com/strictlysoftware

Monday, 9 February 2015

Speeding up Chrome can KILL IT!

Speeding up Chrome can kill it!

By Strictly-Software

Lately I have been really disappointed with the performance of my preferred browser Chrome.

I moved from FireFox to Chrome when the amount of plugins on FireFox made it too slow to work with however this was when Chrome was a clean, fast browser. Now it has just as many plugins available to install as FireFox and the performance has deteriorated constantly over the past few versions.

I am a developer so having 20+ tabs open in my browser is not unusual however when they are all hanging for no reason with "resolving host" messages in the status bar something is wrong.

I have even removed all my plugins that I had installed and decided to leave that to FireFox if I need to test different agents, hacking and so on. However even with a simple install the performance has been crap lately!

Therefore looked up on the web for tips on speeding up Chrome and found this article:

http://digiwonk.wonderhowto.com/how-to/10-speed-hacks-thatll-make-google-chrome-blazing-fast-your-computer-0155989/

It basically tells you some tricks to speed up Chrome by modifying some settings by going to chrome://flags/ in your address bar.

There is a warning at the top of the page that says:

WARNING These experimental features may change, break or disappear at any time. We make absolutely no guarantees about what may happen if you turn one of these experiments on, and your browser may even spontaneously combust. Jokes aside, your browser may delete all your data or your security and privacy could be compromised in unexpected ways. Any experiments that you enable will be enabled for all users of this browser. Please proceed with caution. Interested in cool new Chrome features? Try our beta channel at chrome.com/beta.

So these are all "experimental" features and from the sounds of it they could even make it's security and performance worse not better. Even a few of the tweaks suggested by the article had already disappeared from the settings page.

I did what was still available and a few more tweaks after careful consideration and what happened?

Well at first some pages seemed to load quicker but then I found that:

  • Some sites without a www. sub domain wouldn't load.
  • Some pages wouldn't load at all.
  • When I came into work today even though a Chrome process was running with 0 CPU usage nothing was displayed.
I had to re-boot, and try 3 times to open Chrome before getting back to the chrome://flags/  page and restoring all the defaults. Since then everything has been okay.

So if your going to tweak be careful - it could take down your whole browser!

The best way to speed it up is to remove all the plugins and add-ons and leave that to FireFox. Turn off 3rd party cookies and any 3rd party services that involve constant lookups and try to keep it clean and simple.

It seems that with the over usage of AJAX and sites like Facebook/LinkedIn/Google+ where as you type it constantly looks up the word to see if it matches a name or contact that this "API JIZZ" as I call it has really slowed down the web.

Just by having Google+, Facebook and LinkedIn open at the same time can eat up your memory and I'm on a quad core 64 bit machine.

In my opinion there should be settings to enable you to turn off the API JIZZ and flashy features that rely on lots of JavaScript and AJAX.

It slows down your computer and is not needed most of the time. Having big database lookups on each keystroke is obviously going to use up lots of memory so it should be an option you can disable.

Anyway that's down to the developers of all these social sites who seem to love AJAX for everything. A simple submit button would do in a lot of cases!


Friday, 6 February 2015

New Premium Version of Strictly Auto Tags version 2.9.9

New Premium Version of Strictly Auto Tags version 2.9.9

By Strictly-Software

Did you know you can buy even more Strictly-Software.com products on Etsy.com than on my own site www.strictly-software.com.

This includes special configuration coupons which enable you to hire me to configure your plugins on your site including Strictly-Software AutoTags and Strictly-Software TweetBOT.

However if you follow the settings, and the process for debugging the Readme.txt file you should be okay.

Despite that sometimes you may have a site with an unusual system setup or issues with your chosen options

To get me to help, you can now buy coupons from my shop on Etsy.com to allow me to check your system set-up and sort out any configuration settings that maybe set incorrectly.

The new popular WordPress plugin: Strictly-Software AutoTag premium version can either be bought for just £40 on www.strictly-software.com or for the same amount on my Etsy.com shop. The coupon for help is just £20.

New features in Strictly AutoTags version 2.9.9 include:
  • New option to override the siteurl value from the WordPress getOption('siteurl') function which returns your site url e.g http://www.mysite.com. This is appended to the front of any deeplinked tags e.g http://www.mysite.com/tags/frankel. However if you have an unusual site set-up or folder structure then this can sometimes cause you problems and generate 404 errors if the tag link is clicked. Therefore you can now override the value for deep linking with a hard coded value which is better than re-setting the siteurl value in the wp_optiions table.
  • A new help section at the top which includes the test post article you should always try to save a draft post with Auto Discovery enabled whenever you make changes to see if the plugin is configured correctly. If no tags are generated there is either a problem with other plugins, your configuration settings or your system.
  • Added links to my Facebook/strictlysoftware where you can find lots of help related to my products including detailed debugging steps and notifications on problems. Please "Like" me if you can.
  • Added links to my article on my Facebook page about steps to take when debugging Strictly AutoTags 
  • Bug Fix - On some options (checkbox / boolean) the system wasn't remembering your old settings on page refreshes. This should now be fixed.


Buy Strictly AutoTags version 2.9.9 now!



Buy a Strictly AutoTags Config Coupon now!

Buy a one time use coupon that offers you my time and expertise setting up the plugin on your site.

With an admin login and some time getting to know your content I can iron out any problems and configure the settings for your site and plugin correctly.

If you have trouble configuring your plugin or need help setting it up and getting the correct settings for your site then you can purchase a coupon for £20 which gives you my time and help on your system.

On receipt of payment of the coupon all I will need is access to your admin area with permission to change the plugin settings so that I can set it up for you and recommend settings for what you want to achieve.

Every site is different so every sites configuration will be as well.

View the new plugin download page at www.strictly-software.com

Wednesday, 28 January 2015

NOFOLLOW DOES NOT MEAN DO NOT CRAWL!

NOFOLLOW DOES NOT MEAN DO NOT CRAWL!

By Strictly-Software

I have heard it said by "SEO Experts" and other people that to prevent excess crawling of a site you can add rel="nofollow" to your links and this will stop GoogleBOT from crawling those links.

Whilst on the surface of it this does seem to make logical sense, I mean the attribute value does say "nofollow" not "follow if you want" it isn't. BOTS will ignore the nofollow and still crawl the links if they want to.

The nofollow attribute value is not meant for blocking access to pages and preventing your content from being indexed or viewed by search engines. Instead, the nofollow attribute is used to stop SERPS like GoogleBOT from having any "link juice" from the main page leak out to the pages they link to.

As you should know Google still uses PageRank, even though it is far less used than in years gone by. In the old days it was their prime way of calculating where a page was displayed in their index and how one page was related to another in terms of site authority.

The original algorithm for Page Rank and how it is calculated is below.

PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))


An explanation for it can be found here. Page Rank Algorithm Explained.

The perfect but totally unrealistic scenario is to have another site with a very high Page Rank value e.g 10 (the range goes from 1 to 10) and to have that sites high PR page (e.g their homepage) have a single link on it that goes to your site - without a nofollow value in the rel attribute of the link.

This tells the SERP e.g GoogleBOT that this high ranking site THINKS your site is more important than it in the great scheme of the World Wide Web.

Think of a pyramid with your site/page ideally at the top with lots of high PR pages and sites all pointing to it, passing their link juice upwards to your site. If your page then doesn't have any links on it at all then no link juice you have obtained from inbound links will be "leaked out".

The more links there are on a page the less PR value is given to each link and the less "worthy" your site becomes in theory.

So it should be noted that the nofollow attribute value isn't meant for blocking access to content or preventing content to be indexed by GoogleBOT and other search engines.



Instead, the nofollow attribute is used by sites to stop SERP BOTS like GoogleBOT from passing "authority" and PR value to the page it is linking to.

Therefore GoogleBOT and others could still crawl any link with rel="nofollow" on it.

It just means no Page Rank value is passed to the page being linked to.

Sunday, 25 January 2015

Returning BAD BOTS to where they came from

Banning BAD BOTS to where they came from

By Strictly-Software

Recently in some articles I mentioned some .htaccess rules for returning "BAD BOTS" e,g crawlers you don't like such as IE 6 because no-one would be using it anymore and so on.

Now the rule I was using was suggested by a commenter in a previous article and it was to use the REMOTE_ADDRESS IP parameter to do this.

For example in a previous article (which I have now changed) about banning IE 5, 5.5 and IE 6, I originally suggested using this rule for banning all user-agents that were IE 5, 5.5 or IE 6.

RewriteRule %{HTTP_USER_AGENT} (MSIE\s6\.0|MSIE\s5\.0|MSIE\s5\.5) [NC]
RewriteRule .* http://%{REMOTE_ADDR} [L,R=301]

Now this rewrite rule uses the ISAPI parameter {REMOTE_ADDR} which holds the originating IP address from the HTTP request to send anyone with IE 6 or below back to it.

It is the IP address you would normally see in your servers access logs when someone visits.

Problems with this rule

Now when I changed the rules on one of my own sites to this rule and then started testing it at work for a work site by using a user-agent switcher add-on for Chrome I ran into the problem that every time I went to my own site I was sent back to my companies gateway router page.

I had turned the switcher off but for some reason either a bug in the plugin, a cookie or session variable must have caused my own site to believe I was still on IE 6 and not the latest Chrome version. So everytime I went to my site with this rule I was kicked back to my companies gateway routers page.

Therefore after a clean up and a think and talk with my server techie guy he told me I should be using localhost instead of the REMOTE_ADDR IP address .The reason was that a lot of traffic, hackers, HACKBOTS, Spammers and so on would be hitting the Gateway page for their ISP for potential hacking,

These ISP's might get a but pissed off with your website sending their gateway routers page swathes of traffic that could potentially harm them,

Therefore to prevent getting letters in the post that you are sending swathes of hackers to your homes or phones ISP gateway - as a lot of phones or tablets use proxies for their browsers anyway - is to send them back to their own localhost or 127.0.0.1.

Also instead of using a 301 permanent redirect rule you should use a 302 temporary redirect rule instead as that is the more appropriate code to use,

Use this rule instead

Therefore the rule I now recommend for anyone wanting to ban all IE 5, 5.5 and 6 traffic is below.

RewriteRule %{HTTP_USER_AGENT} (MSIE\s6\.0|MSIE\s5\.0|MSIE\s5\.5) [NC]
RewriteRule .* http://127.0.0.1 [L,R=302]

This Rewrite rule bans IE 5, 5.5 and IE 6.0 and sends the crawler back to the localhost on the users machine with a 302 rewrite rule. You can obviously add other rules in with BOTS and SQL/XSS injection hacks as well

This is a more valid rule as it's not a permanent redirect for the traffic such as if a page has changed it's name. Instead it's down to an invalid parameter or value in the HTTP Request that the user is being redirected to the new destination with a redirect.

If the user changed it's user-agent or parameters then it would get to the site and not be redirected with a 301 OR a 302 status code but instead get a 200 OKAY status code.

So remember, whilst an idea might seem good at first until you fully test it and ensure it doesn't cause problems it might not be all that it seems.

Wednesday, 14 January 2015

ETSY SHOP OPEN FOR BUSINESS!

ETSY SHOP OPEN FOR BUSINESS!

By Strictly-Software

My Etsy shop is OPEN again - if you run a WordPress site and want some tools to automate your system then check out: https://www.etsy.com/uk/shop/StrictlySoftware

I didn't know the items in my shop EXPIRED after so long so the shop was empty to viewers for the last month and a bit but now you can buy your tools from etsy or my own site for plugins: http://www.strictly-software.com/plugins (please click on some adverts and help me raise some cash)

Also my facebook page: https://www.facebook.com/strictlysoftware has information about these tools that you should read if you have purchased any of them.

It has help articles, guides on support, possible issues and fixes and much more - feel free to comment and like the page!

The basic idea behind these plugins is:

Run a site all year, 24/7 without having to do anything apart from some regular maintenance like cleaning tags that are not used very much and OPTIMIZING your database table.

So an RSS / XML feed contains your content (e.g news about something) and this goes into WordPress at scheduled times (Cron jobs or WebCrob jobs) using WordPress plugins like RSSFeeder or WP-O-Matic then as the articles are saved Strictly AutoTags adds the most relevant tags to it by using simple pattern matching like finding the most frequently used "important" words in the article e.g words in the Title, Headers, Strong tags or just Capitalised Words such as names like John Smith.

This means if John Smith became famous over night you wouldn't have to add a manual tag in for him or wait for a 3rd party plugin to add in the word to their own database so that it can be used.

Then once your article is tagged. You can choose to have the most popular tags converted into links to tag pages (pages containing other articles with the same tag) or just bold them for SEO - or do nothing.

You can set certain tags to be "TOP TAGS" which will rank them higher than all other tags. These should be tags related to your site e.g a bit like the old META Keywords.

You can also clean up old HTML, convert text tags to real clickable ones and set up a system where if a tag such as ISIS is found the tag Middle East is used instead. This is all explained on the Strictly AutoTags page on my site.

Then if you also purchase Strictly Tweet BOT PRO as well you can use those new post tags as #hashtags in your tweets and you can set your system up to either tweet to multiple twitter accounts with different formats and tags or tweet to the same account with different wording dependant on the wording in your article.



E.G if your article was about the Middle East wars you could say only post the Tweet if the article contains this word "Middle East" OR "Syria" or you could say only post if it contains the words "ISIS" AND "War".

The TweetBOT then lets you ensure the post is cached (if you are using a WordPress Caching System) by making it live first and making an HTTP request to it so it gets cached. Then it waits a custom defined number of seconds before any Tweets are sent out.

You can then specify a number of seconds between each Tweet that is sent out to prevent Twitter Rushes e.g. Where 50 BOTS all hit your site at the same time.

You can ensure no Tweets are sent out if they contain certain words, add tracking links e.g Google before the link is minimised by Bit.ly.

A simple PIN number process lets you connect your Twitter Account to your TweetBOT Account.

A dashboard keeps you informed of recent Tweets sent out, any errors from Twitter like "duplicate tweet", or if your Bit.ly account isn't working.

Plus a test button lets you test the system without sending a Tweet by taking the last post, running your settings through it such as shortening the link and post and checking all Twitter accounts are working and connected properly.

If you then link your Twitter account up to your Facebook page like I have with my Horse Racing site http://www.ukhorseracingtipster.com/ and my Twitter account @ukhorseracetips with my Facebook page facebook.com/Ukhorseracingtipster you get social media and SEO impact for free!





Check out the new live shop on Etsy for plugins and coupons if you need me to set the plugin up for your site:https://www.etsy.com/uk/shop/StrictlySoftware

You may need help due to your sites special settings or requirements so a coupon will let you help you set it up correctly for you.

Saturday, 10 January 2015

2 Quick Ways To Reduce Traffic On Your System

2 Quick Ways To Reduce Traffic On Your System

By Strictly-Software

Slowing The 3 Major SERP BOTs Down To Reduce Traffic

If you run a site with a lot of pages, good rankings, or a site that tweets out a lot e.g whenever a post comes online then you will probably get most of your traffic from getting crawled by the big 3 crawlers:

I know that whenever I check my access_log on my server to find out the top visiting IP addresses with a command like

grep "Jan/2015" access_log | sed 's/ - -.*//' | sort | uniq -c | sort -nr | less

I always find the top IP's are the main 3 Search Engines own BOTS (SERP = Search Engine Results Page), so I call their BOTS SERP BOTS.

GoogleBot: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

Bing: Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)

Yahoo: Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)

Without asking these BOTS to come NOW by doing things like refreshing your sitemap and pinging these SERPS

Or tweeting out links then they will crawl your site at their own time and choosing and with nothing to tell them to slow down they will crawl at their own speed. This could be once every second and if so it could cause your site performance issues.

However thing about it logically, if you post a news article or a job advert then in reality it only needs to be crawled once by each SERP BOT for it to be indexed. 

You don't really want it to be crawled every day and on every visit by these BOTS as the content HASN'T changed so there is really no need for the visit.

Now I don't know a way of telling a BOT to only crawl a page only if it's new content or it's changed in some way even if you had a sitemap system that only put in pages that were new or edited as the BOTS will still just vist your site and crawl it.

If you cannot add rel="nofollow" on internal links that point to duplicate content which doesn't actually 100% mean the BOT won't crawl it anyway then there are some things you can try if  you find that your site is having performance problems or is under pressure from heavy loads.


Crawl-Delay

Now this only used to be supported by BingBOT and then some smaller new search engines like Blekko

However in recent months after some testing I noticed that all most major SERP BOTS apart from GoogleBOT now obey the command. To get Google to reduce their crawl rate you can use Webmaster Tools to set their crawl rate from the control panel.

For instance on one of my big news sites I have a Crawl-Delay: 25 setting and when I check my access log for those user-agents there is a 25 second (roughly) delay between each request.

Therefore extending this value will reduce your traffic load by the major visitors to your site and is easily done by adding it to your Robot.txt file e.g.

Crawl-delay: 25

Banning IE 6

Now there is no logical reason in the world for any REAL person to be using this user-agent.

This Browser was probably the worst ever Browser in history due to the quirks within it that made web developers jobs so hard. Even just between IE 5.5 and IE 7 there are so many differences with IE 6 and is the reason IE 8 and 9 had all the settings for compatibility modes and browser modes.

It is also the reason IE is going to scrap support for IE 7-9 because of all this hokerery pokery they introduced just to handle the massive differences between IE 6 and their new standard compliant browsers.

Anyone with a Windows computer nowadays should be on at least IE 10. Only if your still on XP and haven't done any Windows Updates since about 5 years ago would you be a real IE 6 user.

Yesterday at work I ran a report on the most used Browsers that day. 

IE 6.0 came 4th!

It was below the 3 SERP BOTS I mentioned earlier and above the latest Chrome version.

On more detailed inspection of my custom logger/defence system that analyses the behaviour of visitors rather than just assuming that because your agent is IE 6 you are actually human could I see these visitors were all BOTS. 

I check for things like whether they could run JavaScript by using JavaScript to log that they can in the same way as I do Flash. These users had no JavaScript or Flash support and the rate they went through pages was way too fast for a human controller.

The only reason I can think people are using this user-agent is because they are script kiddies who have downloaded an old crawling script and the default user-agent is IE 6 and they haven't changed it.

Either they don't have the skill or they are just lazy. However by banning all IE 6 visitors with a simple .htaccess rule like this you can reduce your traffic hugely.

RewriteRule %{HTTP_USER_AGENT} (MSIE\s6\.0|MSIE\s5\.0|MSIE\s5\.5) [NC]
RewriteRule .* http://127.0.0.1 [L,R=302]


This Rewrite rule bans IE 5, 5.5 and IE 6.0 and sends the crawler back to the localhost on the users machine with a 302  rewrite rule.

No normal person would be using these agents. There maybe some Intranets using VBScript as a client side scripting language from the 90's but no modern site is designed with IE 6 in the designers mind. Therefore most sites you find will not hanlde IE 6 very well therefore like Netscape Navigator they are an old browser so don't worry about site support for it. Therefore by banning it you will find your traffic going down a lot by banning just IE 6 and below.

So two simple ideas to reduce your traffic load. Try them and see how much your site improves.