Showing posts with label Speed. Show all posts
Showing posts with label Speed. Show all posts

Saturday, 18 June 2016

Why just grabbing code from the web can lead to major problems down the line

Why just grabbing code from the web can lead to major problems down the line

By Strictly-Software.com

I have wrote many articles over the years about server, system, website and PC performance, and it seems that the more versions of FireFox and Chrome that come out, the slower they get. I don't think I have ever used IE 11 as much as I have in the last 3 months. Mostly just to get Facebook, Radio 1 or Google+ to load within a minute which FF and Chrome seem to have issues with for some reason.

Some add-ons like uBlock Origin prevent 3rd party domain code from being loaded up on the site as well as large image or video/flash objects. It also stops pop-up windows and the loading of remote CSS fonts which is all the craze now.

What the developers of these websites don't seem to realise is that when they are loading in code from all over the web just to make a page display or run it causes a lot of network traffic. It also introduces the possibility that the code at the end source has been tampered with and therefore you could be loading in Cross Site Scripting hacks or ways for people to exploit your site if that certain script exists in the DOM.

Also a less likely scenario but a more common issue is that the more domains your site has to access to get all it's code onto the site, it can mean the page doesn't load as you may want it to, or even not at all.

If script A relies on Script B but Script B doesn't load for a long time then the code in Script A that was going to open a popup window on DOM Load, or play a video just isn't going to work.

I recently overrode the Window.OnError event and logged the Message, URL and Line No with an AJAX call to a log file before either throwing the error for modern sites or hiding it for older ones.

When I started looking through these files the amount of Google AdSense and Tracker scripts not loading due to timeouts is incredible. Also there are issues with bugs in the scripts or due to their slow loading objects not being available for other scripts relying on them to use. An example of just one error is:

24/04/2016 09:54:33 : 8X.XXX.XXX.161 'document.body' is null or not an object in http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js on line 19

People relying on Google for stats shouldn't for a number of reasons. Not only do they not always load and record the visit, but they also rely on 3rd party cookies being enabled and JavaScript being enabled. A Log parser or DB is a much better way to log every single visitor BOT or Human.

For example if you have a main jQuery script you are loading in from a CDN or from a site you don't control, if that domain is having network problems then that means any other code on the site reliant on it won't be able to work until that issue is resolved. This happens a lot from viewing the messages in my JavaScript error log file.

Due to this a lot of  people just grab the code off the net and load it in from a local server to get round network delays.

However by doing this they are stuck in a point of time (the date and the version they copied the file at). I hate this, as instead of actually learning JavaScript so they know what they are doing they are relying on some other blokes framework to solve their problems e.g have a look at whose code most of you are building your site with. If there is a bug in jQuery you either have to fix it yourself or wait for John to fix it. If it's your own code at least you can rely on your own skills and know how the code works.

The other day I had to solve a jQuery problem where the page in question was using an old version of jQuery and another 3rd party script built around jQuery (but not by John), called reveal.js.

As the front end developers wanted to move to the latest version of jQuery they suddenly found that the reveal.js code no longer worked.

After debugging it was clear that the $().live(function) had been removed and as the code that did the popup relied on reveal.js and it was built in 2011 with no recent updates. The whole revealing and hiding of modal boxes stopped as soon as a modern version of jQuery was loaded in for the site.

I had to waste time reading up on jQuery and then hardcoding the version of reveal.js as we had to use the new .on() function so that the new jQuery libraries would work with the old code that was taken from a library developed in 2011.

This is one thing I hate about front end developers who just pick n choose libraries off the web despite them all doing the same thing like event binding and removal multiple times in multiple ways.

If they are relying on a 3rd party library they took from 2011 that also relies on a constantly updated framework like jQuery that is always dropping and adding new methods, then how are people to expect sites to work when a method these libraries rely on are removed?

If they cannot write some basic notes to say that this page relies on this script e.g reveal.js, which came with jQuery 1.4.5 then it makes people like me who hate debugging other peoples frameworks hate 3rd party code even more.

Not only do I have my own getme.js framework which is simple, uses CSS selectors, linked methods where the array of objects is passed down from function to function, but now that most browsers support the simple one line of code that allows for selectors to find objects there is no need to add Sizzle.js to it any-more. Not unless you really want to support old IE versions you can just use this single line.

// where query is the CSS selector
document.querySelectorAll( query ); 

For example in my Getme.js code this following line of code will loop through all Anchor nodes with a class of menu on them inside the DIV with the ID MAIN. I just then alert out the elements ID.

G('DIV#Main > A.menu').each(function(){
   alert(this.id);
})

Obviously if you do all your styling in CSS or inline JS you have the option of how to style a series of objects for example with the .setAtts method you can pass in any element attribute and their values.

This is providing a mixture of a class and inline styles to the Paragraphs inside DIV tags. It also uses chaining where the array of object are passed from one function to the next just like other frameworks.

The first example just looks for DIV tags with P's inside and sets the class to "warningRed" and the style of the font to bold and red. The class can do most of the styling or ALL of it.

It's just an example, so is the 2nd one that shows all P tags with a SPAN with the class "info". Inside it gets a warning message with the .setHTML method and then the .setStyle method colours the text.


G('DIV > P').setAtts({class:"warningRed", style:"color:red; font-weight:bold"});

G('P > SPAN.info').setHTML('CLick for help.').setStyle({color:red, fontSize:8px});


I used a G instead of $ just to distinguish it from all the other frameworks and because it's called Getme.js.

If you want to know how to learn to write your own chainable framework then have a read of this article of mine. I've kept Getme.js simple as I hate people who just copy code from the web especially when it goes wrong.

At least this way I have a wrapper object that allows for chaining and the setting of multiple attributes at once and the use of selectors. However I still like to use pure JavaScript inside my functions so people down the line can get their heads around it.

So next time I get a jQuery problem because John Resig has decided to remove a core function from his framework which then causes a chain re-action due to all the other frameworks that were built around that version of jQuery, I can at least (hopefully) use my simple framework to apply the CSS that the designers need to rather than spend a day hunting around for fixes to other people's code.

That, is something I really hate doing.



By Strictly-Software.com 

© 2016 Strictly-Software.com

Don't Be Fooled By "Turbo Boost" and Windows Performance / Cleaner Applications

Don't Be Fooled By "Turbo Boost" and Windows Performance / Cleaner Applications


By Strictly-Software.com

I bet if you have been online for a more than a few times you will have undoubtedly seen adverts for tools and applications that will "Speed up your computer" or "Tune it up", "remove unnecessary files" and even malware.

Most of these apps are con tricks in that they will run, show you a really high number of problems either to do with security, privacy or performance and when you go to fix them you are told you must pay a fee of £29.99 to get the full version.

Scam code I call it.

Mainly because people don't know what half the items that are recorded as security holes or performance issues are. For example to get a nice big list of privacy concerns about 20,000 they might list every single cookie you have from every browser.

If you don't know what a cookie is it it's a harmless small text file that holds very small information about your visit to the site e.g by linking your username to a member ID so that the next time you visit the site you don't have to keep re-typing your username in the login box.

For example if you install the Web Developer Toolbar on FireFox you can view all the cookies on a site, domain including sessions. Viewing the cookies for this site I see one that gives me this really important information....

Name: SNID
Value: 72=i-mBmgOp22ixVNh68LucZ_88i1MnYk0FkV2k8k3s=uNr4G5YjLe6X9iAQ
Host: .google.com
Path: /verify
Expires: Mon, 11 Apr 2016 16:43:43
GMT Secure: No
HttpOnly: Yes

I have no idea what the cookie value for SNID means and most people apart from the web developers won't so when people try and scare you with "cookies are dangerous" - something I have heard from my parents many times - just ignore their ignorance of web development.

They just need to realise that unless your password is stored in a plain text cookie (which never happens) then you don't have much to fear from cookies at all. They just fill up your local data directories the more sites you visit.

The one thing you may not like are tracking cookies e.g Google who try and track you from site to site to see what kind of information you are interested in so that they can show you relevant adverts.

Turning off 3rd party cookies in Chrome or the browser of your choice and setting DNT (Do Not Track) to YES/ON is worth doing even if some browsers don't support the DNT header.

Turbo Mode

Turbo mode is one of those cool sounding options that seem to signal that just by pressing the Turbo ON button your whole machine will speed up. In reality it does a few things, many of which might not even be happening at the time you press it.

These include:

-Stopping a scheduled de-fragmentation of your hard disk. Something that is rarely needed or used anyway but does consume memory and CPU if running.
-Stopping any scheduled tasks from running. These could be updates, downloads of applications that require updates and the automatic creation of system backup and restore points.
-Postpone the automatic download and installation of important application and Windows updates.

You will be informed about the postponing of downloads and automatic updates such as Windows Updates if enabled.

In reality it doesn't do much but sounds and looks good when it says it has boosted your systems performance by 25% etc. Just beware that there is no way of it really knowing how much it has helped and it is probably negligible anyway.

If you really want to speed up your PC, open the task manager, enable the show all processes option and then order the results by CPU or Memory. The programs at the top using over 1GB should certainly be looked at and may have memory leaks.

A shut down of those applications and then re-opening of them might help you out a lot. I find some apps like MS SQL 2015 really drain my memory if I leave them on for days and a reboot now and then is the best remedy for most problems.

It may be a joke from the IT Crowd to "Turn it on and off again", but in reality that does solve a hell of a lot of problems with computers running high memory or CPU.

Always try and install Windows updates regularly so you are not waiting around hours for those 64 updates to install like I have a number of times due to keep hitting the "Remind me in 15 minutes" button. A reboot with the most up to date software is the best thing you can do for your PC as well as removing applications and plugins for browsers that you never use.

The more unnecessary applications you have on your system the more apps you will find in your Windows Start Up options running just to monitor for updates. Google does it, iTunes does it, and many other programs do as well. The more you can trim your system down so it only uses what you want it to use the better.

Plugins on browsers that were only used once should be removed afterwards.Regularly check if you are actually using all the browser plugins as when they are updated the old versions are hardly ever removed.

Applications you downloaded to do one task should also be uninstalled before you forget about them.

The leaner the machine the quicker the machine. I have a 16GB RAM 64GB Windows box at work and I regularly hit 12/13GB of memory. I usually know this is happening because the radio cuts out. However as I hate closing everything down, waiting for the installations and then trying to remember what I had open at the time I tend to let the memory rise and rise and then get frustrated as everything slows down.

If someone could invent a program that would remember what was open and then after rebooting re-open every app, file (with text), and program that was running before would make a mint. If something like this already exist PLEASE TELL ME WHERE I CAN FIND IT!

Clean your PC manually

This part of the article shows you how these myriad of application cleaner tools which trick you into paying money to speed up your PC are basically useless. Even tests have proved that running the following Windows 8+ built system applications can be just as affective.

Use the built in Disk Cleanup tool included with Windows. It’s focused on freeing up space on your hard drive, but it will also delete old temporary files and other useless things. Just tap the Windows key, type Disk Cleanup, and press Enter to launch it. You can even schedule a Disk Cleanup to clean your computer automatically.

When the tool pops up it will list a number of folders and system folders containing files that build up over time the more you use your PC.

Whilst this might be good in regards to browser cache when you are constantly going to the same sites over and over again as it means the photos and other files are locally stored on your computer preventing a network look up to download them again, these are files that you probably use once and forget about. This causes the folder size to rise and rise slowing down access. If you don't go to the sites often enough for a browser cache to be useful then clean it out. A tool like CCleaner can let you decide which sites get cleaned and which others don't.

Remember to regularly clean the following:
  • Your downloaded folder, apps, videos and other files that you have then installed or watched and no longer need.
  • Device Driver Downloads after installation.
  • Empty the Recycle Bin
  • Clean the System Error and Memory Dump Files
  • Delete Temporary Files 
  • Delete User File History

There are tools that are free that help you do all this, backing up your PC before the deletions in case something goes wrong. We will look at CCleaner in a bit.

So if you don't want to rely on costly tools that try and trick you into paying money to make you feel safe there are plenty of ways around it.

1. Don't be tricked by the salesperson at PC World who promises you McAfee Anti Virus software is the best way to protect your PC. It's insurance, and they get the money - a bonus to the sales person so to speak.

There is no need to waste money on a tool that will kill your CPU by constantly scanning every single file your computer accesses (which is a lot), when there are free tools like MalawareBytes Anti-Malware which can be downloaded for free online. There is a premium version if you do require constant analysis of every file your PC comes in contact with but I haven't found it to be needed.

Just run a scan once a week and make sure to never open .ZIP, .EXE, .DOCX or .PDF files in emails especially when you are not expecting them and they are from people you don't know.

Also please remember that is VERY EASY to fake the "FROM" address in an email (1 line of code), so if your a member of a site and someone sends you a flashy looking email that seems to be from PayPal, Facebook or your bank with the address admin@facebook.com do at least a few things before opening the file.

1. Open the full email headers so that you can see the original sender of the email. Is it from Facebook or your bank?

2. If you are not sure as it's an IP address e.g 134.1.34.248 then run that in a command prompt with the line >> nslookup 134.1.34.248 and make sure it returns a known address. If it comes back empty or with an unknown name e.g RuskiHCKER.com use an online Whois tool (there are lots online), or if you have installed WhoisCL on your Windows computer type whoisCL RuskiHCKER.com and see what the WHOIS details return about the owner of the address. It should tell you what country it's from and an email address to complain to if you are being spammed by it.

3. If the HTML email looks fancy like your bank or Facebook or some other site. Move your mouse over some of the bottom links in the footer or side bar. Most site strippers will only bother putting code behind the main buttons so they can log your typing e.g Login, Password, Forgot Password etc. If you roll your mouse over the "About" or "Help" links and all you see is a # instead of a proper URL then that is suspicious. Delete the email ASAP!

Remember banks never ask you for your PIN code so never trust a site asking you for that. Also if it asks you for information about your mothers maiden name, first pet, first school, favourite colour and other information used to verify you by sites you should shut it down ASAP.

4. If the headers look okay it could still be a hacked mailserver or a man in the middle attack so right click the file and if you installed Malaware properly you should be able to run a virus scan over the file with one click before saving or opening it. If you can't then save it to your computer and run a virus check on the file before opening it. Never just open the file whoever you may think it's from.

Regularly clear your browser history or even better, set your browser to automatically clear its history when you close it if you don’t want to store a history or even better just use the browsers secret browsing options e.g Chrome's is called Incognito and allows you to surf the web without leaving a history or storing cookies on your machine.

Also clear your browser cache every now and then. Whilst a cache is good for quick loading of images and files (JS, CSS, JPEGs) that are used often. Once it becomes too large then it gets slower and slower to find those files you need so it negates the usefulness of it due to it's size.

Run the Disk Defragmenter included with Windows. This isn't necessary if you use an SSD or solid-state drive.

Don’t bother with a registry cleaner or other performance tool if you have to pay for it. If you want an application to help you then CCleaner is that tool.

You can download from here: CCleaner, The good thing about it, is that it's the best-tested registry cleaner out there.

I always run a registry clean after removing applications from my computer to ensure any registry keys and file extensions left over are also removed. CCleaner will also delete your browser cache for all the browsers you use, as well as cookies, saved passwords, web history and temporary files for other programs.

You have the choice to tick what you want to clean and what not to clean but the free tool CCleaner does a lot more than many of these PC cleaning apps do. A test performed in 2011 by Windows Secrets found that the Disk Cleanup tool included with Windows was just as good as paid PC cleaning apps.

Note that this is true even though PC cleaning apps fix “registry errors” while the Disk Cleanup app doesn't, which just shows just how unnecessary registry cleaners are. So don't waste money being "blackmailed" into buying the premium version of these clean up tools.

So yes, it’s been tested, PC cleaning apps are worthless. Tune your PC yourself and you will get better results.

If you want to download CCleaner which is the recommended tool that professionals use then you can get it from here www.piriform.com/ccleaner/download.

By Strictly-Software.com 

© 2016 Strictly-Software.com

Friday, 29 April 2016

Chome and FireFox really getting on my tits....

Chome and FireFox really getting on my tits....

By Strictly-Software.com

Chome and FireFox really getting on my tits....

Chrome was by browser of choice, due to being light weight and fast.

FireFox was in 2nd place due to the range of plugins available.

I had relegated IE into usage only to test code for cross browser compatibility issues.

However I am finding that I am actually using Internet Explorer more and more due to constant issues with both of the latest versions of these browsers.

I am running Chrome: 50.0.2661.75 (64-bit) And FireFox 46.0 buld no: 20160421124000 (64 bit) on all 3 of my machines (Win7 & Win 8.1)

There was a stage when both these honey's were humming like a bees. I even put up some articles on how to improve the speed on both browsers:

Speeding Up Chrome Can Kill It
Speeding up Google Chrome with DNS Pre-Fetching
Performance Tuning FireFox

I also put up a general PC and Browser tune up article with free tools, command line prompts and some basic things to try if you had a slow computer: Speeding up your PC and Internet connection.

However I have even found myself using IE 11 more and more due to constant hanging, pages not loading at all with the "processing request" message in the footer, or waiting for some 3rd party non asynchronous loaded in script, to download and run that blocks the site or page from running.

I think there is a far too much "API JIZZ" in the community at the moment.

What this means is that developers, due to their nature to impress and gold plate code, even when the spec doesn't call for it, are now using so many 3rd party and remotely hosted plugins like jQuery, Google Graphs, tracker code, plus loads of funky looking CPU consuming widgets to make their pages look good.

You only have go into Facebook or G+ and try and write a message. Not only will Google Plus's new post box move around the page before you can start writing, but both websites are constantly analysing your keystrokes to find out if the previous string matches a contact, community or page, in your contact book for them to link to.

The more people and pages you have stored the slower this process becomes. Yes is might be handy but why not just require a symbol like + in Google+ to be put before the person name so that the code only checks that word for a relation.

Imagine having a list of thousands of pages, liked communities/pages and contacts to be constantly checked on every keydown press with AJAX requests. That is overkill. It slows down systems .

I still have two windows from Chrome spinning away for (Google Blogger blogs) at the moment. There is not much 3rd party code on these pages but they are having trouble and showing common "Waiting for Cache" and "Processing Request" messages in the status bar.

This is the same sort of thing I get in FireFox. Although in this browser, what kills me is just the slowness of getting from page to page. On many sites I have to refresh it multiple times before the code all loads and this goes for online banking to online betting sites. Just trying to watch a race on their Flash screens is a nightmare.

I had a bet on a horse the other day on Bet365.com just so I could watch the big race with an unbeaten in 11 straight wins, Douvan, running. However Bet365.com video didn't start and in SkyBet it was stuttery and kept losing picture and sound. I missed the end of one race where a horse I had backed jumped the last fence into the lead but when the picture came back it had finished 3rd!

They keep telling me to clear the cache, reboot the router and do speed tests. Things I have done many times. I have 54Mbps download speed at work and 28Mbps at home. I can stream 4k UHD TV to multiple screens so download speed is not the issue something else is.

Speedof.me is the best online speed testing site I have found as it as it uses no extra files and is ran in pure HTML5 with no Flash, Java or ActiveX type objects requiring to be loaded for it to run.

What is causing the problem I have no idea as my broadband speed seems okay. I suspect it's the large number of reverse proxies being used and the download of shared 3rd party scripts and widgets that can hang due to a large number of HTTP requests.

I tried deleting my userdata file for Google by searching for it in the address bar of Windows Explore with this line: %USERPROFILE%\AppData\Local\Google\Chrome\User Data 

I have also tried disabling Flash as so many times I see the "An object has crashed" bar in the header that is related to the Flash Container object failing. Sometimes a reload works other times it doesn't.

However so many sites STILL use Flash it is hard to live without it really. For example the WHOLE of Bet365.com is made in Flash which makes it very user unfriendly and hard to use with sticky scrollbars and issues with selection of items.

If anyone has similar issues or ideas on resolving them let me know, as I never thought I would be going back to IE to use as my main browser!

By Strictly-Software.com

©2016 Strictly-Software.com

Wednesday, 10 July 2013

Apache Performance Tuning BASH Script

BASH Script to tune Apache Configuration Settings

As you might know a lot of the time I think the LAMP / Wordpress combo is a big bag of shite.

There are so many configuration options, at so many different levels that need tuning to get optimal performance, it is a nightmare to find the right information. There is also too many people offering various solutions for Wordpress / Linux / Apache / MySQL configuration.

Different people recommend different sizes for your config values and just trying to link up server load with page/URL/script requests to find out the cause of any performance issue is a nightmare in itself.

I would have thought there would have been a basic tool out there that could log server load, memory, disk swapping over time and then link that up with the MySQL slow query log, Apache error AND access logs so that you could easily tell when you had issues what processes were running, which URL's were being hit and how much activity was going on to identify culprits for tuning. I have even thought of learning PERL just to write one - not that I want to!

Even with all the MySQL tuning possible, caching plugins installed and memory limits on potentially intensive tasks it can be a nightmare to get the best out of a 1GB RAM, 40GB Virtual Server that is constantly hammered by BOTS, Crawlers and humans. I ban over 50% of my traffic and I still get performance issues at various times of the day - why? I have no FXXING idea!

Without throwing RAM at the problem you can try and set your APACHE values in the config file to appropriate values for your server and MPM fork type.

For older versions of Apache the Multi-Processing Module, non-threaded, pre-forking webserver is well suited as long as the configuration is correct. However it can consume lots of memory if not configured correctly.

For newer versions (2+) the Worker MPM is better as each thread handles a connection at a time and this is considered better for high traffic servers due to the smaller memory footprint. However to get PHP working on this setting apparently needs a lot of configuration and you should read up about this before considering a change.

Read about Apache performance tuning here Apache Performance Tuning.

To find out your current apache version from the console run

apache2 -v OR httpd -v (depending on your server type, if you run top and see apache2 threads then use apache2 otherwise use httpd)

You will get something like this.

Server version: Apache/2.2.9 (Debian) Server built: Feb 5 2012 21:40:20

To find out your current module configuration from the console run

apache2 -V OR httdp -V

Server version: Apache/2.2.9 (Debian)
Server built: Feb 5 2012 21:40:20
Server's Module Magic Number: 20051115:15
Server loaded: APR 1.2.12, APR-Util 1.2.12
Compiled using: APR 1.2.12, APR-Util 1.2.12
Architecture: 64-bit Server
MPM: Prefork threaded: no forked: yes (variable process count)
etc etc etc...

There are lots of people giving "suitable" configuration settings for the various apache settings but one thing you need to do if you run TOP and notice high memory usage and especially high virtual memory usage is try and reduce disk swapping.

I have noticed that when Apache is consuming a lot of memory that your virtual memory (disk based) will be high and you will often experience either high server loads and long wait times for pages to load OR very small server loads e.g 0.01-0.05, an unresponsive website and lots of MySQL Server Gone Away messages in your error log file.

You need to optimise your settings so that disk swapping is minimal which means trying to optimise your MySQL settings using the various MySQL tuning tools I have wrote about as well as working out the right size for your Apache configuration values.

One problem is that if you use up your memory by allowing MySQL to have enough room to cache everything it needs then you can find yourself with little left for Apache. Depending on how much memory each process consumes you can easily find that a sudden spike in concurrent hits uses up all available memory and starts disk swapping.

Therefore apart from MySQL using the disk to carry out OR caching large queries you need to find the right number of clients to allow at any one time. If you allow too many and don't have enough memory to contain them all then the server load will go up, people will wait and the amount of disk swapping will increase and increase until you enter a spiral of doom that only a restart fixes.

It is far better to allow fewer connections and serve them up quickly with a small queue and less waiting than open too many for your server to handle and create a massive queue with no hope of ending.

One of the things you should watch out for is Twitter Rushes caused by automatically tweeting your posts to twitter accounts as this can cause 30-50 BOTS to hit your site at once. If they all consume your memory up then it can cause a problem that I have wrote about before.

Working out your MaxClients value

To work out the correct number of clients to allow you need to do some maths and to help you I have created a little bash script to do this.

What it does is find out the average size of an Apache thread then restarts Apache so that the correct "free size" value can be obtained.

It then divides the remainder by the Apache process size. The value you get should be roughly the right value for your MaxClients.

It will also show you how much disk swapped or virtual memory you are using as well as the size of your MySQL process.

I noticed on my own server that when it was under-performing I was using twice as much disk space as RAM. However when I re-configured my options and gave the system enough RAM to accommodate all the SQL / APACHE processes then it worked fine with low swapping.

Therefore if your virtual memory is greater than the size of your total RAM e.g if you are using 1.5GB of hard disk space as virtual memory and only have 1GB of RAM then it will show an error message.

Also as a number of Apache tuners claim that your MinSpareServers should be 10-25% of your MaxClients value and your MaxSpareServers value 25-50% of your MaxClientsValue I have also included the calculations for these settings as well.


#!/bin/bash
echo "Calculate MaxClients by dividing biggest Apache thread by free memory"
if [ -e /etc/debian_version ]; then
 APACHE="apache2"
elif [ -e /etc/redhat-release ]; then
 APACHE="httpd"
fi
APACHEMEM=$(ps -aylC $APACHE |grep "$APACHE" |awk '{print $8'} |sort -n |tail -n 1)
APACHEMEM=$(expr $APACHEMEM / 1024)
SQLMEM=$(ps -aylC mysqld |grep "mysqld" |awk '{print $8'} |sort -n |tail -n 1)
SQLMEM=$(expr $SQLMEM / 1024)
echo "Stopping $APACHE to calculate the amount of free memory"
/etc/init.d/$APACHE stop &> /dev/null
TOTALFREEMEM=$(free -m |head -n 2 |tail -n 1 |awk '{free=($4); print free}')
TOTALMEM=$(free -m |head -n 2 |tail -n 1 |awk '{total=($2); print total}')
SWAP=$(free -m |head -n 4 |tail -n 1 |awk '{swap=($3); print swap}')
MAXCLIENTS=$(expr $TOTALFREEMEM / $APACHEMEM)
MINSPARESERVERS=$(expr $MAXCLIENTS / 4)
MAXSPARESERVERS=$(expr $MAXCLIENTS / 2)
echo "Starting $APACHE again"
/etc/init.d/$APACHE start &> /dev/null
echo "Total memory $TOTALMEM"
echo "Free memory $TOTALFREEMEM"
echo "Amount of virtual memory being used $SWAP"
echo "Largest Apache Thread size $APACHEMEM"
echo "Amount of memory taking up by MySQL $SQLMEM"
if [[ SWAP > TOTALMEM ]]; then
      ERR="Virtual memory is too high"
else
      ERR="Virtual memory is ok"
fi
echo "$ERR"
echo "Total Free Memory $TOTALFREEMEM"
echo "MaxClients should be around $MAXCLIENTS"
echo "MinSpareServers should be around $MINSPARESERVERS"
echo "MaxSpareServers should be around $MAXSPARESERVERS"


If you get 0 for either of the last two values then consider increasing your memory or working out what is causing your memory issues. Either that or set your MinSpareServers to 2 and MaxSpareServers to 4.

There are many other settings which you can find appropriate values for but adding indexes to your database tables and ensuring your database table/query caches can fit in memory rather than swapped to disk is a good way to improve performance without having to resort to more caching at all the various levels Wordpress/Apache/Linux users love doing.

If you do use a caching plugin for Wordpress then I would recommend tuning it so that it doesn't cause you problems.

At first I thought WP SuperCache was a solution and pre-caching all my files would speed things up due to static HTML being served quicker than PHP.

However I found that the pre-cache stalled often, caused lots of background queries to rebuild the files which consumed memory and also took up lots of disk space.

If you are going to pre-cache everything then hold the files as long as possible as if they don't change there seems little point in deleting and rebuilding them every hour or so and using up SQL/IO etc.

I have also turned off gzip compression in the plugin and enabled it at Apache level. It seems pointless doing it twice and PHP will use more resources than the server.

The only settings I have enabled in WP-Super-Cache at the moment are:


  • Don’t cache pages with GET parameters. (?x=y at the end of a url) 
  • Cache rebuild.
  • Serve a supercache file to anonymous users while a new file is being generated. 
  • Extra homepage checks. (Very occasionally stops homepage caching)
  • Only refresh current page when comments made. 
  • Cache Timeout is set to 100000 seconds (why rebuild constantly?)
  • Pre-Load - disabled.

Also in the Rejected User Agents box I have left it blank as I see no reason NOT to let BOTS like googlebot create cached pages for other people to use. As bots will most likely be your biggest visitor it seems odd to not let these BOTS create cached files.

So far this has given me some extra performance.

Hopefully the tuning I have done tonight will help the issue I am getting of very low server loads, MySQL gone away errors and high disk swapping. I will have to wait and see!

Sunday, 27 December 2009

Reasons why Google Chrome is a great Browser

A list of reasons why Google Chrome is a great Browser

Ever since Google introduced Chrome I have been using it all the time for surfing the net. I had been using FireFox purely for the standards compliance, speed and features but then the more features that were added the slower it got.

I still use FireFox for development and its rich library of plugins and you can speed it up by following the tweaks listed in my Increasing FireFox Performance article. However when I just want to surf the net, watch movies or read articles Chrome is the browser I choose. Here are some reasons why.

  • Reason 1 Chrome is great > It's simple to use and not full of features that are purely there for marketing but you never use. It does the job simply and it does it well.
  • Reason 2 Chrome is great > The Bookmark bar is great for storing quick links to your favourite sites.
  • Reason 3 Chrome is great > It's Fast. It's fast to startup and fast to load pages. It's JavaScript engine is also fast and solid its less forgiving than FireFox that could be considered a good thing or a bad thing depending on how sloppy your code is.
  • Reason 4 Chrome is great > It's Easy to surf in a semi private mode e.g Incognito browsing any cookies are destroyed after browsing and no download or browsing history is kept.
  • Reason 5 Chrome is great > The inbuilt developer tools are pretty good. The Developer console offers a good debugger, JavaScript console, DOM viewer and element inspector.
  • Reason 6 Chrome is great > Its standards compliant.
  • Reason 7 Chrome is great > It now has support for add-ons and plugins so can compete with FireFox for great features. Turn off adverts and flash by default to speed up load time even more. Read my article on Google Chrome plugins for more details.

Thursday, 24 December 2009

Performance Tuning your PC and Internet Connection

How to performance tune your Computer and Internet Connection

I recently had major issues with performance on my laptop and an intermittent slowdown which meant that I couldn't watch streamed movies (e.g YouTube) or remotely access my office computer due to the slow internet connection. Certain times of the day it was fine but at night it was generally bad. This article is based on the steps that I used to diagnose and overcome the problem. It can also be used by those of you who just wish to get the best performance out of your computers.

Is the problem related to your Internet speed or overall computer performance?

Are you only experiencing problems when you are on the Internet such as slow loading web pages, stuttering video streaming or videos just not playing. Or are you having problems running desktop applications such as programs that are slow to open or files that are slow to save. Is just navigating your PC a task in itself or are you experiencing popups all the time that you don't recognise asking you "To run performance checks", "Install this Spyware checker" or pages filled with adverts or links to advertisements that you don't know where they have come from?

Computer Related Problems

First thing is to ensure you don't have a virus, Trojan or Spyware on your PC.
  • If you use Internet Explorer to surf the Internet then there is a good chance you might have a virus as this browser is well known for its many security holes. Consider changing your browser to either Chrome or Firefox. Chrome is a very fast browser and Firefox is a favourite of developers due to the huge number of add-ons available for it.
  • If you use a PC Make sure you install any Windows updates as they reguarly contain patches for security vulnerabilities.
  • If you don't have a virus / spyware checker installed then download one of the good free ones e.g Malwarebytes Anti-Malware, Spybot Search and destroy, Ad-Aware or even better download multiple applications as its not uncommon for one app to find items that another one will not. Remember to always update the virus definitions before running it.
  • If your virus software doesn't find a virus it doesn't mean you don't have one it could just mean that its either a new virus that definitions haven't been created for or its already managed to take hold of your PC and block any virus checker from finding it. Try running a program such as Trend Micro's HijackThis which checks for suspicious looking processes and activity on your PC rather than looking for known virus definitions. If you are unsure about a flagged item you should send the outputted report to one of the recommend forums where specialists will analyse the report and give you detailed info on any action required such as running the Trojan removal tool SDFix.exe.
Once spyware and viruses have been ruled out you should run some basic maintenance on your computer which can be done manually or by downloading one of the many optimiser tools that are available on the net. I have investigated many of these tools and by far the best one I have found is TuneUp Utilities which offers all the tools you need to clean and speed up your PC and browser with a very easy to use interface.

TuneUp Utilities 2010

It offers the ability to modify computer and browser settings to speed up your browsing, remove un-used programs, clean up and defrag your hard-drive and registry, speed up your PC by disabling a number of memory and CPU intensive operations that offer little benefit and much more. There is also a "One Click Optimiser" button which checks your system and offers the solutions. If you want to save a lot of time downloading numerous tool or doing it all by hand then this is the tool for you.

Tuning up your PC Manually

  • Defrag your hard-drive. Over time your disk will get fragmented as new files are added and existing ones are edited or deleted. A heavily fragmented drive slows down file retrieval and saving. You can do this through the Accessories > System Tools > Disk Fragmenter option or you can download a tool like Defraggler to do this for you.
  • Remove old programs and shortcuts to those programs if you never use them any-more. You can use the Add-Remove programs option from the Control Panel to do this or download a program like CCleaner which offers a number of options to help clean up your computer.
  • Remove anything from your startup menu that you hardly use or don't require to be running when you start-up your computer.
  • Clean up your Registry. Often when files are installed or deleted keys are left in the registry that are no longer required. Like any database the more useless information it contains the slower the retrieval of useful info becomes. A tool like TuneUp Utilities or CCleaner offers you the ability to do this easily without having to trawl through the registry looking for keys by hand.
  • Disable memory and CPU intensive operations that run in the background when you require optimal performance. For example disk defragmentation or a full virus scan will slow down your PC when running. This is one of the good things about TuneUp Utilities Turbo Mode as it can be set on or off when required and will ensure that any CPU or Memory intensive operations can be disabled when you require optimal performance.
  • Configure the advanced settings in Control Panel > System > Advanced > Performance.
    1. Under the Visual Effects tab you should set the option to "Adjust for best performance".
    2. On the Advanced tab you should ensure Processor Scheduling and Memory Usage is set to Programs
    3. For Virtual Memory make sure both the initial and maximum size are set the same which according to Microsoft its recommended that this should be 1.5 times your system memory.
    4. Under the Data Execution Prevention tab you should set to"turn on DEP for all programs and services except those I select"
  • Clean up your temporary browser files. Make sure your cache and Internet history doesn't get too large so clean all temporary Internet files on a regular basis. The cache is great for helping sites you regularly visit load quickly but the larger it gets the slower page loads get for all sites.
  • Remove any add-ons that you never use anymore. In Firefox the more add-ons you have the slower the browser can be when loading and they can even cause errors. You will often have duplicate add-ons e.g different versions of Java which can be removed.
  • Install Advert and Flash blocker add-ons if your browser supports it (Firefox, Chrome). Without having to load Flash files and other adverts the page load times can be increased dramatically.
  • Disable JavaScript by default. Not only do most web delivered viruses use JavaScript to infect new PC's it can slow down page load times and make pages seem unresponsive during certain events e.g window, DOM load. All browsers will let you disable JavaScript and in IE VBScript from their inbuilt Options. However to make it easier to set which sites have it on and off you can install add-ons such as NoScript or the Web Developer toolbar. A lot of sites use JavaScript to display adverts, load flash or other videos, validate form fields and deliver other forms of content. Therefore you may find that by having JavaScript disabled you have reduced functionality on many sites. However pages should load a lot quicker and if you do trust the site or require the missing functionality you can always re-enable it.
  • Disable 3rd party cookies. These are cookies that are not set by the site you are visiting and are usually used by advertisers for tracking the sites you visit so that they can deliver more targeted advertisiments. Even Google uses these kinds of cookies now and many people consider them an invasion of their privacy which is why most Spyware tools identify them as items to be removed. This is how to disable 3rd party cookies in the top 3 browsers.
    1. Chrome you can do this by going to Tools > Options > Under The Hood > Privacy > Cookie Settings > Accept cookies only from sites I visit.
    2. Internet Explorer go to Tools > Internet Options > Privacy and then set your Privacy level to Medium high which will disable most 3rd party cookies and some 1st party ones. This will still allow you to login to sites but should prevent all the tracker and advert cookies that accumulate as you surf the net.
    3. Firefox removed the option to block 3rd party cookies in version 2 saying it was impossible to accomplish however you can still do this by either installing an add-on called CookieSafe or changing your user preferences by entering about:config in the address bar and then searching for network.cookie.cookieBehavior. The possible values are 0 which accepts all cookies, 1 only accept cookies from the same server and 2 disable all cookies. Set it to 1 to block 3rd party cookies.
  • Enable Popup blockers and disable any un-used toolbars e.g Google, Yahoo etc.
  • In FireFox disable Firebug and any other DOM manipulating add-ons and only enable them when required. Firebug has steadily got worse over the years in slowing down sites due to all the extra functionality that has been added to it. Therefore it should only be used when developing sites or when you need to use one of its features. The same goes for any other add-ons that you only use on certain sites or at certain times. Having less add-ons to load will increase page load times.
  • In Firefox tweak your config settings to improve performance. Read this article on which settings to tweak to get the best performance possible.
Testing for Network Problems

If you are having issues with slow loading pages when browsing or video streaming then you need to find out whether the problem is local to your home or a general network problem that you need to contact your ISP about.

Before doing anything else you should get some basic details of your network if you don't know them already such as the IP address of your gateway to the internet. Open a command prompt window and type "ipconfig". You should note down the results e.g

C:\Documents and Settings\me>ipconfig

Windows IP Configuration

Ethernet adapter Local Area Connection:

Media State . . . . . . . . . . . : Media disconnected

Ethernet adapter Wireless Network Connection:

Connection-specific DNS Suffix  . :
IP Address. . . . . . . . . . . . : 192.168.1.3
Subnet Mask . . . . . . . . . . . : 255.255.255.0
Default Gateway . . . . . . . . . : 192.168.1.1 


Note down the IP address and the Default Gateway address. The IP Address is your computer and the default Gateway is your connection to the outside world. In this case its a wireless router which is then connected to the Virgin Cable box.

We can now test whether the network problem is with my PC to the wireless or the main router or somewhere else by doing some PING tests.

A "Ping" measures the time that passes between the initial send of the Ping, and the receival of the "Reply" by the machine you pinged. The amount of time that passes during a ping is slightly influenced by the amount of hardware the ping is passed trough, as each would have to relay the ping further. However, there is no set formula for this, as the ping speed also depends upon the speed of the network, how busy it is, and so on.

  • A ping to your default gateway should be very quick e.g 1-2 ms
  • A ping to other computers on your LAN should be between 1-10 MS (good)
  • Pings to external websites such as www.google.com take anything from 20 - 150 MS anything under 50ms is good to an external site.
  • Pings to sites on the other side of the world that go through many hops e.g from the UK to www.china.com should report times of <500ms if the network is good.
So lets do some ping's, first to my gateway then to www.google.com and then to somewhere very far away e.g www.china.com.

C:\Documents and Settings\me>ping 192.168.1.3

PPinging 192.168.1.3 with 32 bytes of data:

Reply from 192.168.1.3: bytes=32 time<1ms TTL=128
Reply from 192.168.1.3: bytes=32 time<1ms TTL=128
Reply from 192.168.1.3: bytes=32 time<1ms TTL=128
Reply from 192.168.1.3: bytes=32 time<1ms TTL=128

Ping statistics for 192.168.1.3:
 Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
 Minimum = 0ms, Maximum = 0ms, Average = 0ms

C:\Documents and Settings\me>ping www.google.com

Pinging www-tmmdi.l.google.com [216.239.59.103] with 32 bytes of data:

Reply from 216.239.59.103: bytes=32 time=32ms TTL=52
Reply from 216.239.59.103: bytes=32 time=28ms TTL=52
Reply from 216.239.59.103: bytes=32 time=32ms TTL=52
Reply from 216.239.59.103: bytes=32 time=30ms TTL=52

Ping statistics for 216.239.59.103:
Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 28ms, Maximum = 32ms, Average = 30ms

C:\Documents and Settings\me>ping www.china.com

Pinging chcache.china.com [124.238.253.102] with 32 bytes of data:

Reply from 124.238.253.102: bytes=32 time=606ms TTL=48
Reply from 124.238.253.102: bytes=32 time=526ms TTL=48
Reply from 124.238.253.102: bytes=32 time=446ms TTL=48
Reply from 124.238.253.102: bytes=32 time=445ms TTL=48

Ping statistics for 124.238.253.102:
Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 445ms, Maximum = 606ms, Average = 505ms

If you are suffering packet loss or long delays then should investigate further.

Another good test from the command prompt is either the tracert / traceroute command or pathping which will do a series of pings from your PC to the destination showing you the addresses of each router it has to pass through and any delay it suffers on the way.

For example lets try a pathping to www.google.com.

C:\Documents and Settings\me>pathping www.google.com

Tracing route to www-tmmdi.l.google.com [216.239.59.99]
over a maximum of 30 hops:
0  strl03455wxp.domain.compname.co.uk [192.168.1.3]
1  192.168.1.1
2  10.129.132.1
3  glfd-cam-1b-v111.network.virginmedia.net [80.4.30.233]
4  glfd-core-1b-ge-115-0.network.virginmedia.net [195.182.181.237]
5  gfd-bb-b-ge-220-0.network.virginmedia.net [213.105.175.89]
6  man-bb-a-ae3-0.network.virginmedia.net [213.105.175.145]
7  man-bb-b-ae0-0.network.virginmedia.net [62.253.187.178]
8  tele-ic-3-ae0-0.network.virginmedia.net [212.43.163.70]
9  158-14-250-212.static.virginmedia.com [212.250.14.158]
10  209.85.255.175
11  209.85.251.190
12  66.249.95.169
13  216.239.49.126
14  gv-in-f99.1e100.net [216.239.59.99]

Computing statistics for 350 seconds...
Source to Here   This Node/Link
Hop  RTT    Lost/Sent = Pct  Lost/Sent = Pct  Address
0                                           strl03455wxp.domain.compname.co.uk
[192.168.1.3]
              0/ 100 =  0%   |
1    0ms     1/ 100 =  1%     1/ 100 =  1%  192.168.1.1
              0/ 100 =  0%   |
2  ---     100/ 100 =100%   100/ 100 =100%  10.129.132.1
              0/ 100 =  0%   |
3   14ms     4/ 100 =  4%     4/ 100 =  4%  glfd-cam-1b-v111.network.virginmed
ia.net [80.4.30.233]
              0/ 100 =  0%   |
4   16ms     2/ 100 =  2%     2/ 100 =  2%  glfd-core-1b-ge-115-0.network.virg
inmedia.net [195.182.181.237]
              0/ 100 =  0%   |
5   14ms     2/ 100 =  2%     2/ 100 =  2%  gfd-bb-b-ge-220-0.network.virginme
dia.net [213.105.175.89]
              0/ 100 =  0%   |
6   27ms     1/ 100 =  1%     1/ 100 =  1%  man-bb-a-ae3-0.network.virginmedia
.net [213.105.175.145]
              0/ 100 =  0%   |
7   24ms     1/ 100 =  1%     1/ 100 =  1%  man-bb-b-ae0-0.network.virginmedia
.net [62.253.187.178]
              0/ 100 =  0%   |
8   31ms     1/ 100 =  1%     1/ 100 =  1%  tele-ic-3-ae0-0.network.virginmedi
a.net [212.43.163.70]
              0/ 100 =  0%   |
9   33ms     0/ 100 =  0%     0/ 100 =  0%  158-14-250-212.static.virginmedia.
com [212.250.14.158]
              0/ 100 =  0%   |
10   25ms     1/ 100 =  1%     1/ 100 =  1%  209.85.255.175
              0/ 100 =  0%   |
11   37ms     1/ 100 =  1%     1/ 100 =  1%  209.85.251.190
              0/ 100 =  0%   |
12   39ms     1/ 100 =  1%     1/ 100 =  1%  66.249.95.169
              0/ 100 =  0%   |
13   41ms     0/ 100 =  0%     0/ 100 =  0%  216.239.49.126
              0/ 100 =  0%   |
14   35ms     0/ 100 =  0%     0/ 100 =  0%  gv-in-f99.1e100.net [216.239.59.99
]

Trace complete.

If you are suffering severe packet loss between routers then that could signify a problem or it may just be that the router is not set up to respond to pings and therefore any ping to that IP would report a time out.

Another test is to compare whether the speeds promised by your broadband provider are actually being delivered to you. There are many speed test sites out there but I tend to use www.broadbandspeedchecker.co.uk OR www.speedtest.net which will measure your download and upload speeds.

You should always do multiple tests and then take an average reading. When I was debugging the issue with my laptop and the wireless connection it had to my main PC and router I was alternating tests between both machines and recording the times to note any difference.

Broadband providers never seem to deliver exactly what they promise but if you are currently getting anything over 2Mbps you shouldn't be getting video streaming issues unless its High Definition movies. Upload speeds will always be a lot less than download speeds so don't expect equality on those two measurments however if like me you were getting periods of the day where your download speed was measured less than 100Kbps then there is definitely something wrong somewhere.

One thing you should remember when dealing with speeds on the net is that the measurements are different than those for disk space. 1Mb is one megabit and 1MB is one megabyte. You can always tell by the letter b as if its capitilised then its bytes and if its lower case its bits. Another thing to note is that a rate of one kilobyte per second (KBps) equals 1000 (not 1024) bytes per second.

If your network problems are intermittent then you should download a tool like networx which allows you to monitor your bandwidth usage, show hourly, daily, monthly reports, set limits on usage and run diagnosis tools such as tracert and ping but in a visual manner.

Run the bandwidth monitoring tool throughout the day and run hourly speed tests this should tell you whether your network problems happen at certain times of the day and provide you with evidence that you can then download as an XLS to provide to your ISP when you contact them to complain.

Wireless Network Issues

If like me you use a laptop that is connected to the main router by a wireless connection then you should rule out problems with the wireless set-up. Run some pings from your PC to the wireless router to check for any issues but ensure that your router is set-up to accept ping requests first.

  • Make sure you have the latest firmware, software and drivers in your router, modem and network adaptor. Communications and hardware companies are always updating the software inside their devices so you should make sure you have the most up to date drivers and other software for your equipment. You should be able to download this from the manufacturers website.
  • Tune your wireless access point. If you get substantially higher speeds when you connect directly to your broadband instead of using wireless networking, this can be due to interference from other Wi-Fi installations nearby, especially if you are in a city. Find out if there is a problem by plugging the network output from your broadband moden directly into the Ethernet port on your laptop or desktop and seeing if speeds improve. If so, try changing the channel of your wireless network: there'll be a setting in its configuration screen, which you can get to via your browser. Check your handbook for details of your router. You should also try moving your laptop around the house to see if you get a better or worse signal depending on where you are.
  • Make sure you are not getting electrical or radio interference from other devices in your house. Lots of gadgets including radios, media streamers, mobile phones and tools to send TV signals around the house use Wi-Fi and they're all sharing the same airwaves. Try turning off all electrical equipment to see if that improves the signal and then one by one turn them on again until you find the culprit. Even mains wiring that runs alongside telephone or network cables can cause a problem.
  • Whilst on the wireless network place your laptop right next to the main router and run some speed tests. If you are having issues with speed whilst directly next to the router then it maybe a problem with the wireless router itself or the wireless internet card your PC or laptop is using.
TCP / IP Tuning

Computers are shipped with default TCP / IP settings that are designed to work with all network speeds, dial ups, DSL and Cable. This means that you can tweak various settings so that they are optimal for your computer.

There are various tools that can help you do this easily such as TuneUp Utilities or there are those such as DrTCP or TCP Optimizer that allow you to view and edit various settings such as your MTU Maximum Transmission Unit or maximum packet size and RWIN (TCP Recieve Window). Out of both these tools TCP Optimizer offers the more configuration options, a registry editor and some tests to calculate your MTU correctly.

For those of you interested in what these values mean then the MTU is the maximum Ethernet packet size your PC will send. If a packet that is too large is sent then it will get split up into chunks (fragmented) and then re-assembled at the destination which obviously is not optimal. Therefore you want the MTU value to be the largest packet size that can be sent without becoming fragmented.

Unless otherwise set, Windows defaults MTU to 1500, or a lower value of 576 for external networks. 1500 is OK unless you are running PPPoE, want to use IPSec (Secure VPNs) or both, then it's too big. 576 is not efficient for the broadband/Internet as it's too small. For Windows VISTA users it's recommended to leave this value alone as apparently it does a pretty good job of automatically calculating these settings anyway.

You can calculate this yourself with the command prompt by doing the following tests.

Windows 2000/XP users:

ping -f -l 1472 www.google.com
(That is a dash lower case "L," not a dash "1." Also note the spaces in between the sections.)

Linux users:

ping -s 1472 www.google.com

OS X users:

ping -D -s 1472 www.dslreports.com

Linux and OS X commands are case sensitive.

Press Enter. Then reduce 1472 by 10 until you no longer get the "packet needs to be fragmented" error message. Then increase by 1 until you are 1 less away from getting the "packet need to be fragmented" message again.

Add 28 more to this (since you specified ping packet size, not including IP/ICMP header of 28 bytes), and this is your MaxMTU.

If you can ping through with the number at 1472, you are done! Stop right there. Add 28 and your MaxMTU is 1500.

For PPPoE, your MaxMTU should be no more than 1492 to allow space for the 8 byte PPPoE "wrapper," but again, experiment to find the optimal value. For PPPoE, the stakes are high as if you get your MTU wrong, you may not just be sub-optimal, things like uploading files or web pages may stall or not work at all.

This example shows you how to do it by hand. If you you downloaded the TCP Optimizer tool go to the largest MTU tab and run the test. You will see that it does a similar test to the one below but obviously its automated to save you time.

C:\Documents and Settings\me>ping -f -l 1472 www.google.com

Pinging www-tmmdi.l.google.com [216.239.59.147] with 1472 bytes of data:

Packet needs to be fragmented but DF set.
Packet needs to be fragmented but DF set.
Packet needs to be fragmented but DF set.
Packet needs to be fragmented but DF set.

Ping statistics for 216.239.59.147:
Packets: Sent = 4, Received = 0, Lost = 4 (100% loss),

C:\Documents and Settings\me>ping -f -l 1462 www.google.com

Pinging www-tmmdi.l.google.com [216.239.59.147] with 1462 bytes of data:

Reply from 216.239.59.147: bytes=64 (sent 1462) time=33ms TTL=52
Reply from 216.239.59.147: bytes=64 (sent 1462) time=31ms TTL=52
Reply from 216.239.59.147: bytes=64 (sent 1462) time=33ms TTL=52
Reply from 216.239.59.147: bytes=64 (sent 1462) time=42ms TTL=52

Ping statistics for 216.239.59.147:
Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 31ms, Maximum = 42ms, Average = 34ms

C:\Documents and Settings\me>ping -f -l 1463 www.google.com

Pinging www-tmmdi.l.google.com [216.239.59.147] with 1463 bytes of data:

Reply from 216.239.59.147: bytes=64 (sent 1463) time=32ms TTL=52
Reply from 216.239.59.147: bytes=64 (sent 1463) time=29ms TTL=52
Reply from 216.239.59.147: bytes=64 (sent 1463) time=30ms TTL=52
Reply from 216.239.59.147: bytes=64 (sent 1463) time=32ms TTL=52

Ping statistics for 216.239.59.147:
Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 29ms, Maximum = 32ms, Average = 30ms

C:\Documents and Settings\me>ping -f -l 1465 www.google.com

Pinging www-tmmdi.l.google.com [216.239.59.147] with 1465 bytes of data:

Packet needs to be fragmented but DF set.
Packet needs to be fragmented but DF set.
Packet needs to be fragmented but DF set.
Packet needs to be fragmented but DF set.

Ping statistics for 216.239.59.147:
Packets: Sent = 4, Received = 0, Lost = 4 (100% loss),

There you go the MTU is 1464 + 28 = 1492

The other settings available in the TCP Optimizer tool are:

Tcp1323Opts
This parameter controls the use of RFC 1323 Timestamp and Window Scale TCP options. Explicit settings for timestamps and window scaling are manipulated with flag bits. Bit 0 controls window scaling, and bit 1 controls timestamps.

GlobalMaxTcpWindowSize
Description: The TcpWindowSize parameter can be used to set the receive window on a per-interface basis. This parameter can be used to set a global limit for the TCP window size on a system-wide basis.

TCP Window size
This parameter determines the maximum TCP receive window size offered. The receive window specifies the number of bytes that a sender can transmit without receiving an acknowledgment. In general, larger receive windows improve performance over high-delay, high-bandwidth networks. For greatest efficiency, the receive window should be an even multiple of the TCP Maximum Segment Size (MSS). This parameter is both a per-interface parameter and a global parameter, depending upon where the registry key is located. If there is a value for a specific interface, that value overrides the system-wide value. See also GobalMaxTcpWindowSize.

Contact your ISP

If you have cleaned and tuned your computer and browser and optimised all your settings to rule everything else out and you're still having problems related to network speed then contact your ISP. Provide them with as much information that you have gathered as possible to show that the problem is not related to your computer set-up. If you have intermittent speed issues show them the charts from networx that you can print out (by hour, by day) to show the problem. Do not give your ISP a chance to blame the issue on your own PC or setup a with most companies they will try and get out of paying for something if they possibly can. You never know they may offer you a new modem and raise you from 2Mbps to 20Mbps like they did to me. Funnily enough as soon as the new modem was plugged in all my network issues were solved instantly!

Hopefully this article has been a good guide to performance tweaks and remember if you want to do it the easy way purchase TuneUp Utilities as it could save your a lot of time, effort and heartache. I don't often recommend software to buy but for only £29.99 you cannot really go wrong when compared with the amount of time you will save.



TuneUp Utilities 2010


Monday, 21 December 2009

Add-On Support for Chrome

Google Chrome Plugins

Google Chrome has been my favourite browser for plain old web surfing since it came out but one of the few downsides to this fast loading and stable browser is the lack of support for add-ons. This is one of the reasons FireFox has claimed such a major stake in the browser market because there are literally hundreds if not thousands of add-ons available to extend the browsers functionality.

Now Chrome has finally caught up and if you are subscribing to Googles dev channel you will be pleased to know that there are already a hundred or so add-ons waiting to be installed on your favorite browser. I have already installed Flashblock and Adblock and they seem to be working well. There does exist a web developer toolbar but its functionality is so cut down and basic at the moment its probably not worth getting.

Now if you are like me and have been working around the lack of plugins for Chrome by using bookmarklets then you will be pleased with this news. However if you are still interested in using bookmarklets which will work cross browser then here is another good one. It allows you to download Flash videos from YouTube at a click of a button so that you can watch them later on.

All you need to do is right click on the bookmarks bar (which should always be made visible) and then click "Add". Then in the Name you should put "YouTube Flash Downloader" or something similar and then for the URL you should paste in the following code:

javascript:window.location.href = 'http://youtube.com/get_video?video_id=' + yt.getConfig("SWF_ARGS")['video_id'] + "&sk=" + yt.getConfig("SWF_ARGS")['sk'] + '&t=' + yt.getConfig("SWF_ARGS")['t'];



This code has been updated to work with the recent changes in YouTubes API so don't worry as I know there is an old version of this bookmarklet floating around the web that doesn't work since November this year.

Now when you are on YouTube and a video starts playing you can just click on that bookmark link and the video should start downloading to your computer.

If you don't have a Flash player installed on your computer and want to play the FLV files in Windows Media Player then you can download an extension from the following location:


If you are interested in other cool bookmarklets then check out my previous article which contained one for viewing the generated source code of a page and one for dynamic DOM inspection.

Remember the cool thing about bookmarklets as apposed to add-ons is that they should work in all modern browsers as they are pure JavaScript. That even means IE 6!!

Also before everyone gets carried away installing lots of add-ons for Chrome just take a step back and remember why you're using Chrome in the first place. For me its because FireFox, which I use for all my web development, has so many add-ons installed that its become very slow to load and many of the add-ons can cause slow page loads e.g Firebug or errors. Therefore if you're like me and use many browsers keep the add-ons to a minimum and keep your web browsing fast.

I will shortly be writing an article about increasing speed and performance for all the major browsers but as a teaser I would give these pointers to quicker browsing.

  • Remove all add-ons that you never use anymore and disable those you rarely use.
  • Add-ons that will make your web surfing quicker such as FlashBlock, AdBlockPlus and NoScript are all good. Only enable those Flash movies and adverts if you really need to.
  • Turn off inbuilt RSS readers and use a special app if you require it.
  • Turn off all 3rd party cookies but keep Session cookies enabled.
  • Trawl through those preference settings and disable anything you do not use.
  • Disable link pre-fetching.
  • Regularly clear your cache, history and auto-complete data.
Those are some very simple tips for increasing speed. I will go into more detail about the user preferences, http.pipelining, max-connections and TCP/IP settings another day.

Friday, 28 August 2009

Firebug So So Slow

Developer Toolbars Slowing Sites Down

Following on from my whinge the other day about how IE 8.0's developer toolbar is causing pages to hang, CPU to max out (50% on dual core). I have noticed on a few sites now since upgrading to Firefox 3.5.2 and Firebug 1.4.2 that I have similar problems.

I have just been to my football site www.hattrickheaven.com and was clicking some of the leagues down the sidebars and wondering why the pages were taking so long to load. I then went to Chrome and tried the same links and the pages loaded as fast as lightening. This made me wonder about Firebug and low and behold disabling Firebug stopped the delay and the pages loaded very quickly. I would like some other people to try the same thing if they have Firebug and let me know the results e.g try these links with Firebug enabled and with it disabled:



With Firebug disabled they should load pretty quickly however with Firebug enabled its taking anything up to 15 seconds! Also check the Task Manager and see what CPU level Firefox shows as mine shows 50%.

I don't know if this is something to do with something within my page that Firebug cannot handle but the only validation errors I have are missing ALT attributes.

So as a tip for slow loading sites I would suggest in both IE and Firefox disabling the developer toolbar and Firebug and seeing if that helps speed up the load times. It seems to have worked for a number of sites now which is a shame as both toolbars should be great add-ons if only they could work without slowing everything down.

Tuesday, 9 September 2008

SQL Performance Top Tips

Another SQL Performance Top Tips?

I know there are lots of Top Tips for improving SQL performance out there on the web but you can never have too much of a good thing so I have created my own list of issues to identify and resolve when trying to improve the performance of an SQL database.


1. Investigate and resolve dead locks or blocked processes.


All it takes is one blocked process on a commonly hit table for your whole site to hang and start reporting timeout issues.

On my jobboard sites I had locking issues with my main JOBS table that had to be joined to a number of other tables to get the data needed on results pages that were hit constantly by crawlers. These tables were also being updated constantly by clients and automated feeds from multi-posters which meant that locks were likely.

As I use one database for multiple sites it also meant that a lock caused by one site would cause timeout issues for the other sites.

To resolve this I created some de-normalised flat tables per site that contain all the fields needed for reporting and searching which meant:

  • The JOB table is only updated at regular intervals and not instantly. SQL 2005 introduced the synonym feature which I demonstrate here. This is used to swap between tables being built and tables in use by the site so that a new table can be built behind the scenes with up to date data.
  • Clustered indexes can be site specific depending on the columns the site search on the most.
  • No joins needed when accessing the table and results are returned very fast.
  • Any problems accessing this table do not affect other sites.
Another trick to handle instances where you are selecting from tables that maybe updated constantly such as job or banner hit tables is to look at the following:
  1. If the table you are selecting from is never updated or deleted from during the times you are selecting from then you can use the WITH (NOLOCK) statement on your SQL SELECTS. You won't have to worry about dirty reads as you are not actually updating the data and you bypass all the overhead SQL has to go through to maintain the LOCKING.
  2. Use the LOCK_TIMEOUT statement to reduce the time LOCKS are held for.
  3. Use the TRY / CATCH statement to catch deadlocks and other blocking errors in combination with a retry loop. After X retries return from the proc. View my example of using LOCK_TIMEOUT to handle DEADLOCKS and BLOCKING.



2. Look into large batch processes that have long run times, take up I/O, have long wait times or cause blocking.

A good example is where you have to transfer lots of records from one table to another e.g from a daily table that only receives INSERTS (e.g job / banner hits) into a historical table for reporting.

If you are deleting large numbers of records from a table that is also being used then you don't want to cause blocks that will freeze your site. In conjunction with point 1 you should look into how you handle your DELETE / UPDATE statements so that they are done in small BATCHES using the SQL 2005+ TOP command.

Read this article for an example of updating or deleting in batches to prevent blocking.

To find out problematic queries after the fact you can utilise the SQL 2005+ Data Management Views (DMV's) which hold key information about the system.

Read this article on my top queries for performance tuning to help identify those queries that need tuning.


3. Re-Index and De-Frag tables

Make sure you regularly re-index and de-fragment your tables as over time fragmentation will build up and performance will be affected. I tend to set up a weekly MS Agent job that runs a Defrag / Re-organize or  on tables with fragmentation over a set percentage as well as rebuild index statistics.

I then also try to schedule a full re-index of all my main tables once a quarter at scheduled down times as during a full rebuild the tables can be taken off line depending on your SQL setup.


4. Identify Slow Queries

Investigate your slowest running queries to make sure they are covered adequately by indexes but also try not to over use indexes. Try to cover as many queries with as few indexes as possible.

Try running this SQL performance report which will identify a dozen or so areas which could be improved on your system including:


  • Causes of the server waits
  • Databases using the most IO
  • Count of missing indexes, by database
  • Most important missing indexes
  • Unused Indexes
  • Most costly indexes (high maintenance)
  • Most used indexes
  • Most fragmented indexes
  • Most costly queries, by average IO
  • Most costly queries, by average CPU
  • Most costly CLR queries, by average CLR time
  • Most executed queries
  • Queries suffering most from blocking
  • Queries with the lowest plan reuse

This is an invaluable tool for any SQL DBA or SQL performance tuning developer.



5. Split tables

If your tables are used for inserting/updating as well as selection then for each index you have on a table that's an extra update required when a record is saved. On some of my big tables that are used heavily for reporting I split the data into daily and historical data. The daily table will allow updates but the historical table will not.

At night a job will transfer the daily data into the historical table, dropping all current indexes, populating the new data and then rebuilding the indexes with a 100% fill factor. You need to balance out whether speed on data retrieval or on update is more important if the table is used for both. You should also look into points 1 and 2 about managing blocking and batch updates/deletes to handle instances where a table is being deleted whilst accessed at the same time.


6. Re-write complex queries and make WHERE clauses SARGABLE

Can you rewrite complex queries to be more efficient. Can you remove left joins to large tables
by populating temporary tables first. Are you putting functions in the WHERE, HAVING clause on the column and negating any index usage. If so can you rewrite the clause and make it SARGABLE so that you make use of the index. For example a WHERE clause like so

WHERE DateDiff(day,CreateDate,GetDate()) = 0

Which is filtering by todays date should be rewritten to:


WHERE CreateDate > '2008-sep-09 00:00:00' --where this is the previous midnight

Put the value from GetDate() into a variable before the SELECT and then use that in the filter
so that no function is used and any index on CreateDate will be available.

Read this article of mine which proves the benefits of SARGABLE clauses in SQL.


7. Query Plan Caching

Check that you are benefiting from query plan caching. If you are using stored procedures that contain branching through use of IF statements that run different SQL depending on the values passed in by parameters then the cached query plan for this procedure may not be the best for all parameter variations.

You can either rewrite the procedure so that each IF branch calls another stored procedure that contains the query required as it will be this plan that gets used.

Or you could rewrite the query to use dynamic SQL so that you build up a string containing the appropriate syntax and parameters and then execute it using sp_executesql which will take advantage of plan re-use. Do not use EXEC as this will not take advantage of plan re-use and its not as safe in terms of sql injection.

You should also look into whether your query plans are not getting used due to your SQL Server settings as the default mode is to use SIMPLE PARAMETERIZATION and not FORCED PARAMETERIZATION.

This features takes AdHoc queries containing literal values and removes values replacing them with parameters. This means that the query plan which gets cached can be re-used for similar queries that have different values which can aid performance as it will reduce compilation time.

When the system is set to SIMPLE mode only AdHoc query plans that contain the same literal values get cached and re-used which in many cases is not good enough as the values for most queries will change all the time for example in SIMPLE mode only the plan for the exact query below will be cached.


SELECT *
FROM PRODUCTS
WHERE ProductID = 10

Which means only products with the ProductID of 10 will benefit from a cached plan however with FORCED PARAMETERIZATION enabled you would have plan with parameters so that any ProductID can benefit from it e.g


SELECT *
FROM PRODUCTS
WHERE ProductID = @1 -- name of SQL parameter


For a more detailed look at the benefits of this method read my article on forced paramaterization.


8. Use the appropriate table for the situation

SQL has a variety of tables from fixed permanent tables to global and local temporary tables to table variables that are both stored in tempdb.

I have found a number of times now that the use of table variables start off being used in stored procedures as very useful memory efficient storage mechanisms but once the datasets stored within them rises above some threshold (I have not found the exact threshold amount yet) the performance drops incredibly.

Whilst useful for array like behaviour within procs and quick for small datasets they should not be used with record sets of many thousands or millions of records. The reason being that no indexes can be added to them and any joins or lookups will result in table scans which are fine with 10-100 records but with 20 million will cause huge problems.

Swapping table variables for either fixed tables or temporary tables with indexes should be considered when the size of data is too great for a table variable.

If your procedures are being called by website logins with execute permission you will need to impersonate a login with higher privileges to allow for the DDL statements CREATE and DROP as a basic website login should have nothing more than read privileges to prevent SQL injection attacks.

If you don't want to risk the rise in privileges then consider using a fixed table and the use of a stamp and unique key so that multiple calls don't clash with each other. Appropriate indexes should be used to ensure speedy data retrieval from these tables.

A real world example I have found is explained in this article on speeding up batch processes that use table variables.


9. Tuning for MySQL

MS SQL has a vast array of useful tools to help you performance tune it from it's Data Management Views and Activity Monitor to it's detailed Query Execution Plans. However if you are using MySQL and have to suffer using a tool like NAVICAT then you are not in such a good position to tune your queries.

The EXPLAIN option is nowhere near as useful as the MS SQL Query Execution Plan but it can be used to ensure the appropriate indexes are added if missing and the Slow Query Log is a useful place to identify problematic queries.

If you are using LINUX or Wordpress and running your system on MySQL then read this article of mine on performance tuning for MySQL.


These are just some tips to look at when trying to improve back end database performance and I will add to this list when I get some more time. Feel free to post your own tips for identifying problematic SQL and then fixing it.

Monday, 8 September 2008

Top 10 Tips for improving ASP performance

Top 10 Tips For Speeding up ASP Classic Sites

1. Look for nested loops and replace them with SQL stored procs. Its surprising how many times I have looked at sites that when outputting product hierarchies or menus on those old catalogue sites use nested loops to build up the content. Its usually people who got into programming on the web side with little or no SQL knowledge that was then learnt later only as and when they required somewhere to persist the data. Replace nested loops with SQL Stacks (see BOL) or in 2005 a recursive CTE. From tests I have found the Stack to perform slightly faster than the CTE but both will outperform nested loops. The data will most likely be in an adjacency table anyway so its just the method of outputting it. I remember replacing one such beast of nested loops with a stored procedure stack and it increased performance by a factor of 20.

2. Re-use those objects. Do you have helper functions that carry out regular expressions or file system object actions that instantiate the object within the function. If you call these functions
more than once on a page you are creating an unneccesary overhead. Have a global include file where you declare some often used variables. Then the first time you call the function if it doesn't exist initialise it otherwise use the existing object pointer. Then destroy them all in your global footer.

3. Simplify any complex regular expressions by breaking them down into multiple parts. I have had problems with badly written regular expressions causing CPU on our webserver to max at 100%. On our quad it would jump to 25%, four hits at the same time and 100% and then no other pages would load. Also complex expressions that do lots of lookbacks will cause overhead and the longer the string you are testing against the more overhead you will get. Remember VBs Regular Expression engine is not as good as some others so its better to break those long complex expressions up into smaller tests if possible.

4. String concatenation. In ASP.NET you have the string builder class but in ASP classic you should create your own using Arrays to store the string and then do a JOIN at the end to return it. Doing something like the following

For x = 1 to 100
strNew = strNew & strOtherString
Next
May work okay if the size of the string in strOtherString is small and you have few iterations but
once those strings get larger the amount of memory used to copy the existing string and then add it to the new one will grow exponentially.

5. Try to redim arrays as few times as possible especially if you are doing a Preserve. Its always best to dim to the exact size before hand but if you don't know how big your array will be then
dim it to a large number first and then use a counter to hold how many elements you actually add. Then at the end you redim preserve once back down to the number of elements you actually used. This uses a lot less memory than having to redim preserve on each addition to the array.

6. Use stored procedures for complex SQL and try to reduce database calls by calling multiple SQL together and then using NextRecordset() to move to the next set of records when you need to. Using stored procedures will also reduce network traffic as its much easier to connect to the DB once and pass through the names of 3 stored procedures than connect 3 times passing through the X lines of SELECT statements that each recordset requires.
strSQL = "EXEC dbo.usp_asp_get_records1; usp_asp_get_recorddetails; usp_asp_get_country_list;"

Set objRS = objConnection.Execute(strSQL)

If not(objRS.BOF AND objRS.EOF) Then
arrRecords = objRS.GetRows()
Set objRS = objRS.NextRecordset()
End If
If not(objRS.BOF AND objRS.EOF) Then
arrRecordDetails = objRS.GetRows()
Set objRS = objRS.NextRecordset()
End If
If not(objRS.BOF AND objRS.EOF) Then
arrCountryList = objRS.GetRows()
End If
7. Make sure that you don't put multiple conditions in IF statements. ASP doesn't support short circuiting which means it evaluates every condition in an IF statement even if the first one
is false. So rewrite code such as:

If intCount = 1 AND strName = "CBE" AND strDataType = "R" Then
'Do something
End If

If intCount = 1 Then
If strName = "CBE" Then
If strDataType = "R" Then
'Do something
End If
End If
End If
8. Also make sure you evalutate conditions in the correct order if you are checking a value for True and False then you don't waste an extra check when you can just change the order of the conditions.

If Not(bVal) Then
Response.Write("NO")
Else
Respone.Write("YES")
End If
Should obviously be

If (bVal) Then
Response.Write("YES")
Else
Respone.Write("NO")
End If
8. Cache as much data as possible if it doesn't change frequently. If you have lists that come from a database but never change or very infrequently change then you could either store them in memory and reload every so often or write the array to a file and then use that static array and only re-create the file when the data changes. I always add an admin only function into my sites that on the press of a button will reload everything I need to into files and cache. This prevents un-required database lookups.

9. Pages that contain database content that's added through the website that only changes when the user saves that content can be created to static HTML files if possible. Rather than rebuilding the page from the DB each time someone views it you can use the static version. Only rebuild the page when the user updates the content. For example a lot of CMS systems store the content within the DB but create a static version of the page for quick loading.

10. Encode and format content on the way into the database not each time its outputted to be displayed. Its more cost efficent to do something once on the way in that hundreds of times on the way out.

These are just a few of the things I have found out that can be very beneficial in speeding up a classic ASP site on the web side of things. As well as any performance tweaks that can be done application side you should take a look at the database to make sure thats performing as well as possible. If your site is data driven then a lot of the performance will be related to how quick you can build those recordsets up and deliver them to the client.