Friday 30 December 2011

New form of Web Site DOS attack leaves millions vulnerable

New security vulnerability can cause most popular websites to become susceptible to DOS attacks.

In a demo called "Efficient Denial of Service Attacks on Web Application Platforms" hosted by
Alexander “alech” Klink and Julian “zeri” Wälde they explained in detail how most web programming languages utilize hashes and manage collisions.

We are not talking about encryption here but the common sort of hashing that allows us to store data in key/value type array objects. It is a simple mathematical hash used to speed up storing and retrieving data posted to web pages used by most web programming languages like PHP and JavaScript.

A very very simple example that doesn't cover all the possibilities that this technique is used and can be exploited would be the following. I k now this is a client side JavaScript hash that would cause more problems for the browser and the users computer than the webserver but as you can run arbitrary JavaScript on any page very easily and XSS hacks are very common nowadays it is still worth showing.

Say I had a client side hash that held some simple values about the system the site was running. These key/values hold different bits of information that are relevant to the object I am using and if a user want to obtain a piece of information they supply the key for the value they want.

System = {
 ProductName: "MySystem",
 Version: 3,
 Build: 3,
 Company: "My Company PLC",
 ServerType: "dev",
 LastBuildDate : "2011-Dec-24 12:45:21"
}


You can get the appropriate value out of the hash table by accessing the appropriate key. In this example if I wanted to find out the current version of the System I would do something like this.

if( System.Version < 3){
 alert("Please upgrade to the latest version");
}

Not a very big hash table but in some cases hash tables can be absolutely huge.

The problem comes if a hacker can overwrite the hash table and set all the keys and values to the same value.

This causes the webserver to get itself into a state of confusion a it doesn't know what to return as all keys are now the same and in the report the authors say:

"An example given showed how submitting approximately two megabytes of values that all compute to the same hash causes the web server to do more than 40 billion string comparisons."

This is obviously a lot of calculation and for just looking up some data on a webpage is a massive overhead that can basically grind the page to a halt.

You can see by my example that by just by overwriting the hash table with all the same values it means my look up for the System.Version would involve checking all the keys in the System object (which would all be set to version) and the server (or in this case the browser as it's client side code) would get into a fiz because it wouldn't know which value to return as all the keys were now the same.

As the authors say only 2MB of values can cause a huge amount of string comparisons which would slow the machine down no end.

Apparently Perl have already done something about this vulnerability some time back but no-one else has yet followed their actions and hopefully it won't take a few big sites to go down before it is fixed across the board.

Without fixing the hashing functions in the languages themselves there are three possible techniques available to website operators to prevent problems occurring.
  • Reduce the length of parameters that can posted.
  • Reduce the number of parameters accepted by the web application framework.
  • Limit the amount of CPU time that any given thread is allowed to run.

Microsoft have released a fix less than 24 hours after disclosure. ASP.NET admins can download patch MS11-100 to protect their IIS web assets. You can find out about this patch from blogs.technet.com

No comments: