Tuesday, December 15, 2009

Return of the Ping of Death

An old exploit made a return recently to the Linux Kernel. If you were to send a large data packet via ICMP to a vulnerable system it would crash causing a denial of service event. The exploit, known as the Ping of Death works because the maximum allowed size of an ICMP packet is 65535 bytes. It is possible to send a larger packet if it is fragmented. The receiving system will defragment it on arrival and if the system is vulnerable, the resulting payload will be bigger than the buffer size allocated to receive it and hence cause an overflow and possibly crash the system.

The attack first made its appearance in the 1990s. It was particularly effective as it was even possible to bypass firewalls by spoofing the source IP address.

Back then there were plenty of exploits that worked with ICMP. Using a broadcast address for either source or destination was a particularly good way of causing denial of service just by generating large amounts of traffic. As internet access was typically dialup of around 56Kbps and corporate wide area networks weren’t much faster, a lot of damage could be done.

In theory, these kind of exploits known as Smurf attacks can’t really happen any more as systems are configured not to respond to broadcasts and routers are set not to forward packets directed to a broadcast address. For old time sake, I gave my systems a test using HPING and found this to be the case.

Monday, December 14, 2009

Thunderbirds are Go for French Military

Something that caught the eye on Slashdot this morning was the story about the French Military adopting Thunderbird as their mail client. I’m doing a bit of work on comparing the security features of Linux and Windows and by extension open source vs. closed. An early conclusion is that the merits and defects of each approach are very subjective. Indeed any attempt at rational debate on the subject tends to descend into a slanging match between the different camps. What is interesting about the Thunderbird story is it shows that the French consider that the open source mail client is secure enough for military use.

It would be naive to think the only selection criteria used was security; indeed the French government has a policy of seeking "maximum technological and commercial independence" for all its software. However, one would hope that security was a major factor in the selection process.

Further reading suggests that a plus point of Thunderbird for the French Military was that it allowed them to develop their own security extensions. Being able to review the original source code was also advantageous.

Tuesday, December 8, 2009

WPA Cracker

The was an interesting article on the Register this morning about a new cloud based service that allows you to brute force crack wireless WPA passwords. The service, run by Moxie Marlinspike of null byte prefix fame, claims it can compare your key against a 135 million word dictionary, optimised for WPA passwords, in around 20 minutes. It can achieve such speed by spreading the load over a 400 CPU cloud cluster.

Although the figures are impressive, the service falls way short of guaranteeing being able to crack your WPA password (Note, it doesn’t claim that it can). For an 8 letter key that uses upper and lower case and numbers, there are 2.18 x e14 possible combinations. This rises to 4.77 x e28 for a 16 letter password. Hence the chance of the service successfully finding your password depends on how closely it resembles a dictionary word.

Of course in reality, your WPA key almost certainly does resemble a dictionary word. If you want to make it safer but still keep it possible to remember, then increase its length as discussed here.

If you want to test the strength of you WPA password, you need to capture the WPA handshake using something like Aircrack-ng, then submit it to the site and hand over $17.

Friday, December 4, 2009

Practical Password Management

I did a quick count today of the passwords that I use at least once per month and was surprised to find that I have 41. I appreciate that this is higher than the average but suspect that anyone who uses a PC for work and does a bit of online shopping or banking is at least in double figures. It’s all very well security specialists (like me) telling us to use different complex passwords for each account and to change them regularly but how the hell can you remember 41 passwords? What I imagine most people do is to use the same password for all accounts and change them as infrequently as possible. This is obviously far from ideal so a compromise needs to be found.

The most important password is the one for your email. Nearly every other password you have will rely on it in some form for resetting it in the event that it is forgotten. Hence a unique strong password here is vital. Afterwards it is a case of assessing the importance of the data held with each account. If it limited personal information you can get away with your generic password, although it should still be difficult to guess.

Another option is to use password management tools. The idea is that you have a secure encrypted database of all your passwords, protected by a strong pass phrase, which means you only need to remember one password. I use Ubuntu Linux which comes with just such a program called Seahorse. It can also be used to managed PGP keys and certificates. More recent versions of Windows come with something called Credentials Manager which is fine in a Windows centric world but isn’t much use for storing passwords where the authentication is built into the end application. An open source utility that I use for password management on Windows but which also works on Linux and possibly MacOS is Password Safe. There are many programs of this type available for free but this one is particularly easy to use and so far has never let me down.

Another thing to do, at least for your systems is to change the default username. A recent study from Microsoft showed that brute force attacks target usernames such as administrator or administrateur.

Tuesday, December 1, 2009

Collocation vs Cloud

The perceived wisdom of locating your applications in the Cloud is that you benefit from scalability, availability and cost. The downside is that you surrender your data to a third party so you need to ensure that you trust them implicitly.

I’ve recently had a mission to source a suitable location to host a large web application with a required availability of close to 24x7 and some guaranteed performance levels. In Cloud terminology I was looking for Infrastructure as a Service (IaaS).To give an idea of the scale, the application was expected to have over 6 million unique visits per month with over 50 million page views.

I approached 4 “enterprise” level cloud providers with a fairly detailed spec of what I expected but was fairly flexible on what they could offer as a solution. Not surprisingly, given the initial spec, the proposed solutions were what could be termed “private cloud” or something close to what used to be called Managed Hosting. Although there were some price differences, all the offers were in the same area. Most importantly, all the providers gave me confidence that I could trust them with my data. I was also sure that they could meet the availability and scalability requirements.

As a follow up exercise, I carried out a cost analysis of the Cloud Solution compared to a collocated equivalent. In order to do so it was necessary to make some fairly large assumptions including that the capital to make the initial investment in the infrastructure was available and that cost could be written off over a period of three years. Extra staff also needed to be factored in.

The end result showed that for this application the Collocate and the Cloud solutions were very similar in cost, with the Collocate slightly cheaper. What was more interesting were the costs when doubling the expected load on the application and hence the supporting infrastructure. In this case, the Collocate solution becomes up to 40% cheaper than the Cloud one as economy of scale begins to take effect.

Of course, it would be foolish to draw the conclusion that Collocate is cheaper than Cloud. There are many things to consider included how much you need the fast scalability and provisioning capabilities of some Cloud offerings, the level of support and monitoring required, your ability to recruit and retain the right staff for collocate as well as security features. What I do think is a fair conclusion is that the larger your hosting requirements, the more you should investigate the available options.