February 19, 2004

Wide-open default configurations and "user freedom"

In the midst of a new deluge of virii for Windows (are they really "for Windows" or more appropriately "for compulsive attachment openers"?) it is time to stop and think a little more. After the inevitable realisation that anti-virus software just does not work should come the illumination that perhaps it is not quite normal for software to joyfully open anything without question.

My mother recently followed the example set by her younger sister and bought herself a laptop with Windows XP to be able to send and receive e-mail. I would have preferred a Mac but it cost too much in comparison to this ultra-cheap AMD laptop deal. It was delivered with the "standard stuff", i.e. Outlook Express, IE6 and Norton anti-virus.

She diligently updates the anti-virus each and every time it askes her to, she even runs the occasional Windows Update which I recommended doing as often as possible despite her slow dial-up line. The outcome of all this is that I get continuously asked: "why do I have to do it?".

That is actually a very good question, why exactly is the default configuration not good enough? Why is it so that her Outlook Express as delivered will happily execute any possible rubbish entering her inbox? Not only, why is it that after numerous runs of the update facility it still executes any possible rubbish? Did nobody take notice of the somewhat recurring viral techniques regarding attachments?

This could well become a long rant against Microsoft but this behaviour is not limited to Windows, far from it: I will never forget the maintenance nightmare of RedHat 5 "default installs" which people all over Imperial College were installing on their PCs when Linux was becoming fashionable. These installations had everything possible running, there were more DNS servers within Imperial College than you can imagine, Samba servers listening to anyone willing to talk not to mention Apache servers offering the standard RedHat index page and the full man pages of the system. Do you know of many stand-alone workstations requiring a DNS service running on the host? Or indeed a Samba server?

Just recently I allowed myself to be pulled into a discussion with someone trying to convince me that OpenBSD is "secure by default" only because it runs no services by default otherwise it would be as insecure as Windows (yes, a somewhat inflammatory remark). As I listened to his arguments I was thinking that perhaps he had never noticed that the average user does not actually need to run Apache, DNS and sendmail. There was no way to explain to him that by forcing people to turn on services rather than turning them off was a rather good idea. Even the analogy of allowing people to decide whether or not Outlook should execute attachments was lost on him, it was all a matter of "freedom of choice". Apparently the "choice" to have your system turned into a lump of useless plastic and metal is an important one.

So, have users forsaken security for a badly "freedom of choice"? Or have they been led to believe that allowing a computer to do all the "thinking" is the equivalent of "freedom"?

Posted by arrigo at 09:10 AM

February 10, 2004

Can the Grid be secured?

One of the latest European high-tech projects is the EU DataGrid which will eventually link all the key research centres in a huge virtual distributed supercomputer.

The idea in itself isn't exactly novel: PVM and Condor have been offering some of its capabilities for a long time.

One of the issues in the EU DataGrid which I find particularly interesting is the security aspect which has in some ways been addressed.

The project set up a serious PKI infrastructure which is used to authorise job submissions and authenticate DataGrid users to the Grid itself. This would normally indicate a serious concern for security, after all you would want to ensure that the computing power is not used by some kid in school to improvie his ranking on Seti@Home and that rogue systems can't join the Grid.

I'd like to offer a different slant to the security issue: not so much who is allowed to use the DataGrid but where the data flows are.

It should be pretty clear that the security of the Grid as a whole depends on the security of individual systems and also that it is sadly the case that any system connected to a network cannot be guaranteed to be secure. In particular what concerns me most is that for computations to take place you have to ship some data off to systems which you do not control so if one of these is compromised then the data is wide open.

The argument which is often put forward is that this is scientific data so it doesn't really matter if someone obtains access to it. This is akin to the justification for universities having lax security and we shall leave it at that but the difference here is that the Grid is being offered for use in other fields, for example, medical research.

Here we hit a problem: it is no longer physicists working with the consitituents of matter but medical research which could possibly contain either patentable information or clinical trial information. In theory data could be encrypted and decrypted on the fly during job runs but once you have control of a system it doesn't take much work to access the decrypted data.

I would agree that it requires a certain amount of dedication to enact the above but this dedication has been definitely shown recently by the spamming community...

Posted by arrigo at 11:12 AM