Security is such an odd thing – what can make one person feel more secure can make others feel less secure, as we see too often in Israel and Palestine. Take, for example, the controversial new shoot-to-kill policy of London’s Metropolitan Police. If they have reason to believe someone is a suicide bomber then they will aim to kill the suspect and not disable them as they would with other suspects.
If nobody knew about this policy then it would be quite effective, suicide bomobers wouldn’t be able to plan for it, they would assume British police would continue their extremely restrained use of firearms. But once this policy becomes known, such as through the awful shooting of the innocent Brazilian electrician Jean Charles de Menezes, the cat is out of the bag.
Would-be bombers can just assume they will be shot and build that into their plans. A dead man’s switch is easy to construct and of course comes ready-made in most hand grenades. So, once public, the Police policy is rendered ineffective. The police are in a bit of a bind because they’re supposed to be accountable to the public and really should make their policies known to the citizenry. But in this case the policy is an operational detail which, in other circumstances, would be not divulged. We don’t know how police officers follow people, tap phone lines or track stolen cars. Well in general we don’t know, there will be many who do know parts of police practice through hearsay, leaks and being at the receiving end of a police operation.
You see the problem with all this secrecy is that it’s security through obscurity, as we say so snappily in the IT world. Once someone knows the secret, the shoot-to-kill policy or the car tracking procedure, then the security is lost and there’s no way to get the security back without changing the policy or procedure. In other words a new secret must be made.
In software many times things have been claimed to be ‘secure’ because nobody outside the supplier knew how the algorithm or protocol worked. But once someone reverse engineered it then the game was up. What’s so beautiful about good cryptography is that everyone knows the algorithms and protocols – they’re publicly available for review and criticism, if warranted. If the algorithm is good then the security relies on the quality of the key whether it be a password, token, biometric or combination of these. In theory, at least…
In reality no algorithm or protocol is an island. Data passes through routers, switches, cables, WiFi, keyboard cables, graphics cards and so on. There are a huge number of places from where vulnerabilities can be found, weaknesses exploited and data stolen. Hence we get into the world of compromise and cost-benefit analysis. Nothing can be 100% secure but we can make it more expensive to steal so that it is no longer worth it.
Nobody has motion detectors and laser beams guarding their fridge, because while someone eating your last yoghurt is irritating it just isn’t worth the cost or inconvenience. On the other hand if you trade in cut diamonds you might want to have a little more security than a locked door.
It is important to recognise that costs and benefits are all in the eye of the beholder. At first blush many people are outraged when they learn that anybody can read their emails, or that faking the from field for an email is trivial. Yet when the alternatives such as using GnuPG are proposed many realise that they just don’t care enough to actually go through the hassle. This tells me two things: Firstly that people don’t value their personal communications as much as I think they should! And secondly that the interfaces in GnuPG, PGP and their brethren are still too klunky for average every-day use.
I’m not sure why all decent security software tends to be so clunky. Perhaps it’s because the propellorheads who really understand security are not particularly artistic. Certainly there is a trade-off between ease-of-use and security but, come on, I cannot believe we’re in 2005 and email encryption still isn’t standard. This is one of those bandwagons which never got rolling – how many people do you receive email from saying “You can read this if you get GnuPG, click here” or less radically “You could be confident in who sent this if you were using PGP!”. All I ever seem to get a email with adverts for instant messaging or meaningless ‘virus scanned by xyz’ footers. We expect a lot from our postal service, we should be ensuring people expect at least as much from their email.
I’d really like to know what readers feel about every day user’s use of security software, in particular email encryption and authentication. Let me know your thoughts and experiences on email@example.com (GnuPG signature optional, for the moment!).
This column first appeared in the excellent LinuxUser magazine, available internationally. For more information visit http://www.linuxuser.co.uk