Ryan Ackroyd, a member of LulzSec who has done time in prison for his hacking activities, held a talk yesterday at Sheffield Hallam University – and it was a good talk indeed!
What follows here now is what me and a friend of of mine discussed on the way back to Nottingham afterwards, and the TL;DR (too long, didn’t read, so this is the summary:) can be subsumed in a quote from the talk (but knowingly taken out of context here): “Data Security is non-existent.”
Ryan was part of the LulzSec team, a Hacktivist splinter group of Anonymous most well known for attacking and downing major websites, such as Paypal etc. in the wake of Private Manning’s leaks to WikiLeaks.
One of the main points of Ryan’s Talk was focussed on how simple some of the vulnerabilities found in today’s software are – especially SQL Injection attacks (where web applications and web sites don’t check the text (or other content) that users are feeding them, which allows attackers to run their own programs on your computer).
Protecting yourself from this kind of attack is actually very simple, and it’s really surprising how many of the sites we use daily don’t actually check for it.
Why don’t they do it, you ask? Well, there’s a great many excuses not to program in a safe way, the most common one is “this needs to work, and it needs to be working soon”.
And this is what I wanted to talk about. In industry, data security is sometimes seen as something of a “necessary evil”, because it is something that will not earn you money, but protects you from losing it. Start-ups in the digital economy are usually focussed on getting something demonstrable (it’s even called the “minimum viable product”) running quickly, deploy it early, then keeping their ears to the ground to find out what users like and dislike about the product, and using that to improve the idea you started off with. And that is one thing of concern here: when someone manages to infiltrate your digital home (a.k.a. Server, or Facebook account, or Google search history), you usually don’t see shards of broken glass on the floor, and all of your assets will (usually) still be there – it’s just that they’re somewhere else as well.
Data Security, as well as privacy then, is sometimes a conscious decision in the trade-off between how much it costs to keep something safe versus the “worst-case scenario/prediction” of how much it would cost if data was to be stolen.
Now, something I find quite interesting in this topic is the original meaning of the word “data” – it stems form the Latin word “datum”, and means “something given”.
And in my opinion, this is how data should be treated: as something that is given to us who develop and maintain platforms – a present from someone else. However mundane, presents like these hold meaning, and also, hopefully, instill a sense of responsibility for that which you were given. In other words, leaning on Voltaire’s (and also, later on, Peter Parker’s uncle Ben): “With Big Data comes Big Responsibility”. And that responsibility is, inevitably, on us, the developers and those designing and creating systems.
Now, that is somewhat of a bombshell, and I know that to some this kind of radical stance might seem a bit overzealous, but I’m trying to be honest here. I think everyone using computer systems deserves to know this is the case right now, and most importantly software developers need to be aware that they need to keep their windows closed (as in: do at least some security proofing of your own servers -developers, please sanitise all your input interfaces, try to make the easy attacks harder (as in “use penetration-testing tools on your own services” and “get friendly others to try and penetrate your services”) and, for god’s sake: everyone, please use (and don’t re-use) safe passwords.)
This way, even though it still is the wild, wild west out there, we at least get to live!
And as always, please let me know what you think about this by leaving a comment below.