Category Archives: Politics

Privacy considered harmful?

The government has once again returned to the vision of giving each of us an electronic health record shared throughout the NHS. This is about the fourth time in twenty years yet its ferocity has taken doctors by surprise.

Seventeen years ago, I was advising the BMA on safety and privacy, and we explained patiently why this was a bad idea. The next government went ahead anyway, which led predictably to the disaster of NPfIT. Nonetheless enough central systems were got working to seriously undermine privacy. Colleagues and I wrote the Database State report on the dangers of such systems; its was adopted as Lib Dem policy and aspects were adopted by the Conservatives too. That did lead to the abandonment of the ContactPoint children’s database but there was a rapid u-turn on health privacy after the election.

The big pharma lobbyists got their way after they got health IT lobbyist Tim Kelsey appointed as Cameron’s privacy tsar and it’s all been downhill from there. The minister says we have an opt-out; but no-one seems to have told him that under GPs will in future be compelled to upload a lot of information about us through a system called GPES if they want to be paid (they had an opt-out but it’s being withdrawn from April). And you can’t even register under a false name any more unless you use a stolen passport.

Will the Information Commissioner be consistent?

This afternoon, the Information Commissioner will unveil a code of practice for data anonymisation. His office is under pressure; as I described back in August, Big Pharma wants all our medical records and has persuaded the Prime Minister it should have access so long as our names and addresses are removed. The theory is that a scientist doing research into cardiology (for example) could have access to the anonymised records of all heart patients.

The ICO’s blog suggests that he will consider data to be anonymous and thus no longer private if they cannot be reidentified by reference to any other data already in the public domain. But this is trickier than you might think. For example, Tim Gowers just revealed on his excellent blog that he had an ablation procedure for atrial fibrillation a couple of weeks ago. So if our researcher can search for all males aged 45-54 who had such a procedure on November 6th 2012 he can pull Tim’s record, including everything that Tim intended to keep private. Even with a central cardiology register, it’s hard to think of a practical mechanism could block Tim’s record as soon as he made that blog post. But now researchers are starting to carry round millions of people’s records on their laptops, protecting privacy is getting really hard.

In his role as data protection regulator, the Commissioner has been eager to disregard the risk of re-identification from private information. Yet Maurice Frankel of the Campaign for Freedom of Information has pointed out to me that he regularly applies a very different rule in Freedom of Information cases, including one involving the University of Cambridge. There, he refused a freedom of information request about university dismissals on the grounds that “friends, former colleagues, or acquaintances of a dismissed person may, through their contact with that person, know something of the circumstances of that person’s departure” (see para 30).

So I will be curious to see this afternoon whether the Commissioner places greater value on the consistency of his legal rulings, or their convenience to the powerful.

Who will screen the screeners?

Last time I flew through Luton airport it was a Sunday morning, and I went up to screening with a copy of the Sunday Times in my hand; it’s non-metallic after all. The guard by the portal asked me to put it in the tray with my bag and jacket, and I did so. But when the tray came out, the newspaper wasn’t there. I approached the guard and complained. He tried to dismiss me but I was politely insistent. He spoke to the lady sitting at the screen; she picked up something with a guilty look sideways at me, and a few seconds later my paper came down the rollers. As I left the screening area, there were two woman police constables, and I wondered whether I should report the attempted theft of a newspaper. As my flight was leaving in less than an hour, I walked on by. But who will screen the screeners?

This morning I once more flew through Luton, and I started to suspect it wouldn’t be the airport’s management. This time the guard took exception to the size of the clear plastic bag holding my toothpaste, mouthwash and deodorant, showing me with glee that it has half a centimetre wider than the official outline on a card he had right to hand. I should mention that I was using a Sainsbury’s freezer bag, a standard item in our kitchen which we’ve used for travel for years. No matter; the guard gleefully ordered me to buy an approved one for a pound from a slot machine placed conveniently beside the belt. (And we thought Ryanair’s threat to charge us a pound to use the loo was just a marketing gimmick.) But what sort of signal do you give to low-wage security staff if the airport merely sees security as an excuse to shake down the public? And after I got through to the lounge and tried to go online, I found that the old Openzone service (which charged by the minute) is no longer on offer; instead Luton Airport now demands five pounds for an hour’s access. So I’m writing this blog post from Amsterdam, and next time I’ll probably fly from Stansted.

Perhaps one of these days I’ll write a paper on “Why Security Usability is Hard”. Meanwhile, if anyone reading this is near Amsterdam on Monday, may I recommend the Amderdam Privacy Conference? Many interesting people will be talking about the ways in which governments bother us. (I’m talking about how the UK government is trying to nobble the Data Protection Regulation in order to undermine health privacy.)

The Perils of Smart Metering

Alex Henney and I have decided to publish a paper on smart metering that we prepared in February for the Cabinet Office and for ministers. DECC is running a smart metering project that is supposed to save energy by replacing all Britain’s gas and electricity meters with computerised ones by 2019, and to cost only £11bn. Yet the meters will be controlled by the utilities, whose interest is to maximise sales volumes, so there is no realistic prospect that the meters will save energy. What’s more, smart metering already exhibits all the classic symptoms of a failed public-sector IT project.

The paper we release today describes how, when Ed Milliband was Secretary of State, DECC cooked the books to make the project appear economically worthwhile. It then avoided the control procedures that are mandatory for large IT procurements by pretending it was not an IT project but an engineering project. We have already written on the security economics of smart meters, their technical security, the privacy aspects and why the project is failing.

We managed to secure a Cabinet Office review of the project which came up with a red traffic light – a recommendation that the project be abandoned. However DECC dug its heels in and the project appears to be going ahead. Hey, we did our best. The failure should be evident in time for the next election; just remember, you read it here first.

The rush to 'anonymised' data

The Guardian has published an op-ed I wrote on the risks of anonymised medical records along with a news article on CPRD, a system that will make our medical records available for researchers from next month, albeit with the names and addresses removed.

The government has been pushing for this since last year, having appointed medical datamining enthusiast Tim Kelsey as its “transparency tsar”. There have been two consultations on how records should be anonymised, and how effective it could be; you can read our responses here and here (see also FIPR blog here). Anonymisation has long been known to be harder than it looks (and the Royal Society recently issued a authoritative report which said so). But getting civil servants to listen to X when the Prime Minister has declared for Not-X is harder still!

Despite promises that the anonymity mechanisms would be open for public scrutiny, CPRD refused a Freedom of Information request to disclose them, apparently fearing that disclosure would damage security. Yet research papers written using CPRD data will surely have to disclose how the data were manipulated. So the security mechanisms will become known, and yet researchers will become careless. I fear we can expect a lot more incidents like this one.

Online traceability: Who did that?

Consumer Focus have recently published my expert report on the issues that arise when attempting to track down people who are using peer to peer (P2P) systems to share copyright material without appropriate permissions. They have submitted this report to Ofcom who have been consulting on how they should regulate this sort of tracking down when the Digital Economy Act 2010 (DEA) mechanisms that are intended to prevent unlawful file sharing finally start to be implemented, probably sometime in 2014.

The basic idea behind the DEA provisions is that the rights holders (or more usually specialist companies) will join the P2P systems and download files that are being shared unlawfully. Because the current generation of P2P systems fails to provide any real anonymity, the rights holders will learn the IP addresses of the wrongdoers. They will then consult public records at RIPE (and the other Regional Internet Registries) to learn which ISPs were allocated the IP addresses. Those ISPs will then be approached and will be obliged, by the DEA, to consult their records and tell the appropriate account holder that someone using their Internet connection has been misbehaving. There are further provisions for telling the rights holders about repeat offenders, and perhaps even for “technical measures” to disrupt file sharing traffic.

From a technical point of view, the traceability part of the DEA process can (in principle) be made to work in a robust manner. However, there’s a lot of detail to get right in practice, both in recording the data generated by the P2P activity and within the ISPs systems — and history shows that mistakes are often made. I have some first hand experience of this, my report refers to how I helped the police track down a series of traceability mistakes that were made in a 2006 murder case! Hence I spend many pages in my report explaining what can go wrong and I set out in considerable detail the sort of procedures that I believe that Ofcom should insist upon to ensure that mistakes are rare and are rapidly detected.

My report also explains the difficulties (in many cases the insuperable difficulties) that the account holder will have in determining the individual who was responsible to the P2P activity. Consumer Focus takes the view that “this makes the proposed appeals process flawed and potentially unfair and we ask Government to rethink this process”. Sadly, there’s been no sign so far that this sort of criticism will derail the DEA juggernaut, although some commentators are starting to wonder if the rights holders will see the process as passing a cost/benefit test.

Workshop on the Economics of Information Security 2012

I’m liveblogging WEIS 2012, as I did in 2011, 2010 and 2009. The event is being held today and tomorrow at the Academy of Sciences in Berlin. We were welcomed by Nicolas Zimmer, Berlin’s permanent secretary for economics and research who mentioned the “explosive cocktail” of streetview, and of using social media for credit ratings, in he context of very different national privacy cultures; the Swedes put tax returns online and Britain has CCTV everywhere, while neither is on the agenda in Germany. Yet Germany like other countries wants the benefits of public data – and their army has set up a cyber-warfare unit. In short, cyber security is giving rise to multiple policy conflicts, and security economics research might help policymakers navigate them.

The refereed paper sessions will be blogged in comments below this post.

Debunking cybercrime myths

Our paper Measuring the Cost of Cybercrime sets out to debunk the scaremongering around online crime that governments and defence contractors are using to justify everything from increased surveillance to preparations for cyberwar. It will appear at the Workshop on the Economics of Information Security later this month. There’s also some press coverage.

Last year the Cabinet Office published a report by Detica claiming that cybercrime cost the UK £27bn a year. This was greeted with derision, whereupon the Ministry of Defence’s chief scientific adviser, Mark Welland, asked us whether we could come up with some more defensible numbers.

We assembled a team of experts and collated what’s known. We came up with a number of interesting conclusions. For example, we compared the direct costs of cybercrimes (the amount stolen) with the indirect costs (costs in anticipation, such as countermeasures, and costs in consequence such as paying compensation). With traditional crimes that are now classed as “cyber” as they’re done online, such as welfare fraud, the indirect costs are much less than the direct ones; while for “pure”cybercrimes that didn’t exist before (such as fake antivirus software) the indirect costs are much greater. As a striking example, the botnet behind a third of the spam in 2010 earned its owner about $2.7m while the worldwide costs of fighting spam were around $1bn.

Some of the reasons for this are already well-known; traditional crimes tend to be local, while the more modern cybercrimes tend to be global and have strong externalities. As for what should be done, our research suggests we should perhaps spend less on technical countermeasures and more on locking up the bad guys. Rather than giving most of its cybersecurity budget to GCHQ, the government should improve the police’s cybercrime and forensics capabilities, and back this up with stronger consumer protection.

I'm from the Government and I'm here to help

Two years ago, Hyoungshick Kim, Jun Ho Huh and I wrote a paper On the Security of Internet banking in South Korea in which we discussed an IT security policy that had gone horribly wrong. The Government of Korea had tried in 1998 to secure electronic commerce by getting all the banks to use an officially-approved AciveX plugin, effectively locking most Koreans into IE. We argued in 2010 that this provided less security than it seemed, and imposed high usability and compatibility costs. Hyoungshick presented our paper at a special conference, and the government withdrew the ActiveX mandate.

It’s now apparent that the problem is still there. The bureaucracy created a procedure to approve alternative technologies, and (surprise) still hasn’t approved any. Korean web businesses remain trapped in the bubble, and fall farther and farther behind. This may well come to be seen as a warning to other governments to adopt true open standards, if they want to avoid a similar fate. The Cabinet Office should take note – and don’t forget to respond to their consultation!

Scrambling for Safety 2012

On the first of April, the Sunday Times carried a story that the Home Secretary planned to expand the scope of the Regulation of Investigatory Powers Act. Some thought this was an April Fool, but no: security minister James Brokenshire confirmed the next day that it was for real. This led to much media coverage; here is a more detailed historical timeline.

There have been eight previous Scrambling for Safety conferences organised while the UK government was considering the RIP Act and the regulations that followed it. The goal is to bring together different stakeholders interested in surveillance policy for an open exchange of views. The conference is open to the public, but you have to register here.

Here is the programme and the event website.