Category Archives: Privacy technology

Anonymous communication, data protection

Privacy economics: evidence from the field

It has been argued that privacy is the new currency on the Web. Services offered for free are actually paid for using personal information, which is then turned into money (e.g., using targeted advertising). But what is the exchange rate for privacy? In the largest experiment ever and the first done in field, we shed new light on consumers’ willingness to pay for added privacy.

One in three Web shoppers pay half a euro extra for keeping their mobile phone number private. If privacy comes for free, more than 80% of consumers choose the company that collects less personal information, our study concludes.

Continue reading Privacy economics: evidence from the field

Social authentication – harder than it looks!

This is the title of a paper we’ll be presenting next week at the Financial Crypto conference (slides). There is also coverage in the New Scientist.

Facebook has a social authentication mechanism where you may be asked to recognise some of your friends from photos as part of the login process. We analysed this and found it to be vulnerable to guessing by your friends, and also to modern face-recognition systems. Most people want privacy only from those close to them; if you’re having an affair then you want your partner to not find out but you don’t care if someone in Mongolia learns about it. And if your partner finds out and becomes your ex, then you don’t want them to be able to cause havoc on your account. Celebrities are similar, except that everyone is their friend (and potentially their enemy).

Second, if someone outside your circle of friends is doing a targeted attack on you, then by friending your friends they can get some access to your social circle to collect photos, which they might use in image-recognition software or even manually to pass the test.
Continue reading Social authentication – harder than it looks!

Cloudy with a Chance of Privacy

Three Paper Thursday is an experimental new feature in which we highlight research that group members find interesting.

When new technologies become popular, we privacy people are sometimes miffed that nobody asked for our opinions during the design phase. Sometimes this leads us to make sweeping generalisations such as “only use the Cloud for things you don’t care about protecting” or “Facebook is only for people who don’t care about privacy.” We have long accused others of assuming that the real world is incompatible with privacy, but are we guilty of assuming the converse?

On this Three Paper Thursday, I’d like to highlight three short papers that challenge these zero-sum assumptions. Each is eight pages long and none requires a degree in mathematics to understand; I hope you enjoy them.

Continue reading Cloudy with a Chance of Privacy

Call for Papers: 12th Privacy Enhancing Technologies Symposium (PETS 2012)

Privacy and anonymity are increasingly important in the online world. Corporations, governments, and other organizations are realizing and exploiting their power to track users and their behavior. Approaches to protecting individuals, groups, but also companies and governments, from profiling and censorship include decentralization, encryption, distributed trust, and automated policy disclosure.

The 12th Privacy Enhancing Technologies Symposium addresses the design and realization of such privacy services for the Internet and other data systems and communication networks by bringing together anonymity and privacy experts from around the world to discuss recent advances and new perspectives.

The symposium seeks submissions from academia and industry presenting novel research on all theoretical and practical aspects of privacy technologies, as well as experimental studies of fielded systems. We encourage submissions with novel technical contributions from other communities such as law, business, and data protection authorities, that present their perspectives on technological issues.

Submissions are due 20 February 2012, 23:59 UTC. Further details can be found in the full Call for Papers.

Blood donation and privacy

The UK’s National Blood Service screens all donors for a variety of health and lifestyle risks prior donation. Many are highly sensitive, particularly sexual history and drug use. So I found it disappointing that, after consulting with a nurse who took detailed notes about specific behaviours and when they occurred, I was expected to consent to this information being stored indefinitely. When I pressed as to why this data is retained, I was told it was necessary so that I can be contacted as soon as I’m eligible again to donate blood, and to prevent me from donating before that.

The first reason seems weak, as contacting donors on an annual or semi-annual basis wouldn’t greatly decrease the level of donation (most risk-factor restrictions last at least 12 months or are indefinite). The second reason is a security fantasy, as it would only detect donors who lie at a second visit after being honest initially. I doubt donor dishonesty is a major problem and all blood is tested anyway. The purpose of lifestyle restrictions is to reduce the base rate of unsafe blood because all tests have false negatives. Storing detailed donor history doesn’t even have much time-saving benefit: history needs to be re-taken before each donation, since lifestyle risks can change.

I certainly don’t think the NBS is trying to stockpile data for nefarious reasons. I expect instead that the increasingly low technical costs of storing data speciously justify its very minor secondary uses if one ignores the risk of a massive compromise (NBS gets about 2 M donors per year). I wonder whether the inherent hazard of data collection was considered in the NBS’ cost/benefit analysis when this privacy policy was adopted . Security engineers and privacy advocates would do well to advocate non-collection of sensitive data before fancier privacy-enhancing technology. The NHS provides a vital service but they can’t do it without their donors, who are always in short supply. It would be a shame to discourage anybody from donating and being honest about their health history by demanding to store their data forever.

Privacy event on Wednesday

I will be talking in London on Wednesday at a workshop on Anonymity, Privacy, and Open Data about the difficulty of anonymising medical records properly. I’ll be on a panel with Kieron O’Hara who wrote a report on open data for the Cabinet Office earlier this year, and a spokesman from the ICO.

This will be the first public event on the technology and policy issues surrounding anonymisation since yesterday’s announcement that the government will give wide access to anonymous versions of our medical records. I’ve written extensively on the subject: for an overview, see my book chapter which explores the security of medical systems in general from p 282 and the particular problems of using “anonymous” records in research from p 298. For the full Monty, start here.

Anonymity is hard enough if the data controller is capable, and motivated to try hard. In the case of the NHS, anonymity has always been perfunctory; the default is to remove patient names and addresses but leave their postcodes and dates of birth. This makes it easy to re-identify about 99% of patients (the exceptions are mostly twins, soldiers, students and prisoners). And since I wrote that book chapter, the predicted problems have come to pass; for example the NHS lost a laptop containing over eight million patients’ records.

Here we go again

The Sunday media have been trailing a speech by David Cameron tomorrow about giving us online access to our medical records and our kids’ school records, and making anonymised versions of them widely available to researchers, companies and others. Here is coverage in the BBC, the Mail and the Telegraph; there’s also a Cabinet Office paper. The measures are supported by the CEO of Glaxo and opposed by many NGOs.

If the Government is going to “ensure all NHS patients can access their personal GP records online by the end of this Parliament”, they’ll have to compel the thousands of GPs who still keep patient records on their own machines to transfer them to centrally-hosted facilities. The systems are maintained by people who have to please the Secretary of State rather than GPs, and thus become progressively less useful. This won’t just waste doctors’ time but will have real consequences for patient safety and the quality of care.

We’ve seen this repeatedly over the lifetime of NPfIT and its predecessor the NHS IM&T strategy. Officials who can’t develop working systems become envious of systems created by doctors; they wrest control, and the deterioration starts.

It’s astounding that a Conservative prime minister could get the idea that nationalising something is the best way to make it work better. It’s also astonishing that a Government containing Liberals who believe in human rights, the rule of law and privacy should support the centralisation of medical records a mere two years after the Joseph Rowntree Reform Trust, a Liberal charity, produced the Database State report which explained how the centralisation of medical records (and for that matter children’s records) destroys privacy and contravenes human-rights law. The coming debate will no doubt be vigorous and will draw on many aspects of information security, from the dreadful security usability (and safety usability) of centrally-purchased NHS systems, through the real hazards of coerced access by vulnerable patients, to the fact that anonymisation doesn’t really work. There’s much more here. Of course the new centralisation effort will probably fail, just like the last two; health informatics is a hard problem, and even Google gave up. But our privacy should not depend on the government being incompetent at wrongdoing. It should refrain from wrongdoing in the first place.

Trusted Computing 2.1

We’re steadily learning more about the latest Trusted Computing proposals. People have started to grok that building signed boot into UEFI will extend Microsoft’s power over the markets for AV software and other security tools that install around boot time; while ‘Metro’ style apps (i.e. web/tablet/html5 style stuff) could be limited to distribution via the MS app store. Even if users can opt out, most of them won’t. That’s a lot of firms suddenly finding Steve Ballmer’s boot on their jugular.

We’ve also been starting to think about the issues of law enforcement access that arose during the crypto wars and that came to light again with CAs. These issues are even more wicked with trusted boot. If the Turkish government compelled Microsoft to include the Tubitak key in Windows so their intelligence services could do man-in-the-middle attacks on Kurdish MPs’ gmail, then I expect they’ll also tell Microsoft to issue them a UEFI key to authenticate their keylogger malware. Hey, I removed the Tubitak key from my browser, but how do I identify and block all foreign governments’ UEFI keys?

Our Greek colleagues are already a bit cheesed off with Wall Street. How happy will they be if in future they won’t be able to install the security software of their choice on their PCs, but the Turkish secret police will?

Debate at Cambridge Festival of Ideas: Internet Freedom

In the evening of Thursday 27 October, I will be participating in a debate at the Cambridge Festival of Ideas, on Internet Freedom. Other speakers include Jim Killock, executive director of the Open Rights Group, Herbert Snorsson, founder of Openleaks.org and David Clemente, Chatham House. Further details can be found on the festival website.

Attendance is free, but booking is required.

PhD studentship available for research on anonymity and privacy

Funding is available for a PhD student to work at the University of Cambridge Computer Laboratory, on the topic of privacy enhancing technologies and anonymous communications, starting in April 2012.

The sponsorship is jointly provided by Microsoft Research Cambridge and under the Dorothy Hodgkin Postgraduate Awards scheme. As such, applicants must be nationals from India, China, Hong Kong, South Africa, Brazil, Russia or countries in the developing world as defined by the Development Assistance Committee of the OECD.

The application deadline is soon (28 October 2011), so please circulate this advertisement to anyone who you think might find it of interest.

Further details can be found on the University website, and enquiries should be sent to me (Steven.Murdoch@cl.cam.ac.uk).