Category Archives: Privacy technology

Anonymous communication, data protection

Developments on health privacy…

The Register reports a leaked document from the NHS which concludes that sensitive patient records would probably be safer held locally, rather than stored on a national database as the Government proposes.

This follows a poll last week in which a majority of GPs said they would not upload their patients’ records to the national database. Together the poll and the leak are a double whammy for the misguided and wasteful project to centralise all computer systems in the NHS.

On Wednesday we are launching a campaign to persuade patients to opt out too. The inaugural meeting will be from 7 to 9 PM in Imperial College, London. For background, see recent posts on opting out and on kids’ databases.

Kids’ databases

The Information Commissioner has just published a report we wrote for him on the UK Government’s plans to link up most of the public-sector databases that contain information on children. We’re concerned that aggregating this data will be both unsafe and illegal. Our report has got coverage in the Guardian, the Telegraph (with a leader), the Daily Mail, the BBC and the Evening Standard.

Traffic Data Retention and Forensic Imaging

Last week I participated in yet another workshop on traffic data retention, in ICRI, with the added twist that now traffic data retention is in ‘European Law’, and shall become actual law in most EU countries very soon. It was a special treat to be talking just after Chief Superindendent Luc Beirens, Head of the Belgian Federal Computer Crime Unit, that tried to sell the idea of retention to a crowd of people from the flagship EU privacy project PRIME.

As usually Beirens assured us that proper judicial oversight exists and will regulate access to traffic data. Yet a different pictured emerged when we got into the details of how cyber-crime investigations are conducted. It turns out that the first thing that the police does, to the suspects but also the victims, of cyber-crime is to take a forensic image of their hard disk. This is a sound precaution: booting up the machine to extract evidence may activate malware on a victim’s machine to erase traces, or an alert system on a suspects computer.

The obvious question becomes: how does this policy of automatic forensic imaging and analysis of a hard disk interacts with traffic data retention? Luc was keen to acknowledge that the investigation procedure would proceed unchanged, and an image of a hard disk that may contain retained data would be taken — and forensic tools used on the totality of the hard disk. To be fair, tools that take a forensic image or only look at parts of the disk according to a set security policy do not exist.

What does this mean? If you are a victim of cyber-crime, or a company you have given your data to is a victim of cyber-crime, all the data will end up with the police. This will be the case irrespective of judicial oversight, or any other safeguards. You may ask yourself what the chance is that the retained data will be kept of a computer that maybe part of an investigation? First do not underestimate the fact that these machines will end up on-line to serve requests, and therefore will be subject to their fair share of attacks. But most importantly this case will obviously occur as part of an investigation on themisuse, unauthorized access, or attempted access to the traffic data retention systems!

This standard procedure may also explain why companies are so reluctant to call in the high tech crime units to help them investigate cyber-crime. Their procedures are simply incompatible with any security policy with a confidentiality component. Would you report some of your documents being stolen from your home or business, if this meant the police taking a copy of every single paper in the building?

Opting out of the NHS Database

The front page lead in today’s Guardian explains how personal medical data (including details of mental illness, abortions, pregnancy, drug taking, alcohol abuse, fitting of colostomy bags etc etc) are to be uploaded to a central NHS database regardless of patients’ wishes.

The Government claims that especially sensitive data can be put into a “sealed envelope” which would not ordinarily be available… except that NHS staff will be able to “break the seal” under some circumstances; the police and Government agencies will be able to look at the whole record — and besides, this part of the database software doesn’t even exist yet, and so the system will be running without it for some time.

The Guardian has more details in the article: From cradle to grave, your files available to a cast of thousands, some comments from doctors and other health professionals: A national database is not essential and a leading article: Spine-chilling.

The Guardian give details on how to opt-out of data sharing: What can patients do? using suggestions for a letter from our own Ross Anderson who has worked on medical privacy for over a decade (see his links to relevant research).

If you are concerned (and in my view, you really should be — once your data is uploaded it will be pretty much public forever), then discuss it with your GP and write off to the Department of Health [*]. The Guardian gives some suitable text, or you could use the opt-out letter that FIPR developed last year (PDF or Word versions available).

[*] See Ross’s comment on this article first!

Yet another insecure banking system

The banks are thinking about introducing a new anti-phising meaure called the ‘chip authentication protocol’. How it works is that each customer gets a device like a pocket calculator in which you put your ‘chip and PIN’ (EMV) card, enter your PIN (the same PIN you use for ATMs), and it will display a one-time authentication code that you’ll use to log on to your electronic banking service, instead of the current password and security question. The code will be computed by the card, which will encrypt a transaction counter using the EMV authentication cryptogram generation key – the same key the EMV protocol uses to generate a MAC on an ATM or store transaction. The use model is that everyone will have a CAP calculator; you’ll usually use your own, but can lend it to a friend if he’s caught short.

I can see several problems with this. First, when your wallet gets nicked the thief will be able to read your PIN digits from the calculator – they will be the dirty and worn keys. If you just use one bank card, then the thief’s chance of guessing your PIN in 3 tries has just come down from about 1 in 3000 to about 1 in 10. Second, when you use your card in a Mafia-owned shop (or in a shop whose terminals have been quietly reprogrammed) the bad guys have everything they need to loot your account. Not only that – they can compute a series of CAP codes to give them access in the future, and use your account for wicked purposes such as money laundering. Oh, and once all UK banks (not just Coutts) use one-time passwords, the phishermen will just rewrite their scripts to do real-time man-in-the-middle attacks.

I suspect the idea of trying to have a uniform UK solution to the phishing problem may be misguided. Bankers are herd animals by nature, but herding is a maladaptive response to phishing and other automated attacks. It might be better to go to the other extreme, and have a different interface for each customer. Life would be harder for the phishermen, for example, if I never got an email from the NatWest but only ever from Bernie Smith my ‘relationship banker’ – and if I were clearly instructed that if anyone other than Bernie ever emailed me from the NatWest then it was a scam. But I don’t expect that the banks will start to act rationally on security until the liability issues get fixed.

How to hack your GP's computer system

It’s easy – you just send them a letter on what appears to be Department of Health notepaper telling them to go to a URL, download a program, load it on their practice system, and run it. The program does something with the database, extracts some information and sends it back to whoever wrote it.

I have written to one of the medical magazines explaining why this is not a good way to do things. Doctors would never dream of injecting some random potion they received through the post into their patients – they’d insist on peer review, licensing, and a trustworthy supply chain. So who reviewed the specification of this software? Who evaluated the implementation? Who accepts liability if it corrupts the patient database, leading to a fatal accident?

Were it not for the Computer Misuse Act, I would email 100 practices at random with a version of the above letter, telling them to run my software – which would simply report back who ran it. From talking to a handful of doctors I reckon most of them would fall for it.

No doubt the bad guys will start doing this sort of thing. Eventually doctors, lawyers and everyone else will learn the simple lesson ‘don’t install software’. Until then, this will be the smart way to help yourself to the juicy, meaty bits of a target organisation’s data. So what will we call it? Philleting?

New website on NHS IT problems

At http://nhs-it.info, colleagues and I have collected material on the NHS National Programme for IT, which shows all the classic symptoms of a large project failure in the making. If it goes belly-up, it could be the largest IT disaster ever, and could have grave consequences for healthcare in Britain. With 22 other computer science professors, I wrote to the Health Select Committee urging them to review the project. The Government is dragging its feet, and things seem to be going from bad to worse.

Random isn't always useful

It’s common to think of random numbers as being an essential building block in security systems. Cryptographic session keys are chosen at random, then shared with the remote party. Security protocols use “nonces” for “freshness”. In addition, randomness can slow down information gathering attacks, although here they are seldom a panacea. However, as George Danezis and I recently explained in “Route Fingerprinting in Anonymous Communications” randomness can lead to uniqueness — exactly the property you don’t want in an anonymity system.
Continue reading Random isn't always useful

Which services should remain offline?

Yesterday I gave a talk on confidentiality at the EMIS annual conference. I gained yet more insights into Britain’s disaster-prone health computerisation project. Why, for example, will this cost eleven figures, when EMIS writes the software used by 60% of England’s GPs to manage their practices with an annual development budget of only 25m?

On the consent front, it turns out that patients who exercise even the mildest form of opt-out from the national database (having their addresses stop-noted, which is the equivalent of going ex-directory — designed for celebs and people in witness protection) will not be able to use many of the swish new features we’re promised, such as automatic repeat prescriptions. There are concerns that providing a degraded health service to people who tick the privacy box might undermine the validity of consent to information sharing.

On the confidentiality front, people are starting to wrestle with the implications of allowing patients online access ot their records. Vulnerable patients — for example, under-age girls who have had pregancy terminations without telling their parents — could be at risk if they can access sensitive data online. They may be coerced into accessing it, or their passwords may become known to friends and family. So there’s talk of a two-tier online record — in effect introducing multilevel security into record access. Patients would be asked whether they wanted some, all, or none of their records to be available to them online. I don’t think the Department of Health understands the difficulties of multilevel security. I can’t help wondering whether online patient access is needed at all. Very few patients ever exercise their right to view and get a copy of their records; making all records available online seems more and more like a political gimmick to get people to accept the agenda of central data collection.

We don’t seem to have good ways of deciding what services should be kept offline. There’s been much debate about elections, and here’s an interesting case from healthcare. What else will come up, and are there any general principles we’re missing?

A Study on The Value of Location Privacy

There is a Workshop on Privacy in The Electronic Society taking place at the beginning of November. We (George Danezis, Marek Kumpost, Vashek Matyas, and me) will present there results of A Study on the value of Location Privacy we have conducted a half year back.

We questioned a sample of over 1200 people from five EU countries, and used tools from experimental psychology and economics to extract from them the value they attach to their location data. We compare this value across national groups, gender and technical awareness, but also the perceived difference between academic use and commercial exploitation. We provide some analysis of the self-selection bias of such a study, and look further at the valuation of location data over time using data from another experiment.

The countries we gathered the data from were Germany, Belgium, Greece, the Czech Republic, and the Slovak Republic. As some of the countries have local currencies, we have re-calculated the values of bids in different countries by using a “value of money” coefficient computed as a ratio of average salaries and price levels in particular countries — this data was taken from Eurostat statistics.

We have gathered bids for three auctions or scenarios. The first and second bids were for one-month tracking. The former data were to be used for academic purposes only, and the latter for commercial purposes. The third bids were for the scenario where participants agreed with a year long tracking and data free for commercial exploitation. Let us start with the first bids.

Differences among Countries

The distributions of the first bids are on the following plot. Although there are differences between all nations, the Greek bids are beyond our expectations.

Distributions of bids in the first auction round.

Continue reading A Study on The Value of Location Privacy