Category Archives: News coverage

Media reports that may interest you

Snoopers’ Charter 2.0

This afternoon at 4.30 I have been invited to give evidence in Parliament to the Joint Select Committee on the Investigatory Powers Bill.

This follows evidence I gave on the technical aspects of the bill to the Science and Technology Committee on November 10th; see video and documents. Of particular interest may be comments by my Cambridge colleague Richard Clayton; an analysis by my UCL colleague George Danezis; the ORG wiki; and finally the text of the bill itself.

While the USA has reacted to the Snowden revelations by restraining the NSA in various ways, the UK reaction appears to be the opposite. Do we really want to follow countries like China, Russia and Kazakhstan, and take the risk that we’ll tip countries like Brazil and India into following our lead? If the Internet fragments into national islands, that will not only do grave harm to the world economy, but make life a lot harder for GCHQ too.

Internet of Bad Things

A lot of people are starting to ask about the security and privacy implications of the “Internet of Things”. Once there’s software in everything, what will go wrong? We’ve seen a botnet recruiting CCTV cameras, and a former Director of GCHQ recently told a parliamentary committee that it might be convenient if a suspect’s car could be infected with malware that would cause it to continually report its GPS position. (The new Investigatory Powers Bill will give the police and the spooks the power to hack any device they want.)

So here is the video of a talk I gave on The Internet of Bad Things to the Virus Bulletin conference. As the devices around us become smarter they will become less loyal, and it’s not just about malware (whether written by cops or by crooks). We can expect all sorts of novel business models, many of them exploitative, as well as some downright dishonesty: the recent Volkswagen scandal won’t be the last.

But dealing with pervasive malware in everything will demand new approaches. Our approach to the Internet of Bad Things includes our new Cambridge Cybercrime Centre, which will let us monitor bad things online at the kind of scale that will be required.

Award-winning case history of the care.data health privacy scandal

Each year we divide our masters of public policy students into teams and get them to write case studies of public policy failures. The winning team this year wrote a case study of the care.data fiasco. The UK government collected personal health information on tens of millions of people who had had hospital treatment in England and then sold it off to researchers, drug companies and even marketing firms, with only a token gesture of anonymisation. In practice patients were easy to identify. The resulting scandal stalled plans to centralise GP data as well, at least for a while.

Congratulations to Lizzie Presser, Maia Hruskova, Helen Rowbottom and Jesse Kancir, who tell the story of how mismanagement, conflicts and miscommunication led to a failure of patient privacy on an industrial scale, and discuss the lessons that might be learned. Their case study has just appeared today in Technology Science, a new open-access journal for people studying conflicts that arise between technology and society. LBT readers will recall several posts reporting the problem, but it’s great to have a proper, peer-reviewed case study that we can give to future generations of students. (Incidentally, the previous year’s winning case study was on a related topic, the failure of the NHS National Programme for IT.)

FCA view on unauthorised transactions

Yesterday the Financial Conduct Authority (the UK bank regulator) issued a report on Fair treatment for consumers who suffer unauthorised transactions. This is an issue in which we have an interest, as fraud victims regularly come to us after being turned away by their bank and by the financial ombudsman service. Yet the FCA have found that everything is hunky dory, and conclude “we do not believe that further thematic work is required at this stage”.

One of the things the FCA asked their consultants is whether there’s any evidence that claims are rejected on the sole basis that a pin was used. The consultants didn’t want to reply on existing work but instead surveyed a nationally representative sample of 948 people and found that 16% had a transaction dispute in the last year. These were 37% MOTO, 22% cancelled future dated payment, 15% ATM cash, 13% shop, 13% lump sum from bank account. Of customers who complained, 43% were offered their money back spontaneously; a further 41% asked; in the end a total of 68% got refunds after varying periods of time. In total 7% (15 victims) had claim declined, most because the bank said the transaction was “authorised” or following a”contract with merchant” and 2 for chip and pin (one of them an ATM transaction; the other admitted sharing their PIN). 12 of these 15 considered the result
unfair. These figures are entirely consistent with what we learn from the British Crime Survey and elsewhere; two million UK victims a year, and while most get their money back, many don’t; and a hard core of perhaps a few tens of thousands who end up feeling that their bank has screwed them.

The case studies profiled in the consultants’ paper were of glowing happy people who got their money back; the 12 sad losers were not profiled, and the consultants concluded that “Customers might be being denied refunds on the sole basis that Chip and PIN were used … we found little evidence of this” (p 49) and went on to remark helpfully that some customers admitted sharing their PINs and felt OK lying about this. The FCA happily paraphrases this as “We also did not find any evidence of firms holding customers liable for unauthorised transactions solely on the basis that the PIN was used to make the transaction” (main report, p 13, 3.25).

According to recent news reports, the former head of the FCA, Martin Wheatley, was sacked by George Osborne for being too harsh on the banks.

Crypto Wars 2.0

Today we unveil a major report on whether law enforcement and intelligence agencies should have exceptional access to cryptographic keys and to our computer and communications data generally. David Cameron has called for this, as have US law enforcement leaders such as FBI Director James Comey.

This policy repeats a mistake of the 1990s. The Clinton administration tried for years to seize control of civilian cryptography, first with the Clipper Chip, and then with various proposals for ‘key escrow’ or ‘trusted third party encryption’. Back then, a group of experts on cryptography and computer security got together to explain why this was a bad idea. We have now reconvened in response to the attempt by Cameron and Comey to resuscitate the old dead horse of the 1990s.

Our report, Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications, is timed to set the stage for a Wednesday hearing of the Senate Judiciary Committee at which Mr Comey will present his proposals. The reply to Comey will come from Peter Swire, who was on the other side twenty years ago (he was a Clinton staffer) and has written a briefing on the first crypto war here. Peter was recently on President Obama’s NSA review group. He argues that the real way to fix the problems complained of is to fix the mutual legal assistance process – which is also my own view.

Our report is also highly relevant to the new ‘Snoopers’ Charter’ that Home Secretary Teresa May has promised to put before parliament this fall. Mrs May has made clear she wants access to everything.

However this is both wrong in principle, and unworkable in practice. Building back doors into all computer and communication systems is against most of the principles of security engineering, and it also against the principles of human rights. Our right to privacy, set out in section 8 of the European Convention on Human Rights, can only be overridden by mechanisms that meet three tests. First, they must be set out in law, with sufficient clarity for their effects to be foreseeable; second, they must be proportionate; third, they must be necessary in a democratic society. As our report makes clear, universal exceptional access will fail all these tests by a mile.

For more, see the New York Times.

Thinking of selling your old phone? Watch out!

Today we unveil two papers describing serious and widespread vulnerabilities in Android mobile phones. The first presents a Security Analysis of Factory Resets. Now that hundreds of millions of people buy and sell smartphones secondhand and use them for everything from banking to dating, it’s important to able to sanitize your phone. You need to clean it when you buy it, so you don’t get caught by malware; and even more when you sell it, so you don’t give away your bank credentials or other personal information. So does the factory reset function actually work? We bought a couple of dozen second-hand Android phones and tested them to find out.

The news is not at all good. We were able to retrieve the Google master cookie from the great majority of phones, which means that we could have logged on to the previous owner’s gmail account. The reasons for failure are complex; new phones are generally better than old ones, and Google’s own brand phones are better than the OEM offerings. However the vendors need to do a fair bit of work, and users need to take a fair amount of care.

Attacks on a sold phone that could not be properly sanitized are one example of what we call a “user-not-present” attack. Another is when your phone is stolen. Many security software vendors offer a facility to lock or wipe your phone remotely when this happens, and it’s a standard feature with mobile antivirus products. Do these ‘solutions’ work?

You guessed it. Antivirus software that relies on a faulty factory reset can only go so far, and there’s only so much you can do with a user process. The AV vendors have struggled with a number of design tradeoffs, but the results are not that impressive. See Security Analysis of Consumer-Grade Anti-Theft Solutions Provided by Android Mobile Anti-Virus Apps for the gory details. These failings mean that staff at firms which handle lots of second-hand phones (whether lost, stolen, sold or given to charity) could launch some truly industrial-scale attacks. These papers appear today at the Mobile Security Technology workshop at IEEE Security and Privacy.

Another scandal about forensics

The FBI overstated forensic hair matches in nearly all trials up till 2000. 26 of their 28 examiners overstated forensic matches in ways that favoured prosecutors in more than 95 percent of the 268 trials reviewed so far. 32 defendants were sentenced to death, of whom 14 were executed or died in prison.

In the District of Columbia, the only jurisdiction where defenders and prosecutors have re-investigated all FBI hair convictions, three of seven defendants whose trials included flawed FBI testimony have been exonerated through DNA testing since 2009, and courts have cleared two more. All five served 20 to 30 years in prison for rape or murder. The FBI examiners in question also taught 500 to 1,000 state and local crime lab analysts to testify in the same ways.

Systematically flawed forensic evidence should be familiar enough to readers of this blog. In four previous posts here I’ve described problems with the curfew tags that are used to monitor the movements of parolees and terrorism suspects in the UK. We have also written extensively on the unreliability of card payment evidence, particularly in banking disputes. However, payment evidence can also be relevant to serious criminal trials, of which the most shocking cases are probably those described here and here. Hundreds, perhaps thousands, of men were arrested after being wrongly suspected of buying indecent images of children, when in fact they were victims of credit card fraud. Having been an expert witness in one of those cases, I wrote to the former DPP Kier Starmer on his appointment asking him to open a formal inquiry into the police failure to understand credit card fraud, and to review cases as appropriate. My letter was ignored.

The Washington Post article argues cogently that the USA lacks, and needs, a mechanism to deal with systematic failures of the justice system, particularly when these are related to its inability to cope with technology. The same holds here too. In addition to the hundreds of men wrongly arrested for child porn offences in Operation Ore, there have been over two hundred prosecutions for curfew tag tampering, no doubt with evidence similar to that offered in cases where we secured acquittals. There have been scandals in the past over DNA and fingerprints, as I describe in my book. How many more scandals are waiting to break? And as everything goes online, digital evidence will play an ever larger role, leading to more systematic failures in future. How should we try to forestall them?

Whodunnit? Fascinating Forensics by BBC’s Naked Scientists

BBC’s Naked Scientists recently did an hour-long show with live audience about forensic science,  during which they solved a (fictitious) murder with the help of six forensic scientists and practitioners.

Chris Smith and Ginny Smith covered the forensic process from crime scene to court room and discussed all the evidence in between, including how to retrieve forensic evidence from a crime scene, digital forensics and the (lack of) randomness of numbers, toxicology, pathology, eye-witness testimony and our work on motion-based lie detection.

You can find the podcast here.

Media coverage “to freeze or not to freeze” paper

On the 5th of January this year we presented a paper on the automatic detection of deception based on full-body movements at HICSS (Hawaii), which we blogged about here at LBT. We measured the movements of truth tellers and liars using full-body motion capture suits and found that liars move more than truth tellers; when combined with interviewing techniques designed to increase the cognitive load of liars, but not of truth tellers, liars even moved almost twice as much as truth tellers. These results indicate that absolute movement, when measured automatically, may potentially be a reliable cue to deceit. We are now aiming to find out if this increase in body movements when lying is stable across situations and people. Simultaneously, we are developing two lines of technology that will make this method more usable in practice. First, we are building software to analyse behaviors in real-time. This will enable us to analyse behavior whilst it is happening (i.e., during the interview), instead of afterwards. Second, we are investigating remote ways to analyse behavior, so interviewees will not have to wear a body-suit when being interviewed. We will keep you updated on new developments.

In the meantime, we received quite a lot of national and international media attention. Here is some tv and radio coverage on our work by Dailymotion, Fox (US), BBC world radio, Zoomin TV (NL), WNL Vandaag de dag (NL, deel 2, starts at 5:20min), RTL Boulevard (NL), Radio 2 (NL), BNR (NL), Radio 538 (NL). Our work was also covered by newspapers, websites and blogs, including the Guardian, the Register, the Telegraph, the Telegraph incl. polygraphthe Daily Mail, Mail Online, Cambridge News, King’s College Cambridge, Lancaster University, Security Lancaster, Bruce Schneier’s blog, International Business TimesRT,   PC World, PC Advisor, Engadget, News Nation, Techie News, ABP Live, TweakTown, Computer WorldMyScience, King World News, La Celosia (Spanish),de Morgen (BE), NRC (NL), Algemeen Dagblad (NL), de Volkskrant (NL), KIJK (NL), and RTV Utrecht (NL).

 

 

Can we have medical privacy, cloud computing and genomics all at the same time?

Today sees the publication of a report I helped to write for the Nuffield Bioethics Council on what happens to medical ethics in a world of cloud-based medical records and pervasive genomics.

As the information we gave to our doctors in private to help them treat us is now collected and treated as an industrial raw material, there has been scandal after scandal. From failures of anonymisation through unethical sales to the care.data catastrophe, things just seem to get worse. Where is it all going, and what must a medical data user do to behave ethically?

We put forward four principles. First, respect persons; do not treat their confidential data like were coal or bauxite. Second, respect established human-rights and data-protection law, rather than trying to find ways round it. Third, consult people who’ll be affected or who have morally relevant interests. And fourth, tell them what you’ve done – including errors and security breaches.

The collection, linking and use of data in biomedical research and health care: ethical issues took over a year to write. Our working group came from the medical profession, academics, insurers and drug companies. We had lots of arguments. But it taught us a lot, and we hope it will lead to a more informed debate on some very important issues. And since medicine is the canary in the mine, we hope that the privacy lessons can be of value elsewhere – from consumer data to law enforcement and human rights.