Virgin Money sends email helping phishers

It’s not unusual for banks to send emails which are confusingly similar to phishing, but this recent one I received from Virgin Money is exceptionally bad. It tells customers that the bank (Northern Rock) is changing domain names from their usual one (northernrock.co.uk) to virginmoney.com and customers should use their usual security credentials to log into the new domain name. Mail clients will often be helpful and change the virginmoney.com into a link.

This message is exactly what phishers would like customers to fall for. While this email was legitimate (albeit very unwise), a criminal could follow up with an email saying that savings customers should access their account at virginsavings.net (which is currently available for registration). Virgin Money have trained their customers to accept such emails as legitimate, which is a very dangerous lesson to teach.

It would have been safer to not do the rebranding, but if that’s considered essential for commercial reasons, then customers should have been told to continue accessing the site at their usual domain name, and redirected them (via HTTPS) to the new site. It would mean keeping hold of the Northern Rock domain names for the foreseeable future, but that is almost certainly what Virgin Money are planning anyway.


[larger version]

Job opening: post-doctoral researcher in usable security

(post UPDATED with new job opening)

I am delighted to announce a job opening in the Cambridge Security Group. Thanks to generous funding from the European Research Council I am in a position to recruit several post-doc research associates to work with me on the Pico project, whose ambitious aim is ultimately to liberate the world from the annoyance and insecurity of passwords, which everyone hates.

In previous posts I hinted at why it’s going to be quite difficult (Oakland paper) and what my vision for Pico is (SPW paper, USENIX invited talk). What I want to do, now that I have the investment to back my idea, is to assemble an interdisciplinary team of the best possible people, with backgrounds not just in security and software but crucially in psychology, interaction design and embedded hardware. We’ll design and build a prototype, build a batch of them and then have real people (not geeks) try them out and tell us why they’re all wrong. And then design and build a better one and try it out again. And iterate as necessary, always driven by what works for real humans, not technologists. I expect that the final Pico will be rather different, and a lot better, than the one I envisaged in 2011. Oh, and by the way, to encourage universal uptake, I already promised I won’t patent any of it.

As I wrote in the papers above, I don’t expect we’ll see the end of passwords anytime soon, nor that Pico will displace passwords as soon as it exists. But I do want to be ready with a fully worked out solution for when we finally collectively decide that we’ve had enough.

Imagine we could restart from zero and do things right. Have you got a relevant PhD or are about to get one? Are you keen to use it to change the world for the better? Are you best of the best, and have the track record to prove it? Are you willing to the first member of my brilliant interdisciplinary team? Are you ready for the intellectually challenging and stimulating environment of one of the top research universities in the world? Are you ready to be given your own real challenges and responsibilities, and the authority to be in charge of your work? Then great, I want to hear from you and here’s what you need to do to apply (post UPDATED with new opening).

(By the way: I’m off to Norway next week for passwords^12, a lively 3-day conference organized by Per Thorsheim and totally devoted to nothing else than passwords.)

Since I was passing…

When you register an Internet domain name in “.com” (and some other top level domains) you have the choice of using a “privacy” or “proxy” service rather than having your name and contact details recorded within the “whois” systems that provide a public record of domain name ownership.

A privacy service will record that you are the owner of the domain name but your contact details will be hidden. A proxy service will hide your identity as well.

The privacy-conscious use these services to avoid disclosing information about themselves (and to avoid the trivial amount of spam sent to contact email addresses). The cyber criminals use these services as well — so that it is hard for the Good Guys to link domains into groups and hard for them to argue (in an Al Capone tax evading manner) that “you may not understand this criminality or be convinced this evidence, but just take a look at the invalid details given when registering the domain“.

I’m currently working on a project for ICANN that will measure the prevalance of privacy/proxy usage by different types of cybercriminals… of which more at another time — because at present I’m having a holiday! I went to Palm Cove (just north of Cairns) to see the recent total solar eclipse… and my holiday involves a short(ish) drive south to Melbourne

… and since I was passing Nobby Beach (just south of Brisbane) I took the opportunity to peek at the home of the larger Internet domain name proxy services:
Richard points at PrivacyProtect.org's PO Box
whose details appear in whois records like this:

PrivacyProtect.org
Domain Admin (contact@privacyprotect.org)
ID#10760, PO Box 16
Note - All Postal Mails Rejected, visit Privacyprotect.org
Nobby Beach
null,QLD 4218
AU
Tel. +45.36946676

There are at present (according to domainnametools.com) some 2,584,758 domains associated with contact@privacyprotect.org. You can see why they don’t want any postal mail, because their PO box is merely a standard size:
Close-up of PO Box #16
The reality of course is that you should contact Privacy Protection by email or their website… but then you’d miss out on getting to look at some of the nearby beaches!
View of beach at Surfer's Paradise

Will the Information Commissioner be consistent?

This afternoon, the Information Commissioner will unveil a code of practice for data anonymisation. His office is under pressure; as I described back in August, Big Pharma wants all our medical records and has persuaded the Prime Minister it should have access so long as our names and addresses are removed. The theory is that a scientist doing research into cardiology (for example) could have access to the anonymised records of all heart patients.

The ICO’s blog suggests that he will consider data to be anonymous and thus no longer private if they cannot be reidentified by reference to any other data already in the public domain. But this is trickier than you might think. For example, Tim Gowers just revealed on his excellent blog that he had an ablation procedure for atrial fibrillation a couple of weeks ago. So if our researcher can search for all males aged 45-54 who had such a procedure on November 6th 2012 he can pull Tim’s record, including everything that Tim intended to keep private. Even with a central cardiology register, it’s hard to think of a practical mechanism could block Tim’s record as soon as he made that blog post. But now researchers are starting to carry round millions of people’s records on their laptops, protecting privacy is getting really hard.

In his role as data protection regulator, the Commissioner has been eager to disregard the risk of re-identification from private information. Yet Maurice Frankel of the Campaign for Freedom of Information has pointed out to me that he regularly applies a very different rule in Freedom of Information cases, including one involving the University of Cambridge. There, he refused a freedom of information request about university dismissals on the grounds that “friends, former colleagues, or acquaintances of a dismissed person may, through their contact with that person, know something of the circumstances of that person’s departure” (see para 30).

So I will be curious to see this afternoon whether the Commissioner places greater value on the consistency of his legal rulings, or their convenience to the powerful.

ACM Queue interview on research into the hardware-software interface

ACM Queue has posted my August 2012 interview on research into the hardware-software interface. We discuss the importance of a whole-stack view in addressing contemporary application security problems, which are often grounded in how we represent and execute software over lower-level substrates. We need to consider CPU design, operating systems, programming languages, applications, and formal methods — which requires building collaborations that span traditional silos in computer science research. I also consider the impact of open source on software security research methodology, and how we might extend those ideas to CPU research. A motivation for this investigation is our experimental CHERI hybrid capability processor, part of the CTSRD Project, a long-term research collaboration between the security, operating systems, and computer architecture groups at the University of Cambridge Computer Laboratory and the systems and formal methods groups SRI International Computer Science Laboratory.

GetCash from NatWest

It has been four or five months since NatWest launched a new function in its mobile phone app – GetCash. The goal was to allow customers to withdraw cash from NatWest’s ATMs without a debit or credit card. The app receives a six digit code that customers can type into an ATM and get as much as £100 at a time. I am not sure how useful it is as I personally forget my mobile phone more often than my wallet but it appears that some crooks found it very useful indeed.

A news about the service being suspended broke out on 6th of October and it has been covered in BBC Breakfast today. I have several thoughts related to this incident. Continue reading GetCash from NatWest

Who will screen the screeners?

Last time I flew through Luton airport it was a Sunday morning, and I went up to screening with a copy of the Sunday Times in my hand; it’s non-metallic after all. The guard by the portal asked me to put it in the tray with my bag and jacket, and I did so. But when the tray came out, the newspaper wasn’t there. I approached the guard and complained. He tried to dismiss me but I was politely insistent. He spoke to the lady sitting at the screen; she picked up something with a guilty look sideways at me, and a few seconds later my paper came down the rollers. As I left the screening area, there were two woman police constables, and I wondered whether I should report the attempted theft of a newspaper. As my flight was leaving in less than an hour, I walked on by. But who will screen the screeners?

This morning I once more flew through Luton, and I started to suspect it wouldn’t be the airport’s management. This time the guard took exception to the size of the clear plastic bag holding my toothpaste, mouthwash and deodorant, showing me with glee that it has half a centimetre wider than the official outline on a card he had right to hand. I should mention that I was using a Sainsbury’s freezer bag, a standard item in our kitchen which we’ve used for travel for years. No matter; the guard gleefully ordered me to buy an approved one for a pound from a slot machine placed conveniently beside the belt. (And we thought Ryanair’s threat to charge us a pound to use the loo was just a marketing gimmick.) But what sort of signal do you give to low-wage security staff if the airport merely sees security as an excuse to shake down the public? And after I got through to the lounge and tried to go online, I found that the old Openzone service (which charged by the minute) is no longer on offer; instead Luton Airport now demands five pounds for an hour’s access. So I’m writing this blog post from Amsterdam, and next time I’ll probably fly from Stansted.

Perhaps one of these days I’ll write a paper on “Why Security Usability is Hard”. Meanwhile, if anyone reading this is near Amsterdam on Monday, may I recommend the Amderdam Privacy Conference? Many interesting people will be talking about the ways in which governments bother us. (I’m talking about how the UK government is trying to nobble the Data Protection Regulation in order to undermine health privacy.)

Plaintext Password Reminders

There was a public outcry followed by ICO “making enquiries” when Troy Hunt published a post about Tesco’s plaintext password reminders exactly a month ago.

I wanted to use the reference for a text I was writing last week when someone asked me about online accounts of Companies House. At that moment I said to myself, wait a second. Companies House sends plaintext reminders as well. How strange. I sent a link to a short post to ComputerWorld. They in turn managed to get a statement from Companies House that includes:

“… although it is [Companies House] certified to the ISO 27001 standard and adheres to the government’s Security Policy Framework, it will carry out a review of its systems in order to establish whether there is a threat to companies’ confidential information.” Continue reading Plaintext Password Reminders

The Perils of Smart Metering

Alex Henney and I have decided to publish a paper on smart metering that we prepared in February for the Cabinet Office and for ministers. DECC is running a smart metering project that is supposed to save energy by replacing all Britain’s gas and electricity meters with computerised ones by 2019, and to cost only £11bn. Yet the meters will be controlled by the utilities, whose interest is to maximise sales volumes, so there is no realistic prospect that the meters will save energy. What’s more, smart metering already exhibits all the classic symptoms of a failed public-sector IT project.

The paper we release today describes how, when Ed Milliband was Secretary of State, DECC cooked the books to make the project appear economically worthwhile. It then avoided the control procedures that are mandatory for large IT procurements by pretending it was not an IT project but an engineering project. We have already written on the security economics of smart meters, their technical security, the privacy aspects and why the project is failing.

We managed to secure a Cabinet Office review of the project which came up with a red traffic light – a recommendation that the project be abandoned. However DECC dug its heels in and the project appears to be going ahead. Hey, we did our best. The failure should be evident in time for the next election; just remember, you read it here first.

Chip and Skim: cloning EMV cards with the pre-play attack

November last, on the Eurostar back from Paris, something struck me as I looked at the logs of ATM withdrawals disputed by Alex Gambin, a customer of HSBC in Malta. Comparing four grainy log pages on a tiny phone screen, I had to scroll away from the transaction data to see the page numbers, so I couldn’t take in the big picture in one go. I differentiated pages instead using the EMV Unpredictable Number field – a 32 bit field that’s supposed to be unique to each transaction. I soon got muddled up… it turned out that the unpredictable numbers… well… weren’t. Each shared 17 bits in common and the remaining 15 looked at first glance like a counter. The numbers are tabulated as follows:

F1246E04
F1241354
F1244328
F1247348

And with that the ball started rolling on an exciting direction of research that’s kept us busy the last nine months. You see, an EMV payment card authenticates itself with a MAC of transaction data, for which the freshly generated component is the unpredictable number (UN). If you can predict it, you can record everything you need from momentary access to a chip card to play it back and impersonate the card at a future date and location. You can as good as clone the chip. It’s called a “pre-play” attack. Just like most vulnerabilities we find these days some in industry already knew about it but covered it up; we have indications the crooks know about this too, and we believe it explains a good portion of the unsolved phantom withdrawal cases reported to us for which we had until recently no explanation.

Mike Bond, Omar Choudary, Steven J. Murdoch, Sergei Skorobogatov, and Ross Anderson wrote a paper on the research, and Steven is presenting our work as keynote speaker at Cryptographic Hardware and Embedded System (CHES) 2012, in Leuven, Belgium. We discovered that the significance of these numbers went far beyond this one case.

Continue reading Chip and Skim: cloning EMV cards with the pre-play attack