Category Archives: Politics

Operational security failure

A shocking article appeared yesterday on the BMJ website. It recounts how auditors called 45 GP surgeries asking for personal information about 51 patients. In only one case were they asked to verify their identity; the attack succeeded against the other 50 patients.

This is an old problem. In 1996, when I was advising the BMA on clinical system safety and privacy, we trained the staff at one health authority to detect false-pretext phone calls, and they found 30 a week. We reported this to the Department of Health, hoping they’d introduce some operational security measures nationwide; instead the Department got furious at us for treading on their turf and ordered the HA to stop cooperating (the story’s told in my book). More recently I confronted the NHS chief executive, David Nicholson, and patient tsar Harry Cayton, with the issue at a conference early last year; they claimed there wasn’t a problem nowadays now that people have all these computers.

What will it take to get the Department of Health to care about patient privacy? Lack of confidentiality already costs lives, albeit indirectly. Will it require a really high-profile fatality?

Justice, in one case at least

This morning Jane Badger was acquitted of fraud at Birmingham Crown Court. The judge found there was no case to answer.

Her case was remarkably similar to that of John Munden, about whom I wrote here (and in my book here). Like John, she worked for the police; like John, she complained to a bank about some ATM debits on her bank statement that she did not recognise; like John, she was arrested and suspended from work; like John, she faced a bank (in her case, Egg) claiming that as its systems were secure, she must be trying to defraud them; and like John, she faced police expert evidence that was technically illiterate and just took the bank’s claims as gospel.

In her case, Egg said that the transactions must have been done with the card issued to her rather than using a card clone, and to back this up they produced a printout allocating a transaction code of 05 to each withdrawal, and a rubric stating that 05 meant “Integrated Circuit Card read – CVV data reliable” with in brackets the explanatory phrase “(chip read)”. This seemed strange. If the chip of an EMV card is read, the reader will verify the signature on the certificate; if its magnetic strip is read (perhaps because the chip is unserviceable) then the bank will check the CVV, which is there to prevent magnetic strip forgery. The question therefore was whether the dash in the above rubric meant “OR”, as the technology would suggest, or “AND” as the bank and the CPS hoped. The technology is explained in more detail in our recent submission to the Hunt Review of the Financial Services Ombudsman (see below). I therefore advised the defence to apply for the court to order Egg to produce the actual transaction logs and supporting material so that we could verify the transaction certificates, if any.

The prosecution folded and today Jane walked free. I hope she wins an absolute shipload of compensation from Egg!

Opting out

The British Journal of General Practice has just published an editorial I wrote on Patient confidentiality and central databases. I’m encouraging GPs to make clear to patients that it’s OK to opt out – that they won’t incur the practice’s disapproval. Some practices have distributed leaflets from www.TheBigOptOut.org while others – such as The Oakland practice – have produced their own leaflets. These practices have seen the proportion of patients opting out rise from about 1% to between 6% and 19%. The same thing happened a few years ago in Iceland, where GP participation led to 11% of the population opting out of a central database project, which as a result did not become universal. GPs can help patients do the same here.

Financial Ombudsman losing it?

I appeared on “You and Yours” (Radio 4) today at 12.35 with an official from the Financial Ombudsman Service, after I coauthored a FIPR submission to a review of the service which is currently being conducted by Lord Hunt.

Our submission looks at three cases in particular in which the ombudsman decided in favour of the banks and against bank customers over disputed ATM transactions. We found that the adjudicators employed by the ombudsman made numerous errors both of law and of technology, and concluded that their decisions were an affront to reason and to justice.

One of the cases has already appeared here on lightbluetouchpaper; the other two cardholders appeared on an investigation into card fraud on “Tonight with Trevor MacDonald”, and their case papers are included, with their permission, as appendices to our submission. These papers are damning, but the Hunt review’s staff declined to publish them on the somewhat surprising grounds that the information in them might be used to commit identity theft against the customers in question. Eventually they published our submission minus the two appendices of case papers. (If knowing someone’s residential address and the account number to a now-defunct bank account is enough for a criminal to steal money from you, then the regulatory failures afflicting the British banking system are even deeper than I thought.)

The Financial Ombudsman Service, and its predecessor the Banking Ombudsman, have for many years found against bank customers and in favour of the banks. In the early-to-mid 1990s, they upheld the banks’ outrageous claim that mag-stripe ATM cards were invulnerable to cloning; this led to the court cases described here and here. That position collapsed when ATM criminals started being sent to prison. Now we have another wave of ATM card cloning, which we’ve discussed several times: we’ve shown you a chip and PIN terminal playing Tetris and described relay attacks. There’s much more to come.

The radio program is online here (the piece starts 29 minutes and 40 seconds in). We clearly have them rattled; the ombudsman was patronising and abusive, and made a number of misleading statements. He also said that the “independent” Hunt review was commissioned by his board of directors. I hope it turns out to be a bit more independent than that. If it doesn’t, then consumer advocates should campaign for the FOS to be abolished and for customers to be empowered to take disputes to the courts, as we argue in section 31-32 of our submission.

Government security failure

In breaking news, the Chancellor of the Exchequer will announce at 1530 that HM Revenue and Customs has lost the data of 15 million child benefit recipients, and that the head of HMRC has resigned.

FIPR has been saying since last November’s publication of our report on Children’s Databases for the Information Commissioner that the proposed centralisation of public-sector data on the nation’s children was not only unsafe but illegal.

But that isn’t all. The Health Select Committee recently made a number of recommendations to improve safety and privacy of electronic medical records, and to give patients more rights to opt out. Ministers dismissed these recommendations, and a poll today shows doctors are so worried about confidentiality that many will opt out of using the new shared care record system.

The report of the Lords Science and Technology Committee into Personal Internet Security also poitned out a lot of government failings in preventing electronic crime – which ministers contemptuously dismissed. It’s surely clear by now that the whole public-sector computer-security establishment is no longer fit for purpose. The next government should replace CESG with a civilian agency staffed by competent people. Ministers need much better advice than they’re currently getting.

Developing …

(added later: coverage from the BBC, the Guardian, Channel 4, the Times, Computer Weekly and e-Health Insider; and here’s the ORG Blog)

Happy Birthday ORG!

The Open Rights Group (ORG) has, today, published a report about their first two years of operation.

ORG’s origins lie in an online pledge, which got a thousand people agreeing to pay a fiver a month to fund a campaigning organisation for digital rights. This mass membership gives it credibility, and it’s used that credibility on campaigns on Digital Rights Management, Copyright Term Extension (“release the music“), Software Patents, Crown Copyright and E-Voting (for one small part of which Steven Murdoch and I stayed up into the small hours to chronicle the debacle in Bedford).

ORG is now lobbying in the highest of circles (though as everyone else who gives the Government good advice they aren’t always listened to), and they are getting extensively quoted in the press, as journalists discover their expertise, and their unique constituency.

Naturally ORG needs even more members, to become even more effective, and to be able to afford to campaign on even more issues in the future. So whilst you look at their annual report, do think about whether you can really afford not to support them!

ObDisclaimer: I’m one of ORG’s advisory council members. I’m happy to advise them to keep it up!

Government ignores Personal Medical Security

The Government has just published their response to the Health Committee’s report on The Electronic Patient Record. This response is shocking but not surprising.

For example, on pages 6-7 the Department reject the committee’s recommendation that sealed-envelope data should be kept out of the secondary uses service (SUS). Sealed-envelope data is the stuff you don’t want shared, and SUS is the database that lets civil servants, medical researchers others access to masses of health data. The Department’s justification (para 4 page 6) is not just an evasion but is simply untruthful: they claim that the design of SUS `ensures that patient confidentiality is protected’ when in fact it doesn’t. The data there are not pseudonymised (though the government says it’s setting up a research programme to look at this – report p 23). Already many organisations have access.

The Department also refuses to publish information about security evaluations, test results and breaches (p9) and reliability failures (p19). Their faith in security-by-obscurity is touching.

The biggest existing security problem in the NHS – that many staff carelessly give out data on the phone to anyone who asks for it – will be subject to `assessment’, which `will feed into the further implementation’. Yeah, I’m sure. But as for the recommendation that the NHS provide a substantial audit resource – as there is to detect careless and abusive disclosure from the police national computer – we just get a long-winded evasion (pp 10-11).

Finally, the fundamental changes to the NPfIT business process that would be needed to make the project work, are rejected (p14-15): Sir Humphrey will maintain central control of IT and there will be no `catalogue’ of approved systems from which trusts can choose. And the proposals that the UK participate in open standards, along the lines of the more successful Swedish or Dutch model, draw just a long evasion (p16). I fear the whole project will just continue on its slow slide towards becoming the biggest IT disaster ever.

Government ignores Personal Internet Security

At the end of last week the Government published their response to the House of Lords Science and Technology Committee Report on Personal Internet Security. The original report was published in mid-August and I blogged about it (and my role in assisting the Committee) at that time.

The Government has turned down pretty much every recommendation. The most positive verbs used were “consider” or “working towards setting up”. That’s more than a little surprising, because the report made a great deal of sense, and their lordships aren’t fools. So is the Government ignorant, stupid, or in the thrall of some special interest group?

On balance I think it starts from ignorance.

Some of the most compelling evidence that the Committee heard was at private meetings in the USA from companies such as Microsoft, Cisco, Verisign, and in particular from Team Cymru, who monitor the “underground economy”. I don’t think that the Whitehall mandarins have heard these briefings, or have bothered to read the handful of published articles such as this one in ;login, or this more recent analysis that will appear at CCS next week. If the Government was up-to-speed on what researchers are documenting, they wouldn’t be arguing that there is more crime solely because there are more users — and they could not possibly say that they “refute the suggestion […] that lawlessness is rife”.

However, we cannot rule out stupidity.

Some of the Select Committee recommendations were intended to address the lack of authoritative data — and these were rejected as well. The Government doesn’t think its urgently necessary to capture more information about the prevalence of eCrime; they don’t think that having the banks collate crime reports gets all the incentives wrong; and they “do not accept that the incidence of loss of personal data by companies is on an upward path” (despite there being no figures in the UK to support or refute that notion, and considerable evidence of regular data loss in the United States).

The bottom line is that the Select Committee did some “out-of-the-box thinking” and came up with a number of proposals for measurement, for incentive alignment, and for bolstering law enforcement’s response to eCrime. The Government have settled for complacency, quibbling about the wording of the recommendations, and picking out a handful of the more minor recommendations to “note” to “consider” and to “keep under review”.

A whole series of missed opportunities.

NHS Computer Project Failing

The House of Commons Health Select Committee has just published a Report on the Electronic Patient Record. This concludes that the NHS National Programme for IT (NPfIT), the 20-billion-pound project to rip out all the computers in the NHS and replace them with systems that store data in central server farms rather than in the surgery or hospital, is failing to meet its stated core objective – of providing clinically rich, interoperable detailed care records. What’s more, privacy’s at serious risk. Here is comment from e-Health Insider.

For the last few years I’ve been using the London Ambulance Service disaster as the standard teaching example of how things go wrong in big software projects. It looks like I will have to refresh my notes for the Software Engineering course next month!

I’ve been warning about the safety and privacy risks of the Department of Health’s repeated attempts to centralise healthcare IT since 1995. Here is an analysis of patient privacy I wrote earlier this year, and here are my older writings on the security of clinical information systems. It doesn’t give me any great pleasure to be proved right, though.

House of Lords Inquiry: Personal Internet Security

For the last year I’ve been involved with the House of Lords Science and Technology Committee’s Inquiry into “Personal Internet Security”. My role has been that of “Specialist Adviser”, which means that I have been briefing the committee about the issues, suggesting experts who they might wish to question, and assisting with the questions and their understanding of the answers they received. The Committee’s report is published today (Friday 10th August) and can be found on the Parliamentary website here.

For readers who are unfamiliar with the UK system — the House of Lords is the second chamber of the UK Parliament and is currently composed mainly of “the great and the good” although 92 hereditary peers still remain, including the Earl of Erroll who was one of the more computer-literate people on the committee.

The Select Committee reports are the result of in-depth study of particular topics, by people who reached the top of their professions (who are therefore quick learners, even if they start by knowing little of the topic), and their careful reasoning and endorsement of convincing expert views, carries considerable weight. The Government is obliged to formally respond, and there will, at some point, be a few hours of debate on the report in the House of Lords.

My appointment letter made it clear that I wasn’t required to publicly support the conclusions that their lordships came to, but I am generally happy to do so. There’s quite a lot of these conclusions and recommendations, but I believe that three areas particularly stand out.

The first area where the committee has assessed the evidence, not as experts, but as intelligent outsiders, is where the responsibility for Personal Internet Security lies. Almost every witness was asked about this, but very few gave an especially wide-ranging answer. A lot of people, notably the ISPs and the Government, dumped a lot of the responsibility onto individuals, which neatly avoided them having to shoulder very much themselves. But individuals are just not well-informed enough to understand the security implications of their actions, and although it’s desirable that they aren’t encouraged to do dumb things, most of the time they’re not in a position to know if an action is dumb or not. The committee have a series of recommendations to address this — there should be BSI kite marks to allow consumers to select services that are likely to be secure, ISPs should lose mere conduit exemptions if they don’t act to deal with compromised end-user machines and the banks should be statutorily obliged to bear losses from phishing. None of these measures will fix things directly, but they will change the incentives, and that has to be the way forward.

Secondly, the committee are recommending that the UK bring in a data breach notification law, along the general lines of the California law, and 34 other US states. This would require companies that leaked personal data (because of a hacked website, or a stolen laptop, or just by failing to secure it) to notify the people concerned that this had happened. At first that might sound rather weak — they just have to tell people; but in practice the US experience shows that it makes a difference. Companies don’t like the publicity, and of course the people involved are able to take precautions against identity theft (and tell all their friends quite how trustworthy the company is…) It’s a simple, low-key law, but it produces all the right incentives for taking security seriously, and for deploying systems such as whole-disk encryption that mean that losing a laptop stops being synonymous with losing data.

The third area, and this is where the committee has been most far-sighted, and therefore in the short term this may well be their most controversial recommendation, is that they wish to see a software liability regime, viz: that software companies should become responsible for their security failures. The benefits of such a regime were cogently argued by Bruce Schneier, who appeared before the committee in February, and I recommend reading his evidence to understand why he swayed the committee. Unlike the data breach notification law the committee recommendation isn’t to get a statute onto the books sooner rather than later. There’s all sorts of competition issues and international ramifications — and in practice it may be a decade or two before there’s sufficient case law for vendors to know quite where they stand if they ship a product with a buffer overflow, or a race condition, or just a default password. Almost everyone who gave evidence, apart from Bruce Schneier, argued against such a law, but their lordships have seen through the special pleading and the self-interest and looked to find a way to make the Internet a safer place. Though I can foresee a lot of complications and a rocky road towards liability, looking to the long term, I think their lordships have got this one right.