Category Archives: Legal issues

Security-related legislation, government initiatives, court cases

Government security failure

In breaking news, the Chancellor of the Exchequer will announce at 1530 that HM Revenue and Customs has lost the data of 15 million child benefit recipients, and that the head of HMRC has resigned.

FIPR has been saying since last November’s publication of our report on Children’s Databases for the Information Commissioner that the proposed centralisation of public-sector data on the nation’s children was not only unsafe but illegal.

But that isn’t all. The Health Select Committee recently made a number of recommendations to improve safety and privacy of electronic medical records, and to give patients more rights to opt out. Ministers dismissed these recommendations, and a poll today shows doctors are so worried about confidentiality that many will opt out of using the new shared care record system.

The report of the Lords Science and Technology Committee into Personal Internet Security also poitned out a lot of government failings in preventing electronic crime – which ministers contemptuously dismissed. It’s surely clear by now that the whole public-sector computer-security establishment is no longer fit for purpose. The next government should replace CESG with a civilian agency staffed by competent people. Ministers need much better advice than they’re currently getting.

Developing …

(added later: coverage from the BBC, the Guardian, Channel 4, the Times, Computer Weekly and e-Health Insider; and here’s the ORG Blog)

Government ignores Personal Medical Security

The Government has just published their response to the Health Committee’s report on The Electronic Patient Record. This response is shocking but not surprising.

For example, on pages 6-7 the Department reject the committee’s recommendation that sealed-envelope data should be kept out of the secondary uses service (SUS). Sealed-envelope data is the stuff you don’t want shared, and SUS is the database that lets civil servants, medical researchers others access to masses of health data. The Department’s justification (para 4 page 6) is not just an evasion but is simply untruthful: they claim that the design of SUS `ensures that patient confidentiality is protected’ when in fact it doesn’t. The data there are not pseudonymised (though the government says it’s setting up a research programme to look at this – report p 23). Already many organisations have access.

The Department also refuses to publish information about security evaluations, test results and breaches (p9) and reliability failures (p19). Their faith in security-by-obscurity is touching.

The biggest existing security problem in the NHS – that many staff carelessly give out data on the phone to anyone who asks for it – will be subject to `assessment’, which `will feed into the further implementation’. Yeah, I’m sure. But as for the recommendation that the NHS provide a substantial audit resource – as there is to detect careless and abusive disclosure from the police national computer – we just get a long-winded evasion (pp 10-11).

Finally, the fundamental changes to the NPfIT business process that would be needed to make the project work, are rejected (p14-15): Sir Humphrey will maintain central control of IT and there will be no `catalogue’ of approved systems from which trusts can choose. And the proposals that the UK participate in open standards, along the lines of the more successful Swedish or Dutch model, draw just a long evasion (p16). I fear the whole project will just continue on its slow slide towards becoming the biggest IT disaster ever.

Time to forget?

In a few hours time Part III of the Regulation of Investigatory Powers Act 2000 will come into effect. The commencement order means that as of October 1st a section 49 notice can be served which requires that encrypted data be “put into an intelligible form” (what you and I might call “decrypted”). Extended forms of such a notice may, under the provisions of s51, require you to hand over your decryption key, and/or under s54 include a “no tipping off” provision.

If you fail to comply with a notice (or breach a tipping off requirement by telling someone about it) then you will have committed an offence, for which the maximum penalty is two years and a fine or both. It’s five years for “tipping off” and also five years (an amendment in s15 of the Terrorism Act 2006) if the case relates to “national security”.

By convention, laws in the UK very seldom have retrospective effect, so that if you do something today, Parliament is very loth to pass a law tomorrow to make your actions illegal. However, the offences in Part III relate to failing to obey a s49 notice and that notice could be served on you tomorrow (or thereafter), but the material may have been encrypted by you today (or before).

Potentially therefore, the police could start demanding the putting into an intelligible form, not only of information that they seize in a raid tomorrow morning, but also of material that they seized weeks, months or years ago. In the 1995 Smith case (part of Operation Starburst), the defendant only received a suspended sentence because the bulk of the material was encrypted. In this particular example, the police may be constrained by double jeopardy or the time that has elapsed from serving a notice on Mr Smith, but there’s nothing in RIP itself, or the accompanying Code of Practice, to prevent them serving a s49 notice on more recently seized encrypted material if they deem it to be necessary and proportionate.

In fact, they might even be nipping round to Jack Straw’s house demanding a decryption key — as this stunt from 1999 makes possible (when the wording of a predecessor bill was rather more inane than RIP was (eventually) amended to).

There are some defences in the statute to failing to comply with a notice — one of which is that you can claim to have forgotten the decryption key (in practice, the passphrase under which the key is stored). In such a case the prosecution (the burden of proof was amended during the passage of the Bill) must show beyond a reasonable doubt that you have not forgotten it. Since they can’t mind-read, the expectation must be that they would attempt to show regular usage of the passphrase, and invite the jury to conclude that the forgetting has been faked — and this might be hard to manage if a hard disk has been in a police evidence store for over a decade.

However, if you’re still using such a passphrase and still have access to the disk, and if the contents are going to incriminate you, then perhaps a sledgehammer might be a suitable investment.

Me? I set up my alibi long ago 🙂

Econometrics of wickedness

Last Thursday I gave a tech talk at Google; you can now watch it online. It’s about work a number of us have done on searching for covert communities, with a focus on reputation thieves, phisherman, fake banks and other dodgy businesses.

While in California I also gave a talk on Information Security Economics, first as a keynote talk at Crypto and later as a seminar at Berkeley (the slides are here).

Digital signatures hit the road

For about thirty years now, security researchers have been talking about using digital signatures in court. Thousands of academic papers have had punchlines like “the judge then raises X to the power Y, finds it’s equal to Z, and sends Bob to jail”. So far, this has been pleasant speculation.

Now the rubber starts to hit the road. Since 2006 trucks in Europe have been using digital tachographs. Tachographs record a vehicle’s speed history and help enforce restrictions on drivers’ working hours. For many years they have used circular waxed paper charts, which have been accepted in court as evidence just like any other paper record. However, paper charts are now being replaced with smartcards. Each driver has a card that records 28 days of infringement history, protected by digital signatures. So we’ve now got the first widely-deployed system in which digital sigantures are routinely adduced in evidence. The signed records are being produced to support prosecutions for working too long hours, for speeding, for tachograph tampering, and sundry other villainy.

So do magistrates really raise X to the power Y, find it’s equal to Z, and send Eddie off to jail? Not according to enforcement folks I’ve spoken to. Apparently judges find digital signatures too “difficult” as they’re all in hex. The police, always eager to please, have resolved the problem by applying standard procedures for “securing” digital evidence. When they raid a dodgy trucking company, they image the PC’s disk drive and take copies on DVDs that are sealed in evidence bags. One gets given to the defence and one kept for appeal. The paper logs documenting the procedure are available for Their Worships to inspect. Everyone’s happy, and truckers duly get fined.

In fact the trucking companies are very happy. I understand that 20% of British trucks now use digital tachographs, well ahead of expectations. Perhaps this is not uncorrelated with the fact that digital tachographs keep much less detailed data than could be coaxed out of the old paper charts. Just remember, you read it here first.

Hacking tools are legal for a little longer

It’s well over a year since the Government first brought forward their proposals to make security research illegal crack down on hacking tools.

They revised their proposals a bit — in the face of considerable lobbying about so-called “dual-use” tools. These are programs that might be used by security professionals to check if machines were secure, and by criminals to look for the insecure ones to break into. In fact, most of the tools on a professionals laptop, from nmap through wireshark to perl could be used for both good and bad purposes.

The final wording means that to succesfully prosecute the author of a tool you must show that they intended it to be used to commit computer crime; and intent would also have to be proved for obtaining, adapting, supplying or offering to supply … so most security professionals have nothing to worry about — in theory, in practice of course being accused of wickedness and having to convince a jury that there was no intent would be pretty traumatic!

The most important issue that the Home Office refused to concede was the distribution offence. The offence is to "supply or offer to supply, believing that it is likely to be used to commit, or to assist in the commission of [a Computer Misuse Act s1/s3 offence]". The Home Office claim that “likely” means “more than a 50% chance” (apparently there’s caselaw on what likely means in a statute).

This is of course entirely unsatisfactory — you can run a website for people to download nmap for years without problems, then if one day you look at your weblogs and find that everyone in Ruritania (a well-known Eastern European criminal paradise) is downloading from you, then suddenly you’re committing an offence. Of course, if you didn’t look at your logs then you would not know — and maybe the lack of mens rea will get you off ? (IANAL ! so take advice before trying this at home!)

The hacking tools offences were added to the Computer Misuse Act 1990 (CMA), along with other changes to make it clear that DDoS is illegal, and along with changes to the tariffs on other offences to make them much more serious — and extraditable.

The additions are in the form of amendments that are incorporated in the Police and Justice Act 2006 which received its Royal Assent on the 8th November 2006.

However, the relevant sections, s35–38, are not yet in force! viz: hacking tools are still not illegal and will not be illegal until, probably, April 2008.

Continue reading Hacking tools are legal for a little longer

Follow the money, stupid

The Federal Reserve commissioned me to research and write a paper on fraud, risk and nonbank payment systems. I found that phishing is facilitated by payment systems like eGold and Western Union which make the recovery of stolen funds more difficult. Traditional payment systems like cheques and credit card payments are revocable; cheques can bounce and credit card charges can be charged back. However some modern systems provide irrevocability without charging an appropriate risk premium, and this attracts the bad guys. (After I submitted the paper, and before it was presented on Friday, eGold was indicted.)

I also became convinced that the financial market controls used to fight fraud, money laundering and terrorist finance have become unbalanced as they have been beefed up post-9/11. The modern obsession with ‘identity’ – of asking even poor people living in huts in Africa for an ID document and two utility bills before they can open a bank account – is not only ridiculous and often discriminatory. It’s led banks and regulators to take their eye off the ball, and to replace risk reduction with due diligence.

In real life, following the money is just as important as following the man. It’s time for the system to be rebalanced.

Extreme online risks

An article in the Guardian, and a more detailed story in PC Pro, give the background to Operation Ore. In this operation, hundreds (and possibly thousands) of innocent men were raided by the police on suspicion of downloading child pornography, when in fact they had simply been victims of credit card fraud. The police appear to have completely misunderstood the forensic evidence; once the light began to dawn, it seems that they closed ranks and covered up. These stories follow an earlier piece in PC Pro which first brought the problem to public attention in 2005.

Recently we were asked by the Lords Science and Technology Committee whether failures of online security caused real problems, or were exaggerated. While there is no doubt that many people talk up the threats, here is a real case in which online fraud has done much worse harm than simply emptying bank accounts. Having the police turn up at six in the morning, search your house, tell your wife that you’re a suspected pedophile, and with social workers in tow to interview your children, must be a horrific experience. Over thirty men have killed themselves. At least one appears to have been innocent. As this story develops, I believe it will come to be seen as the worst policing scandal in the UK for many years.

I remarked recently that it was a bad idea for the police to depend on the banks for expertise on card fraud, and to accept their money to fund such investigations as the banks wanted carried out. Although Home Office and DTI ministers say they’re happy with these arrangements, the tragic events of Operation Ore show that the police should not compromise their independence and their technical capability for short-term political or financial convenience. The results can simply be tragic.

There aren’t that many serious spammers any more

I’ve recently been analysing the incoming email traffic data for Demon Internet, a large(ish) UK ISP, for the first four weeks of March 2007. The raw totals show a very interesting picture:

Email & Spam traffic at Demon Internet, March 2007

The top four lines are the amount of incoming email that was detected as “spam” by the Cloudmark technology that Demon now uses. The values lie in a range of 5 to 13 million items per day, with the day of the week being irrelevant, and huge swings from day to day. See how 5 million items on Saturday 18th is followed by 13 million items on Monday 20th!

The bottom four lines are the amount of incoming email that was not detected as spam (and it also excludes incoming items with a “null” sender, which will be bounces, almost certainly all “backscatter” from remote sites “bouncing” spam with forged senders). The values here are between about 2 and 4 million items a day, with a clear pattern being followed from week to week, with lower values at the weekends.

There’s an interesting rise in non-spam email on Tuesday 27th, which corresponds to a new type of “pump and dump” spam (mainly in German) which clearly wasn’t immediately spotted as spam. By the next day, things were back to normal.

The figures and patterns are interesting in themselves, but they show how summarising an average spam value (it was in fact 73%) hides a much more complex picture.

The picture is also hiding a deeper truth. There’s no “law of large numbers” operating here. That is to say, the incoming spam is not composed of lots of individual spam gangs, each doing their own thing and thereby generating a fairly steady amount of spam from day to day. Instead, it is clear that very significant volumes of spam is being sent by a very small number of gangs, so that as they switch their destinations around: today it’s .uk, tomorrow it’s aol.com and on Tuesday it will be .de (hmm, perhaps that’s why they hit .demon addresses? a missing $ from their regular expression!).

If there’s only a few large gangs operating — and other people are detecting these huge swings of activity as well — then that’s very significant for public policy. One can have sympathy for police officers and regulators faced with the prospect of dealing with hundreds or thousands of spammers; dealing with them all would take many (rather boring and frustrating) lifetimes. But if there are, say, five, big gangs at most — well that’s suddenly looking like a tractable problem.

Spam is costing us [allegedly] billions (and is a growing problem for the developing world), so there’s all sorts of economic and diplomatic reasons for tackling it. So tell your local spam law enforcement officials to have a look at the graph of Demon Internet’s traffic. It tells them that trying to do something about the spammers currently makes a lot of sense — and that by just tracking down a handful of people, they will be capable of making a real difference!

TK Maxx and banking regulation

Today’s news coverage of the theft of 46m credit card numbers from TK Maxx underlines a number of important issues in security, economics and regulation. First, US cardholders are treated much better than customers here – over there, the store will have to write to them and apologise. Here, cardholders might not have been told at all were it not that some US cardholders also had their data stolen from the computer centre in Watford. We need a breach reporting law in the UK; even the ICO agrees.

Second, from the end of this month, UK citizens won’t be able to report bank or card fraud to the police; you’ll have to report it to the bank instead, which may or may not then report it to the police. (The Home Office wants to massage the crime statistics downwards, while the banks want to be able to control and direct such police investigations as take place.)

Third, this week the UK government agreed to support the EU Payment Services Directive, which (unless the European Parliament amends it) looks set to level down consumer protection against card fraud in Europe to the lowest common denominator.

Oh, and I think it’s disgraceful that the police’s Dedicated Cheque and Plastic Crime Unit is jointly funded and staffed by the banks. The Financial Ombudsman service, which is also funded by the banks, is notoriously biased against cardholders, and it’s not acceptable for the police to follow them down that path. When bankers tell customers who complain about fraud ‘Our systems are secure so it must be your fault’, that’s fraud. Police officers should not side with fraudsters against their victims. And it’s not just financial crime investigations that suffer because policemen leave it to the banks to investigate and adjudicate card fraud; when policemen don’t understand fraud, they screw up elsewhere too. For example, there have been dozens of cases where people whose credit card numbers were stolen and used to buy child pornography were wrongfully prosecuted, including at least one tragic case.