Category Archives: Academic papers

Internet Censorship and Control

The Internet is and has always been a space where participants battle for control. The two core protocols that define the Internet – TCP and IP – are both designed to allow separate networks to connect to each other easily, so that networks that differ not only in hardware implementation (wired vs. satellite vs. radio networks) but also in their politics of control (consumer vs. research vs. military networks) can interoperate easily. It is a feature of the Internet, not a bug, that China – with its extensive, explicit censorship infrastructure – can interact with the rest of the Internet.

Today we have released an open-access collection (also published as a special issue of IEEE Internet Computing), of five peer reviewed papers on the topic of Internet censorship and control, edited by Hal Roberts and myself (Steven Murdoch). The topics of the papers include a broad look at information controls, censorship of microblogs in China, new modes of online censorship, the balance of power in Internet governance, and control in the certificate authority model.

These papers make it clear that there is no global consensus on what mechanisms of control are best suited for managing conflicts on the Internet, just as there is none for other fields of human endeavour. That said, there is optimism that with vigilance and continuing efforts to maintain transparency the Internet can stay as a force for increasing freedom than a tool for more efficient repression.

Workshop on the Economics of Information Security 2013

I’m liveblogging WEIS 2013, as I did in 2012, 2011, 2010 and 2009. This is the twelfth workshop on the economics of information security, and the sessions are being held today and tomorrow at Georgetown University. The panels and refereed paper sessions will be blogged in comments below this post (and there’s another liveblog by Vaibhav Garg).

Security and Human Behaviour 2013

I’m liveblogging the Workshop on Security and Human Behaviour which is being held at USC in Los Angeles. The participants’ papers are here; for background, see the liveblogs for SHB 2008-12 which are linked here and here. Blog posts summarising the talks at the workshop sessions will appear as followups below. (Added: there is another liveblog by Vaibhav Garg.)

A further observation on quantum computing

Today we’ve published a paper showing that Bell’s inequality is violated in fluid mechanics. What has this to do with computing or security? Well, when we posted a paper back in February pointing out that hydrodynamic models of quantum physics raise questions about the scalability of quantum computing, a number of people asked for a better explanation of how this squares with the Bell tests. John Bell proved an inequality in 1964 that applies to classical particles but that is broken by quantum mechanical ones. In today’s paper we show that Bell’s inequality does not hold in classical fluid dynamics, as angular momentum and energy are delocalised in the fluid.

This may have implications for engineering, science and philosophy. On the engineering front, nine-figure sums have been poured into developing quantum computers, but even advocates of quantum computing admit they don’t really work. As our February paper argued, a hydrodynamic interpretation of quantum mechanics may suggest reasons why.

On the scientific front, the Bell tests are commonly seen as excluding not just local hidden-variable models of quantum mechanics, but local realism too. Our paper shows that the two are distinct, and thus leaves more room for research on quantum foundations. It also shows that we should be more careful in our use of terms such as ‘local’ – which might be of interest to the philosophers; the Bell tests do not draw quite as clear a dividing line between the quantum and classical worlds as many have believed.

Revisiting secure introduction via hyperlinks

Today at W2SP I presentednew paper making the case for distributing security policy in hyperlinks. The basic idea is old, but I think the time is right to re-examine it. After the DigiNotar debacle, the community is getting serious about fixing PKI on the web. It was hot topic at this week’s IEEE Security & Privacy (Oakland), highlighted by Jeremy Clark and Paul van Oorschot’s excellent survey paper. There are a slew of protocols under development like key pinning (HPKP), Certificate Transparency, TACK, and others. To these I add s-links, a complementary mechanism to declare support for new proposals in HTML links. Continue reading Revisiting secure introduction via hyperlinks

A search engine for code

In a seminar today, we will unveil Rendezvous, a search engine for code. Built by Wei-Ming Khoo, it will analyse an unknown binary, parse it into functions, index them, and compare them with a library of code harvested from open-source projects.

As time goes on, the programs we need to reverse engineer get ever larger, so we need better tools. Yet most code nowadays is not written from scratch, but cut and pasted. Programmers are not an order of magnitude more efficient than a generation ago; it’s just that we have more and better libraries to draw on nowadays, and a growing shared heritage of open software. So our idea is to reframe the decompilation problem as a search problem, and harness search-engine technology to the task.

As with a text search engine, Rendezvous uses a number of different techniques to index a target binary, some of which are described in this paper, along with the main engineering problems. As well as reverse engineering suspicious binaries, code search engines could be used for many other purposes such as monitoring GPL compliance, plagiarism detection, and quality control. On the dark side, code search can be used to find new instances of disclosed vulnerabilities. Every responsible software vendor or security auditor should build one. If you’re curious, here is the demo.

Call for Papers: Free and Open Communications on the Internet (FOCI '13)

The 3rd USENIX Workshop on Free and Open Communications on the Internet (FOCI ’13) seeks to bring together researchers and practitioners from technology, law, and policy who are working on means to study, detect, or circumvent practices that inhibit free and open communications on the Internet. We invite two distinct tracks for papers: a technical track for technically-focused position papers or works-in-progress; and a social science track for papers focused on policy, law, regulation, economics or related fields of study.

FOCI will favor interesting and new ideas and early results that lead to well-founded position papers. We envision that work presented at FOCI will ultimately be published at relevant, high-quality conferences. Papers will be selected primarily based on originality, with additional consideration given to their potential to generate discussion at the workshop. Papers in the technical track will also be evaluated based on technical merit. As with other USENIX events, papers accepted for FOCI ’13 will be made freely available on the USENIX website.

For further details, see the call for papers (PDF version). The submission deadline is 6 May 2013.

Call for Nominations: 2013 PET Award

I am on the award committee for the 2013 PET Award and we are looking for nominations of papers which have made an outstanding contribution to the theory, design, implementation, or deployment of privacy enhancing technology.

The 2013 award will be presented at Privacy Enhancing Technologies Symposium (PETS) and carries a prize of $3,000 USD thanks to the generous support of Microsoft. The crystal prize itself is offered by the Office of the Information and Privacy Commissioner of Ontario, Canada.

Any paper by any author written in the area of privacy enhancing technologies is eligible for nomination. However, the paper must have appeared in a refereed journal, conference, or workshop with proceedings published in the period from 16 April 2011 until 31 March 2013.

To submit a nomination, please see the instructions on the award page.

How Certification Systems Fail: Lessons from the Ware Report

Research in the Security Group has uncovered various flaws in systems, despite them being certified as secure. Sometimes the certification criteria have been inadequate and sometimes the certification process has been subverted. Not only do these failures affect the owners of the system but when evidence of certification comes up in court, the impact can be much wider.

There’s a variety of approaches to certification, ranging from extremely generic (such as Common Criteria) to highly specific (such as EMV), but all are (at least partially) descendants of a report by Willis H. Ware – “Security Controls for Computer Systems”. There’s much that can be learned from this report, particularly the rationale for why certification systems are set up as the way they are. The differences between how Ware envisaged certification and how certification is now performed is also informative, whether these differences are for good or for ill.

Along with Mike Bond and Ross Anderson, I have written an article for the “Lost Treasures” edition of IEEE Security & Privacy where we discuss what can be learned, about how today’s certifications work and should work, from the Ware report. In particular, we explore how the failure to follow the recommendations in the Ware report can explain why flaws in certified banking systems were not detected earlier. Our article, “How Certification Systems Fail: Lessons from the Ware Report” is available open-access in the version submitted to the IEEE. The edited version, as appearing in the print edition (IEEE Security & Privacy, volume 10, issue 6, pages 40–44, Nov‐Dec 2012. DOI:10.1109/MSP.2012.89) is only available to IEEE subscribers.

Hard questions about quantum crypto and quantum computing

We’ve been assured for 29 years that quantum crypto is secure, and for 19 years that quantum computing is set to make public-key cryptography obsolete. Yet despite immense research funding, attempts to build a quantum computer that scales beyond a few qubits have failed. What’s going on?

In a new paper Why quantum computing is hard – and quantum cryptography is not provably secure, Robert Brady and I try to analyse what’s going on. We argue that quantum entanglement may be modelled by coupled oscillators (as it already is in the study of Josephson junctions) and this could explain why it’s hard to get more than about three qubits. A companion paper of Robert’s on The irrotational motion of a compressible inviscid fluid presents a soliton model of the electron which shows for the first time how spin-1/2 symmetry, and the Dirac equation, can emerge in a completely classical system. There has been a growing amount of work recently on classical models of quantum behaviour; see for example Yves Couder’s beautiful experiments.

The soliton model challenges the Bell tests which purport to show that the wavefunctions of entangled particles are nonlocal. It also challenges the assumption that the physical state of a quantum system is entirely captured by its wavefunction &#936. It follows that local hidden-variable theories of quantum mechanics are not excluded by the Bell tests, and that in consequence we do not have to believe the security proofs offered for EPR-based quantum cryptography. We gave a talk on this at the theoretical physics seminar at Warwick on January 31st; here are the slides and here’s the video, parts 1, 2, 3, 4 and 5.