I’ve written a rebuttal in today’s Guardian to an article that appeared last week by Martin Rees, the President of the Royal Society. Martin argued that science should be subjected to more surveillance and control in case terrorists do bad things with it.
Those of us who work with cryptography and computer security have been subjected to a lot of attempts by governments to restrict what we do and publish. It’s a long-running debate: the first book written on cryptology in English, by Bishop John Wilkins in 1641, remarked that ‘If all those useful Inventions that are liable to abuse, should therefore be concealed, there is not any Art or Science which might be lawfully profest’. (John, like Martin, was Master of Trinity in his day.)
In 2001–2, the government put an export control act through Parliament which, in its original form, would have required scientists working on subjects with possible military applications (that is, most subjects) to get export licenses before talking to foreigners about our work. FIPR colleagues and I opposed this; we organised Universities UK, the AUT, the Royal Society, the Conservatives and the Liberals to bring in an amendment in the Lords creating a research exemption for scientists. We mustn’t lose that. If scientists end up labouring under the same bureaucratic controls as companies that sell guns, then both science and nonproliferation will be seriously weakened.
Some people love to worry: Martin wrote a whole book wondering about how the human race will end. But maybe we should rather worry about something a bit closer to hand — how our civilisation will end. If a society turns inwards and builds walls to keep the barbarians out, then competition abates, momentum gets lost, confidence seeps away, and eventually the barbarians win. Imperial Rome, Ming Dynasty China, … ?
The control of research can be counterproductive in other ways too. I remember how at the beginning of public-key encryption Europe (and the rest of the world) received a boost from the American policy of banning the export of “munitions” such as encryption hardware and software. We could happily develop cryptography among ourselves, secure in the knowledge that there would be no US competition.
In that case we had the added benefit that the key “RSA” patent was based on previously published research. It was therefore valid in the USA (constraining the activities of US companies) but invalid elsewhere.