I have just been at the Cambridge Risk and Uncertainty Conference which brings together people who educate the public about risks. They include public-health doctors trying to get people to eat better and exercise more, statisticians trying to keep governments honest about crime statistics, and climatologists trying to educate us about global warming – an eclectic and interesting bunch.
Most of the people in this community see their role as dispelling ignorance, or motivating the slothful. Yet in most of the cases we discussed, the public get risk wrong because powerful interests make a serious effort to scare them about some of life’s little hazards, or to reassure them about others. When this is put to the risk communication folks in a question – whether after a talk or in the corridor – they readily admit they’re up against a torrent of misleading marketing. But they don’t see what they’re doing as adversarial, and I strongly suspect that many risk interventions are less effective as a result.
In my talk (slides) I set this out as simply and starkly as I could. We spend too much on terrorism, because both the terrorists and the governments who’re supposed to protect us from them big up the threat; we spend too little on cybercrime, because everyone from the crooks through the police and the banks to the computer industry has their own reason to talk down the threat. I mentioned recent cases such as Wannacry as examples of how institutions communicate risk in self-serving, misleading ways. I discussed our own study of browser warnings, which suggests that people at least subconsciously know that most of the warnings they see are written to benefit others rather than them; they tune out all but the most specific.
What struck me with some force when preparing my talk, though, is that there’s just nobody in academia who takes a holistic view of adversarial risk communication. Many people look at some small part of the problem, from David Rios’ game-theoretic analysis of adversarial risk through John Mueller’s studies of terrorism risk and Alessandro Acquisti’s behavioural economics of privacy, through to criminologists who study pathways into crime and psychologists who study deception. Of all these, the literature on deception might be the most relevant, though we should also look at politics, propaganda, and studies of why people stubbornly persist in their beliefs – including the excellent work by Bénabou and Tirole on the value people place on belief. Perhaps the professionals whose job comes closest to adversarial risk communication are political spin doctors. So when should we talk about new facts, and when should we talk about who’s deceiving you and why?
Given the current concern over populism and the role of social media in the Brexit and Trump votes, it might be time for a more careful cross-disciplinary study of how we can change people’s minds about risk in the presence of smart and persistent adversaries. We know, for example, that a college education makes people much less susceptible to propaganda and marketing; but what is the science behind designing interventions that are quicker and cheaper in specific circumstances?
Besides political spin-doctors, an even more potentially adversarial risk-perception marketing challenge is to be found in religious missionary work. In this case, a larger part is potentially based on faith, and a lesser part is properly based on debating and appeals to reason; as compared to an idealised image of the work of political spin-doctors; although in practice, there is actually convergence here since politics is also largely ideological and faith-based, and any religion with actual relevancy in our everyday lives is going to be based on real-life facts and reason as well as faith.
I spent two years as a full-time religious missionary.
Perceived risk was everything: a few of the ministers of other churches (who I suppose might have perceived us as a potential threat to their income), would sometimes stop at nothing to prejudice people’s minds before we could get to them, by telling their congregations all sorts of foolish tales about the heinous teachings of Mormonism and their likelihood of ending up in the wrong place in the afterlife if they allowed themselves to be deceived by our message!
In the late 1800’s, this involved spreading rumours around coal-mining villages in South Yorkshire that the Mormon Elders would kidnap pretty young ladies and take them through the tunnel from Liverpool to Salt Lake City (some feat of engineering!!!) One such lady had allegedly been imprisoned in Temple Square, and got so distraught, she had jumped into the Great Salt Lake, and died (a jump of over 40 miles — as my Hemsworth-born grandfather was much amused to learn on visiting Salt Lake!) By the 1900’s, they got a little more sophisticated, putting a scientific & scholarly, historical authority, or cultural justice branding on their “objections” (the way some religious people speak now, you would think that the Holy Bible was the only thing that contained any truth at all, and that everything else was from the devil! They’re painting themselves into a corner in so many ways.)
Our job was simply to get people thinking about the potential motivations of those who were persuading them one way or another, and share the evidence so that people could make a proper judgement for themselves, based on an enlightened interpretation of the facts.
I would love to see advocates using some universal scale for describing the impact of risky behaviour. Professor Spiegelhalter attempted this with microlives and micromorts (for details, see https://understandinguncertainty.org/)
I wonder if the risks could be described in financial terms, perhaps as a risk to the national income – a “microbill”?
Umm. I may be missing something, but I pretty much think of my last 15 years’ campaigning as “adversarial risk communication”…
The link to the conference (first line, first paragraph) points to a holding page. Is there another resource where I could where learn about the conference?
FYI the link ‘behavioural economics of privacy’ is malformed: it should point to http://www.heinz.cmu.edu/~acquisti/papers/Acquisti-Grossklags-Chapter-Etrics.pdf
Thanks – fixed!