Do warnings work?

Bill Durodie

Bill Durodie is Professor and Chair of International Relations in the Department of Politics, Languages and International Studies at the University of Bath, UK

The narrative presented here was supported by an award from the Gerda Henkel Stiftung under their Special Programme – Security, Society and the State.

 

It is commonly asserted that the first duty of government is to protect its citizens. But one of the challenges confronting authorities that produce advice and issue alerts is the extent to which precautionary messages have become an integral part of our cultural landscape in recent times. From public health to counter-terrorism, climate change to child safety, a profusion of agencies – both official and unofficial – are constantly seeking to raise our awareness and modify our behaviour whether we know it or not. This may be done with the best of intentions – but we should be mindful of where that may lead.

 

Issuing a warning presumes negative outcomes if it is not heeded. Accordingly, it transfers a degree of responsibility onto recipients who may not have sought such counsel – or been consulted. Indeed, these may come to interpret it as a mechanism to deflect blame and accountability. And, aside from the intended response – presumed appropriate by those imparting the information – others may dispute the evidence presented, its interpretation, and the intentions behind these, as evidenced by acts of complacency and defiance.

 

Such negative consequences – deemed maladaptive by politicians and officials who have swallowed the psychologised lexicon of our times – reveal an important truth in our supposedly post-truth societies. And that is, that people are not driven by evidence alone. Addressing their core values and beliefs is more critical to motivating change and achieving influence. This requires respecting their moral independence and recognising the importance of ideas. Process and data-driven, protectionist paternalism on the other hand, reflects a low-view of human beings, which is readily self-defeating.

 

Altering our choice architecture, as some describe it, encourages self-fulfilling prophecies that interfere with our autonomy and undermine consent in the name of improving welfare or keeping us safe. And while there is a wealth of literature regarding such interventions and their purported effectiveness, most relates to single cases or relies largely on precedent – such as preparing for terror attacks or controlling tobacco use – rather than examining the implicit assumptions and the wider, societal consequences of such approaches.

 

Responses like overreaction, habituation and fatigue derive not so much from specific instances of warning as from the cumulative impact of a cultural proclivity to issue such guidance. This latter, in its turn, speaks to the growing disconnect between those providing advice – even if at arm’s length from the state (thereby inducing a limited sense of civic engagement) – and those charged with living by it. To a self-consciously isolated political class, proffering instructions and regulating behaviour appears to offer direction and legitimacy in an age bereft of their having any broader social vision.

 

Yet, reflecting on the UK Foreign and Commonwealth Office provision of travel advisories before and after the 2002 Bali bombings, the distinguished Professor of War Studies, Lawrence Freedman, noted how such guidance ‘is bound to be incomplete and uncertain’. ‘[I]t is unclear’, he continued, ‘what can be achieved through general exhortations’. Far more important, to avert government being accused of complacency or alarmism – ‘the sins of omission and commission’, as he put it – is the need to impart and share in a sense of strategic framing with the public. We might call this politics.

 

In his 2002 speech at the Lord Mayor’s Banquet, the then British Prime Minister, Tony Blair, advised how intelligence on possible security threats crossed his desk ‘all the time’. Only some was reliable. The remainder included misinformation and gossip. He sought to distinguish between specific intelligence, suggestive intelligence and acting ‘on the basis of a general warning’, which would effectively ‘be doing [the terrorists’] job for them’.

 

Blair explained how there was a balance to be struck and a judgement to be made ‘day by day, week by week’ in order not to shut down society. He noted that keeping citizens alert, vigilant and cooperative would test ‘not just our ability to fight, but … our belief in our own way of life’. In doing so, he implicitly pointed to the need for wider critical engagement and our having a sense of collective purpose beyond the immediacy of any threat.

 

But nudging people to act without their conscious support and endlessly raising awareness about all-manner of presumed risks and adverse behaviours precludes both of these essential elements. Indeed, when some suggest that the general population are inherently ignorant, not qualified, too immature or cannot be relied on to handle complex evidence to determine matters for their own good (an argument as old as Plato), they display a considerable complacency of their own, as well as an unwillingness to engage and inability to inspire a broader constituency to effect change.

 

People can only become more knowledgeable, mature and reliable when they participate actively in matters of consequence. There can be no shared sense of social purpose if citizens are not treated as adults. Otherwise, official pronouncements come across as the disengaged exhortations of remote authorities, and warnings – as with the increasingly graphic images on cigarette packets – simply become the background noise of the self-righteous.

 

The refusal to be inoculated against H1N1 pandemic influenza once a vaccine was developed for it in 2009, for example, did not stem from social media propagation of ‘rumours’ and ‘speculation’ on ‘volatile’ public opinion as some supposed. Rather, and more damagingly still, it was a conscious rejection led by healthcare workers themselves, informed by their own experience of the virus, and inured to the declarations of senior officials who announced that ‘it really is all of humanity that is under threat’, as well as those who responded uncritically in accordance, developing models where none applied.

 

The language of warnings has shifted over the years from articulating threats, which could promote individual responsibility, to simply eliciting desired behaviours. Indeed, the proliferation of biological metaphors – ideas go viral, individuals are vulnerable, activities are addictive – reflects the demise of any wider moral or political outlook. But encouraging a responsive sensitivity and tacit acceptance by evoking negative emotions can readily backfire. It is unlikely to generate a critical culture or social solidarity.

 

So – do warnings work? It depends. Facts alone do not motivate many. It is how they are interpreted that matters. And the framing of these today often dismisses our agency and promotes a powerful sense of determinism. The Nobel Prize winning economist, Daniel Kahnemann, noted how ‘[t]here are domains in which expertise is not possible’. Decision-making – like democracy – is a moral choice in which we are all equals.

 

Not everything of value has a value and few things that are worthy have a worth. That is why the singular pursuit of evidence and data by those in authority, with a view to inducing acceptance and behaviour change, fails to inspire those who seek more to life than the mere protection of the state. Where are the ideas and ideals capable of leading us beyond narrow, existential concern for our own well-being and towards a broader appreciation of the potential of the collective human project?