Development action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
3 minutes
Read so far

Learning from Behavioural Changes That Fail

0 comments
Affiliation

Queen Mary University of London (Osman, McLachlan, Fenton, Neil); Kings College London (Löfstedt); Health and Medical University (Meder); Max Planck Institute for Human Development (Meder)

Date
Summary

"Understanding why behavioural changes fail, and being able to anticipate possible types of failures when designing interventions could help to save time and public funds invested in these techniques, and overall increase their success in achieving the desired behavioural change." - Magda Osman

The popularity of the behavioural change approach is evident in a count of at least 83 behaviour change frameworks, which are used by institutions such as the World Health Organization (WHO) and governmental agencies worldwide to provide practical and ethical guidance for designing and implementing interventions to spark positive change. A recent example is health agencies drawing on such frameworks to develop campaigns to mobilise public compliance with strategies to manage the COVID-19 pandemic, such as social distancing and wearing masks. This article argues that there is value in examining interventions that inadvertently fail to achieve their desired behavioural change - not only those that succeed. It (i) highlights that reports of failure and backfiring are common in the literature; (ii) identifies commonalities and causal pathways underlying these failures; and (iii) presents a taxonomy derived from the identified commonalities that researchers and practitioners can use in characterising and analysing different types of failure as a foundation for evidence-based policy.

The researchers analysed 65 articles, published between 2008 and 2019, that identify behavioural approaches that failed to meet their intended objectives. An example of a behavioural approach is the use of "nudges", a collection of approaches designed to alter choice environments ("choice architectures") to achieve behaviour change. The researchers identified 8 different types of failures in total, constiting a taxonomy depicted in figure 1 in the paper. In short:

  • No treatment effects - e.g., using a moral persuasion message to increase tax compliance may have no overall effect.
  • Backfiring - e.g., dieters can express higher desire for and show increased consumption of unhealthy foods after receiving a message highlighting the negative aspects of particular food items.
  • Treatment offset by negative side effects - e.g., an environmental campaign may reduce residents' water consumption but at the same time increase their electricity consumption.
  • No treatment effect, but positive side effects - i.e., interventions yield unforeseen positive consequences, even if the actual target behaviour remains unchanged.
  • Only proxy changes, not actual criterion - e.g., information provision may increase healthy food selections in a simulated supermarket but have no long-term impact on body mass index and lifestyle.
  • Positive treatment effect is offset by later behaviour - e.g., charitable organisations sending reminders to potential donors can increase donations but later lead to higher unsubscriber rates from the mailing list, thereby jeopardising future donations.
  • Environment does not support change - e.g., in the 1980s, Croatia moved from an opt-in to an opt-out system for organ donations, but for several years this had little impact on actual donation rates because the necessary medical infrastructure was not in place.
  • Intervention triggers counteracting forces - e.g., consumers may struggle to curb their consumption of unhealthy sugary beverages in response to regulators' choice-restricting methods, such as a sugar tax, because drinks companies challenge the regulation.

The most common type of interventions that resulted in failures were those involving social norming or social comparisons, where individuals are provided with information about the behaviour of their peers in order to encourage a desired behaviour change. Interventions that involved the provision of information through letters or text messaging accounted for almost one-quarter of the failed studies.

As the researchers observe, "a recurring theme across many studies and types of failures is that..., for different subgroups, the interventions either worked, did not take, or backfired. This constitutes a practical gain for systematically analysing failures and can help answer the question 'what works for whom and why (not)?' by pointing researchers and practitioners to relevant factors they need to consider before embarking on new studies."

The article also examines the use of computational causal modeling techniques (e.g., causal Bayesian network approaches) to map out the factors that can influence specific behavioural interventions and their likelihood of success. A causal model contains 3 basic elements - nodes, arrows, and probabilities - and involves creation of probability tables. Such techniques may allow researchers and decision-makers a way of mapping out in advance what might work, as well as what might undermine the intervention ahead of time.

In conclusion: "Rather than asking in hindsight after an intervention failed 'what went wrong?', researchers and practitioners should ask in advance 'what could go wrong, and how could it go wrong?'....Doing this exposes the relevant factors and potential causal dependencies beyond the local cause and effect relationships that are at the heart of the policy problem. Conducting such a principled analysis in advance is particularly helpful when conducting field experiments, where there is great difficulty in controlling for externalities."

Source

Trends in Cognitive Sciences https://doi.org/10.1016/j.tics.2020.09.009 - sent via email from Magda Osman to The Communication Initiative on October 30 2020 and sourced from: "Nudges Fail More Often than Is Reported, Experts Warn", by Queen Mary University of London, MedicalXpress, October 28 2020 - accessed on November 2 2020. Image credit: cienpies