Applying ‘dark logic’ in practice – Nick Axford

Not a million miles away from where I write this lies a once-grand seaside resort now better known – at least in parts of the media – for high rates of personal debt and teenage pregnancy. In an attempt to change this, planners had a bright idea: if they built a new road to the area, more people would come to shop, holiday and do business. Regeneration would follow.

Several years passed and the road got built. It did indeed make it easier to get to the resort. In that respect it seemed like a success story. Except for an unanticipated consequence: the road also made it easier to leave. Better-off, more mobile residents have taken the opportunity to head in the opposite direction to shop and work in a neighbouring and now more accessible affluent city.

This is an example of a seemingly good idea having unexpected harmful consequences. Recent years have seen a growing awareness of these in the field of psychosocial interventions designed to improve aspects of children’s health and development.

An influential paper published in 2015 argued that designers and evaluators of programmes should consider up front the possible adverse effects of well-intentioned interventions and seek to monitor them. Just as it is common to develop logic models, explaining the mechanisms by which programme activities will contribute to desired outcomes, so we should develop what Chris Bonell and his colleagues christened ‘dark logic models’. These can be used to guide the evaluation of potential harms.

The impetus came from Bonell’s own experience: an intervention he evaluated to reduce teenage pregnancies had precisely the opposite effect. No-one had expected this. So he and his team went back to the drawing board. They consulted intervention providers and recipients, interrogated the literature and started to pull apart the intervention logic model in an attempt to understand why things had gone so badly wrong.

The resulting paper suggests three approaches to developing dark logic models, one of which is to “consult individuals or groups who have particular insights into local contexts and how interventions might operate within these” (p.97).

***

Also in 2015, results of the Building Blocks trial were published to shocked gasps and considerable disappointment in the field of early intervention. The study evaluated Family Nurse Partnership (FNP), a high-profile, evidence-based home visiting programme for teenage mothers with three decades of trial data from the US demonstrating its impact on child development and maternal life course outcomes. It had been introduced to England by the Department of Health in 2007. Within eight years it was reaching 25% of its target group of teenage mothers. From a scale perspective this was a phenomenal success.

The problem, as the Building Blocks trial showed, was that FNP had no effect on primary outcomes and small effects on only a handful of secondary outcomes. While some critics of the evidence-based enterprise seemed to take delight in this, and Daily Mail headlines bemoaned millions of pounds wasted on ‘teen mothers’, others pondered what had gone wrong and how to respond. Leaving aside the valid debate about whether the primary outcomes and some measures of other outcomes were suitable, the trial was generally regarded as robust and, whichever way you looked at the results, there was clearly scope for FNP and universal services to do better in areas such as smoking cessation, breastfeeding, and domestic violence.

The FNP National Unit seized the initiative and set up a project in 11 of its (then) 130 plus sites to adapt the programme and test those adaptations. Part of this involved co-production teams comprising nurses, commissioners, local experts and clients working together to design clinical adaptations. As such, the work sought to meld practice and client experience with scientific evidence on how problems develop and what works to prevent or address them.

Teams were assisted by the Dartington Service Design Lab and the FNP National Unit to develop four key documents that together captured the DNA of each adaptation: a logic model (diagram and narrative); an intervention description (using the TIDieR framework); a context map setting out local and wider cultural, social, economic and political factors deemed likely to affect the success of the adaptation; and a dark logic model incorporating a rating of the likelihood of the possible adverse effects of the adaptation and, if necessary, suitable mitigating actions.

***

Although sites mainly worked independently, across the eight clinical adaptations there are commonalities in the kinds of dark logic that were hypothesised.

Some were fairly general. One was the concern that introducing new content would displace other intervention activity; for example, extra time spent addressing smoking cessation would mean that other important issues, such as mental health, could not be covered. Another worry was that addressing difficult issues, such as child neglect, would cause the client to disengage or drop out of the programme entirely.

Concerns were also expressed about the adverse effect on specific outcomes, albeit not those targeted by the adaptation. For instance, trying to engage wider family in supporting the client to breastfeed could contribute to family conflict if the grandmother was resistant – perhaps because she had not breastfed herself and felt guilty or judged. Similarly, the use of video coaching to help mothers interact more sensitively with their babies was deemed likely to trigger feelings of low self-esteem in some clients.

Then there was dark logic that focused on the functioning of the programme and the wider system. In innovations that required nurses to have extra training and make significant changes to their practice, co-production teams thought that it might make nurses’ jobs harder, particularly in a context of financial cuts and general upheaval, with knock-on effects for the quality of their work. In one site the introduction of new approaches to maternal mental health was hypothesised to impede multidisciplinary work unless it was carefully aligned with what other agencies were doing.

***

Two general observations about these hypotheses can be made. The first is that, although in some cases the hypothesised dark logic suggested an adverse effect on the targeted outcome (what Bonell and colleagues call ‘paradoxical effects’), it is striking how many are concerned with other outcomes (referred to in the article as ‘harmful externalities’).

Second, while the potential harmful effects that were articulated may have a direct effect on the client, more often than not they are indirect, meaning that they are mediated by other people in the client’s life or even through their impact on the functioning of the intervention and the wider service system.

It is also worth reflecting briefly on the process by which the dark logic was articulated. Broadly, the approach was well received. Co-production teams found the idea intuitive and helpful. Importantly, it clearly shaped adaptation designs; in some cases, seemingly good ideas were shelved after reflection, and in others initial ideas were refined and mitigating actions were put in place. Here is not the place to go into what those mitigating actions comprised in detail, but they included enhanced monitoring to detect early signs of problems, boosting nurse training and applying strong referral pathways and safeguarding protocols to guard against harm to the client and their child.

***

Thinking ahead to next steps, an obvious opportunity is to develop a comprehensive typology of dark logic that extends beyond FNP and the early years. This could be used prospectively when designing interventions, almost like a checklist: ‘Have you considered the possibility of…?’.

A related innovation would be to examine the empirical evidence for the likelihood of different types of hypothesised dark logic materialising. Put another way, it would be helpful for people to have a better understanding of things known to be harmful when they design interventions. There is strong evidence, for example, that putting young people who are engaged in antisocial behaviour or crime together with similar young people – in a group intervention, for example – contributes to delinquency (because they learn from one another, normalise the maladaptive behaviour, and so on). Criminologists call this ‘social deviancy training’. But other types of dark logic might simply be needless worry, which, if it stifles risky but potentially valuable innovation could – ironically – in itself be detrimental.

One comment

Leave a comment

Your email address will not be published.