Focusing on what works, not just what is new

| No responses | Theme: Work with Communities, Youth & Education

This blog is the first of a series that looks at five features the programme has identified that support organisations to replicate effectively and build an evidence-base for their work. Each of these features is explored in the Realising Ambition mid-programme report.

Here Tim Hobbs and Cassandra Ohlson from the Social Research Unit explore the first feature: the importance of having a tightly defined intervention with a clear focus on outcomes.

Innovation has a lot going for it. It suggests momentum and energy, new and better ways of doing things, and efficiencies in both time and money. Innovation in public services has long been fashionable. It assumes that services could be better and a greater impact on outcomes could be achieved – and who could argue with this?

However the problem with focusing on ‘the new’ is that it can breed initiatives that are rarely sustained or replicated. As a result, even services or programmes with a body of evidence supporting the difference they make are overlooked in the search for the new (and remember, once-upon-a-time these programmes were probably viewed as innovative too).

But what if we looked for the things that worked and tried to copy them, rather than looking for something different? This is the approach of Realising Ambition, a £25m Big Lottery Fund programme testing the replication of promising and evidence-based interventions as an approach to improving outcomes for children and young people. This means spreading good practice, underpinned by strong evidence, to new geographical areas or new groups of people. We think this is just as exciting as innovation, bordering on edgy in the current climate!

Copying – sounds simple?

Replicating, or copying existing programmes might sound easy. Our learning from the Realising Ambition programme has taught us that it is hard work, but that there are a few critical factors that can help.

In a recent Realising Ambition mid-programme report, five characteristics of organisations successful at replication were highlighted.  Here we take a closer look at the first, which is about being totally clear about what you are attempting to replicate. We refer to this as ‘intervention specificity’ – having a tightly defined intervention, or programme. We think it is the foundation upon which effective replication is based.

Without being clear what it is that is being replicated, it is easy to end up with 10 variable services, with variable outcomes, being replicated in 10 different areas, rather than one well-defined and evidenced programme replicated in those 10 areas. Research has shown that sticking as closely as possible to the way a programme is intended to be delivered makes it more likely that we’ll achieve the impact on outcomes that we’d hope.

What makes the difference in replication?

A tightly defined intervention has a clear focus on what outcomes it is seeking to achieve, for whom, and how the service or component activities are expected to bring about changes in outcomes. This might sound straightforward, but it can be surprisingly challenging to get to.

A logic model can help you articulate this: a visual representation of the connections between the intervention and the desired outcomes, ideally presented alongside evidence in support of the links in the chain.

This kind of clarity really helps in replicating.

All 25 Realising Ambition interventions have developed a model to express the logic underpinning what they do, but these models aren’t set in stone; some are still evolving as organisations reflect on their practice and refine what they do. External support and challenge can be useful: at the Social Research Unit at Dartington, we have supported the organisations in the Realising Ambition portfolio to get to this point.

Being clear and specific when talking about the ‘what outcomes’, ‘for whom’ and ‘how’ paves the way for successful replication, but other things matter too.

It’s important to test and refine the assumptions of a logic model using both existing evidence (what has worked before, in what circumstances and for whom), and routinely collected data on the outcomes for the children and young people you’re working with.

It’s also vital to monitor and demonstrate delivery that is faithful to the original programme design. This can be supported by the development and use of eligibility criteria and implementation manuals to guide and monitor delivery.

Being really clear about the intervention also helps to get to more accurate start-up and unit cost estimates. This in turn will inform a solid and realistic business plan, so that you can replicate again, and again and again…

Paving the way for innovation

Lastly, a commitment to learning and continuous improvement requires putting all of these pieces together: knowing what is being replicated, how well it is being delivered and the impact this has on outcomes can support organisations to further refine their intervention and improve the quality of its delivery, all aiding future replication.

It may not be edgy, but when it comes to replication this attention to detail, specificity, logic and evidence is what counts.

But one of the most exciting things about replication is that it paves the way for innovation. Replication allows us to test things that have worked in other places, in new areas and with new audiences. It suggests opportunities for innovation – what might we try to do differently to improve a programme? What can change and what needs to stay the same? Replication and innovation are two sides of the same coin – working towards better outcomes for children and young people.

Where to find out more

  • The Realising Ambition mid-programme report discusses the importance of being clear about what you intend to replicate. A copy of the report can be downloaded here.
  • Design & Refine: developing effective interventions for children and young people – a free guide focusing on designing an intervention and planning for its implementation. This draws upon the Social Research Unit’s ‘What Works’ Standards of Evidence.
  • Also take a look our freely accessible webinars on logic models, fidelity monitoring, manuals, and more!

Tim Hobbs PhD is Head of Analytics and Cassandra Ohlson a Researcher at The Social Research Unit.

Comments

  • (will not be published)