What has the IMD ever done for us?

No-one is arguing that the Index of Multiple Deprivation is bad, but does it actually make a difference? A genuine question that's been bugging me for a while, and so, with a bit of apprehension, I wrote about why.

I'm admittedly a little apprehensive about writing this blog, but it's a question that has been on my mind for a while. There's a couple of reasons for this apprehension. Firstly, it's pretty much a given that The Index of Multiple Deprivation (IMD) is a good thing with data, and I don't want to lose my good data card. Secondly, maybe the answer to my question is so blindingly obvious, but I'm just not smart enough to see it. But this has been bugging me for a while, and so here I go...

I've been using the Index of Multiple Deprivation for years. I've pointed people to it. I've built products with it. I've laid it on maps. We all have right? You've probably cited it in reports, used it to justify funding bids, and watched it shape commissioning decisions across the country.

And yet I've always had this niggling question: has it actually made anything better?

This isn't a technical critique. I'm not here to argue about the weighting of domains or the methodology behind the seven indicators. Smarter people than me have done that work. What I'm asking is something more fundamental: has the existence of the IMD this tool we've embedded so deeply into how policy gets made across the UK actually improved outcomes for the communities it describes?

The case for

Some would argue, yes, of course, stop asking ridiculous questions Tom. The IMD introduced a multidimensional way of understanding deprivation, not just income, but health, education, employment, housing, crime, the living environment. It helped us think about intersectionality across different dimensions of need. It gave us a shared language and a common dataset that everyone from local councils to national government could point to.

And that's a good thing. Having a consistent, open dataset that covers England is genuinely useful. But has it made a difference?

The uncomfortable evidence

In 2021, the Institute for Community Studies at The Young Foundation published something that should have landed like a bombshell. They found a 0% change in the relative economic advantage of the UK's most deprived neighbourhoods over 15 years, despite targeted investment of over £20 billion across consecutive Labour, coalition, and Conservative governments.

Zero percent over fifteen years when we've had the data use.

The most deprived places in 2004 were, by and large, the most deprived places in 2019. Research looking at deprivation trajectories from 1971 to 2020 found : around 82% of areas in the most deprived decile in 2004 were still there in 2010, and by 2015-2019, nearly 88% of the most deprived areas stayed in that bottom decile.

So here's my question: if the IMD has been the primary thing guiding where billions of pounds of investment go, and the places it identifies as most deprived remain most deprived decade after decade, what exactly is it doing for us?

A static snapshot dressed as understanding

The IMD is updated roughly every five years. It's built from administrative data, benefit claims, educational attainment, health records, crime statistics. It tells us about communities through the lens of what government systems already collect. It's a portrait drawn from the paperwork.

And it's a relative measure. It can tell you one area is more deprived than another, but not by how much. It can't tell you whether things are getting better in absolute terms. An area could see genuine improvements in the lives of its residents and still sit in the same decile because everywhere else improved too. There's also another problem, and this is a big one for me: The IMD describes areas, not people.

False certainty?

I sometimes wonder if the IMD gives us a false sense that we know what's happening and can control it?

Think about how it gets used in practice. A commissioner pulls up the IMD map or uses one of the many tools (more on this later) that list the areas they need to worry about. They see the red areas, they write it into their strategy document, doing the right thing. There's a comfortable certainty to it: the data says this, so we do that. Is this just New Public Management wrapped up in maps?

But does the IMD actually tell us about what's happening in those communities? Does it tell us about the relationships between people? About the assets they hold, the community groups, the informal networks, the knowledge and capability that exists in every place? Does it capture what people care about, what they're trying to build, what keeps them up at night?

And if I'm certain about anything, which at the moment is not a whole lot, it's that real change on a local level only comes from people who live there having real agency.

The IMD tells policymakers about communities. It doesn't enable communities to tell their own story. And there's a world of difference between those two things.

When certainty stops us learning

Here's what I think might be the real cost. When we have a dataset that feels authoritative and comprehensive, seven domains! thirty-nine indicators! every small area in England! we tend to stop asking questions. We stop being curious. We stop going to places and listening to people. Because we already know. The data told us.

And so we stop learning.

We wrap new programmes, new funding rounds, new policy initiatives in the same dataset, the same statutory data repackaged in a different way. The Independent Commission on Neighbourhoods just released a new dashboard about Neighbourhoods, it's nice, but guess what, it's the IMD with a couple of other indexs.

New project, new programme management framework, different logo. But the same underlying assumption: that understanding deprivation means looking at the numbers, and that looking at the numbers means we know what to do. And this isn't even a knock on ICON, we've all done it, and again, maybe I'm just not smart enough to know that this really does make a difference.

So what instead?

I'm not arguing we should throw the IMD away. Data matters. Understanding patterns of deprivation at scale matters. But I am arguing we need to be honest about what it is and what it isn't.

The IMD is a backward-looking snapshot of administrative data. It's useful for understanding broad patterns. It's not a substitute for understanding communities, and it's certainly not a basis for designing interventions.

Perhaps what we should be doing is leaning into the uncertainty. Acknowledging that no dataset, however well-constructed, can capture the complexity of what's happening in a place. That the right response isn't to point at a map and say "there do something about that" but to go to that place and ask "what's going on here? What do you need? What are you already doing?" Maps as conversations

An invitation

I'm genuinely asking these questions, not just rhetorically. I've used the IMD extensively and I suspect many reading this have too. So I'd love to know:

Has it changed something for the better in your experience? Has it led to a decision that wouldn't have been made otherwise, one that actually improved things? Or has it become just something we do? A box we tick, a map we show, a certainty we lean on because the alternative, admitting we don't really know is too uncomfortable?

I think there's a conversation to be had here. One about data and power, about who gets to define a place, and about whether the tools we've built to understand deprivation might actually be getting in the way of doing something about it.

Subscribe to Tomcw.xyz

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe