How do we measure if we're having an impact?

NoteI wrote this post on the excellent report when it first came out in March 2020, but posting it got delayed by lockdown, and since then sadly MobLab has had to close down – but although the context for some of our campaigning has changed, the themes in it feel as important as ever.

The team at Mobilisation Lab has done a huge service to the campaigning community, by bringing together the ‘Measuring People Power in 2020’ report that has surveyed 500+ changemakers to look at the metrics that they’re using in their campaigning, and it’s an important read for any campaign leader.

(full disclosure – I was involved in the advisory group for the report)

It’s full of takeaways but for me, one that it’s got me thinking about is how it’s time for us to drop the vanity metrics and really push into finding measures that capture the depth of our work.

We’ve known for years the limits of vanity metrics – which look at list size or page views – and focus on the breadth of activity happening, but can often have little bearing on the depth of our advocacy or the impact it’s happening,

But the report finds that most of us are still using them, with 91% of respondents saying they use them, and importantly for leaders, that as senior management we might be perpetuating this by suggesting an ongoing interest in focusing on them.

The report finds that others perceive there is only a moderate or small amount of support for measuring people power from senior leaders.

So if we’re to change that, we’ll need to be part of leading it. 

Now I know from my own personal experience, leading campaigning and organising work at Save the Children UK that’s it’s easy to get enthralled with vanity metrics – they’re easy to report on, stand up against similar figures that are presented by other colleagues and make you feel good when explaining them to the CEO – who doesn’t want to be able to report that the number of campaigners you’ve had sign your latest petition. 

But they’re limiting the story that we can tell about what it takes to create change and prevent us from doing the hard work that’s needed to find new measures of people power – of course, I’d love to be able to say that the report has found a single unifying metric that we can all use to explain the impact of our campaigning but we don’t have that.

So where do we go next?

‘The holy grail of people power is a measurement that captures (a) the breadth of a campaign or organization’s reach, (b) the depth of sustained supporter engagement and leadership, and (c) the impact these factors have on achieving the mission’.

https://mobilisationlab.org/resources/measuring-people-power/

Well helpfully the report has some thoughts about what we can do differently, and how we might be able to start to search for people power metrics that help to reflect the ‘The holy grail of people power is a measurement that captures (a) the breadth of a campaign or organization’s reach, (b) the depth of sustained supporter engagement and leadership, and (c) the impact these factors have on achieving the mission’.

1. We need to talk about power in our measurement – the focus of our work as campaigners is about change, a good day is when the work you’re doing comes together to win change, but how many of our metrics reflect this. Are they rooted in an understanding of power as something that is dynamic, that changes, and that needs to reflect the theories of change that we’re using? Are we adapting our measures to how our campaigns are seeing how change will happen?  

2. Look outside our organisations to learn from others – the report highlights how some organisation are experimenting with different approach to measuring people power, focusing more on the depth and impact of their movements, for example Friends of the Earth in the Netherlands on how they’ve moved to focus on measuring the leadership capacity within their movement.

Or some interesting literature coming out of academia looking at the evidence for the approaches that work – the report is full of useful snippets of insight from academics, for example, this work that finds that volume of contact might not be the most important way of influencing decision-makers, but the quality of contacts is. 

3. Make it playful and fun – there is a brilliant quote in the report from Rachel Collinson who says ‘a measure is good if it is precise, practical and playful’. That resonates as it’s easy to see our conversations on measurement feel like a chore at the end of the process, but how as leaders do we ensure that we supporting the creation of measures that bring joy to the process, as well as reflection. How do we draw from others who are using behaviour insight to create ways of capturing information and using measures that are fun.

4. Celebrate what we’re already doing – I’m sure many organsiations have already moved beyond vanity metrics, but when the report says that one in five recipients aren’t aware of any promising people power metric – perhaps we’ve not good at sharing what we’re doing. We perhaps feel a little fragile about sharing until their perfect, but as leaders how do we share our ‘work in progress’.

For example, I’m working on a project at the moment that’s looking to build local campaigning infrastructure using a composite metric to measure group health, in another area of our work we’re looking more at how we can measure the number of ‘youth-led’ advocacy initiatives, and with our fundraising colleagues, we’re looking at a lifetime value metric that tries to properly quantify the contribution that our supporters make via their campaigning action. It might not feel groundbreaking, but perhaps helpful to talk about more.  

(If anyone’s interested and based in the UK I’d be up for convening a session where we all bring our current ‘works in progress’ then get in touch via Twitter)

So lots to think about, and the report helps to start of more of a conversation about how we talk about and measure what’s working and not working. 

Leave a Reply

Your email address will not be published. Required fields are marked *