🧵Impact Report Best Practices ⬇️
Just finished reading the latest
@dofe.org Impact Report and their Technical Report, and wow, the level of transparency and rigour in their measurement framework is impressive. 📊👏
impctlab.uk/DoEImpactReport
impctlab.uk/DoETechnicalReport
(1/7)
They’re using validated questions (like those from ONS wellbeing measures), which is such a refreshing change from the vague and untested indicators we often see in the charity sector. ✅
This means their data has both credibility and comparability. (2/7)
I also love how upfront they are about the sample, both in terms of who responded and how representative it is. Too many reports hide this detail in footnotes or ignore it entirely. The Duke of Edinburgh Award is setting a standard here. 🏅
(3/7)
However, there’s a challenge that stood out.
More than 50% of participants come from areas of low deprivation (IMD 8-10: likely affluent areas).
While that isn’t inherently a bad thing, it raises questions about whether the programme is reaching those who might benefit the most? 🏙️↔️🏞️
(4/7)
Jul 24, 2025 10:18A key opportunity for improvement would be stronger segmentation of their impact data.
For example, we need to see how outcomes differ by demographics like deprivation level, ethnicity, or region this is critical to understanding equity of access and impact. 🔧
(5/7)
Imagine the power of being able to say,
“Young people from high deprivation areas experienced this level of improved wellbeing compared to their peers from lower deprivation areas.”
That’s where the real story lies. 💡
(6/7)
Overall, I’m super impressed. DoE is showing that rigorous, transparent, and well-communicated impact measurement is possible at scale. They’re already doing so much right, but with sharper segmentation, they could lead the field even further. 🚀
(7/7)