A lot of official statistical releases cross our desks at Full Fact HQ. We dont get time to devour them all.
But one in particular caught our attention this montha report by the Justice Data Lab, based in the Ministry of Justice, on the Family Man Programme.
Family Man is a family relationships programme for male prisoners aimed, ultimately, at reducing re-offending. It's run by Safe Ground, a charity.
The Justice Data Lab report contains analysis to help Safe Ground understand whether or not the programme is workingthe Lab provides this service to other organisations working with offenders as well, and makes its findings public.
It compares the re-offending rates of people in a particular programme, like Family Man, with the re-offending rate of a control group of similar offenders.
That requires sophisticated statistical techniques, and an understanding that not all differences between a programme group and a control group are significant.
A committed charity awaiting the results of a new type of intervention might be eager to take a lower re-offending rate after one year as proof that their approach is the one that works. But there may not be enough evidence to draw those conclusions from a limited sample.
Its the job of statisticians to make those and other caveats clear. We were struck by how effectively the Justice Data Lab does this.
In the Family Man report, the numbers say that 31% of offenders who completed Family Man went on to re-offend within one year, compared with 37% for a control group.
The summary on the first page contains the following guidance:
What you can say: There is insufficient evidence at this stage to draw a conclusion about the impact of Family Man Programme run by Safe Ground on the one-year proven re-offending rate.
What you cannot say: This analysis shows that attending the Family Man Programme run by Safe Ground decreased the one-year proven re-offending rate by 6 percentage points, or by any other amount.
The Justice Data Lab told us that these clear health warnings were rolled out in response to questions from charities receiving these highly technical reports, asking what they could and couldnt say in public about the impact of their re-offending projects.
In this case, the small number of Family Man participants assessed contributed to the uncertainty, so there's an argument that the exercise isn't inherently useful without more data. Other reports, with bigger sample sizes, have more concrete conclusions under the "What you can say" heading.
We haven't seen anything like this done elsewhere. Other producers of official numbers may have some idea of the flawed conclusions people tend to draw from more regular statistical releases, and could consider applying a similar system to ward off misunderstandings.
It might be that some already do, and we just haven't spotted it. We plan to ask the National Statistician's Good Practice Team. If you've seen other examples of clear communication in statistical releases that we should highlight to them, please let us know.