Find out what happened when the scrutiny review library opened its doors and invited people in
Scrutiny review library
Setting up and designing the new scrutiny review library was something of a leap of faith. At the outset, there was no guarantee that councils would take the time to contribute, or that the idea would gain traction beyond initial curiosity. Like many sector-wide initiatives, its success depended entirely on voluntary engagement. The big ideas was to create a practical, shared space where councils can see how others have approached the same thorny challenges—bringing together bite-sized insights, real examples, and links to the work behind them. Councils were invited to share submissions and help build a stronger, more informed approach to scrutiny across the sector.
The question was simple: would people see the value and act on it?
We launched the library through our newsletter in July 2025, and the early response was both immediate and revealing.
Submissions didn’t build gradually; they arrived in a surge. August 2025 alone accounts for around 80% of all entries. That kind of concentration tells its own story. The launch created a clear moment and councils responded to it.
However, momentum faded over time. Without continued promotion or prompting, submissions dropped off sharply after August. A few smaller clusters appeared in September, and then activity slowed to a trickle. By the end of March 2026, the library stood at 63 submissions, a respectable total, but one clearly shaped by that early spike rather than sustained growth.
There’s a lesson here about how this kind of resource behaves. It doesn’t grow passively. It needs nudging.
What does the quality of reviews tell us about scrutiny in practice?
If the volume tells one story, the content tells another, and this is where the library becomes most useful.
While the library wasn’t designed as a formal quality assessment exercise, some clear patterns do emerge.
At its best, the work submitted reflects a confident and maturing scrutiny function. Reviews are generally well structured and outcome-focused, with a clear emphasis on practical recommendations rather than description for its own sake. Many draw on a mix of evidence such as stakeholder views, service data, and external comparisons to support their conclusions.
There is also a noticeable shift towards more complex, cross-cutting issues. Topics such as cost of living, health inequalities, and environmental challenges feature prominently, suggesting scrutiny is increasingly engaging with whole-system questions rather than narrow service silos. This points to scrutiny at its strongest, focused, evidence-led, and working across boundaries to address real-world issues.
That said, quality is not entirely consistent. There is variation in depth and clarity of impact. Some reviews set out a strong line of inquiry and a clear route to measurable change, others are more high-level or process-driven.
A recurring gap is around what happens next. Fewer submissions clearly articulate how recommendations will be monitored, or what success would look like in practice. That follow-through, often the hardest part of scrutiny, is also where its value is ultimately proven.
Taken together, this suggests a familiar pattern. Scrutiny is most effective where it is clearly focused and tied to outcomes, and less effective where it becomes broader, more procedural, or disconnected from impact.
Breadth of review topics
Spread of topics across submissions is broad, and in many ways reassuring. The most common areas, climate and environment (16), children and families (15), housing (14), and health (12), align closely with the major pressures facing local government.
Looking across the reviews, most focus on the places people live, things like the environment, housing, and local infrastructure, and the services that affect people’s day-to-day lives, such as children’s services, health, and inequality. What’s noticeable is how often these issues overlap. These are not marginal issues, they are the things that matter to families, residents and communities.
Beyond these, we see strong representation in community safety (10), economic development and infrastructure (both 9), alongside a wider mix including corporate services, transport, and adult social care.
At the other end of the scale, areas such as digital and finance appear less frequently as standalone reviews. That may not indicate absence so much as integration, as these themes are often embedded in the general business of scrutiny.
Who’s been adding reviews?
Contributions are highly concentrated among a small number of councils. Stockton-on-Tees Borough Council alone accounts for 16 submissions, with East Sussex County Council contributing 6. The library was developing a fanbase of users, keen to spotlight and share their work. In one sense, this is positive. It shows that where councils see value, they are willing to engage repeatedly. The library is not just being used, it is being adopted.
Beyond this core group, most councils submitted only one to three reviews. This creates a familiar “long tail” pattern, a handful of highly engaged authorities driving volume, alongside a wider group contributing more sporadically.
This raises some important questions. The library clearly has value, evidenced by both the volume and quality of early submissions. But its reach is uneven. Only 20 councils in total have stepped forward, entering the library with their reviews. Without sustained engagement or prompts, participation quickly tails off. And while the diversity of topics is strong, the contributor base is still relatively narrow compared to the sector as a whole.
The library has proven the concept: councils are willing to share scrutiny work when given a platform. But it has also exposed the limits of passive collection. If the aim is to build a truly representative and continuously growing resource, it will require more than a one-off launch, it will need ongoing curation, active outreach, and a clearer incentive for councils to contribute regularly.
What next?
The library has, in many ways, done what it set out to do. It has demonstrated that councils are willing to share their work when given a platform, and that there is an appetite to learn from one another.
But it has also highlighted the limits of a passive approach. If this is to become a genuinely representative and continuously growing resource, it will need more than a one-off launch. It will require active curation, an annual call to action for submissions, and a clearer sense of why contributing matters.
One possible next step is to bring users into the process more directly, perhaps through peer review, shared reflection, or even light-touch benchmarking. Not as a formal assessment exercise, but as a way of building collective ownership and raising the bar over time.
The value of the library isn’t just in what’s been submitted. It’s in what it helps improve next.