I’ve blogged before on what impacts of digital engagement we should be measuring. For me - with ongoing policy engagement in particular – I’ve been especially interested in seeing if we could measure the impacts on both individual policies and the teams who develop them.
In response to commitments made in the BIS Digital Strategy, we’ve just published our first (and second) detailed case studies with more to come. They both give a fairly evenly balanced view of what is possible through online engagement – and it’s interesting to see in the science and society example (which I have an added interest in given I’ve seen that site develop from the policy side) that what may have worked before can’t always be guaranteed to keep on working. But as we’ve been working on the case studies, and going back to other teams to assess where they’re at now, it’s great to see that there were real impacts, beyond the usual suspects of shares, likes, views etc. including:
- Tangible effects on the types of evidence collected during a review – with one team commenting on the high quality of responses and evidence obtained;
- Allied to that, real-time monitoring of responses proved useful in adapting thinking on an ongoing basis. It was felt that – as responses came in within hours, as opposed to the months associated with traditional consultation processes – it was easier to get a feel for the themes that were emerging. It didn’t happen in these particular reviews, but this is potentially valuable in terms of exploding myths before they become an issue;
- The digital approach seen as a more open and transparent approach to policy making;
- Digital approaches actively considered for the future by the teams themselves, and more widely by others around them.
There was also a noticeable tangible increase in teams’ digital skills, with impacts including:
- Teams taking on responsibility for monitoring and evaluation on an ongoing basis;
- A recognition of the value of listening and monitoring as a tool which could potentially inform approaches to briefing;
- One individual having a defined digital engagement objective
- Transferring those skills to new teams.
It wasn’t specifically mentioned, but in many ways the individuals involved in projects become ambassadors for digital – they may be asked to talk to other teams about their experiences, for example.
I’m very aware of the dangers of drawing many conclusions from relatively isolated examples and there’s no guarantee that all teams using digital for a specific project will continue to do so. But, all are promising signs that digital can be effectively incorporated into policy teams’ day-to-day activities. Let’s see what our next batch of case studies bring – but in the meantime do let us know what you think.