Skip to main content

Latest news & events

Live Labs blog: Monitoring & Evaluation - a new approach

Proving Services Director, Karen Farquharson, performed the monitoring and evaluation role for the Live Labs programme. This blog accompanies the White Paper, A New Approach to Monitoring and Evaluation, and includes some observations, reflections and learning from the evaluation process to date.

Since its formation in 2003, Proving has completed many independent assessments for project and programme portfolios within both the private and public sectors but the nature and ambition of this innovation programme was different. We welcomed the opportunity to participate.

The ethos of the Live Labs programme is quite unique. It is focused, inclusive and collaborative with the objective of acquiring valuable learning and delivering tangible benefits for the sector. The governance throughout the programme is intended to be light-touch and agile, responding quickly to support and assist those Live Lab projects with emerging issues and challenges. The evaluation and monitoring process had to reinforce this approach and contribute to the success of the programme.

The evaluation process adopted by Proving is based on academic research, industry best practice and client experience. It is designed to be focused, accurate, flexible and efficient - reducing unnecessary bureaucracy and cost. The emphasis is on the delivery of valuable outcomes, answering the ‘So What?’ test, rather than assessing the management of inputs (costs and resources). It is therefore critical that the evaluation factor set for the dimensions of Attractiveness and Achievability, accurately reflects the objectives, characteristics and constraints of the programme. The programme management team and Live Lab representatives were all involved in defining and weighting the factor set. This was time well spent. The factor set has remained constant during the life of the programme, with only the factor weightings altered to reflect changing priorities as the programme progresses.

Each Live Lab is required to undertake six-monthly self-assessments in between the Proving evaluations, using the same factor set and evaluation toolkit. This gives the Live Lab the opportunity to reflect their views as to project progress and performance. The self-assessments to-date vary in depth and supporting commentary, although Proving has had few concerns as to their overall accuracy. As to be expected, the Live Labs generally assessed themselves a little more positively than the Proving evaluation. 

Rather than extensive form filling and the capture of metrics, the approach encourages participation and debate amongst the project management team and delivery partners, with the aim of reaching a consensus as to performance and the actions necessary to improve. For more complex projects with multiple workstreams, the evaluation workshop may be one of the few times the Live Lab project stakeholders get together and discuss the complete programme of work using an open, honest but structured process. They often learn about aspects about their project they may not be actively involved in. This is reflected in feedback received from Deborah Fox TfWM Network Resilience Live Lab: ‘It is always good to get everyone from the project around the table and share our progress and learnings to date’. Proving was pleased to see that for all Live Labs, senior management actively encouraged staff, partners and providers to fully participate and share their thoughts and observations without any fear of repercussions. 

All levels of the Live Labs programme governance and project delivery are focused on delivering programme success. This should be a given but Proving has evaluated many programmes over the years where there is some distrust and the evaluation is regarded as an audit, directed just to find problems, rather than contribute to achieving success. The Live Lab Commissioning Board encouraged this positive ethos. Proving provide evaluation reports to the board that focus on critical issues, project successes, points of interest and future areas of evaluation. 

In keeping with the programme objectives, the evaluation and monitoring process was also treated as a source of learning, captured to improve current programme performance but also assist future innovation projects and portfolios.

You can find the White Paper, A New Approach to Monitoring and Evaluation here.

Media enquiries: please contact Coast Communications 01579 352600 | VAT number: 337 0556 05 | Website by Cosmic