Published21st September 2018
By Hannah Ormston, Impact Officer, Royal Society of Edinburgh
The Royal Society of Edinburgh (RSE) is a multi-disciplinary educational charity based in Scotland. We utilise the expertise and experience of our 1600 Fellows—distinguished individuals elected in recognition of their achievements in academia and beyond to provide public benefit initiatives. Our main activities include funding for research in Scotland and overseas; policy advice papers which provide independent evidence to inform public policy decisions; free public lectures and events; and international activities with our stakeholders abroad. This broad range of activities, only made possible through the scope of the expertise of the Fellowship, is a unique strength of the RSE. In turn, assessing the impact and evaluating the wide reach of the organisation can prove to be a challenging task.
Since early 2017, we have improved our internal evaluation practices to better understand the impact we make as an organisation, with the goal being to embed evaluation into everything we do. As the RSE’s Impact Officer, my role has been to facilitate this change and to work with staff to develop new approaches to measure and collect impact evidence.
Many people who work in evaluation may agree that often, the ‘go to’ tool for data collection is SurveyMonkey. Before we started our impact journey, the words SurveyMonkey and evaluation were sometimes used interchangeably by staff and although it’s a great tool for gathering evidence, there are many other software packages available. However, data collection doesn’t always have to use fancy software. Over the past 12 months, we have trialled alternative methods and tools with success and learned to tailor the technique based on the audience. This has included using a range of tools, from Mentimeter free presentation software where respondents can input their answers from their mobile phone—to sticky walls with post-it notes and marker pens.
Evaluation terminology like outcomes, indicators and evidence is daunting. It’s unfair to expect staff to be able to evaluate their activities without receiving appropriate training so we invested in training the team to ensure that everyone is on the same page and comes from the same starting point. When a new member of staff starts, they receive an impact induction, so they know what is expected from the offset. Evaluation Support Scotland (ESS)—an Inspiring Impact partner—has provided an array of support from leading a session at our team Away Day last year, to providing tailored support for our Policy Team. They also have an excellent array of resources on their website which have helped to guide staff sessions and are a handy reference point.
Over the past 12+ months I have learned to never underestimate the power of peer learning. We established an internal Impact Champions Group that meets every six weeks to discuss ideas and share experiences in an informal setting. Working with this group of likeminded individuals who are committed to impact practice has been invaluable and having a regular meeting in the diary has a helped to keep our evaluation momentum going.
Knowing where to start and determining what’s the most important activity to evaluate first can be a scary prospect. As part of our annual appraisal process this year, every member of our 40+ team set themselves an impact objective. This has given the team focus and demonstrated the commitment of our senior management team to evaluating the impact we make. Having buy-in from staff at all levels is vital when trying to build a culture of impact practice.