Tuesday, 5 October 2010

Self-Evaluating at Organisation Level

There are many reasons for undertaking a self-evaluation and many different ways to do it. Let's start with an evaluation undertaken by the organisation with a view to improving it, making it more effective or more efficient.

Evaluating or reviewing a process would usually call for some investigation at different levels within an organisation. How well does the process involved meet strategic objectives? Is it catered for in written strategy and thus adequately resourced, or does it meet a more localised need, being resourced internally by a team or department? Even so, does it then feature in that department's strategy and do senior management accept it as a necessary process so that the department receives adequate funding to include it?

Who are the end-users and participants in the process? How is the process governed, resourced, managed, delivered and monitored? Do people within the process understand why it exists, how it meets strategic drivers, have access to written guidelines, have a process for giving feedback, raising concerns, suggesting improvements and are these things adequately monitored and actioned?

A while ago I managed a project called Embedding Business & Community Engagement Through Business Process Improvement and Internal Engagement. The Embedding BCE project was only concerned with one (albeit wide-ranging) type of activity within FE and HE institutions. Yet the process of managing and delivering services under the umbrella of BCE meant that we had to engage with senior management, central co-ordinating units, core business process delivery teams such as HR, Finance, IT, Libraries, Estates/Facilities, Information Systems and Marketing and with practitioners from academic departments, research institutes and business units from all over the institution.

No one person could hope to know everything necessary to conduct a self-evaluation on their own.

The interview process in the Embedding BCE project gathered perceptions. The views put forward did not necessarily reflect the truth of the situation - as we got conflicting views and assertions from different people within the same process. But the point is that each interviewee thought it was the truth, or more properly the true situation, that they were giving us.

So a self-evaluation should aim to bring these different viewpoints together. After our interview process we staged a half-day workshop using a workbook that is downloadable from the link at the end of this entry.

The workbook contained around 25 questions. The workshop took around 5 hours. The intention was to stimulate some discussion by allowing these different perceptions to surface and be challenged by a small group of around 10-12 people representing all levels from SMT to Practitioner and from a range of teams and departments.

It identified quick wins where one department was doing something really well, where this could 'easily' now be communicated and replicated across the organisation. Although the workbook asked for scores against questions, this was again aimed at showing that some departments would score differently to others. Improved internal communications would benefit just about every organisation of 5+ staff!

In several cases the workshop discussions involved raised voices at some stage. But the managed conflict sparked ideas and suggestions that were quickly turned into a list of potential improvements or developments. It identified both strengths and weaknesses. It stimulated and inspired many participants. It bored a few.

It made me wary of results of surveys where a single person had completed the questions, or where perhaps several people have completed different sections from their own perception without any interaction. And it made me think that any system that attempts to score an organisation as a whole is going to both short-change good practice and perhaps paper over cracks at the same time. Even with a scoring system of only 1-4 we had arguments over whether a question should be scored 2.3 or 2.6 as the department who would have scored a 3 were reluctant to accept a score of 2! In most cases where these surveys are made public, there is no option to split scores.

Self-evaluating and then acting on the findings are essential to ensure widespread uptake of good practice and that external evaluations find consistency

The methodology and findings of the Embedding BCE project are published in the Embedding BCE infoKit on the JISC infoNet website. I can be contacted through JISC infoNet to help facilitate reviews or self-evaluation workshops of BCE in UK Further and Higher Education institutions if required.

No comments: