Elevator interview with Penelope Jackson, Division Manager Quality Assurance, AfDB

Friday, 4 December, 2020
Evaluation Matters Edition: 
Q3 2020: Evaluation Week Special Edition

Why do you think it is important for development institutions to draw learning from monitoring and evaluations? We have been doing this for so long, surely we should know all there is to know by now?

Any successful organization has to learn continuously. Learning is not a one-time event. What there is to learn – like the real world - evolves. Most importantly, we must also apply the lessons to enact change. Monitoring provides us with real-time information to help adapt, and if needed, redirect our efforts. Evaluation looks deeper and even questions, not just how are we doing, but why we are doing it. That’s an important role.

Which institutional factors (eg. corporate structure and hierarchies, ownership, client relationships etc.) enable or inhibit learning from evaluations? How so?

Of course, many institutional factors come into play. Two factors you didn’t mention in your question are time and resources. Many staff are very interested in learning from evaluations, but carving out the time to do so is not always easy. However, barriers to learning are just as much about the evaluations themselves. Is the evaluation on a relevant topic about which we need new information? Is it delivered at the opportune time to inform a new direction or strategy? Is it credible – meaning is it of high enough quality, based on solid evidence, and offering practical recommendations that will help to address the issues found?

Does a culture of learning in an institution automatically imply a culture of evaluation?

Not necessarily. An evaluation culture is more specific from a learning perspective, but can also involve aspects other than learning. Sometimes evaluations have been much more focused on accountability than learning, even if on paper they seek to cover both. The current focus on rating  even at sector, country and theme level, can be a distraction from the learning objective, as discussion inevitably centers on that. For evaluation to really be about learning it needs to focus on what has worked and what has not worked, including why not, and what do we need to do differently.

How do you convince staff to see evaluation as a beneficial process that leads to improvement of the Bank’s work, rather than a necessary evil?

I believe it was Winston Churchill who said “I like learning but I don’t like being taught”. I think evaluators sometimes under-estimate the extent to which the organization wants and values evaluation and broader learning. Staff are interested and want to be implicated both in learning and in generating the knowledge. Yes, people may find it hard to fit in their busy schedules, but my experience is that if the evaluation is of high quality and well-timed, staff certainly do see it as beneficial. Often a good evaluation will ring true to those dealing with certain challenges day-to-day and they appreciate that an external party has taken time to examine, reflect, and offer possible solutions. It can also open doors to obtaining answers they know are needed, but have not been able to get traction on themselves.

As institutional learning evolves, what are some of the trends or directions you see organizations such as the AfDB moving into, in terms of learning and knowledge management, and what role will evaluation play in these changes?

We are all digital champions now, right? Online learning, workshops, seminars, etc. are important and they have pros and cons. For operations staff at the AfDB, we are investing in online training, but that has to be supplemented by the kind of discussion which solicits broader learning. And most importantly, the real learning happens not by talking about it, but by doing it. Benjamin Franklin is quoted as saying: “Tell me and I forget, teach me and I may remember, involve me and I learn”. It’s true. The best learning is by doing.

But evaluation can bring important evidence to the fore, and pull out specific lessons from past experiences. Targeting those lessons on specific issues (avoiding the highly generic lessons) and disseminating to the right, well-targeted audience, is key.

 

Author name: 
Penelope Jackson