When we work with our government partners to encourage evidence generation and use, we draw on our experiences from success cases we know well. Uganda is a particularly important case, where 3ie supported the evaluations of several flagship government programmes and contributed to capacity development. 3ie's partnership with Uganda's Office of the Prime Minister (OPM) offers an example of how we can work together to build the evidence culture within the government. Much of this work fell under 3ie's Uganda Evidence Programme, which ran from 2015 to 2019, in partnership with OPM, with DFID funding.

As a recent blog argues, several country governments in Africa are working hard on building evidence ecosystems. There is a lot we can be learning from each of them. With the programme drawing to a close, we take the opportunity to reflect on some of our key takeaways in Uganda.

Institutional design within the government matters

The placement of the monitoring and evaluation (M&E) office within the Ugandan OPM helped the programme run smoothly. As the apex body within the government, the OPM has the authority and legitimacy to enforce the M&E policy. Experiences in other countries have shown that the specific location of the national M&E unit within the government matters for ensuring accountability and enforcement. This is certainly true for Uganda.
Networks within the government, such as the evaluation sub-committee (which involves key government institutions, civil society, academia and donors) as well as national and ministry M&E working groups are important platforms for relaying and discussing evidence. Evaluations feed into the discussions of these network meetings—an important first step in making evidence visible.

Start with the evaluations there is already an appetite for

Securing buy-in for evaluation work can be a time consuming project by itself, as 3ie’s experiences elsewhere have shown. In this case, buy-in came by starting with evaluations that the OPM wanted to carry out: process evaluations of flagship programmes.

Importantly, the process evaluation recommendations found resonance within the ministries because they provided concrete recommendations for improving programme performance through implementation tweaks.

For example, the process evaluation of the Ugandan government’s Youth Livelihood Programme recently informed the cabinet’s decision to make several key changes in programme design. The recommended changes to the programme—limiting the size of groups and increasing the budget for institutional support—came from the process evaluation. Separately, the Ministry of Health revised its training manual on family planning services informed by evidence from the process evaluation of the National Family Planning Programme.

These examples illustrate the political reality that it is easier to modify programme design than ask for radical policy change, such as the scaling down of a programme. But that reality also points to a limitation of demand-driven evaluations with the government in the drivers' seat: they may avoid topics that are inconvenient for the government.

Individual evidence champions matter

A conducive institutional set up will make no difference without motivated and committed individuals, as seen in most star examples of evidence use in decision-making. In Uganda, we found such champions at different levels of the government. The OPM, for instance, took the lead for using the recommendations of the public sector assessment to push for a cabinet stipulation that put on hold the creation of new government agencies and authorities. In the Ministry of Gender, Labour and Social Development, the permanent secretary’s interest in evidence rubbed off on ministry staff. Despite the political sensitivity surrounding the Youth Livelihood programme, the ministry team persisted for months to get the cabinet to approve programme design changes.

Capacity development builds the appetite for evidence

The fact that Ugandan researchers led the 3ie-supported impact evaluations in this programme helped sustain engagement with government ministries. The evaluators made themselves available for numerous meetings with government officials; and they carried out workshops with government staff. Ministry staff members told 3ie that the programme spurred them to think more about commissioning evaluations and seeking out capacity development opportunities.

Nonetheless, challenges remain in the evaluation capacity of government ministries. An assessment carried out by 3ie highlighted some persistent issues: limited staff capacity, limited budgets, and poor incentives for implementing evaluation recommendations. Furthermore, donor-driven M&E was not integrated with ministry M&E systems. Within the government, the focus was more on monitoring, rather than evaluation. Finally, the technical language of evaluations and the impracticality of recommendations were impediments to their implementation.

This assessment adds weight to the case for a long-term view of capacity development. Rather than one-off workshops, investment in long-term, hands-on, learning-by-doing models can reap benefits.

Partnerships are crucial for sustaining the evidence revolution

Although this specific programme has concluded, 3ie's work in Uganda certainly has not. 3ie and OPM continue to work together on the advocacy for evidence-informed decision-making. This partnership is the kind of relationship we believe can keep the evidence revolution going.

Francis Rathinam was the lead for the Uganda Evidence Programme and Radhika Menon contributed to this blog while working with 3ie .

Leave a comment

Enter alphabetic characters only.