Conditional cash transfers increase school enrolments and use of health facilities. Community-level water supply does not have health benefits. There is emerging evidence that community-driven development programmes do not increase social cohesion.

These statements can be made with confidence based on the considerable body of evidence from impact evaluations undertaken to answer the question of what works in development. 3ie is now adding this body of evidence as more completed studies are becoming available.

Knowing which interventions don’t work can save scarce development resources. And knowing which work best, and most cost effectively, can make sure these resources are better spent. But impact evaluations can do a lot more. They can help inform programme design. This they do in two ways: (i) multiple-treatment arm designs, and (ii) adopting a theory-based evaluation design. (I will discuss theory-based designs in a subsequent blog.)

Studies with multiple treatment arms examine the impact of different programme designs. So one treatment arm will get one design, say supplementary feeding to tackle child malnutrition. The second treatment arm could have nutritional counselling. Just having two treatment arms will let us compare which of the two treatments is most effective. If we have a no treatment control group arm we can also measure the absolute impact and cost effectiveness of the two treatments.

Multiple treatment arms address intervention design questions of interest to policymakers. Do conditions make a difference for conditional cash transfers, and for what? (some evidence related to education says yes). Does it matter when and how often the transfer is paid? (there is evidence that a large transfer just before school fees are due has a larger impact on enrolments). Does it matter who receives the transfer? (there is substantial evidence that women are more likely to use income for their children’s welfare than are men). What sort of administrative arrangements work best? (bureaucratic procedures, including ‘entering offices’, can deter ordinary people). Should payment be in cash or kind?

Take the example of computer assisted learning. Computer assisted learning has been shown to have a substantial, cost-effective impact on learning outcomes notably at the basic (primary or elementary) level. But is this cost effective? How many computers are needed for a class of 30 children? It is plausible that the learning effects are greater for two children per computer than one. The learning effects may be lower with three, but the cost effectiveness still higher.

Multiple treatment arm studies can test the cost effectiveness of different student-to-computer ratios. And what sort of technical support is needed for teachers? Is it sufficient that they know the basics of how to operate the computer, if that, or do they need intensive training to understand the learning objectives of the software?

Well-designed impact evaluations won’t tell us just if computer assisted learning programmes work (we know they do provided they come with appropriate software, and the school infrastructure is sufficient to support them), but how to make them work better?

It is often argued that there are complementarities in development interventions, for example extension services are only effective when combined with input subsidies. A special case of multiple treatment arms are factorial designs. This is a powerful design which is particularly under-used.

Factorial designs explore the impact of interventions A, B and C=A+B, preferably with a ‘no treatment’ comparison group though there may be practical, political or ethical objections to that. For example, A could be improved water supply, B hygiene education and C is the two together. Or A is microcredit, B business support services and C the two combined.

Factorial designs can test this. But only if the study has sufficient statistical power. Because of the large number of combinations possible, statistical power often becomes a constraint. This is because each possible combination requires a group of possible participants within it that is large enough to detect a statistically significant response). So researchers need to build the analysis of the complementarities into the design from the start.

Multiple treatment arm studies apply experimental or quasi-experimental designs to conduct a counterfactual assessment of the (cost) effectiveness of variations in intervention design to inform better designs. The design variations being tested should be the ones of interest to policymakers, ones they will implement if they are proven to be effective.

Further Reading

On the effectiveness of CCTs : Conditional cash transfers and health: unpacking the causal chain, Marie M. Gaarder, Amanda Glassman & Jessica E. Todd, Journal of Development Effectiveness, vol 2 issue 1, 2010

On the ineffectiveness of community-level water supplyWater, sanitation and hygiene interventions to combat childhood diarrhoea in developing countries, Hugh Waddington, Birte Snilstveit, Howard White and Lorna Fewtrell

On the ineffectiveness of community-driven development:
Interventions to promote social cohesion in Sub-Saharan Africa, Elisabeth King, Cyrus Samii and Birte Snilstveit

The GoBifo Project Evaluation Report: Assessing the Impacts of Community Driven Development in Sierra Leone, Katherine Casey, Rachel Glennerster, Edward Miguel

Effects of a Community Driven Reconstruction Program in Eastern Democratic Republic of Congo, Macartan Humphreys, Raul Sanchez de la Sierra, Peter van der Windt

For examples of current research on computer assisted learning in China

Remedying Education: Evidence from Two Randomized Experiments in India, Abhijit Banerjee, Esther Duflo, Shawn Cole, Leigh Linden

On impact evaluation design

Theory-based impact evaluation: principles and practice, Howard White, 3ie Working Paper 3

An introduction to the use of randomized control trials to evaluate development interventions, Howard White, 3ie Working Paper 9

Achieving high-quality impact evaluation design through mixed methods: the case of infrastructure, Howard White, Journal of Development Effectiveness, vol 3, issue 1

Leave a comment

Enter alphabetic characters only.