|||

Paradox of Evaluation Counterfactuals

Why does the counterfactual evaluation principle seem to break down in the following case?

Suppose eleven organisations each bring enough vaccines to vaccinate 100,000 children, and there are a million children who need the vaccine now. There are enough vaccines to go round, but only just; only 10% of the shots are left over. One of the organisations, Organisation X, wants to know if its intervention had any impact. People on the ground are impressed: you saved their lives” they say. But Evelyn the Evaluator says no, we have learned to think in counterfactuals, the World Bank told us to. So Organisation X actually had no impact, because even if they hadn’t come, all the children would still have got vaccinated - there were after all just enough vaccines to go round.

So the counterfactuals paradigm tells us one answer, whereas most evaluators would probably disagree to some extent and for a variety of reasons. I guess some would argue that we can’t be certain the other organisations would have been all so effective, or that taken together, all ten interventions really did have an impact.

But I just read Mohr (1999) who has a more profound objection to the counterfactual argument: he discusses the idea of physical causation, the kind of hammer-hits-glass, glass-smashes” evidence that we just don’t usually doubt. If we saw the vaccinations happening with our very eyes, why would we need any other kind of evidence that the intervention by Organisation X caused the children to be vaccinated? Aren’t the people who say we saw you saving their lives” right in a way that we might miss if we have been on too many evaluation seminars? And isn’t the counterfactual argument nonsense when used in this way?

Mohr, L. B. (1999). The qualitative method of impact analysis. American Journal of Evaluation, 20(1), 69.

Up next Roma parents’ views on participation in education in B&H. Our OSI funded project on parental participation in schools in B&H in 2009, which employed a representative face-to-face survey of 1143 parents, Democracy support pays off in the end, at least from the perspective of USAID, even in Egypt A pretty comprehensive quantitative study of the effectiveness of democracy support (Finkel et al 2007& 2008) found that, when controlling for a
Latest posts Making notes on PDFs without computer or paper Publications causal-map Causal Map intro Causal Mapping - an earlier guide The walk to school in Sarajevo Glitches Draft blog post for AEA365 Theory Maker! Inventory & analysis of small conservation grants, C&W Africa - Powell & Mesbach! Lots of charts! Answering the “why” question: piecing together multiple pieces of causal information rbind.fill for 1-dimensional tables in r yED graph editor Examples of trivial graph format Using attr labels for ggplot An evaluation puzzle: “Talent show” An evaluation puzzle: “Mobile first” An evaluation puzzle: “Many hands” An evaluation puzzle: Loaves and fishes An evaluation puzzle: “Freak weather” An evaluation puzzle: “Billionaire” Using Dropbox for syncing Shiny app data on Amazon EC2 Progress on the Causal Map app Articles and presentations related to Causal Maps and Theorymaker Better ways to present country-level data on a world map: equal-area cartograms A starter kit for reproducible research with R A reproducible workflow for evaluation reports Welcome to the Wiggle Room Realtime comments on a Theory of Change Responses to open questions shown as tooltips in a chart A panel on visualising Theories of Change for EES 2018?