Much like journalists, development experts and organizations often espouse the need to measure and evaluate their work’s impact. But while journalists typically work with analytics and engagement metrics, development organizations evaluate their impact in the form of dense, lengthy research papers and studies.
Knowing this, how can journalists best report on development organizations’ impact evaluation? And how can one use data to better evaluate development projects’ impact — and find out whether or not development projects are actually working in their home countries?
A recent impactAFRICA webinar, hosted by the International Center for Journalists (ICFJ) and Code for Africa, in partnership with the World Bank’s Strategic Impact Evaluation Fund (SIEF), offered some solutions. These tips aren’t just useful for development reporters — any journalist who covers current research and studies can glean lessons from the webinar.
Here are IJNet’s top takeaways:
Identify red flags
Before reporting on any study, journalists should know how to identify warning signs that could skew a story and misrepresent the reality of a program’s outcome, explained Dave Evans, senior economist in the Chief Economist's Office for the Africa Region of the World Bank.
Studies that lack a comparison between the program’s impact and what would have happened without the program are likely relying on a before/after comparison, and are therefore less reliable, he said. Studies with small sample sizes are also problematic, as they only reflect a small portion of the development effort.
For example, soon after Liberia was declared Ebola-free last year, the White House released a press statement “claiming a great deal of credit for this,” Evans said. However, the statement failed to address why the White House believed its program in particular was the one to eradicate Ebola.
“It wasn't at all clear what the causal link was, what their counterfactual (what would have happened without the program) was or why they believed their program was what had led to this difference,” Evans said.
Understand correlation vs. causation
Journalists must be able to differentiate between correlation and causation when reporting on impact evaluation data.
A correlation is a simple statement of the relationship between two variables, such as stating that women earn lower salaries than men. A causation indicates the cause for the correlation: women earn lower salaries than men because they are discriminated against in the workplace. While both correlations and causations have their advantages, Evans stressed the need to have solid evidence before making a causal claim.
“It's important that we don't make a causal claim, that we don't say ‘The reason for this correlation is because of a certain action by an individual or a group of individuals,’ unless we have evidence — which is where the impact evaluations come back in,” he said.
Tell individual stories that reflect overall change
“A lot of the best reporting that I read tells the stories of individuals who benefit from programs or who don't,” Evans said.
While it might make sense as a journalist to tell the most exceptional story from an impact evaluation study, doing so won’t reflect the study’s results in an accurate light. For example, a story on an individual who benefitted from a program that ultimately had more lukewarm results for most other aid recipients won’t fairly represent the program’s efficacy.
“If we really want to highlight the story of someone who has benefited, we may want to counterbalance that with the story of someone who reflects more of the average effect of the program,” Evans said.
Reach out to study authors
Lastly, Evans said journalists reporting on impact evaluation should always make an effort to contact the study’s authors. Doing so can allow the authors to clarify any details of their impact evaluation and ensure you don’t misrepresent their results. Evans said it’s easy to find authors’ contact information, as it’s often displayed on their websites.
“Obviously, you're an independent journalist. You're not beholden to report exactly what they say,” he said. “But their input can be extremely valuable and can make sure that you get the reporting on the research right.”
Watch the full webinar below:
Image CC-licensed by Flickr via USAID U.S. Agency for International Development.