Languages

Advice for journalists researching scholarly studies and reports

Advice for journalists researching scholarly studies and reports

Sherry Ricchiardi | April 26, 2017

Journalists often turn to scholarly studies and reports to add credibility and depth to their stories.  Usually, the practice works well — unless the information is biased or flawed.

There are looming questions: How do media professionals distinguish good data from bad? How do we keep from being taken in by shoddy or self-serving research? What are the red flags?

A reporter may not be a whiz at deciphering scientific methodology and statistical analysis, but we do know how to follow a line of questioning to get closer to the truth.

Two postings on the Journalist’s Resource (JR) website help differentiate between a quality study and a dubious one. Both are the result of one reporter’s nagging concerns about the validity of data she published in her own stories.

“Even though I was cautious in selecting reports and studies, I often worried I had picked a bad piece of research — something that relied on a flawed analysis, for example,” said JR managing editor Denise-Marie Ordway. The project is part of the Carnegie-Knight Initiative on the Future of Journalism Education.

Journalists must give research the same critical review they give government budgets and legislative policies, said Ordway. Earlier this year, she posted, “How to tell good research from bad: 13 questions to ask.” Among standards journalists should consider:

  • Is this research peer-reviewed? A study published in a peer-reviewed journal typically undergoes a detailed critique by a small number of qualified scholars. The peer-review process, while imperfect, is designed for quality control.

  • Is it published in a top-tier academic journal? Top journals are more likely to feature high-quality research. Their peer-review process tends to be more rigorous.

  • Who funded the research? Authors of studies published in academic journals are required to disclose funding sources. Studies funded by organizations such as the National Science Foundation tend to be trustworthy because the funding process itself is subject to an exhaustive peer review process.

  • What are the author’s credentials? Knowing where the authors work and how often they have been published can help assess their expertise in a field of study.

  • Do the authors have a conflict of interest? Be leery of research conducted by individuals or organizations that stand to gain from the findings.

  • Does the study rely on survey results? Survey results can be biased if respondents were not chosen by random selection. Beware of any survey that relies on respondents who self-select; for example, many internet-based surveys.

Ordway also posted 10 things she wished she’d known about research earlier in her career that included stints at The Philadelphia Inquirer and Orlando Sentinel. She was a Nieman Fellow at Harvard University and serves on the National Education Writers Association advisory board.

She offers practical, user-friendly tips on how to select the best data to add context to a story or fact-check a claim. Included on Ordway’s list:

  • General Google searches are not the best way to find good research. A better source is Google Scholar, which searches for research published in peer-reviewed journals. Other good resources: PubMed, Microsoft Academic, PLOS and the National Bureau of Economic Research.

  • Researchers are generally accessible and like to talk about their work. “We have found that researchers respond more quickly to email than phone calls,” wrote Ordway. “They also may share free copies of their work or tell you how to access them for free.”

  • If reporters are confused by a data analysis and don’t have a strong background in statistics or research methods, reach out to someone who does. Many scholars are eager to help journalists describe their research findings correctly.

  • Don’t spend much time on the abstract. The best places to find information about key findings: 1) the “results” section, which typically is in the middle of a research article, and 2) the “discussion” or “conclusions” section, which usually is located at the end of the paper and offers a summary of findings.

In today’s lightning-fast media environment, knowing how to evaluate scientific studies serves another important purpose.

“In my opinion, high-quality research is one way journalists can fight misinformation and fake news. It is critical that we be able to spot a bad piece of research when we see it,” said Ordway. “Basically, I wish someone had given me these two tip sheets 15 years ago.”

Main image CC-licensed by Flickr via Leo Hidalgo.

Tags: 

POST A COMMENT

Plain text

  • No HTML tags allowed.
  • Twitter message links are opened in new windows and rel="nofollow" is added.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Please log in or register in order to comment this post.