Critically Evaluating Claims

- Author: Megha Satyanarayana
- Full Title: Critically Evaluating Claims
- Type: #snippet✂️
- Document Tags: #sci_comms
- URL: https://www.theopennotebook.com/2022/01/25/critically-evaluating-claims/?utm_source=The+Open+Notebook&utm_campaign=fbd3d2118e-Master+Class+4%3A+Study+Story+-+Day+2&utm_medium=email&utm_term=0_94b4f65b87-fbd3d2118e-550771797
Highlights
- To cut through the murkiness and hype, science journalists need to vet the information and sources they come across and be on the lookout for red flags. Also essential is understanding our own biases—what we wish to be true, and how that plays into our decision making. Here, both skepticism and self-awareness can be key. (View Highlight)
- Press releases are as much about getting attention for the institution, the company, or the researcher as they are about the research, says Janet Stemwedel, a philosophy professor at San José State University who has written on the topic of evaluating claims in research. A lot of nonprofits and institutions use press releases and media they have developed in-house to raise funds, so the tone and language will almost always be optimistic and positive. They may even oversell the value of the research. (View Highlight)
- For example, journalists love reporting on foods like chocolate, says Alice Lichtenstein, a nutrition professor at Tufts University who helps dispel myths about food. But nutrition reporting is often full of holes, and journalists often bring too little skepticism to their coverage of food research. It’s hard to pin down why, but she thinks it has something to do with the idea that because we all eat, we all think we are experts in doing it. (View Highlight)
- Another common reporting mistake can occur when journalists don’t fully understand the nature of the study they are reporting on. A lot of climate change studies, for instance, are based on modeling of different events and drawing conclusions based on those models. Not understanding how the models work, and what their shortcomings are, can lead to overselling or underselling the research. And in biomedical or clinical research, it’s important to draw distinctions between interventional studies, where researchers change people’s behaviors or treatments and look for effects, and observational studies, where they just look for patterns in what people are already doing. “With an intervention study, it’s essentially cause and effect, where with an observational study, you just look at associations,” Lichtenstein says. (View Highlight)
- And of course, it’s important to navigate conflicts of interest: Ask sources who funds the research? This is especially relevant to climate change and environment reporting, where organizations with a stake in the outcome of climate change mitigation can be prone to hyperbole. (View Highlight)