Table 2

Participant views on reasons for the prevalence of misinformation and possible strategies for change

Reason for the problemExample quotationsStrategies for change
Bad science—poor-quality or biased science research
Academics experience huge pressure to publish.‘We have pressure to publish for publications and for grants and also in terms of promotions … I [think], not only governments but funding sources, they should not put that much pressure on how many papers we publish, but in the quality.’ (P2, early/mid-career researcher)Change the incentives for researchers—reward translation activities rather than just publication numbers.
Commercial influences.‘If you have a study [that] is funded by industry, you’re going to get cited much more often and you’re also going to get cited much more quickly in systematic reviews. And so what happens, overall, is that systematic reviews now end up reflecting industry funded studies much more heavily … One of the things that I think is really useful for when we're talking about primary research … is around synthesis ability … checklists are really useful for that. Just to say, “These are the things that you need to think about when you’re designing and then reporting your study that are going to make it easier for us to do our jobs in evidence synthesis”.’ (P3, experienced researcher)Increase public funding of science to minimise commercial influences; train researchers in synthesis ability.
Questionable research practices, for example, recruiting until results are statistically significant then stopping.‘You see a lot of people doing very poor statistics …[I’ve seen] multiple papers in [high impact factor journals] that are terrible. It’s not just the poor journals … I think in the future we’re going to be more heavily scrutinised… We could look at your p-values and if your p-values are always just below .05 you might need to explain yourself.’ (P4, experienced researcher)Increase oversight of research quality in academic institutions and publishing houses.
Inadequate access—lack of access to research that is free, timely, understandable and trustworthy
Publication bias: for example, only publishing results that are favourable to funders/political leaders.‘Share a league table of people who publish their protocol, whose protocol matched the actual analysis … to reward the good behaviour, to get the big institutions and the big funders in government to prioritise that and put that on a pedestal. Maybe that could be part of the block funding, how well you’re doing that. And then straight away, that dramatically changes the incentive, and then everybody has to follow suit on that.’ (P4, experienced researcher)Encourage protocol registration.
Impenetrable language, concepts and loss of specialist science journalists who can explain and critically evaluate scientific studies.‘There needs to be a middle ground between the press release which is, “This is going to cure cancer” and the scientific paper, which is impregnable … Maybe just like a one-pager that’s like, “This is what we’re seeking to find out. This is a preprint. This is how many people. These are the shortcomings…and this is where this research fits into the arc of research” …It’d certainly be good for health journalists. Because not every outlet has dedicated health reporters anymore anyway”.’ (P1, science communicator)Reward plain language publications, including simplified versions published in tandem with full studies.
Peer review system is inefficient, lacks transparency.‘A colleague told me a ridiculous story where she was sent a paper … for rapid peer review for COVID, and she reviewed it … within a couple of days and said, “This is terrible,” and … they’d actually published it by then.’ (P4, experienced researcher)Open peer review, with academic reward.
Academic publication paywalls.‘We’re a really unusual journal in that we’re completely open access. But … it’s certainly not a viable business model. I don’t know how much longer we can do it for.’ (P5, science communicator)Open access facilitated by governments, funders and institutions.
Low information and science literacy—people do not read high-quality science information
People use unreliable and algorithm-driven sources for science news.‘[Don’t] read all your news on Facebook, you have to read something else! … [Because] social media … are so good with these algorithms, you’re only going to see what’s going to reinforce what you already think.’ (P6, early/mid-career researcher)Educate the public about where to find and how to evaluate good science.
People are less attentive to trustworthiness of news than its visual or narrative appeal traditional.‘I think there’s this disconnect between what goes on in the health research world and the findings and then what all the, especially younger people are looking at on the internet … I look at people when they’re looking at their mobile phones and they just flick through so quickly. So the amount of time you have to get someone’s attention is so miniscule now … It’s almost incidental that you’re impressing upon them that this is a trusted source of information because I think a lot of people don’t even ask that question.’ (P7, science communicator)Train scientists to use engaging communication tools: visuals and narrative.
People expect certainty, precision and immediate answers in science.‘Science is about embracing uncertainty, actually. That’s where it is strongest, I suppose. That’s what it is. Whereas I think the popular imagination is scientists can deliver certainty. Scientists know things. And I think it’s partly the way it’s taught in schools. Mathematics and science, you know? There is a correct answer.’ (P8, science communicator)Educate the public about the scientific process.