Why are people antiscience, and what can we do about it?

Aviva Philipp-Muller, Spike W. S. Lee, and Richard E. Petty

Featured Article

Targeting Basis 3: Increasing Acceptance of Scientific Information Even When It Contradicts One’s Beliefs and Attitudes

To tackle rejection of scientific information that contradicts an audience’s beliefs, prevention is better than cure: Whenever possible, minimize the formation of ill-informed beliefs in the first place. One preventive strategy is to train people in scientific reasoning (i.e., the ability to evaluate the quality of scientific information). People equipped with scientific reasoning skills are more likely to accept high-quality scientific evidence (84). This strategy is especially apt for combatting the rise of fake news [which is another major problem that requires societal-level changes in digital infrastructure (125). Arming media consumers with the skills to differentiate between true and false scientific information leads them to become more discerning regarding which beliefs to adopt (125). Critically, this strategy pertains to conveying the correct scientific information prior to any misinformation being adopted. An additional caveat is that, although encouraging critical reasoning decreases belief in scientific misinformation, simply telling people that they should trust science more can actually increase belief in and dissemination of misinformation framed as being scientific (compared with misinformation not framed as being scientific) (126).

Related to the broader notion of training in scientific reasoning, a specific strategy is called prebunking. Derived from the logic of disease inoculation (127), it involves forewarning people that they will be receiving misinformation, then giving them a small dose of misinformation (the “vaccine”) and refuting it so that they will be better able to resist misinformation when they encounter it in the wild (the “disease”). Data from a field experiment among older adults have found this strategy to be effective for minimizing the impact of disinformation on people’s intention to receive a COVID-19 vaccine (128).

Another preventive strategy, which sounds intuitive but turns out to be ineffective for enhancing acceptance of scientific information, is increasing a population’s general scientific literacy. Unlike specialized scientific knowledge, general scientific literacy does not involve a deep dive into why a scientific phenomenon occurs (89). Unlike scientific reasoning skills, general scientific literacy does not teach people how to parse scientific information (84). Instead, it merely entails imparting an unelaborated list of scientific information (89). Why is it ineffective for enhancing acceptance of scientific information? Because people with more scientific literacy are simply more sophisticated at bolstering their existing beliefs by cherry-picking ideas and information to defend their worldview (84). Higher levels of scientific literacy, instead of leading people to coalesce around scientific truths, can increase polarization of beliefs (84). Similarly, greater cognitive sophistication (e.g., stronger analytic thinking) does not necessarily reduce antiscience views, as the most cognitively sophisticated and educated people can also be the most polarized (129), although the evidence for and interpretation of this pattern have been subject to debate (130).

When preventive strategies are implausible, curative ones are necessary. Simply learning information is often uncorrelated with attitude change (48, 131). What matters more than whether people learn or remember the information they have been told is how they react to that information. If people have positive reactions to a message, they are more likely to change their attitudes to be in line with that message (132). By implication, merely informing the public of scientific information is insufficient; one must also persuade them. Strong, well-reasoned, and well-substantiated arguments, implemented by skilled science communicators, have been found effective for altering even entrenched attitudes, such as toward climate change (133) and the safety of electronic health records (134).

But, for the particularly intransigent, additional strategies should be utilized to supplement persuasive arguments. As noted earlier, a fundamental mechanism that leads people to reject scientific information contradictory to their beliefs is cognitive dissonance. This aversive state has been found to be reduced by a procedure called self-affirmation, which involves prompting people to conjure and affirm values that matter to them (e.g., caring for one’s family) in ways unrelated to the cognitive conflict at hand (135). Why does self-affirmation reduce dissonance? Because it increases one’s sense of self-integrity and security, which reduces the threatening effect of dissonance to the self. Self-affirmation interventions have been used successfully to reduce defensiveness and increase acceptance of scientific information regarding health behaviors (136) and climate change (137).

Sometimes, scientific messages not only conflict with a person’s beliefs and attitudes but also with their particular moral concerns. To manage this, an effective strategy is to identify the specific morals the recipient endorses and reframe the scientific message to accord with them. Conservatives, who endorse the moral foundation of ingroup loyalty, are more persuaded by messages about climate change framed as a matter of loyalty to one’s country. Liberals, who endorse the moral foundation of intentional care, are more persuaded by messages about climate change framed as a matter of care for innocent creatures (138). Moral reframing has also been found effective for minimizing morally based opposition to vaccines and stem cell technology (138). Similarly, for recipients who think about public health in more (vs. less) moral terms, messages that use moral arguments such as engaging in physical distancing during the COVID-19 pandemic to benefit others (vs. oneself) are more persuasive (139).

To increase acceptance of scientific evidence among those who have strong moral intuitions about naturalness/purity, science communicators can specifically reframe scientific innovations as confluent with nature. For example, increasing the perceived naturalness of geoengineering has been found to increase people’s acceptance of it as a strategy to combat climate change (140). Overall, these findings suggest that science communicators can create multiple moral frames when communicating their scientific information to distinct audiences (e.g., liberals vs. conservatives, religious vs. nonreligious) who are likely to have different moral intuitions or views.


Previous | Next Targeting Basis 4: Matching the Delivery of the Scientific Message with the Recipient’s Epistemic Style

Related Topics