Why are people antiscience, and what can we do about it?

Aviva Philipp-Muller, Spike W. S. Lee, and Richard E. Petty

Featured Article

Basis 3: The Scientific Message Itself

People do not always think and behave in line with what science suggests. One reason is that they are unaware of the scientific evidence [i.e., the deficit model (48). Sometimes, when people simply learn about the scientific consensus, their thoughts and feelings follow suit [i.e., the gateway belief model (49). Other times, however, when scientific information contradicts people’s existing beliefs about what is factually true, they can reject even the strongest scientific evidence, because harboring conflicting cognitions is aversive. This phenomenon is known as cognitive dissonance (50), which arises when a person is exposed to information that conflicts with their existing beliefs, attitudes, or behaviors. Dissonance elicits discomfort. Given this aversive feeling, people are motivated to resolve the contradiction and eliminate the discomfort in a number of ways, such as rejecting the new information, trivializing the topic, rationalizing that there is no contradiction, or revising their existing thought (51).

Critically, people tend to resolve dissonance using the path of least resistance. To a person who has been smoking their entire life, it is far easier to reject or trivialize scientific evidence about the health risks of smoking than to alter their ingrained habit. With dissonance, the intransigence of existing beliefs resembles the stickiness of existing behaviors: It is easier to reject a piece of scientific information than to revise an entire system of existing beliefs one has accumulated and integrated into a worldview over the years, often reinforced by social consensus. One’s existing beliefs can be based on valid scientific information, previously accepted but now outdated scientific information, or scientific misinformation. As an example of dissonance arising from believing outdated scientific information, for thousands of years, it was a widespread belief that Earth was the center of the universe and that the sun orbited Earth (52). To a person who had always believed the sun revolved around Earth, it was far easier to reject the notion of Copernican heliocentrism than to overhaul the geocentric model of the universe, which was previously accepted and felt subjectively coherent enough, and thus in no obvious need for revision.

In addition to rejecting new information from scientific progress and updates, individuals might possess beliefs that contradict scientific evidence due to the spread of misinformation. The last few years have witnessed a proliferation of fake news (53), catalyzed by social media, which facilitates the rapid spread of information regardless of whether it is true. Sadly, fake news spreads “significantly farther, faster, deeper, and more broadly” than true news on social media platforms, because fake news stories often evoke stronger emotional reactions and come across as more novel than true ones, which are attributes that increase sharing behavior (54). Although some individuals might be sharing misinformation merely because of inattention to veracity (not because of endorsement of content) (55), extensive sharing of fake news among one’s ingroup makes it likely to be accepted, due to the dynamics of social identity outlined earlier, which can result in rapid acceptance of pseudoscientific or antiscientific beliefs.

Once misinformation has spread, it is difficult to correct (56), and there is often a continued influence of the misinformation even after it has been retracted. Corrections issued by media sources are typically ineffective at reducing belief in the misinformation. In fact, corrections can sometimes reinforce the belief by making it more salient (56). Unfortunately, misinformation on many scientific topics has been widely disseminated, such as exaggerated and unfounded risks of vaccines (including pre-COVID times), denial of climate change, and dismissal of evidence for evolution (57).

Scientific misinformation is especially difficult to correct when it provides a causal explanation for a phenomenon. Correcting the misinformation would leave a gap in people’s mental model of why an event or a situation has occurred (58) and would cause discomfort (59). People often refill that gap with misinformation to make sense of the issue at hand. Circling back to the example of heliocentrism, telling a geocentricist that Earth is actually not the center of the universe would leave a gap in their mental model of why the sun clearly appears to be revolving around Earth, a gap that is easy to refill by reaffirming their existing causal belief. Similar cognitive dynamics have long been observed in pseudoscience (60) and continue to result in rejection of scientific information today.

Not only do people possess beliefs about whether things are true or false, they also evaluate things as desirable or undesirable (attitudes) (9), important or unimportant (values) (61), and right or wrong (morals) (62). Some moral views are at odds with particular kinds of scientific information, resulting in morally fueled rejection. For example, people who endorse the moral significance of naturalness and purity are prone to resisting scientific technologies and innovations seen as tampering with nature. Vaccines (63) and genetically modified food (64), despite their documented benefits, are often rejected due to perceptions that they are unnatural. This cluster of moral intuitions about naturalness and purity is highly related to individual differences in aversion to “playing God,” an aversion that predicts lower willingness to fund the NSF and less monetary donation to organizations supporting novel scientific procedures (65).

Attitudes rooted in one’s notions of right and wrong (e.g., not eating meat as a moral issue rather than as a taste preference) are particularly strong (66) and tend to be more extreme, persistent over time, resistant to change, and predictive of behavior (67). For example, people with moralized attitudes toward recycling are more resistant to counterattitudinal information regarding the efficacy of recycling (68). To resolve dissonance from conflicting information, rejecting the novel scientific information is often the path of lesser resistance than revising one’s existing moralized attitudes. Likewise, when misinformation is consistent with one’s existing attitudes, it is difficult to correct (69). To people who love driving high-horsepower but gas-guzzling vehicles, misinformation such as “climate change is a hoax” would be attitude consistent, whereas scientific correction of this misinformation would be attitude inconsistent and thus prone to rejection.


Previous | Next Basis 4: Mismatch between the Delivery of the Scientific Message and the Recipient’s Epistemic Style

Related Topics