Why are people antiscience, and what can we do about it?

Aviva Philipp-Muller, Spike W. S. Lee, and Richard E. Petty

Featured Article

Basis 4: Mismatch between the Delivery of the Scientific Message and the Recipient's Epistemic Style §

Even when scientific information does not conflict with an individual’s beliefs or attitudes, it can still be rejected for reasons beyond the content of the message. In particular, when scientific information is delivered in ways that are at odds with a person’s style of thinking about the topic at hand or their general approach to information processing, it is less likely to be processed and more likely to be rejected (70).

For example, when people construe an issue in abstract/high-level (vs. concrete/low-level) terms, concrete (vs. abstract) scientific information about the issue mismatches their construal level and tends to be rejected. People typically construe the issue of climate change in abstract/high-level terms (e.g., global environmental degradation), because the consequences of climate change are seen as psychologically distant (71), and distance promotes abstract construal (72). Thus, when ecofriendly products are described in concrete/low-level terms (e.g., fine details about the product’s carbon savings), despite making a compelling case, they tend to be rejected (71). Evaluation and choice of sustainable products are also undermined when the products are described in concrete terms of self-interested economic savings to consumers who think abstractly about sustainability (73).

Even holding the level of abstractness/concreteness constant, scientific information can be presented in a gain frame or a loss frame. Describing a vaccine as 90% effective (gain frame) is technically equivalent to describing it as 10% ineffective (loss frame), but with dissimilar psychological effects, because the frame can be at odds with people’s regulatory focus (74). Promotion focus orients people to eagerly attaining gains; prevention focus orients people to cautiously preventing losses. When scientific information is framed as promoting gains (vs. preventing losses), it tends to be rejected by people who are prevention focused (vs. promotion focused) (74). Such mismatch effects have been found to result in rejection of climate change (75) and health messages (e.g., vaccination and smoking cessation) (76).

Framing of scientific information also varies in how certain and decisive it seems. Even when there is a high degree of scientific consensus, scientific information is often disseminated in terms that signal uncertainty. Such terminology, while technically \accurate, leads people with high need for closure (i.e., low tolerance of epistemic uncertainty) (77) to reject it. For example, when people receive mixed scientific information about vaccines, those with high need for closure are particularly likely to become entrenched in their existing views and reject the mixed information (78). More generally, people with high need for closure are more likely to reject novel information that challenges their currently held conclusions or assumptions (77). This poses a challenge for scientists, who are trained to hedge their findings and avoid overclaiming certainty, as they try to communicate the preliminary, inconclusive, nuanced, or evolving nature of scientific evidence.

Finally, scientific information varies in its quality. Intuitively, high-quality arguments are more persuasive than low-quality ones (79). But this is often not true for people with low need for cognition (i.e., people who do not enjoy thinking), for whom low-quality arguments can be just as persuasive as high-quality ones if positive peripheral cues (e.g., a likable source) are present (80). Therefore, while good-quality scientific evidence is, overall, more likely to be accepted than bad-quality evidence (81), people who do not enjoy thinking are less likely to appreciate such quality distinctions. They are less likely to process complex information, as comprehending it requires active thinking (79). They are also less likely to choose to read nuanced science blog posts (82) and less likely to accept evidence for climate change and evolution (83).

Construal level, regulatory focus, need for closure, and need for cognition are different dimensions of epistemic style. On any of these dimensions, a mismatch between how scientific information is delivered and how the recipient thinks will increase the probability of rejection. More generally, source–recipient mismatches (basis 4), content conflicts (basis 3), social identity (basis 2), and sources lacking in credibility (basis 1) all contribute to antiscience attitudes. They also point to why politics is a particularly potent driver of these attitudes.

An individual's habitual or favored process of making a judgment or solving a problem.


Previous | Next How Politics Drive Antiscience Attitudes

Related Topics