Shane O'Mara

Shane O'Mara

Share this post

Shane O'Mara
Shane O'Mara
certain and wrong: you can’t enfactify your way out of it

certain and wrong: you can’t enfactify your way out of it

what if we’re wrong about 'being wrong'? rethinking polarisation as more than simply a bunch of cognitive biases in action

Shane O'Mara's avatar
Shane O'Mara
Jul 16, 2025
∙ Paid
5

Share this post

Shane O'Mara
Shane O'Mara
certain and wrong: you can’t enfactify your way out of it
3
Share

how cognitive biases, values, and identity turn facts into flashpoints

Modern politics is dominated by hyperpolarisation; even issues of settled science (climate change, vaccines, evolution) have become fiercely divisive (this varies by country, of course: anti-vax gibberish is marginal in many countries).

What's going on?

One way of explaning what’s going is to focus on explaining the problem through a malign constellation of psychological mechanisms, including confirmation bias, myside reasoning, psychological reactance, and identity-protective cognition, amplified further by assortative social interactions, all coming together and bringing out the worst in us, rather than bringing out the best of us - the ‘better angels of our nature’ 😇 .

Smarter thinking in everyday life

Smarter thinking in everyday life

Shane O'Mara
·
Jan 16
Read full story

confirmation bias drives polarisation

Confirmation bias is the tendency to seek and interpret information consistent with your existing beliefs. It is a powerful driver of polarisation, because humans instinctively avoid cognitive dissonance (the discomfort felt when encountering information that contradicts your existing views), as we prefer feeling coherence and comfort in our beliefs.

As a result, individuals selectively seek information sources aligning with their existing beliefs, ignore or dismiss opposing evidence, and interpret ambiguous information as further proof of their original convictions.

They may even actively choose an information diet exclusively feeding their biases: after all, finding out you're right because some podcastet tells you you are, is rewarding - especially if said podcaster is esteemed by others in your social circle.

Media of various sorts further reinforce these biases by continuously supplying users with belief-consistent content, creating echo chambers and deepening divides.

Here's a good example:

(FYI: he’s not president anymore!).

myside bias and motivated reasoning

Closely related to confirmation bias is myside bias: treating your own arguments as inherently rational, reasonable, and logical, while viewing opposing arguments as irrational, unreasonable, and illogical.

Myside reasoning is less about discovering empirical reality than protecting your existing beliefs and perhaps persuading others.

This often leads us to dismiss strong counterarguments which challenge our beliefs: research indicates presenting partisans with opposing evidence rarely prompts genuine reconsideration; it instead triggers motivated scepticism, where counterarguments become perceived threats to identity, and individuals become emotionally invested in dismissing or undermining opposing perspectives.

My book Talking Heads has a US Kindle Ed available here and ofc you can order the hard-copy and audio as well)

the role of identity and values

Political polarisation is driven by cultural identities and core values: people often accept or reject factual information based on its implications for their social identity. Findings labelled as “settled science” may become divisive if accepting them signals betrayal of group loyalties or values.

The pope and the tooth fairy

The pope and the tooth fairy

Shane O'Mara
·
June 12, 2024
Read full story

Dan Kahan’s theory of cultural cognition illustrates this: when scientific conclusions threaten group identities or cherished moral convictions, identity-protective cognition takes over. The desire to maintain group solidarity or uphold sacred values (such as freedom, autonomy, or religious principles) can override empirical accuracy. Scientific literacy alone does not solve this; in fact, better-informed individuals often become more adept at aligning their factual beliefs with group identities rather than the empirical world.

the self as a governance unit

the self as a governance unit

Shane O'Mara
·
Jul 2
Read full story

confirmation bias in digital environments

How does confirmation bias operate online? A study by Boonprakong and colleagues (ref at bottom) examined how people interact with polarising content in a Twitter-like news feed. They found users with strong political beliefs and a tendency towards low-effort thinking exposed to emotionally-charged posts were significantly more prone to confirmation bias. Participants selected content matching their views, and also recalled and rated it as more accurate and persuasive.

These findings suggest confirmation bias arises from belief and context: emotional tone of content, layout of information feeds, and cognitive effort demanded all influence polarisation. Thus, social media platforms could design interventions (such as reducing polarising expression and enhancing media literacy tools) to dampen bias-amplifying effects.

This result complements prior findings: we humans are not simply passive recipients of information, but active selectors and interpreters, often unknowingly reinforcing our prior views through interface affordances and cognitive biases. And we end up in a self-curated epistemic bubble, shaped by design and disposition.

psychological reactance and resistance

Another psychological mechanism driving polarisation is reactance: our instinctive resistance to perceived attempts at coercion or control or removal of agency. When authorities or experts deliver recommendations by diktat (such as vaccination mandates or climate policies), many individuals experience these as threats to personal autonomy. Psychological reactance can provoke resistance to even scientifically sound recommendations, making them feel like a desire for submission, and not rational agreement.

psychological reactance theory: this is why you resist change

psychological reactance theory: this is why you resist change

Shane O'Mara
·
May 22
Read full story

Reactance research shows messaging perceived as controlling or paternalistic frequently backfires, prompting increased resistance among groups already sensitive to autonomy restrictions (for instance, among those holding libertarian or anti-authoritarian values).

assortative interactions amplify divisions

Polarisation is also intensified by assortative social dynamics: we humans prefer associating with like-minded others. Over time, assortative interactions can create homogeneous groups whose members reinforce each other’s beliefs. Shared narratives become dominant, dissent is discouraged, and internal views harden into identity markers.

Social network experiments demonstrate how clustering like-minded individuals amplifies polarisation. Once social groups become ideologically homogeneous, group members rarely encounter genuine counterarguments, and the social costs of dissent rise. Such environments foster radicalisation; members increasingly adopt extreme positions, as these signal greater group loyalty (‘Yes, the election was stolen’; ‘yes, I believe windmills cause cancer’; ‘of course, vaccines cause autism’; ‘indeed dewormers can treat covid’; ‘something, something, space lasers mumble weather’ and many more: I discuss examples in my book ‘Talking Heads’ - US Kindle Ed available here).

the myth of a fixed personality

the myth of a fixed personality

Shane O'Mara
·
May 13
Read full story

moral reasoning and emotional commitment

Underlying these cognitive and social dynamics is moral reasoning: beliefs around controversial scientific issues often evolve into sacred moral convictions, where rejection of opposing views becomes morally justified rather than logically argued. When science conflicts with deeply held moral or religious values, factual corrections are often rejected outright, as they are considered morally threatening rather than merely incorrect.

Moralising beliefs heightens emotional commitment to narratives justifying ignoring or dismissing conflicting evidence. Emotional narratives about personal freedom, bodily autonomy, or loyalty to community identities dominate logical argumentation, making opposing viewpoints morally unacceptable rather than merely factually incorrect.

Leave a comment

why it matters

Understanding how cognitive biases, values, identity, and emotional reasoning converge explains why even scientifically-settled questions become fiercely polarised. Interventions reducing polarisation by respecting psychological realities are likely to be more effective rather than censoring them or overriding them. Understanding the psychological roots of polarisation helps us engage constructively with seemingly irrational beliefs, and creates opportunities for meaningful dialogue in politically-charged environments.

the planning trap: when more means less

the planning trap: when more means less

Shane O'Mara
·
Jun 17
Read full story

but what if we’re wrong about why we’re divided?

(or why my argument might be wrong)

I’ve argued cognitive biases (confirmation bias, myside reasoning, psychological reactance) help explain why arguments around “settled science” can become fiercely polarised: people often reject facts because their beliefs serve deeper values, identities, and social affiliations. Add in the emotional pull of like-minded communities and the reinforcement loops of social media, and you get a recipe for ideological siloing and belief hardening: it’s not simply because they are misinformed.

That account, I think, still holds—but it may not be the whole story.

Here, I take a step back and ask: what might be wrong, incomplete, or misleading about this account? What have we missed by focusing on cognition, and how might the story shift if we widen the lens? In other words, if we test my argument above for its own confirmation bias and myside reasoning, what will we find?

If you enjoyed this post, and want more essays on how neuroscience and psychology can help us build better lives, subscribe below. You’ll get new ideas from BrainPizza. And a major new series is coming very soon - focusing on how we can rethink our democracies for the better.

maybe it’s not just about bias

(Below: extended discussion focused on why the view above might be wrong (this was harder to write, as I like to think I’m correct!); overcoming bias - point by point; plus a handy downloadable smart thinking pdf link, and a list of further reading).

Keep reading with a 7-day free trial

Subscribe to Shane O'Mara to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Shane O'Mara
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share