Truth under Attack

Sep 29, 2022

Truths should be stubborn things, right? Not in today’s society. A set of polls conducted this summer revealed about 70 percent of Republican voters still believe that Joe Biden did not win the 2020 presidential election, despite extensive bipartisan investigations into voter fraud that validated the trustworthiness of the election. Online, the YouTube suggestion algorithm has been shown to steer viewers toward more extreme or far-fetched videos, spreading conspiracy theories and fringe beliefs. And users on other platforms such as TikTok and Twitter deliberately disseminate misinformation about lifesaving vaccines.

Lies, extremism and the manipulation of reality seem to be common themes in today’s current events. Because all untruths are antithetical to science, we hope this issue will serve in some measure as an antidote to the poison of manipulated facts and other forms of mendacity. Never has it been more important to understand the science of how we humans determine what is true.

For starters, our perception is inherently subjective. We may believe that we are open-minded creatures, but most people latch on to ideas that seem to validate their own preconceived beliefs—even if this behavior prevents them from seeing new solutions. Such ingrained implicit bias has served us well in the course of evolution, but in the modern era, it more often leads us astray.

Indeed, humans famously make, and commit to, decisions even when they don’t have all the facts, and in some cases, those leaps to conclusions make some accept conspiracy theories and other misinformation. Good news: the practice of questioning your deepest-held beliefs, especially in light of strong evidence, can strengthen your objectivity and critical thinking skills.

Nowhere are our failings at objective reasoning more exploitable than on social media, used globally by billions. Facebook and other platforms enable the spread of misinformation that sows social unrest—in particular, meme culture has been shown to propagate lies and increase division. Platform algorithms that take advantage of our psychological vulnerabilities trap us in echo chambers. In the end, users become the unwitting vectors of these threats.

Civic life suffers because of these malevolent forces. Turmoil, anxiety and a sense that society is in jeopardy lead to the kind of polarization that makes winning an argument more important than understanding opponents’ viewpoints. We are stuck in what philosopher Kathleen Higgins describes as the post-truth era, where there is no longer an expectation that politicians or pundits will be honest. Rejection of expertise and sound data has even led the highest court in the land to issue rulings that endanger human health.

Although the human mind comes equipped with built-in obstacles to objective thinking, we shouldn’t give in to ignorance and bias. Psychologist Douglas T. Kenrick and his co-authors offer simple interventions that can make us more open-minded, scientific thinkers. In fact, scientists can look to philosophy to aid in some self-examination about how much, in the hands of subjective creatures, the tools of science can ultimately discover.

The common theme in many of these seemingly abysmal examinations of the state of our societal affairs is a heartening bright spot. By just being aware of how we perceive information, we can protect ourselves from disinformation and hogwash. We don’t have to always agree, but at least we’ll be anchored in what is real and what is not.