“We the People,” as we self-identify in the preamble to the Constitution, are more polarized than ever in these ostensibly United States. When it is misinformation that fosters this polarization, all eyes immediately turn to social media. Research shows that Twitter, Facebook and other platforms may widen social gaps, not by creating echo chambers but by motivating users to prove commitment to identities, causes and political parties. It’s harder to overlook differences and form connections with those on “the other side” if the whole world is spectating a heated dialogue in which you are personally engaged. Then there is the very public roster of the club to which you belong—the lists that specify precisely whom you follow on social media. Following the “wrong” accounts draws judgments of complicity by association, whereas following a “desirable” cadre of people and sharing their posts affirms a feeling of belonging.

Misinformation spreads through more traditional forms of social communication as well, including the mailbox, television ads and simple person-to-person contact. Outside the online global forum, we tend to maintain messier social circles and personal interactions. Audiences are smaller, mostly friends or relatives with whom you hash out differences and must coexist. These brick-and-mortar relationships leave less room to drift toward polarized viewpoints.

Social media, by contrast, pulls users en masse in opposite directions, hardening partisan differences. In this environment, widely disseminated misinformation that circulates in the public sphere creates an opening to inflict damage, especially on platforms where users seem most prone to sharing content with the intent of harming “the other side.” With billions of people worldwide connecting through social media platforms, the new viral channels are lightning quick and present relentless challenges to keeping our skepticism scalpels sharp and minimizing our gullibility.

So what can we do? Giving in to pressures to exclude outsiders and prove in-group bona fides is a natural impulse. How then can social media users avoid falling prey to misinformation manipulators whose mission of chaos, confusion and lies threatens the stability of our nation’s elections and our democracy? This question has come forth in sharp relief as voters listen to the spate of accusations of malfeasance and misbehavior leading up to next month’s midterms.

The role of misinformation in public life is at the center of psychologist Jay Van Bavel’s research. He has probed how social media and other factors exploit the weak spots of our human need to belong—and exclude—leading to the spread of false information that has been so prevalent before November’s tipping-point elections. An associate professor of psychology and neural science at New York University, Van Bavel spoke with Scientific American about who is most vulnerable to these lies, how they spread and what we can do to avoid becoming victims of those who benefit from the ecosystem of misinformation.

[An edited transcript of the interview follows.]

How does misinformation spread and why? And how is it having an impact in the current political environment?

Social media users are scrolling through about 300 feet of news feed a day. That is a lot. Misinformation tends to be designed to be attention-grabbing, more emotionally loaded or embedded with morality or identity, so it captures our attention in a sea of content. The reason we share is that it is interesting or relevant or signals our affiliation with our political leader or party. A lot of it is salacious allegations of affairs or corruption. These types of messages can be really damaging to a candidate as they spread widely.

A surprising data point in the 2016 election data was that older people spread misinformation [more than] young people [do]. Older people are more polarized and more committed to identities, and often they arguably are not as savvy on social media as young people, who have been using it a lot. Older people vote at about twice the rate of young people in midterms.

What are some factors in the success of viral misinformation?

It’s a combination of things. In structural terms, the U.S. is more polarized than it’s been in 20 years, more driven by out-group hate than by in-group love.

Then you have political leaders. During the pandemic, it was [Donald] Trump who downplayed the risks of the pandemic during the first months. Privately, he said it was a risk, so he clearly knew, but he was running for reelection and didn’t want to damage the economy and have people blame him for it.

And then [you have] the set of people around them, political elites. There is a big role of political elites—they give cues to people about what to believe and whom to trust. And it’s kind of a trickle down, [starting with] senators who are politically active on social media or Fox News and then people like us online and scrolling and sharing or rewteeting it or mentioning it to a friend or e-mailing it. People tend to get news from family and friends they trust more because if you see it from a family member, it seems more trustworthy.

It’s not all the leaders who are sources of the misinformation. Trump was retweeting conspiracy theories and misinformation from his followers, and he would amplify it if it was favorable to him. Conspiracy theorists are often creative in generating stories, and that trickles up and gets amplified by elites. There is definitely an ecosystem and vicious cycles, where the more [misinformation is] shared, the more incentive for people to share and get monetized, get invited onto Fox News. There are all kinds of incentive structures embedded in this.

What platforms are the likeliest contenders for the best incubators of misinformation?

TikTok is a big one, although [the company] is trying to update its policies to manage misinformation. Facebook probably has more people online than Twitter. There is a big study that found that people who got their COVID information from Facebook had the greatest levels of vaccine hesitancy even versus getting it from Fox News. Those data suggest that Facebook is, if anything, a big risk factor where most people are getting news.

How do the parties responsible for generating misinformation decide on what to disseminate?

The misinformation that catches on and spreads is stuff that connects to themes that people have already heard or believe. One from the last election was “stop the steal,” the [false] notion that the election was stolen. So during the pandemic, there were various conspiracy theories or misinformation about every stage of the pandemic, everything from vaccines and how they affect kids to masks. And a lot was spread in the effort of discrediting the health risks the pandemic posed.

What surprises might be on the horizon for the midterms?

It’s a little harder to predict than a presidential election. Most information is targeted to specific candidates, probably in the most contested senate races, such as Georgia and Pennsylvania. It will be about the candidates running in those races or maybe some swing districts in Congress. That is where the architects of misinformation want to shake things up.

A small number of people tend to be very politically extreme. Most are people on the right, so Republican political actors. And you can imagine that there would be some misinformation from the left as well. It tends not to be spread as widely. The people spreading [misinformation] are the usual suspects—Alex Jones, Roger Stone. It tends to generate from a few accounts and then gets spread by people who are politically aligned.

What can people do to be less vulnerable to influences from these misinformation campaigns?

We need to become more skeptical about information that comes from our own groups, our political leaders. I’ve shared maybe two or three pieces of misinformation. One was a parody, and I didn’t realize it because it kind of aligned with something in the zeitgeist, with my beliefs and identity. Friends on social media corrected me, and I immediately felt humiliated and took it down. In my universe, if I were to start sharing a lot of information like that from Alex Jones, I’d stop getting invited to conferences.

We need norms around correcting one another and not have to do it all ourselves. That is the whole thing with science—peer review: scholars come along after and point out our errors. We [scientists] live in a community that helps us get smarter over time. You want to embed yourself in communities and make that normative, not to get defensive and to be open-minded about being fact-checked.

Anything else?

Another [tactic] is “prebunking,” getting the facts out before misinformation spreads too widely. That acts like a vaccine. Get an inoculation, and the brain has antibodies to this misinformation so that when you encounter it in the wild, your brain can understand that you’re being manipulated and counterargue it. That seems to make more people more skeptical overall.

Online there is a really cool game, the Bad News Game, that teaches how [misinformation spreads] and what works. The data suggest that [the game] helps people.

Journalists have a role to identify misinformation that’s likely to spread in this cycle, something that will piggyback on “stop the steal” or “fake ballot” allegations. I’d encourage journalists to get out accurate information about how people are likely to be manipulated based on the last election and give people resources or places they can go to get higher-quality information on these issues.

How does someone with an entrenched belief react to fact-checks?

For the most part, fact-checks work, but the impact they have in studies was really small, compared with that of partisan identity. [Some of] the time, [fact-checking] actually seemed to backfire. People got more entrenched mostly if something got fact-checked by the other side, and people who had this backfire effect really identified with a particular group. People who drive around in a pickup with five Trump flags or a car with stickers all over it with every single liberal cause they support are more susceptible to this kind of entrenchment. People who don’t make it an essential part of their identity seem to be less susceptible to getting entrenched.

I think that this is going to be a big issue. It’s not going away; it’s only getting bigger. And it’s not just the issue of individuals needing to be more savvy. It’s that we are embedded in systems that are rewarding misinformation spreaders with economic things. Alex Jones is a good test case. He is making so much every year selling conspiracy theories to his audience. People don’t like being manipulated and need a greater understanding of the fact that they are being manipulated by people profiting off of these lies.