It’s not just that one uncle who’s not allowed at Thanksgiving anymore who’s been spreading misinformation online. The practice began long before the rise of social media — governments around the world have been doing it for centuries. But it wasn’t until the modern era, one fueled by algorithmic recommendation engines built to infinitely increase engagement, that nation-states have managed to weaponize disinformation to such a high degree. In his new book Tyrants on Twitter: Protecting Democracies from Information Warfare, David Sloss, Professor of Law at Santa Clara University, explores how social media sites like Facebook, Instagram, and TikTok have become platforms for political operations that have very real, and very dire, consequences for democracy while arguing for governments to unite in creating a global framework to regulate and protect these networks from information warfare.

David Sloss

Excerpted from Tyrants on Twitter: Protecting Democracies from Information Warfare, by David L. Sloss, published by Stanford University Press, ©2022 by the Board of Trustees of the Leland Stanford Junior University. All Rights Reserved.


Governments were practicing disinformation long before the advent of social media. However, social media accelerates the spread of false information by enabling people to reach a large audience at low cost. Social media accelerates the spread of both misinformation and disinformation. “Misinformation” includes any false or misleading information. “Disinformation” is false or misleading information that is purposefully crafted or strategically placed to achieve a political goal. 

The political objectives of a disinformation campaign could be either foreign or domestic. Prior chapters focused on foreign affairs. Here, let us consider domestic disinformation campaigns. The “Pizzagate” story is a good example. In fall 2016, a Twitter post alleged that Hillary Clinton was “the kingpin of an international child enslavement and sex ring.” The story quickly spread on social media, leading to the creation of a discussion board on Reddit with the title “Pizzagate.” As various contributors embellished the story, they identified a specific pizza parlor in Washington, DC, Comet Ping Pong, as the base of operations for the child sex operation. “These bizarre and evidence-free allegations soon spread beyond the dark underbelly of the internet to relatively mainstream right-wing media such as the Drudge Report and Infowars.” Alex Jones, the creator of Infowars, “has more than 2 million follows on YouTube and 730,000 followers on Twitter; by spreading the rumors, Jones vastly increased their reach.” (Jones has since been banned from most major social media platforms.) Ultimately, a young man who believed the story arrived at Comet Ping Pong with “an AR- 15 semiautomatic rifle… and opened fire, unloading multiple rounds.” Although the story was debunked, “pollsters found that more than a quarter of adults surveyed were either certain that Clinton was connected to the child sex ring or that some part of the story must have been true.”

Several features of the current information environment accelerate the spread of misinformation. Before the rise of the internet, major media companies like CBS and the New York Times had the capacity to distribute stories to millions of people. However, they were generally bound by professional standards of journalistic ethics so that they would not deliberately spread false stories. They were far from perfect, but they did help prevent widespread dissemination of false information. The internet effectively removed the filtering function of large media organizations, enabling anyone with a social media account — and a basic working knowledge of how messages go viral on social media — to spread misinformation to a very large audience very quickly. 

The digital age has given rise to automated accounts known as “bots.” A bot is “a software tool that performs specific actions on computers connected in a network without the intervention of human users.” Political operatives with a moderate degree of technical sophistication can utilize bots to accelerate the spread of messages on social media. Moreover, social media platforms facilitate the use of microtargeting: “the process of preparing and delivering customized messages to voters or consumers.” In summer 2017, political activists in the United Kingdom built a bot to disseminate messages on Tinder, a dating app, that were designed to attract new supporters for the Labour Party. “The bot accounts sent between thirty thousand and forty thousand messages in all, targeting eighteen- to twenty-five-year-olds in constituencies where the Labour candidates needed help.” In the ensuing election, “the Labour Party either won or successfully defended some of these targeted districts by just a few votes. In celebrating their victory over Twitter, campaign managers thanked… their team of bots.” There is no evidence in this case that the bots were spreading false information, but unethical political operatives can also use bots and microtargeting to spread false messages quickly via social media. 

In the past two decades, we have seen the growth of an entire industry of paid political consultants who have developed expertise in utilizing social media to influence political outcomes. The Polish firm discussed earlier in this chapter is one example. Philip Howard, a leading expert on misinformation, claims: “It is safe to say that every country in the world has some homegrown political consulting firm that specializes in marketing political misinformation.” Political consultants work with data mining companies that have accumulated huge amounts of information about individuals by collecting data from a variety of sources, including social media platforms, and aggregating that information in proprietary databases. The data mining industry “supplies the information that campaign managers need to make strategic decisions about whom to target, where, when, with what message, and over which device and platform.”

Political consulting firms use both bots and human-operated “fake accounts” to disseminate messages via social media. (A “fake account” is a social media account operated by someone who adopts a false identity for the purpose of misleading other social media users about the identity of the person operating the account.) They take advantage of data from the data mining industry and the technical features of social media platforms to engage in very sophisticated microtargeting, sending customized messages to select groups of voters to shape public opinion and/or influence political outcomes. “Social media algorithms allow for the constant testing and refinement of campaign messages, so that the most advanced techniques of behavioral science can sharpen the message in time for those strategically crucial final days” before an important vote. Many such messages are undoubtedly truthful, but there are several well-documented cases where paid political consultants have deliberately spread false information in service of some political objective. For example, Howard has documented the strategic use of disinformation by the Vote Leave campaign in the final weeks before the UK referendum on Brexit. 

It bears emphasis that disinformation does not have to be believed to erode the foundations of our democratic institutions. Disinformation “does not necessarily succeed by changing minds but by sowing confusion, undermining trust in information and institutions, and eroding shared reference points.” For democracy to function effectively, we need shared reference points. An authoritarian government can require citizens to wear masks and practice social distancing during a pandemic by instilling fear that leads to obedience. In a democratic society, by contrast, governments must persuade a large majority of citizens that scientific evidence demonstrates that wearing masks and practicing social distancing saves lives. Unfortunately, misinformation spread on social media undermines trust in both government and scientific authority. Without that trust, it becomes increasingly difficult for government leaders to build the consensus needed to formulate and implement effective policies to address pressing social problems, such as slowing the spread of a pandemic.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.