Hitting the Books: Modern social media has made misinformation so, so much worse

IIt’s not just that one uncle who’s banned from attending Thanksgiving that’s spreading misinformation online. The practice began long before the rise of social media — governments around the world have been doing it for centuries. But it is only in the modern era, fueled by algorithmic recommendation engines designed to infinitely increase engagement, that nation-states have managed to weaponize disinformation to such a high degree. In his new book Bullies on Twitter: Protecting Democracies from Information WarfareSanta Clara University law professor David Sloss examines how social media sites like Facebook, Instagram and TikTok have become platforms for political operations that have very real and very dire consequences for democracy, and makes the case for governments to unite in creating a global framework to regulate these networks and protect against information warfare.

Tyrants on Twitter cover art

David Castle

excerpt from Bullies on Twitter: Protecting Democracies from Information Warfare, by David L. Sloss, published by Stanford University Press, ©2022 by the Board of Trustees of the Leland Stanford Junior University. All rights reserved.

Governments practiced disinformation long before the advent of social media. However, social media is accelerating the spread of false information by enabling people to reach large audiences at low cost. Social media is accelerating the spread of both misinformation and disinformation. “Misinformation” includes any false or misleading information. “Disinformation” is false or misleading information intentionally created or strategically placed to achieve a political objective.

The political goals of a disinformation campaign can be both domestic and foreign. Earlier chapters focused on foreign policy. Let’s consider domestic disinformation campaigns here. The “Pizzagate” story is a good example. In the fall of 2016, a Twitter post claimed Hillary Clinton was “the protagonist of an international child slavery and sex ring.” The story quickly spread on social media, leading to the creation of a discussion forum on Reddit titled Pizzagate. As various contributors embellished the story, they identified a particular Washington, DC pizza joint, Comet Ping Pong, as the base of operations for the child sex operation. “These bizarre and unproven allegations soon spread beyond the dark underbelly of the internet to fairly mainstream media outlets like the Drudge Report and Infowars.” Alex Jones, the creator of Infowars, “has more than 2 million followers on YouTube and 730,000 followers on Twitter; by spreading the rumours, Jones increased his reach enormously.” (Jones has since been banned from most major social media platforms.) Ultimately, a young man who believed the story came out with “an AR-15 semi-automatic rifle… and opened fire at Comet Ping Pong and discharged several rounds”. Although the story was debunked, “pollsters found that more than a quarter of the adults surveyed were either certain that Clinton was connected to the child sex ring or that some part of the story must have been true.”

Several characteristics of the current information environment accelerate the spread of misinformation. Before the rise of the internet, major media outlets like CBS and the New York Times had the ability to spread stories to millions of people. However, they were usually bound by professional standards of journalistic ethics to avoid deliberately spreading false stories. They were far from perfect, but they did help prevent the widespread spread of false information. The internet effectively eliminated the filtering function of large media organizations and allowed anyone with a social media account — and a basic working knowledge of how news goes viral on social media — to spread misinformation to very large audiences very quickly.

The digital age has spawned automated accounts known as “bots”. A bot is “a software tool that performs specific actions on computers that are connected in a network without human user intervention”. Political activists with a moderate level of technical sophistication can use bots to speed up the spread of news on social media. Additionally, social media platforms facilitate the use of microtargeting: “the process of preparing and delivering tailored messages to voters or consumers.” In the summer of 2017, political activists in the UK built a bot to spread messages on Tinder, a dating app aimed at attracting new supporters to the Labor Party. “The bot accounts sent out between thirty thousand and forty thousand messages in total, targeting eighteen to twenty-five year olds in constituencies where Labor candidates needed help.” few votes. Celebrating their victory via Twitter, the campaign managers thanked … their team of bots.” There is no evidence in this case that the bots spread false information, but unethical political agents can also use bots and microtargeting to spread false news quickly spread social media.

Over the past two decades, we’ve seen the growth of an entire industry of paid political consultants who have developed expertise in using social media to influence political outcomes. An example of this is the Polish company discussed earlier in this chapter. Philip Howard, a leading expert on misinformation, claims: “It’s safe to say that every country in the world has a domestic policy consultancy that specializes in marketing political misinformation.” have accumulated vast amounts of information about individuals by collecting data from a variety of sources, including social media platforms, and aggregating that information into proprietary databases. The data mining industry “provides the information campaign managers need to make strategic decisions about who to target, where, when, with what message, and through what device and platform.”

Political consulting firms are using both bots and human-powered “fake accounts” to spread messages across social media. (A “fake account” is a social media account operated by someone impersonating a false identity in order to mislead other social media users as to the identity of the person operating the account .) They use data from the data mining industry and the technical characteristics of social media platforms to engage in very sophisticated microtargeting and send customized messages to select constituencies to shape public opinion and/or political outcomes to influence. “Social media algorithms allow for constant testing and refinement of campaign messages, allowing the most advanced behavioral science techniques to sharpen the message in time for those strategically crucial “final days” before an important vote. Many such messages are undoubtedly true, but there are several well-documented instances where paid political advisers have deliberately disseminated false information in the service of political ends. For example, Howard has documented the strategic use of disinformation by the Vote Leave campaign in the weeks leading up to the UK’s Brexit referendum.

It stresses that one need not believe that disinformation undermines the very foundations of our democratic institutions. Disinformation “does not necessarily occur through rethinking, but by creating confusion, undermining trust in information and institutions, and undermining common points of reference.” For democracy to work effectively, we need common reference points. An authoritarian government can require citizens to wear masks and practice social distancing during a pandemic, instilling fear that leads to obedience. In contrast, in a democratic society, governments must convince a large majority of citizens that scientific evidence shows that wearing masks and practicing social distancing saves lives. Unfortunately, misinformation circulating on social media undermines trust in both government and scientific authority. Without that trust, it becomes increasingly difficult for leaders to build the consensus needed to formulate and implement effective policies to address pressing social issues like slowing the spread of a pandemic.

All products recommended by Engadget are selected by our editorial team independently from our parent company. Some of our stories contain affiliate links. If you buy something through one of these links, we may receive an affiliate commission.

https://www.engadget.com/hitting-the-books-tyrants-on-twitter-david-sloss-stanford-university-press-150043542.html?src=rss Hitting the Books: Modern social media has made misinformation so, so much worse

Russell Falcon

USTimesPost.com is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@ustimespost.com. The content will be deleted within 24 hours.

Related Articles

Back to top button