How disinformation could sway the 2020 election

While foreign election interference has dominated discussion of disinformation, most intentionally false content targeting U.S. social media is generated by domestic sources.

242
SOURCEThe Conversation

In 2016, Russian operatives used Facebook, Twitter and YouTube to sow division among American voters and boost Donald Trump’s presidential campaign.

What the Russians used to accomplish this is called “disinformation,” which is false or misleading content intended to deceive or promote discord. Now, with the first presidential primary vote only five months away, the public should be aware of the sources and types of online disinformation likely to surface during the 2020 election.

First, the Russians will be back. Don’t be reassured by the notorious Russian Internet Research Agency’s relatively negligible presence during last year’s midterm elections. The agency might have been keeping its powder dry in anticipation of the 2020 presidential race. And it helped that U.S. Cyber Command, an arm of the military, reportedly blocked the agency’s internet access for a few days right before the election in November 2018.

Temporarily shutting down the Internet Research Agency won’t be enough to stop the flow of harmful content. Lee Foster, who leads the disinformation team at the cybersecurity firm FireEye, told me in an interview that the agency is “a small component of the overall Russian operation,” which also includes Moscow’s military intelligence service and possibly other organizations. Over time, Foster said, “All of these actors rework their approaches and tactics.”

And there’s more to fear than just the Russians. I’m the author of a new report on disinformation and the 2020 election published by the New York University Stern Center for Business and Human Rights. In the report, I predict that the Russians won’t be alone in spreading disinformation in 2020. Their most likely imitator will be Iran, especially if hostility between Tehran and Washington continues to mount.

Disinformation isn’t just Russian

In May, acting on a tip from FireEye, Facebook took down nearly 100 Iranian-related accounts, pages and groups. The Iranian network had used fake American identities to espouse both conservative and liberal political views, while also promoting extremely divisive anti-Saudi, anti-Israel and pro-Palestinian themes.

As Senate Intelligence Committee co-chair Mark Warner, a Virginia Democrat, has said, “The Iranians are now following the Kremlin’s playbook.”

While foreign election interference has dominated discussion of disinformation, most intentionally false content targeting U.S. social media is generated by domestic sources.

I believe that will continue to be the case in 2020. President Trump often uses Twitter to circulate conspiracy theories and cast his foes as corrupt. One story line he pushes is that Facebook, Twitter and Google are colluding with Democrats to undermine him. Introducing a right-wing “social media summit” at the White House in July, he tweeted about the “tremendous dishonesty, bias, discrimination, and suppression practiced by certain companies.”

Supporters of Democrats also have trafficked in disinformation. In December 2017, a group of liberal activists created fake Facebook pages designed to mislead conservative voters in a special U.S. Senate race in Alabama. Matt Osborne, who has acknowledged being involved in the Alabama scheme, told me that in 2020, “you’re going to see a movement toward [political spending from undisclosed sources] on digital campaigns in the closing days of the race.” He suggests there could be an effort to discourage Republicans from voting with “an image of a red wave with a triumphal statement that imbues them with a sense of inevitable victory: ‘No need to bother voting. Trump has got it in the bag.’”

Spreading fake videos

Also likely to surface next year: “deepfake” videos. This technique produces highly convincing – but false – images and audio. In a recent letter to the CEOs of Facebook, Google and Twitter, House Intelligence Committee Chairman Adam Schiff, a California Democrat, wrote: “A timely, convincing deepfake video of a candidate” that goes viral on a platform “could hijack a race – and even alter the course of history. … The consequences for our democracy could be devastating.”

Just one example of a deepfake video.

Instagram could be a vehicle for deepfakes. Owned by Facebook, the photo and video platform played a much bigger role in Russia’s manipulation of the 2016 U.S. election than most people realize, and it could be exploited again in 2020. The Russian Internet Research Agency enjoyed more user engagement on Instagram than it did on any other platform, according to a December 2018 report commissioned by the Senate Intelligence Committee. “Instagram is likely to be a key battleground on an ongoing basis,” the report added.

Companies could step up

The social media companies are responding to the problem of disinformation by improving their artificial intelligence filters and hiring thousands of additional employees devoted to safety and security. “The companies are getting much better at detection and removal of fake accounts,” Dipayan Ghosh, co-director of the Harvard Kennedy School’s Platform Accountability Project, told me.

But the companies do not completely remove much of the content they pinpoint as false; they merely reduce how often it appears for users, and sometimes post a message noting that it’s false.

In my view, provably false material should be eliminated from feeds and recommendations, with a copy retained in a cordoned-off archive available for research purposes to scholars, journalists and others.

Another problem is that responsibility for content decisions now tends to be scattered among different teams within each of the social media companies. Our report recommends that to streamline and centralize, each company should hire a senior official who reports to the CEO and is responsible for overseeing the fight against disinformation. Such executives could marshal resources more easily within each company and more effectively coordinate efforts across social media companies.

Finally, the platforms could also cooperate more than they currently do to stamp out disinformation. They’ve collaborated effectively to root out child pornography and terrorist incitement. I believe they now have a collective responsibility to rid the coming election of as much disinformation as possible. An electorate that has been fed lies about candidates and issues can’t make informed decisions. Votes will be based on falsehoods. And that means the future of American democracy – in 2020 and beyond – depends on dealing effectively with disinformation.

FALL FUNDRAISER

If you liked this article, please donate $5 to keep NationofChange online through November.

SHARE
Previous articleExit John Bolton, but will that mean an end to his failed foreign policy?
Next articleThe Pompeo Doctrine
Paul Barrett joined the Center as deputy director in September 2017 after spending more than three decades as a journalist and author focusing on the intersection of business, law, and society. Most recently, Paul worked for 12 years for Bloomberg Businessweek magazine, where he served at different times as the editor of an award-winning investigative team and a writer covering topics such as energy and the environment, military procurement, and the civilian firearm industry. From 1986 to 2005, he wrote for The Wall Street Journal, serving as the newspaper’s Supreme Court correspondent and later as the page one special projects editor. Paul is the author of four critically acclaimed nonfiction books, the most recent of which are GLOCK: The Rise of America’s Gun (2012), a New York Times Bestseller, and LAW OF THE JUNGLE: The $19 Billion Legal Battle Over Oil in the Rain Forest and the Lawyer Who’d Stop at Nothing to Win (2014). Both of those books have been optioned for Hollywood movies. Since 2008, Paul has served as an adjunct professor at New York University School of Law. He co-teaches a seminar called “Law, Economics, and Journalism,” in which students learn to analyze social issues with the tools of those three professions. Paul has a J.D. from Harvard Law School and an A.B. from Harvard College.

COMMENTS