News. Debate. Community. Levers for a better democracy.
Voting
True
Drew Angerer/Getty Images

U.S. President Donald Trump speaks during a 'Make America Great Again' campaign rally on May 20, 2019 in Montoursville, Pennsylvania.

How disinformation could sway the 2020 election

Barrett is the deputy director of the Center for Business and Human Rights at the Stern School of Business and is an adjunct professor of law at New York University.

In 2016, Russian operatives used Facebook, Twitter and YouTube to sow division among American voters and boost Donald Trump's presidential campaign.


What the Russians used to accomplish this is called "disinformation," which is false or misleading content intended to deceive or promote discord. Now, with the first presidential primary vote only five months away, the public should be aware of the sources and types of online disinformation likely to surface during the 2020 election.

First, the Russians will be back. Don't be reassured by the notorious Russian Internet Research Agency's relatively negligible presence during last year's midterm elections. The agency might have been keeping its powder dry in anticipation of the 2020 presidential race. And it helped that U.S. Cyber Command, an arm of the military, reportedly blocked the agency's internet access for a few days right before the election in November 2018.

Temporarily shutting down the Internet Research Agency won't be enough to stop the flow of harmful content. Lee Foster, who leads the disinformation team at the cybersecurity firm FireEye, told me in an interview that the agency is "a small component of the overall Russian operation," which also includes Moscow's military intelligence service and possibly other organizations. Over time, Foster said, "All of these actors rework their approaches and tactics."

And there's more to fear than just the Russians. I'm the author of a new report on disinformation and the 2020 election published by the New York University Stern Center for Business and Human Rights. In the report, I predict that the Russians won't be alone in spreading disinformation in 2020. Their most likely imitator will be Iran, especially if hostility between Tehran and Washington continues to mount.

In May, acting on a tip from FireEye, Facebook took down nearly 100 Iranian-related accounts, pages and groups. The Iranian network had used fake American identities to espouse both conservative and liberal political views, while also promoting extremely divisive anti-Saudi, anti-Israel and pro-Palestinian themes.

As Senate Intelligence Committee co-chair Mark Warner, a Virginia Democrat, has said, " The Iranians are now following the Kremlin's playbook. "While foreign election interference has dominated discussion of disinformation, most intentionally false content targeting U.S. social media is generated by domestic sources.

I believe that will continue to be the case in 2020. President Trump often uses Twitter to circulate conspiracy theories and cast his foes as corrupt. One story line he pushes is that Facebook, Twitter and Google are colluding with Democrats to undermine him. Introducing a right-wing "social media summit" at the White House in July, he tweeted about the "tremendous dishonesty, bias, discrimination, and suppression practiced by certain companies."

Supporters of Democrats also have trafficked in disinformation. In December 2017, a group of liberal activists created fake Facebook pages designed to mislead conservative voters in a special U.S. Senate race in Alabama. Matt Osborne, who has acknowledged being involved in the Alabama scheme, told me that in 2020, "you're going to see a movement toward [political spending from undisclosed sources] on digital campaigns in the closing days of the race." He suggests there could be an effort to discourage Republicans from voting with "an image of a red wave with a triumphal statement that imbues them with a sense of inevitable victory: 'No need to bother voting. Trump has got it in the bag.'"

Also likely to surface next year: " deepfake" videos. This technique produces highly convincing – but false – images and audio. In a recent letter to the CEOs of Facebook, Google and Twitter, House Intelligence Committee Chairman Adam Schiff, a California Democrat, wrote: "A timely, convincing deepfake video of a candidate" that goes viral on a platform "could hijack a race – and even alter the course of history. … The consequences for our democracy could be devastating."

You Won't Believe What Obama Says In This Video! 😉 www.youtube.com


Instagram could be a vehicle for deepfakes. Owned by Facebook, the photo and video platform played a much bigger role in Russia's manipulation of the 2016 U.S. election than most people realize, and it could be exploited again in 2020. The Russian Internet Research Agency enjoyed more user engagement on Instagram than it did on any other platform, according to a December 2018 report commissioned by the Senate Intelligence Committee. "Instagram is likely to be a key battleground on an ongoing basis," the report added.

The social media companies are responding to the problem of disinformation by improving their artificial intelligence filters and hiring thousands of additional employees devoted to safety and security. "The companies are getting much better at detection and removal of fake accounts," Dipayan Ghosh, co-director of the Harvard Kennedy School's Platform Accountability Project, told me.

But the companies do not completely remove much of the content they pinpoint as false; they merely reduce how often it appears for users, and sometimes post a message noting that it's false.

In my view, provably false material should be eliminated from feeds and recommendations, with a copy retained in a cordoned-off archive available for research purposes to scholars, journalists and others.

Another problem is that responsibility for content decisions now tends to be scattered among different teams within each of the social media companies. Our report recommends that to streamline and centralize, each company should hire a senior official who reports to the CEO and is responsible for overseeing the fight against disinformation. Such executives could marshal resources more easily within each company and more effectively coordinate efforts across social media companies.

Finally, the platforms could also cooperate more than they currently do to stamp out disinformation. They've collaborated effectively to root out child pornography and terrorist incitement. I believe they now have a collective responsibility to rid the coming election of as much disinformation as possible. An electorate that has been fed lies about candidates and issues can't make informed decisions. Votes will be based on falsehoods. And that means the future of American democracy – in 2020 and beyond – depends on dealing effectively with disinformation.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation

News. Community. Debate. Levers for better democracy.

Sign up for The Fulcrum newsletter.

Justin Sullivan/Getty Images

Computer company Hewlett Packard received a perfect score from the index for its policies on political spending disclosure.

Big companies disclosing more could-be-secret political spending, analysis shows

An increasing number of the country's largest publicly traded companies are disclosing more than ever about political spending habits that the law permits them to keep secret.

That's the central finding of the fifth annual report from a group of academics and corporate ethicists, who say the average score among the biggest companies traded on American exchanges, the S&P 500, has gone up each year since 2014.

Though corporate political action committees must disclose their giving to candidates, those numbers are very often dwarfed by the donations businesses make to the trade associations and other outside groups that have driven so much of the steady rise in spending on elections. Conservatives say robust disclosure of these behaviors is the best form of regulating money in politics and is working fine, and this new report reflects that. Those who say campaign finance needs more assertive federal regulation will argue such corporate transparency is inconsistent and inadequate to the task, and the new report underscores that.

Keep reading... Show less
Tasos Katopodis/Getty Images

Sen. Ron Wyden of Oregon is joined by fellow Democratic members of the House and Senate this summer to discuss legislation that would attempt to prevent hacking into the country's election systems. Intelligence officials announced late last week an outline for how to release information about possible hacks during the 2020 elections.

Intel community promising more transparency about election hacking efforts

A year from the presidential election, U.S. intelligence agencies have adopted a new framework for how they will inform candidates, groups and the public about attempts to disrupt our country's elections by foreign operatives.

But the one-page summary of the plan, released late last week, is so general that it remains unclear what the intelligence community plans to do if and when it discovers something suspicious.

The summary by the director of national intelligence states that the federal government will "follow a process and principles designed to ensure, to the greatest extent possible, that notification decisions are consistent, well-informed and unbiased."

The new framework is designed to prevent a repeat of some of what happened after the 2016 election.

Keep reading... Show less