Skip to content
Search

Latest Stories

Top Stories

Report suggests plan for limiting election disinformation

Cartoon of people trying to shut out noise
Rudzhan Nagiev/Getty Images

Eight months after Inauguration Day, one-third of Americans told pollsters they still believed Donald Trump actually won the election and that Joe Biden stole it away from the incumbent. A new report offers a mix of government and corporate reforms to limit the spread and influence of such election disinformation.

The Common Cause Education Fund, an affiliate of the democracy reform advocacy group Common Cause, issued a report in late October reviewing the state of disinformation campaigns and a series of recommendations designed to stem the tide.

"Just as we came together last year, rising up to vote safely and securely in record numbers during a global pandemic, we must now rise up to stop election disinformation efforts in future elections," the researchers wrote.


The report groups its 14 recommendations in three categories: statutory reforms, executive and regulatory agency reforms, and corporate policy reforms for social media businesses.

"There is no single policy solution to the problem of election disinformation," according to the report. "We need strong voting rights laws, strong campaign finance laws, strong communications and privacy laws, strong media literacy laws, and strong corporate civic integrity policies."

While many of the solutions require some mix of legislative activity, increased civic education and media literacy, and grassroots advocacy, others are easier to achieve — particularly self-imposed corporate reforms, said Jesse Littlewood, vice president of campaigns for Common Cause. For example, he suggested it would not be complicated for social media platforms to consistently enforce their own standards.

"They're not taking any action on disinformation in the 2020 campaign," said Littlewood, referring to ongoing claims that the election was stolen, claims that would have been addressed last year. "When you let your enforcement lax, it allows false narratives to grow."

Littlewood also identified the need to spend more time on civic integrity.

"We learned a lot through the Facebook Papers about the disbanding of the civic team right after the 2020 election, and the historic underinvestment in content moderation particularly in civic integrity issues," he said.

The statutory recommendations focus on five areas:

  • Voter intimidation and false election speech, including state and federal legislation prohibiting the spread of election disinformation.
  • Campaign finance reforms, such as passing federal and state disclosure laws to expose "dark money" and strengthening the Federal Elections Commission.
  • Passing media literacy legislation at the state level.
  • Enacting state privacy laws that include civil rights protections.
  • Approving federal legislation to curb some online business practices, such as banning discriminatory algorithms, limiting and protecting the data collected online, and supporting local and watchdog journalism.

Some aspects of these proposals already exist in federal legislation that has stalled in Congress.

The regulatory recommendations fall into four buckets:

  • Demonstrating state and federal leadership through executive action to stop the spread of election disinformation.
  • Stepping up enforcement of state and federal laws that ban voter intimidation and other election interference efforts.
  • Empowering the Federal Trade Commission to step up its privacy protection work.
  • Use the FEC and state agencies to update and enforce disclosure requirements and rules against disinformation.

Finally, Common Cause suggests five areas of improvement for social media corporations:

  • Directing users to official state and local sources of information about voting and elections.
  • Maintaining and improving their self-imposed disinformation rules, throughout election and non-election years.
  • Developing technology, such as artificial intelligence and algorithms that limit the spread of disinformation.
  • Granting journalists and researchers more access to social media data.
  • Increasing investment in efforts to stop non-English disinformation.

Littlewood said access to the data is one of the most important recommendations, as it influences the potential to achieve others.

"It's going to be hard to make progress from a regulatory process if there isn't transparency," said Littlewood. "We're trending in the other direction now, which is really problematic.

"Without access to the data, it's very hard to understand what's happening. It's very difficult to come up with recommendations that balance the private interests of the platform and the public interest. That's got to be our starting point."

Read the full report.

Read More

“There is a real public hunger for accurate, local, fact-based information”

Monica Campbell

Credit Ximena Natera

“There is a real public hunger for accurate, local, fact-based information”

At a time when democracy feels fragile and newsrooms are shrinking, Monica Campbell has spent her career asking how journalism can still serve the public good. She is Director of the California Local News Fellowship at the University of California, Berkeley, and a former editor at The Washington Post and The World. Her work has focused on press freedom, disinformation, and the civic role of journalism. In this conversation, she reflects on the state of free press in the United States, what she learned reporting in Latin America, and what still gives her hope for the future of the profession.

You have worked in both international and U.S. journalism for decades. How would you describe the current state of press freedom in the United States?

Keep ReadingShow less
Person on a smartphone.

The digital public square rewards outrage over empathy. To save democracy, we must redesign our online spaces to prioritize dialogue, trust, and civility.

Getty Images, Tiwaporn Khemwatcharalerd

Rebuilding Civic Trust in the Age of Algorithmic Division

A headline about a new education policy flashes across a news-aggregation app. Within minutes, the comment section fills: one reader suggests the proposal has merit; a dozen others pounce. Words like idiot, sheep, and propaganda fly faster than the article loads. No one asks what the commenter meant. The thread scrolls on—another small fire in a forest already smoldering.

It’s a small scene, but it captures something larger: how the public square has turned reactive by design. The digital environments where citizens now meet were built to reward intensity, not inquiry. Each click, share, and outrage serves an invisible metric that prizes attention over understanding.

Keep ReadingShow less
A woman typing on her laptop.

Pop-ups on federal websites blaming Democrats for the shutdown spark Hatch Act concerns, raising questions about neutrality in government communications.

Getty Images, Igor Suka

When Federal Websites Get Political: The Hatch Act in the Digital Age

As the federal government entered a shutdown on October 1st, a new controversy emerged over how federal agencies communicate during political standoffs. Pop-ups and banners appeared on agency websites blaming one side of Congress for the funding lapse, prompting questions about whether such messaging violated federal rules meant to keep government communications neutral. The episode has drawn bipartisan concern and renewed scrutiny of the Hatch Act, a 1939 law that governs political activity in federal workplaces.

The Shutdown and Federal Website Pop-ups

The government shutdown began after negotiations over the federal budget collapsed. Republicans, who control both chambers of Congress, needed Democratic support in the Senate to pass a series of funding bills, or Continuing Resolutions, but failed to reach an agreement before the deadline. In the hours before the shutdown took effect, the Department of Housing and Urban Development, or HUD, posted a full-screen red banner stating, “The Radical Left in Congress shut down the government. HUD will use available resources to help Americans in need.” Users could not access the website until clicking through the message.

Keep ReadingShow less
Congress Must Lead On AI While It Still Can
a computer chip with the letter a on top of it
Photo by Igor Omilaev on Unsplash

Congress Must Lead On AI While It Still Can

Last month, Matthew and Maria Raine testified before Congress, describing how their 16-year-old son confided suicidal thoughts to AI chatbots, only to be met with validation, encouragement, and even help drafting a suicide note. The Raines are among multiple families who have recently filed lawsuits alleging that AI chatbots were responsible for their children’s suicides. Their deaths, now at the center of lawsuits against AI companies, underscore a similar argument playing out in federal courts: artificial intelligence is no longer an abstraction of the future; it is already shaping life and death.

And these teens are not outliers. According to Common Sense Media, a nonprofit dedicated to improving the lives of kids and families, 72 percent of teenagers report using AI companions, often relying on them for emotional support. This dependence is developing far ahead of any emerging national safety standard.

Keep ReadingShow less