Skip to content
Search

Latest Stories

Top Stories

More than 100 groups demand social media platforms do more to fight election disinformation

online disinformation
tommy/Getty Images

Two days after Elon Musk said he would lift Twitter’s permanent ban on Donald Trump if his acquisition goes through, more than 100 organizations called on social media companies to combat disinformation during the 2022 midterm elections.

In a letter sent to the CEOs of the biggest social media companies on Thursday, the leaders of civil rights and democracy reform groups requested the platforms take a series of steps, including “introducing friction to reduce the spread and amplification of disinformation, consistent enforcement of robust civic integrity policies; and greater transparency into business models that allow disinformation to spread.”

The letter – whose signatories include Common Cause, Leadership Conference on Civil and Human Rights, Campaign Legal Center, the League of Women Voters and the NAACP – praised the companies for instituting plans to combat disinformation while demanding the platforms do more and do it consistently.


Meanwhile, Democratic Sen. Michael Bennet intends to offer a bill Thursday to establish a federal commission to regulate tech companies. According to The Washington Post, the bill would give the government authority to review platforms’ algorithms and to create rules governing content moderation.

“We need an agency with expertise to have a thoughtful approach here,” Bennet told the Post.

But for now, the companies receiving the letter (Facebook, Google, TikTok, Snap, YouTube, Twitter and Instagram) make their own rules. And in the wake of the Jan. 6, 2021, Capitol insurrection fueled by unfounded claims that Joe Biden stole the presidential election from Donald Trump, the groups behind the letter fear further damage to democratic institutions.

“Disinformation related to the 2020 election has not gone away but has only continued to proliferate. In fact, according to recent polls, more than 40 percent of Americans still do not believe President Biden legitimately won the 2020 presidential election. Further, fewer Americans have confidence in elections today than they did in the immediate aftermath of the January 6th insurrection,” they wrote.

The letter lays out eight steps the companies can take to stop the spread of disinformation:

  • Limit the opportunities for users to interact with election disinformation, going beyond the warning labels that have been introduced.
  • Devote more resources to blocking disinformation that targets people who do not speak English.
  • Consistently enforce “civic integrity policies” during election and non-election years.
  • Apply those policies to live content.
  • Prioritize efforts to stop the spread of unfounded voter fraud claims, known as the “Big Lie.”
  • Increase fact-checking of election content, including political advertisements and statements from public officials.
  • Allow outside researchers and watchdogs access to social media data.
  • Increase transparency of internal policies, political ads and algorithms.

“The last presidential election, and the lies that continued to flourish in its wake on social media, demonstrated the dire threat that election disinformation poses to our democracy,” said Yosef Getachew, director of the media and democracy program for Common Cause. “Social media companies must learn from what was unleashed on their platforms in 2020 and helped foster the lies that led a violent, racist mob to storm the Capitol on January 6th. The companies must take concrete steps to prepare their platforms for the coming onslaught of disinformation in the midterm elections. These social media giants must implement meaningful reforms to prevent and reduce the spread of election disinformation while safeguarding our democracy and protecting public safety.”

Read More

A person on using a smartphone.

With millions of child abuse images reported annually and AI creating new dangers, advocates are calling for accountability from Big Tech and stronger laws to keep kids safe online.

Getty Images, ljubaphoto

Parents: It’s Time To Get Mad About Online Child Sexual Abuse

Forty-five years ago this month, Mothers Against Drunk Driving had its first national press conference, and a global movement to stop impaired driving was born. MADD was founded by Candace Lightner after her 13-year-old daughter was struck and killed by a drunk driver while walking to a church carnival in 1980. Terms like “designated driver” and the slogan “Friends don’t let friends drive drunk” came out of MADD’s campaigning, and a variety of state and federal laws, like a lowered blood alcohol limit and legal drinking age, were instituted thanks to their advocacy. Over time, social norms evolved, and driving drunk was no longer seen as a “folk crime,” but a serious, conscious choice with serious consequences.

Movements like this one, started by fed-up, grieving parents working with law enforcement and law makers, worked to lower road fatalities nationwide, inspire similar campaigns in other countries, and saved countless lives.

Keep ReadingShow less
King, Pope, Jedi, Superman: Trump’s Social Media Images Exclusively Target His Base and Try To Blur Political Reality

Two Instagram images put out by the White House.

White House Instagram

King, Pope, Jedi, Superman: Trump’s Social Media Images Exclusively Target His Base and Try To Blur Political Reality

A grim-faced President Donald J. Trump looks out at the reader, under the headline “LAW AND ORDER.” Graffiti pictured in the corner of the White House Facebook post reads “Death to ICE.” Beneath that, a photo of protesters, choking on tear gas. And underneath it all, a smaller headline: “President Trump Deploys 2,000 National Guard After ICE Agents Attacked, No Mercy for Lawless Riots and Looters.”

The official communication from the White House appeared on Facebook in June 2025, after Trump sent in troops to quell protests against Immigration and Customs Enforcement agents in Los Angeles. Visually, it is melodramatic, almost campy, resembling a TV promotion.

Keep ReadingShow less
When the Lights Go Out — and When They Never Do
a person standing in a doorway with a light coming through it

When the Lights Go Out — and When They Never Do

The massive outage that crippled Amazon Web Services this past October 20th sent shockwaves through the digital world. Overnight, the invisible backbone of our online lives buckled: Websites went dark, apps froze, transactions stalled, and billions of dollars in productivity and trust evaporated. For a few hours, the modern economy’s nervous system failed. And in that silence, something was revealed — how utterly dependent we have become on a single corporate infrastructure to keep our civilization’s pulse steady.

When Amazon sneezes, the world catches a fever. That is not a mark of efficiency or innovation. It is evidence of recklessness. For years, business leaders have mocked antitrust reformers like FTC Chair Lina Khan, dismissing warnings about the dangers of monopoly concentration as outdated paranoia. But the AWS outage was not a cyberattack or an act of God — it was simply the predictable outcome of a world that has traded resilience for convenience, diversity for cost-cutting, and independence for “efficiency.” Executives who proudly tout their “risk management frameworks” now find themselves helpless before a single vendor’s internal failure.

Keep ReadingShow less
Fear of AI Makes for Bad Policy
Getty Images

Fear of AI Makes for Bad Policy

Fear is the worst possible response to AI. Actions taken out of fear are rarely a good thing, especially when it comes to emerging technology. Empirically-driven scrutiny, on the other hand, is a savvy and necessary reaction to technologies like AI that introduce great benefits and harms. The difference is allowing emotions to drive policy rather than ongoing and rigorous evaluation.

A few reminders of tech policy gone wrong, due, at least in part, to fear, helps make this point clear. Fear is what has led the US to become a laggard in nuclear energy, while many of our allies and adversaries enjoy cheaper, more reliable energy. Fear is what explains opposition to autonomous vehicles in some communities, while human drivers are responsible for 120 deaths per day, as of 2022. Fear is what sustains delays in making drones more broadly available, even though many other countries are tackling issues like rural access to key medicine via drones.

Keep ReadingShow less