Skip to content
Search

Latest Stories

Top Stories

More than 100 groups demand social media platforms do more to fight election disinformation

online disinformation
tommy/Getty Images

Two days after Elon Musk said he would lift Twitter’s permanent ban on Donald Trump if his acquisition goes through, more than 100 organizations called on social media companies to combat disinformation during the 2022 midterm elections.

In a letter sent to the CEOs of the biggest social media companies on Thursday, the leaders of civil rights and democracy reform groups requested the platforms take a series of steps, including “introducing friction to reduce the spread and amplification of disinformation, consistent enforcement of robust civic integrity policies; and greater transparency into business models that allow disinformation to spread.”

The letter – whose signatories include Common Cause, Leadership Conference on Civil and Human Rights, Campaign Legal Center, the League of Women Voters and the NAACP – praised the companies for instituting plans to combat disinformation while demanding the platforms do more and do it consistently.


Meanwhile, Democratic Sen. Michael Bennet intends to offer a bill Thursday to establish a federal commission to regulate tech companies. According to The Washington Post, the bill would give the government authority to review platforms’ algorithms and to create rules governing content moderation.

“We need an agency with expertise to have a thoughtful approach here,” Bennet told the Post.

But for now, the companies receiving the letter (Facebook, Google, TikTok, Snap, YouTube, Twitter and Instagram) make their own rules. And in the wake of the Jan. 6, 2021, Capitol insurrection fueled by unfounded claims that Joe Biden stole the presidential election from Donald Trump, the groups behind the letter fear further damage to democratic institutions.

“Disinformation related to the 2020 election has not gone away but has only continued to proliferate. In fact, according to recent polls, more than 40 percent of Americans still do not believe President Biden legitimately won the 2020 presidential election. Further, fewer Americans have confidence in elections today than they did in the immediate aftermath of the January 6th insurrection,” they wrote.

The letter lays out eight steps the companies can take to stop the spread of disinformation:

  • Limit the opportunities for users to interact with election disinformation, going beyond the warning labels that have been introduced.
  • Devote more resources to blocking disinformation that targets people who do not speak English.
  • Consistently enforce “civic integrity policies” during election and non-election years.
  • Apply those policies to live content.
  • Prioritize efforts to stop the spread of unfounded voter fraud claims, known as the “Big Lie.”
  • Increase fact-checking of election content, including political advertisements and statements from public officials.
  • Allow outside researchers and watchdogs access to social media data.
  • Increase transparency of internal policies, political ads and algorithms.

“The last presidential election, and the lies that continued to flourish in its wake on social media, demonstrated the dire threat that election disinformation poses to our democracy,” said Yosef Getachew, director of the media and democracy program for Common Cause. “Social media companies must learn from what was unleashed on their platforms in 2020 and helped foster the lies that led a violent, racist mob to storm the Capitol on January 6th. The companies must take concrete steps to prepare their platforms for the coming onslaught of disinformation in the midterm elections. These social media giants must implement meaningful reforms to prevent and reduce the spread of election disinformation while safeguarding our democracy and protecting public safety.”

Read More

Microchip labeled "AI"
Preparing for an inevitable AI emergency
Eugene Mymrin/Getty Images

Nvidia and AMD’s China Chip Deal Sets Dangerous Precedent in U.S. Industrial Policy

This morning’s announcement that Nvidia and AMD will resume selling AI chips to China on the condition that they surrender 15% of their revenue from those sales to the U.S. government marks a jarring inflection point in American industrial policy.

This is not just a transaction workaround for a particular situation. This is a major philosophical government policy shift.

Keep ReadingShow less
Doctor using AI technology
Akarapong Chairean/Getty Images

Generative AI Can Save Lives: Two Diverging Paths In Medicine

Generative AI is advancing at breakneck speed. Already, it’s outperforming doctors on national medical exams and in making difficult diagnoses. Microsoft recently reported that its latest AI system correctly diagnosed complex medical cases 85.5% of the time, compared to just 20% for physicians. OpenAI’s newly released GPT-5 model goes further still, delivering its most accurate and responsive performance yet on health-related queries.

As GenAI tools double in power annually, two distinct approaches are emerging for how they might help patients.

Keep ReadingShow less
Avoiding Policy Malpractice in the Age of AI

"The stakes of AI policymaking are too high and the risks of getting it wrong are too enduring for lawmakers to legislate on instinct alone," explains Kevin Frazier.

Getty Images, Aitor Diago

Avoiding Policy Malpractice in the Age of AI

Nature abhors a vacuum, rushing to fill it often chaotically. Policymakers, similarly, dislike a regulatory void. The urge to fill it with new laws is strong, frequently leading to shortsighted legislation. There's a common, if flawed, belief that "any law is better than no law." This action bias—our predisposition to do something rather than nothing—might be forgivable in some contexts, but not when it comes to artificial intelligence.

Regardless of one's stance on AI regulation, we should all agree that only effective policy deserves to stay on the books. The consequences of missteps in AI policy at this early stage are too severe to entrench poorly designed proposals into law. Once enacted, laws tend to persist. We even have a term for them: zombie laws. These are "statutes, regulations, and judicial precedents that continue to apply after their underlying economic and legal bases dissipate," as defined by Professor Joshua Macey.

Keep ReadingShow less