Skip to content
Search

Latest Stories

Top Stories

TikTok has become a hotbed of misinformation

TikTok
Smith Collection/Gado/Getty Images

In the last election cycle, Facebook and Twitter came under heavy criticism because they were used to spread misinformation and disinformation. But as those platforms have matured and others have surged to the forefront, researchers are now examining the negative influence of the newer players. Like TikTok.

The platform, which allows users to create and share short videos, has become tremendously popular, particularly among teens and young adults. It was the second most downloaded app during the first quarter of 2022, according to Forbes, and it has become the second most popular social media platform among teens this year, per the Pew Research Center.


And because TikTok is also eating into a big chunk of Google’s search dominance, it has become a significant source of misinformation.

Earlier this month, researchers at NewsGuard sampled TikTok search results on a variety of topics, covering the 2020 presidential election, the midterm elections, Covid-19, abortion and school shootings. They found that nearly 20 percent of the results demonstrated misinformation.

Emphasis theirs:

For example, the first result in a search for the phrase “Was the 2020 election stolen?” was a July 2022 video with the text “The Election Was Stolen!” The narrator stated that the “2020 election was overturned. President Trump should get the next two years and he should also be able to run for the next four years. Since he won the election, he deserves it.” (Election officials in all 50 states have affirmed the integrity of the election, and top officials in the Trump administration have dismissed claims of widespread fraud.)

Sign up for The Fulcrum newsletter

Of the first 20 videos in the search results, six contained misinformation (if not disinformation), including one that used a QAnon hashtag. The same search on Google did not result in web pages promoting misinformation.

Similarly, a search for “January 6 FBI” on TikTok returned eight videos containing misinformation among the top 20, including the top result. Again, Google did not have any misinformation in the top 20.

While Google will search the entire internet – from government websites to news to videos to recipes – a TikTok search will only return videos uploaded to the platform by its users.

TikTok does have a content moderation system and states in its guidelines that misinformation is not accepted. But users appear to have found ways around the AI system that serves as the first line of defense against misinformation.

“There is endless variety, and efforts to evade content moderation (as indicated in [NewsGuard’s] report) will always stay several steps ahead of the efforts by the platform,” said Cameron Hickey, project director for algorithmic transparency at the National Conference on Citizenship, when asked whether there is anything the platforms can do to prevent misinformation from surfacing in search results. “That doesn’t mean the answer is always no, but it means that concrete investment in both understanding what misinformation is out there, how people talk about it, and effectively judging both the validity and danger are a significant undertaking.”

While advocates encourage social media platforms to step up their anti-misiniformation efforts, there are other steps that can be taken at the user end, particularly by stepping up education about identifying falsehoods.

“Users on social media need greater media literacy skills in general, but a key focus should be on understanding why messages stick,” said Hickey.

He pointed to three reasons people latch onto misinformation:

  • Motivated reasoning: People want to find contentpeople statement that aligns with their beliefs and values.
  • Emotional appeals: Media consumers need to pause when they have an emotional response to some information and evaluate the cause of the reaction.
  • Easy answers: Be wary of any information that seems too good to be true.

Read More

Should States Regulate AI?

Rep. Jay Obernolte, R-CA, speaks at an AI conference on Capitol Hill with experts

Provided

Should States Regulate AI?

WASHINGTON —- As House Republicans voted Thursday to pass a 10-year moratorium on AI regulation by states, Rep. Jay Obernolte, R-CA, and AI experts said the measure would be necessary to ensure US dominance in the industry.

“We want to make sure that AI continues to be led by the United States of America, and we want to make sure that our economy and our society realizes the potential benefits of AI deployment,” Obernolte said.

Keep ReadingShow less
The AI Race We Need: For a Better Future, Not Against Another Nation

The concept of AI hovering among the public.

Getty Images, J Studios

The AI Race We Need: For a Better Future, Not Against Another Nation

The AI race that warrants the lion’s share of our attention and resources is not the one with China. Both superpowers should stop hurriedly pursuing AI advances for the sake of “beating” the other. We’ve seen such a race before. Both participants lose. The real race is against an unacceptable status quo: declining lifespans, increasing income inequality, intensifying climate chaos, and destabilizing politics. That status quo will drag on, absent the sorts of drastic improvements AI can bring about. AI may not solve those problems but it may accelerate our ability to improve collective well-being. That’s a race worth winning.

Geopolitical races have long sapped the U.S. of realizing a better future sooner. The U.S. squandered scarce resources and diverted talented staff to close the alleged missile gap with the USSR. President Dwight D. Eisenhower rightfully noted, “Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed.” He realized that every race comes at an immense cost. In this case, the country was “spending the sweat of its laborers, the genius of its scientists, the hopes of its children.”

Keep ReadingShow less
Closeup of Software engineering team engaged in problem-solving and code analysis

Closeup of Software engineering team engaged in problem-solving and code analysis.

Getty Images, MTStock Studio

AI Is Here. Our Laws Are Stuck in the Past.

Artificial intelligence (AI) promises a future once confined to science fiction: personalized medicine accounting for your specific condition, accelerated scientific discovery addressing the most difficult challenges, and reimagined public education designed around AI tutors suited to each student's learning style. We see glimpses of this potential on a daily basis. Yet, as AI capabilities surge forward at exponential speed, the laws and regulations meant to guide them remain anchored in the twentieth century (if not the nineteenth or eighteenth!). This isn't just inefficient; it's dangerously reckless.

For too long, our approach to governing new technologies, including AI, has been one of cautious incrementalism—trying to fit revolutionary tools into outdated frameworks. We debate how century-old privacy torts apply to vast AI training datasets, how liability rules designed for factory machines might cover autonomous systems, or how copyright law conceived for human authors handles AI-generated creations. We tinker around the edges, applying digital patches to analog laws.

Keep ReadingShow less
Nurturing the Next Generation of Journalists
man using MacBook Air

Nurturing the Next Generation of Journalists

“Student journalists are uniquely positioned to take on the challenges of complicating the narrative about how we see each other, putting forward new solutions to how we can work together and have dialogue across difference,” said Maxine Rich, the Program Manager with Common Ground USA. I had the chance to interview her earlier this year about Common Ground Journalism, a new initiative to support students reporting in contentious times.

A partnership with The Fulcrum and the Latino News Network (LNN), I joined Maxine and Nicole Donelan, Program Assistant with Common Ground USA, as co-instructor of the first Common Ground Journalism cohort, which ran for six weeks between January and March 2025.

Keep ReadingShow less