Skip to content
Search

Latest Stories

Top Stories

How can social media better inform young users?

Social media apps on a phone
Jonathan Raa/NurPhoto via Getty Images

Downey is an intern for The Fulcrum and a graduate student at Northwestern's Medill School of Journalism.

Social media platforms’ algorithms are tailored to promote content that excites. Even if the viral video of a politician yelling at a constituent was taken out of context or even artificially generated, it doesn’t matter. If it evokes an emotional response it is more likely to show up on other users’ feeds, meaning more views, more likes, more comments and more shares.

Half of the 18-29 year olds in the United States said they had “some or a lot of trust in the information they get from social media sites,” according to one 2022 Pew Research Center study. But if the information they are seeing on these platforms is inaccurate or entirely fabricated, there is the risk that young people — the biggest consumers of social media content — will fall victim to false information.


“The algorithm really does not care about what’s true or what’s helpful or what’s civically engaged. It cares about keeping you entertained and participating,” said Ethan Zuckerman, director of the Initiative for Digital Public Infrastructure at the University of Massachusetts at Amherst.

The responsibility to combat the rise of misinformation on these platforms does not fall to users or social media companies or policymakers, according to Michael Best, a professor of international affairs and interactive computing at Georgia Tech University — it has to be a group effort.

Sign up for The Fulcrum newsletter

“At this point it’s an all-hands-on-deck kind of challenge because it’s so significant and pervasive. So I would not say that one piece of the equation can fully respond to the challenge,” he said.

The first piece of the equation are the users, according to Best. They should have a personal responsibility to develop what he called their “media consumption literacy.”

“Don’t just trust ‘randos’ because they get a lot of likes or attention,” he said.

Social platforms’ business models are focused on engagement. So those “randos” who have a ton of views are often the result of the algorithm “tipping the scales toward the most exciting content,” Best added. “And exciting, again, often privileges content of concern.”

He and other experts suggest that the best way to counteract the threat of spreading or consuming misinformation is to fact-check content from unknown users or even what’s shared by the people you follow. Comparing social media content to coverage from more traditional news sources that have proven their legitimacy overtime is one of the easiest ways to do this, experts said.

“In general, it’s always good to get multiple points of view. And I think that’s true for news as it is for anything else, so I would be worried about anyone who’s just getting their news on TikTok,” Zuckerman said.

Social media consumers used to find more mainstream news on certain social media platforms.

In the past, Facebook’s referral traffic used to drive many users to news outlets’ websites, but in recent years Facebook and Meta’s other platforms like Instagram have moved away from news and politics. In a Thread last summer, Instagram head Adam Mosseri shared that the negativity associated with the news and politics “is not at all worth the scrutiny.” Just last month, Facebook removed its Facebook News tab in the United States, signifying a massive shift away from news and political content.

X, formerly known as Twitter, also changed how it shares news once Elon Musk took ownership in October 2022. Last year, he announced that X would stop displaying the headlines on links to news articles because he believed “it will greatly improve the esthetics” of tweets, but after complaints from users, headlines were restored earlier this year (although much smaller in size).

While news consumption on other social media sites has declined or stayed stagnant, the share of U.S. TikTok users who get their news on the platform has doubled since 2020. Nearly half of users said “they regularly get news there,” a Pew Research Center study released last month found.

Like consumers of any content, especially algorithmic content, users should “consider diversifying your diet so that you’re getting a wide variety of stuff,” Zuckerman said. But supplementing social media with more traditional media is only one piece of the puzzle. Social media companies themselves have a large responsibility to ensure that the information on their platforms is not spreading dis-, mis- and malinformation.

Social media companies have adopted some measures to combat misinformation. Meta partners with third-party fact-checking organizations on Facebook, Instagram and most recently Threads to review the accuracy of posts and stories. Content identified as false is labeled as misinformation and distribution is reduced. This type of fact-checking became prevalent after COVID-19 misinformation spread across the platform during the pandemic.

Like Meta, TikTok adopted a global fact-checking program to assess the accuracy of content posted to the platform. If content is flagged as harmful misinformation, TikTok will remove the video or restrict its distribution. X rebranded its old fact-checking platform, Birdwatch, to Community Notes once Musk took over in 2022. Community Notes allows users to submit helpful context to posts that could be misleading.

Social media feeds are not designed for facts and news, they “are optimized to keep you engaged,” Zuckerman said. “They’re optimized to keep you participating, keep you clicking.” The hold social media companies have on users is large, especially when algorithms are built to confirm a user’s biases and beliefs. There are a few ways platforms can better serve those young users who are heavily reliant on social media for their news though.

“You could build social media feeds that aim for diversity,” Zuckerman said. For example, he described an algorithm that would suggest a few Republicans for the user to follow if it noticed a user followed Democrats exclusively. If the algorithm saw a user following a lot of Americans, it would recommend users and accounts that provide the person with a more global view.

“People might not enjoy it as much as they enjoy their current confirmations, but you could imagine it being civically useful, you could imagine it giving you a wider view of the world. We just haven't seen much of it,” Zuckerman added.

De-ranking is another method social media companies can take against content they consider offensive, harmful or extreme. “That’s not censoring it, it’s just making it less prominent,” Best said. De-ranking moves content seen as harmful or false out of the top search results and further down in the algorithm so less people view it. This is a step shy of the more extreme option of deplatforming.

“Deplatforming is sort of the ultimate method that generally platforms would have against content they’re concerned with. That means removing those offending users,” Best said. De-ranking and De-platforming come at the risk of impeding on users’ First Amendment rights though, although the courts have come to competing conclusions so far.

The U.S. Supreme Court heard two cases on the issue earlier this year. A Texas law prevented social media companies from “censoring, banning, demonetizing or otherwise restricting” content strictly because it is a user’s opinion. The court ruled in favor of the state, determining corporations do not have a First Amendment right to censor what people say. A Florida law imposed daily penalties on social media companies that deplatformed political candidates or “journalist enterprise.” In that case, the court ruled in favor of the social media companies, which as private entities are entitled to moderate content. A ruling expected by the end of June will impact both the content users can post and what social media companies can moderate in the future.

Representatives for Meta, TikTok and X did not respond to requests for interviews.

Policymakers are the last piece of the equation, Best said. In the last year, policymakers have started to hold social media companies accountable, with Meta CEO Mark Zuckerberg testifying before Congress earlier this year and Congress passing a bill last month that bans TikTok if it’s not sold by its Chinese owner. Best suggested that policymakers have the ability to hold social media companies accountable for their business models and possibly influence them to move away from algorithms that reward content of concern.

Doing away with algorithms entirely may not change things though, Zuckerman said. “If you get rid of the algorithm altogether, you’re probably going to end up with more and more people isolating themselves because that’s what tends to happen when people have choice,” he said.

Even with all of its flaws, at its best social media serves a purpose for young people and for democracy, Zuckerman argued.

“Democracy requires media,” Zuckerman said. “We have to have the capability of talking to each other and making up our minds about who we want to represent us. So you have to be able to have some space in society where people can have those conversations.”

Read More

Donald Trump on stage at the Republican National Convention

Former President Donald Trump speaks at the 2024 Republican National Convention on July 18.

J. Conrad Williams Jr.

Why Trump assassination attempt theories show lies never end

By: Michele Weldon: Weldon is an author, journalist, emerita faculty in journalism at Northwestern University and senior leader with The OpEd Project. Her latest book is “The Time We Have: Essays on Pandemic Living.”

Diamonds are forever, or at least that was the title of the 1971 James Bond movie and an even earlier 1947 advertising campaign for DeBeers jewelry. Tattoos, belief systems, truth and relationships are also supposed to last forever — that is, until they are removed, disproven, ended or disintegrate.

Lately we have questioned whether Covid really will last forever and, with it, the parallel pandemic of misinformation it spawned. The new rash of conspiracy theories and unproven proclamations about the attempted assassination of former President Donald Trump signals that the plague of lies may last forever, too.

Keep ReadingShow less
Computer image of a person speaking
ArtemisDiana/Getty Images

Overcoming AI voice cloning attacks on election integrity

Levine is an election integrity and management consultant who works to ensure that eligible voters can vote, free and fair elections are perceived as legitimate, and election processes are properly administered and secured.

Imagine it’s Election Day. You’re getting ready to go vote when you receive a call from a public official telling you to vote at an early voting location rather than your Election Day polling site. So, you go there only to discover it’s closed. Turns out that the call wasn’t from the public official but from a replica created by voice cloning technology.

That might sound like something out of a sci-fi movie, but many New Hampshire voters experienced something like it two days before the 2024 presidential primary. They received robocalls featuring a deepfake simulating the voice of President Joe Biden that discouraged them from participating in the primary.

Keep ReadingShow less
Robotic hand holding a ballot
Alfieri/Getty Images

What happens when voters cede their ballots to AI agents?

Frazier is an assistant professor at the Crump College of Law at St. Thomas University. Starting this summer, he will serve as a Tarbell fellow.

With the supposed goal of diversifying the electorate and achieving more representative results, State Y introduces “VoteGPT.” This artificial intelligence agent studies your social media profiles, your tax returns and your streaming accounts to develop a “CivicU.” This artificial clone would use that information to serve as your democratic proxy.

Keep ReadingShow less
Sen. Ron Johnson in front of a chart

Sen. Ron Johnson claims President Biden has allowed 1,700 terrorists to enter the country. That total refers to encounters (people who were stopped)

Tom Williams/CQ-Roll Call, Inc via Getty Images

Has President Joe Biden ‘let in’ nearly 1,700 people with links to terrorism?

This fact brief was originally published by Wisconsin Watch. Read the original here. Fact briefs are published by newsrooms in the Gigafact network, and republished by The Fulcrum. Visit Gigafact to learn more.

Has President Joe Biden ‘let in’ nearly 1,700 people with links to terrorism?

No.

Border agents have encountered individuals on the federal terrorist watchlist nearly 1,700 times since President Joe Biden took office — that means those people were stopped while trying to enter the U.S.

Keep ReadingShow less
Social media app icons
hapabapa/Getty Images

Urban planning can counter social media’s impact on young people

Dr. Jones is a grassroot urban planner, architectural designer, and public policy advocate. She was recently a public voice fellow through The OpEd Project.

Despite the breathtaking beauty of our world, many young people remain oblivious to it, ensnared by the all-consuming grip of social media. A recent Yale Medicine report revealed the rising negative impact social media has on teens, as this digital entrapment rewires their brains and leads to alarming mental and physical health struggles. Tragically, they are deprived of authentic life experiences, having grown up in a reality where speculation overshadows genuine interactions.

For the sake of our society’s future, we must urgently curb social media’s dominance and promote real-world exploration through urban planning that ensures accessible, enriching environments for all economic levels to safeguard the mental and physical health of the young.

Keep ReadingShow less