Skip to content
Search

Latest Stories

Top Stories

Dealing with false facts: How to correct online misinformation

Side-by-side images, one with a computer overlay

A comparison of an original and deepfake video of Facebook CEO Mark Zuckerberg.

Elyse Samuels/The Washington Post via Getty Images

Sanfilippo is an assistant professor in the School of Information Sciences at the University of Illinois Urbana-Champaign and book series editor for Cambridge Studies on Governing Knowledge Commons. She is a public voices fellow of The OpEd Project.

Deepfakes of celebrities and misinformation about public figures might not be new in 2024, but they are more common and many people seem to grow ever more resigned that they are inevitable.

The problems posed by false online content extend far beyond public figures, impacting everyone, including youth.


New York Mayor Eric Adams in a recent press conference emphasized that many depend on platforms to fix these problems, but that parents, voters and policymakers need to take action. “These companies are well aware that negative, frightening and outrageous content generates continued engagement and greater revenue,” Adams said.

Recent efforts by Taylor Swift’s fans, coordinated via #ProtectTaylorSwift, to take down, bury, and correct fake and obscene content about her are a welcome and hopeful story about the ability to do something about false and problematic content online.

Still, deepfakes (videos, photos and audio manipulated by artificial intelligence to make something look or sound real) and misinformation have drastically changed social media over the past decade, highlighting the challenges of content moderation and serious implications for consumers, politics and public health.

Sign up for The Fulcrum newsletter

At the same time, generative AI — with ChatGPT at the forefront — changes the scale of these problems and even challenges digital literacy skills recommended to scrutinize online content, as well as radically reshaping content on social media.

The transition from Twitter to X — which has 1.3 billion users — and the rise of TikTok — with 232 million downloads in 2023 — highlight how social media experiences have evolved as a result.

From colleagues at conferences discussing why they’ve left LinkedIn and students asking if they really need to use it, people recognize the decrease in quality of content on that platform (and others) due to bots, AI and the incentives to produce more content.

LinkedIn has established itself as key to career development, yet some say it is not preserving expectations of trustworthiness and legitimacy associated with professional networks or protecting contributors.

In some ways, the reverse is true: User data is being used to train LinkedIn Learning’s AI coaching with an expert lens that is already being monetized as a “professional development” opportunity for paid LinkedIn Premium users.

Regulation of AI is needed as well as enhanced consumer protection around technology. Users cannot meaningfully consent to use platforms and their ever changing terms of services without transparency about what will happen with an individual’s engagement data and content.

Not everything can be solved by users. Market-driven regulation is failing us.

There needs to be meaningful alternatives and the ability to opt out. It can be as simple as individuals reporting content for moderation. For example, when multiple people flag content for review, it is more likely to get to a human moderator, who research shows is key to effective content moderation, including removal and appropriate labeling.

Collective action is also needed. Communities can address problems of false information by working together to report concerns and collaboratively engineer recommendation systems via engagement to deprioritize false and damaging content.

Professionals must also build trust with the communities they serve, so that they can promote reliable sources and develop digital literacy around sources of misinformation and the ways AI promotes and generates it. Policymakers must also regulate social media more carefully.

Truth matters to an informed electorate in order to preserve safety of online spaces for children and professional networks, and to maintain mental health. We cannot leave it up to the companies who caused the problem to fix it.

Read More

Tangle News logo

Election Countdown, with guest Issac Saul of Tangle News

Scott Klug was a 32-year Democratic member of Congress from Wisconsin. Despite winning his four elections by an average of 63 percent, he stayed true to his term limit pledge and retired.

During his time in Congress, Klug had the third most independent voting record of any Wisconsin lawmaker in the last 50 years. In September 2023, he launched a podcast, “Lost in the Middle,” to shine a spotlight on the oft ignored political center.

“The podcast was born,” Klug told Madison Magazine, “out of the sentiment that a wide swath of the American public, myself included, can’t figure out how in the hell we got to this place. And more importantly, is there a way for us out of it.”

Keep ReadingShow less
CNN's John King and the Magic Wall

CNN and other media outlets need to explain the process, not just predict the winner on election night.

YouTube

This election night, the media can better explain how results work

Johnson is the executive director of the Election Reformers Network. Penniman is the founder and CEO of Issue One and author of “Nation on the Take: How Big Money Corrupts Our Democracy and What We Can Do About It.”

Watching election night on cable or network news is a great national tradition. Memorable moments arise as the networks announce their projections in key states. Anchors and commentators demonstrate extraordinary understanding of the unique politics of hundreds of cities and counties across the country. As the results of the most consequential election on the planet unfold, there’s a powerful sense of shared witness.

But our polarized politics has revealed a serious flaw in election night coverage. As disinformation abounds, it is increasingly important for voters to know how the actual, legally certain election results are determined. And right now, voters are not seeing enough of that information on their screens on election night.

Keep ReadingShow less
Donald Trump on stage

The media has held Kamala Harris to a different standard than Donald Trump.

Jabin Botsford/The Washington Post via Getty Images

The media is normalizing the abnormal

Rikleen is executive director of Lawyers Defending American Democracy and the editor of “Her Honor – Stories of Challenge and Triumph from Women Judges.”

As we near the end of a tumultuous election season, too many traditional media outlets are inexplicably continuing their practice of covering candidates who meet standards of normalcy differently than the candidate who has long defied them.

By claiming to take the high road of neutrality in their reporting, these major outlets are committing grave harm. First, they are failing to address what is in plain sight. Second, through those continued omissions, the media has abdicated its primary responsibility of contributing to an informed electorate.

Keep ReadingShow less
Michigan ballot box
RobinOlimb/Getty Images

Register for Election Overtime Project briefing for Michigan media

Becvar is co-publisher of The Fulcrum and executive director of the Bridge Alliance Education Fund. Nevins is co-publisher of The Fulcrum and co-founder and board chairman of the Bridge Alliance Education Fund.

The Election Overtime Project, an effort to prepare journalists to cover the outcome of the 2024 election, is hosting its third swing-state briefing on Oct. 25, this time focused on Michigan.

The series is a part of an effort to help reporters, TV anchors and others prepare America to understand and not fear close elections. Election Overtime is an initiative of the Election Reformers Network and developed in partnership with the Bridge Alliance, which publishes The Fulcrum.

Keep ReadingShow less