Skip to content
Search

Latest Stories

Top Stories

Misinformation isn’t just coming from your cranky uncle on Facebook

Opinion

Misinformation on Facebook
Marcela Vieira

Minichiello writes The Sift newsletter for educators at the News Literacy Project.

I see misinformation all the time. Scrolling through Instagram, I saw a musician I follow sharing false posts about the Israel-Hamas war. Out to eat at a restaurant, a server making friendly small talk shared true crime content she finds online — while rattling off names of accounts that I later discovered were conspiracy-minded. A friend of mine thinks the Infowars conspiracy theories site is a delight. And there’s my relative who started entertaining the idea that the world is flat after watching YouTube videos.

Misinformation affects everything from our health decisions to our personal relationships to business to, of course, democracy. It’s easy to get angry when we’re confronted with misinformation — that’s what it’s designed to do — but learning how to sort fact from fiction online while also practicing empathy will go a long way in fixing the misinformation crisis.


And the News Literacy Project, where I now work after a career in journalism, can teach people how to identify credible news.

First, if you’re fortunate enough not to live in a local news desert, you should get your information from your local newspaper, radio station or TV news programs. Beware of online “news” sites that are partisan propaganda. If they have very few or no local stories but plenty of politically charged articles, it’s a red flag. Also, follow news outlets that adhere to journalism standards and ethics, such as being transparent about corrections. Quality local news not only empowers us individually, but studies show it’s also good for democracy.

Second, when navigating difficult conversations where misinformation may come up, try practicing what NLP calls “PEP”: patience, empathy and persistence. You can’t convince people of anything in one heated conversation, but if you listen you can walk away with a better understanding of how they developed their beliefs.

“We don’t all think the same,” said one of my relatives, who refused to get a Covid-19 booster shot after seeing misinformation about how it might affect someone’s hair.

I responded: “I’m not saying we need to think the same thing. I’m saying we need to know what’s true before we agree or disagree.”

It’s hard when someone you love is repeating falsehoods, but that doesn’t mean they’re not intelligent or they are a bad person. We’re living in a system where social media platforms don’t properly moderate content for misinformation and, in fact, incentivize its spread through outrage. Laws and regulations haven’t caught up to how quickly artificial intelligence technologies — including misleading images, videos and posts — are being developed.

Additionally, remember that none of us are exempt from falling for misinformation. It’s embarrassing, but it’s happened to me too. During the early days of the pandemic, photos spread online of dolphins swimming in Venice canals. It seemed mesmerizing and magical. Of course, those photos later turned out to be false. And I felt like a fool – something to remember the next time I spar with a friend over misinformation.

It’s a systemic problem. We can do our part to counter it by seeking out credible sources of news. We can keep in mind that it’s easy to be fooled by falsehoods. And we can engage in tough, face-to-face conversations with the people we care about, but lead with empathy — not accusations.

One tactic that has helped me immensely — not just in countering misinformation I see online but also when talking through hot misinformation topics with loved ones — is to simply take ... a ... pause. This is actually a news literacy skill. Being news-literate means being able to identify credible information, which is often as simple as pausing to confirm whether something is true before sharing it online or in conversations.

It also means seeking out credible news sources like your local TV station or paper. By practicing news literacy skills, we’ll be better equipped to find trustworthy information and engage in difficult conversations about conspiracy theories and misinformation.

During National News Literacy Week, which runs Jan. 22-26, you can learn how to push back on misinformation and empower yourself and your communities to seek quality, vetted information. Join us in our effort to build a national movement to advance the practice of news literacy throughout American society, creating better informed, more engaged and more empowered individuals — and ultimately a stronger democracy.

Read More

Person on a smartphone.

The digital public square rewards outrage over empathy. To save democracy, we must redesign our online spaces to prioritize dialogue, trust, and civility.

Getty Images, Tiwaporn Khemwatcharalerd

Rebuilding Civic Trust in the Age of Algorithmic Division

A headline about a new education policy flashes across a news-aggregation app. Within minutes, the comment section fills: one reader suggests the proposal has merit; a dozen others pounce. Words like idiot, sheep, and propaganda fly faster than the article loads. No one asks what the commenter meant. The thread scrolls on—another small fire in a forest already smoldering.

It’s a small scene, but it captures something larger: how the public square has turned reactive by design. The digital environments where citizens now meet were built to reward intensity, not inquiry. Each click, share, and outrage serves an invisible metric that prizes attention over understanding.

Keep ReadingShow less
A woman typing on her laptop.

Pop-ups on federal websites blaming Democrats for the shutdown spark Hatch Act concerns, raising questions about neutrality in government communications.

Getty Images, Igor Suka

When Federal Websites Get Political: The Hatch Act in the Digital Age

As the federal government entered a shutdown on October 1st, a new controversy emerged over how federal agencies communicate during political standoffs. Pop-ups and banners appeared on agency websites blaming one side of Congress for the funding lapse, prompting questions about whether such messaging violated federal rules meant to keep government communications neutral. The episode has drawn bipartisan concern and renewed scrutiny of the Hatch Act, a 1939 law that governs political activity in federal workplaces.

The Shutdown and Federal Website Pop-ups

The government shutdown began after negotiations over the federal budget collapsed. Republicans, who control both chambers of Congress, needed Democratic support in the Senate to pass a series of funding bills, or Continuing Resolutions, but failed to reach an agreement before the deadline. In the hours before the shutdown took effect, the Department of Housing and Urban Development, or HUD, posted a full-screen red banner stating, “The Radical Left in Congress shut down the government. HUD will use available resources to help Americans in need.” Users could not access the website until clicking through the message.

Keep ReadingShow less
Congress Must Lead On AI While It Still Can
a computer chip with the letter a on top of it
Photo by Igor Omilaev on Unsplash

Congress Must Lead On AI While It Still Can

Last month, Matthew and Maria Raine testified before Congress, describing how their 16-year-old son confided suicidal thoughts to AI chatbots, only to be met with validation, encouragement, and even help drafting a suicide note. The Raines are among multiple families who have recently filed lawsuits alleging that AI chatbots were responsible for their children’s suicides. Their deaths, now at the center of lawsuits against AI companies, underscore a similar argument playing out in federal courts: artificial intelligence is no longer an abstraction of the future; it is already shaping life and death.

And these teens are not outliers. According to Common Sense Media, a nonprofit dedicated to improving the lives of kids and families, 72 percent of teenagers report using AI companions, often relying on them for emotional support. This dependence is developing far ahead of any emerging national safety standard.

Keep ReadingShow less
A person on using a smartphone.

With millions of child abuse images reported annually and AI creating new dangers, advocates are calling for accountability from Big Tech and stronger laws to keep kids safe online.

Getty Images, ljubaphoto

Parents: It’s Time To Get Mad About Online Child Sexual Abuse

Forty-five years ago this month, Mothers Against Drunk Driving had its first national press conference, and a global movement to stop impaired driving was born. MADD was founded by Candace Lightner after her 13-year-old daughter was struck and killed by a drunk driver while walking to a church carnival in 1980. Terms like “designated driver” and the slogan “Friends don’t let friends drive drunk” came out of MADD’s campaigning, and a variety of state and federal laws, like a lowered blood alcohol limit and legal drinking age, were instituted thanks to their advocacy. Over time, social norms evolved, and driving drunk was no longer seen as a “folk crime,” but a serious, conscious choice with serious consequences.

Movements like this one, started by fed-up, grieving parents working with law enforcement and law makers, worked to lower road fatalities nationwide, inspire similar campaigns in other countries, and saved countless lives.

Keep ReadingShow less