Skip to content
Search

Latest Stories

Top Stories

Yes, consumers can change Facebook’s polarizing behavior

Opinion

Mark Zuckerberg

It's up to social media consumers to force Mark Zuckerberg and his company to make changes, writes Aftergut.

Drew Angerer/Getty Images

Aftergut is a former federal prosecutor in San Francisco.

The oft-recited close of T.S. Eliot’s celebrated 20th century poem “The Hollow Men” captures a truth about the fall of Rome and other tragic losses: “not with a bang but a whimper.” Meaning that in human affairs, catastrophe seldom arrives via a meteor strike but rather by a slow rollout of flaws that are part of who we are.

The 21st century update may argue that the way the world ends is not with a bang but a click, one short keystroke on a Facebook page.


(Facebook rebranded itself as “Meta” in October, but let’s continue to call it “Facebook” here.)

Many a morning routine starts with a check of a “feed” … the latest news, photos, and updates from family and friends. Our brains get an endorphin rush from likes, emoji hearts, or posted pics from a child or sister. Then there’s the unexpected outreach from a childhood crush or long-lost roommate. Oh, and those photos from Yellowstone or the Amalfi Coast.

Facebook studies neuro-marketing to match its product to our brains. As David Rock, author of ” Your Brain at Work,” has written: “The circuitry activated when you connect online is the seeking circuitry of dopamine. Yet . . we don’t tend to get the oxytocin or serotonin calming reward that happens when we bond with someone in real time, when our circuits resonate with real-time shared emotions and experiences.”

Facebook has mined dopamine brain science to maximize its advertising audience. The results are chilling. The documents that Facebook whistleblower Frances Haugen recently revealed make one’s neck hair stand straight up. They tell of Facebook promoting political polarity, driving teen anxiety and feeding negative emotions to draw clicks.

On Dec. 2, we learned that Facebook has been making money selling ads comparing vaccine mandates to Nazism and the Holocaust. This follows a pattern first seen with Facebook posts promoting human trafficking. Although the ads violated Facebook policy, the company ran them, only removing them after news outlets notified Facebook a public story was about to run.

According to Facebook researchers, its algorithm for spreading posts focused on increasing anger in political communications: “Misinformation, toxicity, and violent content are inordinately prevalent among reshares.”

Facebook CEO Mark Zuckerberg resisted proposed changes: “Mark doesn’t think we could go broad” with the fix, according to the documents. “We wouldn’t launch if there was a material [business] impact.”

Internal memos also show that Facebook was aware that its subsidiary Instagram is harmful to children. Per the Wall Street Journal’s exposure, Facebook research confessed: “We make body image issues worse for one in three teen girls. Teens blame Instagram for increases in the rate of anxiety and depression.“

On Oct. 6, Zuckerberg posted a damage-control statement, saying the critique makes no sense: “We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.”

Maybe so but notice what he didn’t say: That ad buys have dropped, or that Facebook has taken steps to decelerate its spreading of harmful messages. Filippo Menczer, Indiana University professor of informatics and computer science, has written about ways to do so.

In Zuckerberg’s post trying to quiet the storm, he also asked, “If social media were as responsible for polarizing society as some people claim, then why are we seeing polarization increase in the US while it stays flat or declines in many countries ... ?"

Pu-l eeeeeeze. No one said social media is solely responsible for polarization. Different countries have different national histories of discord, racism and xenophobia. They also have had different leaders and have applied differing governmental remedies. All these factors produce disparate levels of polarization.

Shell-game PR sophistry doesn’t get a CEO off the hook for damaging society.

Today, Instagram CEO Adam Mosseri testifies before a Senate Committee. Some say only Congress can crack down on Facebook. But it’s no newsflash that Congress is paralyzed, frozen in inaction by special-interest contributors, including Facebook. Is it sensible to leave regulating dysfunction to the dysfunctional?

The point here is that anyone troubled by Facebook putting profits over healthy children and society can replace their Facebook habit and not add to its profits. After all, advertising revenue turns on subscriber numbers.

Life is not only possible after Facebook; it can be better. A 2019 report found that your brain off of Facebook yields “more in-person time with friends and family ... less partisan fever ... [plus a] bump in one’s daily moods and life satisfaction. And ... an extra hour a day of downtime.”

With trust in Facebook falling, and reasons to drop it rising, there are alternative social media sites for friends and family to join. Responsible news curators and outlets abound that don’t organize insurrections, tell us in a million ways we’re not good-looking enough, or spending enough time in Tahiti. Plus, they don’t work overtime to addict us.

Young women recognizing the damage Instagram wreaks have started to use it for messaging but not posting. Support and resources on sites like Good for MEdia build healthy resistance to FOMO and peer pressure for those declaring selective independence from social media addiction.

As for dropping a habit, I can’t say, “There’s an app for that.” But if you’re motivated, each of us has an aptitude for it. Together, maybe we delay the world ending with a click or a whimper.

Read More

“There is a real public hunger for accurate, local, fact-based information”

Monica Campbell

Credit Ximena Natera

“There is a real public hunger for accurate, local, fact-based information”

At a time when democracy feels fragile and newsrooms are shrinking, Monica Campbell has spent her career asking how journalism can still serve the public good. She is Director of the California Local News Fellowship at the University of California, Berkeley, and a former editor at The Washington Post and The World. Her work has focused on press freedom, disinformation, and the civic role of journalism. In this conversation, she reflects on the state of free press in the United States, what she learned reporting in Latin America, and what still gives her hope for the future of the profession.

You have worked in both international and U.S. journalism for decades. How would you describe the current state of press freedom in the United States?

Keep ReadingShow less
Person on a smartphone.

The digital public square rewards outrage over empathy. To save democracy, we must redesign our online spaces to prioritize dialogue, trust, and civility.

Getty Images, Tiwaporn Khemwatcharalerd

Rebuilding Civic Trust in the Age of Algorithmic Division

A headline about a new education policy flashes across a news-aggregation app. Within minutes, the comment section fills: one reader suggests the proposal has merit; a dozen others pounce. Words like idiot, sheep, and propaganda fly faster than the article loads. No one asks what the commenter meant. The thread scrolls on—another small fire in a forest already smoldering.

It’s a small scene, but it captures something larger: how the public square has turned reactive by design. The digital environments where citizens now meet were built to reward intensity, not inquiry. Each click, share, and outrage serves an invisible metric that prizes attention over understanding.

Keep ReadingShow less
A woman typing on her laptop.

Pop-ups on federal websites blaming Democrats for the shutdown spark Hatch Act concerns, raising questions about neutrality in government communications.

Getty Images, Igor Suka

When Federal Websites Get Political: The Hatch Act in the Digital Age

As the federal government entered a shutdown on October 1st, a new controversy emerged over how federal agencies communicate during political standoffs. Pop-ups and banners appeared on agency websites blaming one side of Congress for the funding lapse, prompting questions about whether such messaging violated federal rules meant to keep government communications neutral. The episode has drawn bipartisan concern and renewed scrutiny of the Hatch Act, a 1939 law that governs political activity in federal workplaces.

The Shutdown and Federal Website Pop-ups

The government shutdown began after negotiations over the federal budget collapsed. Republicans, who control both chambers of Congress, needed Democratic support in the Senate to pass a series of funding bills, or Continuing Resolutions, but failed to reach an agreement before the deadline. In the hours before the shutdown took effect, the Department of Housing and Urban Development, or HUD, posted a full-screen red banner stating, “The Radical Left in Congress shut down the government. HUD will use available resources to help Americans in need.” Users could not access the website until clicking through the message.

Keep ReadingShow less
Congress Must Lead On AI While It Still Can
a computer chip with the letter a on top of it
Photo by Igor Omilaev on Unsplash

Congress Must Lead On AI While It Still Can

Last month, Matthew and Maria Raine testified before Congress, describing how their 16-year-old son confided suicidal thoughts to AI chatbots, only to be met with validation, encouragement, and even help drafting a suicide note. The Raines are among multiple families who have recently filed lawsuits alleging that AI chatbots were responsible for their children’s suicides. Their deaths, now at the center of lawsuits against AI companies, underscore a similar argument playing out in federal courts: artificial intelligence is no longer an abstraction of the future; it is already shaping life and death.

And these teens are not outliers. According to Common Sense Media, a nonprofit dedicated to improving the lives of kids and families, 72 percent of teenagers report using AI companions, often relying on them for emotional support. This dependence is developing far ahead of any emerging national safety standard.

Keep ReadingShow less