Radwell, business executive and global thought leader in consumer marketing is the author of American Schism, the 2022 International Book Award Winner for Best General Nonfiction. He speaks often on the civic crisis we face and how we can move forward to reestablish a more rational discussion to replace our divisive contemporary political discourse.
The recent news is that Facebook’s parent, Meta, had record earnings that have driven the stock up over 20% in trading. In the world of social media, Facebook is ubiquitous. Coming from humble beginnings as a Harvard student’s pet project, it’s now expanded to 2.89 billion users worldwide, making it the largest social network by 600 million users.
While Facebook founder Mark Zuckerberg began the site as a way for college students to connect with each other, it quickly ballooned to a place where people not only get information about what their friends are doing, but also what’s going on in the world. A recent Pew Research study found that 36 percent of American adults regularly use Facebook to get their news. In comparison, about 23 percent reported they use YouTube for news; 15 percent use Twitter.
In American Schism, winner of the 2022 International Book Award for Best General Nonfiction, I discuss how the media incentives in both digital and cable “news” are misaligned with the public good of accurate information. While this is a complex problem, there are myriad ways that better incentive systems can be created.
Sign up for The Fulcrum newsletter
In the case of Facebook, the problem is twofold: Because Facebook’s algorithm—the set of rules that determines what content is displayed—is based on what users interact with through likes, comments, and shares, it creates a lopsided sense of news. More outrageous comments tend to get noticed more and thus get more interactions and start a self-perpetuating promotion cycle. This has nothing to do with newsworthiness and everything to do with using effective stimuli to elicit more clicks and thus greater advertising revenue for Facebook. Furthermore, Eli Pariser, author of The Filter Bubble: What the Internet Is Hiding from You, discovered users are less likely to interact with posts that present viewpoints counter to their own, which significantly impacts public discourse. “The most serious political problem posed by filter bubbles is that they make it increasingly difficult to have a public argument,” Pariser wrote. “As the number of different segments and messages increases, it becomes harder and harder for the campaigns to track who’s saying what to whom.”
The other issue is that since Facebook isn’t bound by the same ethical standards of journalism or information sharing—ethics that the mainstream media arguably has a tenuous grasp on as it is—the chance of encountering heavily biased or downright false information is high. Unlike a traditional news platform, which uses professional judgment to determine which stories get prominent attention, and abides by journalistic standards of verification of information, Facebook’s algorithm is tuned to serve high-interest content that will garner the most engagement—regardless of its accuracy. According to a Washington Post report, news publishers known for releasing misinformation got six times more “likes, shares, and interactions” on the platform than trustworthy news sites.
Even more disturbing is how Facebook began serving ads that invited users to “like” certain media outlet pages, which caused a significant increase in traffic to these sites. In 2013, Buzzfeed reported a 69 percent bump in page views linked from Facebook. While news outlets appreciated additional readership, they also came to understand Facebook was able to control WHERE its users got their information. “Across the landscape, it began to dawn on people who thought about these kinds of things: Damn, Facebook owns us. They had taken over media distribution,” Alexis C. Madrigal wrote in The Atlantic.
With so much influence on what news gets distributed to whom, Facebook has demonstrated its ability to manipulate not only its users' opinions, but also the fate of democracy.
How do we tame this social media juggernaut?
Trust Busting
If we look to history, we can find several examples of companies getting too big for the public good. These monopolies, which occur when a company becomes the sole supplier of a particular product, eliminate competition in the marketplace, and as a result can take advantage of its customers.
The most famous examples of monopolies in US history are J.P. Morgan’s Northern Securities Company and John Rockefeller’s Standard Oil Company, both of whom bought up smaller competitors to control their respective commodities.
Congress passed the Sherman Antitrust Act in 1890 to break up these monopolies, but it was largely toothless until Teddy Roosevelt began using it to his advantage. Roosevelt’s “trust busting” became something for which his administration would be remembered.
This moniker is a bit of a misnomer, because Roosevelt did not always believe big business to be bad. As long as it grew through legitimately outmatching its competition, he felt it served a public good. But if a business grew through unfair or unreasonable practices, Roosevelt felt the government should intervene in order to protect the public.
The Northern Securities Company, which dominated the railroad industry, was ordered to dissolve following a 1904 Supreme Court ruling that the company was violating the Sherman Antitrust Act. Likewise, in 1911, Standard Oil was ordered to break into 34 separate companies following a Supreme Court decision that determined the company’s majority market share had come as a result of economic threats against competitors and secret deals with railroads.
In more recent history, the US Justice Department ordered AT&T—the sole provider of telephone service in the country through its Bell System—to split into nine independent companies in 1982. The so-called “Baby Bells” allowed competitors to re-enter the market, providing telephone services and related equipment.
Many would argue that Facebook is a textbook definition of a monopoly, but the Federal Trade Commission still can’t come up with the evidence to prove it. The FTC filed suit against the company in 2020, but because of the way it presented its case, it could not prove that Facebook controls a dominant share of personal social networking services.
The reason this filing initially failed, according to a Tech Crunch report, is because they failed to include messaging apps—like the Facebook-owned WhatsApp—into the equation. If they were to include that data, Facebook would control over 90 percent of the market. However, the FTC re-filed a lawsuit in August of this year—ironically on the same day Zuckerberg announced the Facebook Company would be rebranding itself as Meta.
The parallels between the monopolies of the early 20th century and Facebook are striking. Instead of competing with alternatives, they’ve in essence pulled them into their machine—or attempted to, as when they made a bid to buy both Twitter and Snapchat. “Facebook has, for many years, continued to engage in a course of anti-competitive conduct with the aim of suppressing, neutralizing, and deterring serious competitive threats to Facebook,” the FTC suit alleges.
Time For Change
With evidence that Facebook has become so powerful that they have the ability to shape an election or sway public opinion to such a degree, they’ve demonstrated they’re too big for the public good. Furthermore, From both a civil and an economic perspective, accurate information is a public good. A student of microeconomics would clearly recognize that this strengthens the case for government action. While I’m not always for government intervention, when there exists a factual basis of monopoly behavior which limits societal consumption of a public good, limited regulation is necessary.
There should absolutely be competing models of social media communication—ideally, models that value accuracy over salaciousness.
If the FTC suit and the recent order from the UK government for Facebook to sell Giphy is any indication, Facebook’s reign as many’s primary source of information will soon come to a close. Until then, it’s important for users to understand that the information they receive from Facebook shouldn’t be wholly trusted. Before sharing a story, consider the source and verify its accuracy with other reputable sources. If it seems too scandalous to be true, it probably is.
Your Turn
How do you think the government—and the public—should handle Facebook’s Monopoly?
Please feel free to share your thoughts with the author at: sradwell@SethDavidRadwell.com.