Skip to content
Search

Latest Stories

Top Stories

AI leaves us no choice but to learn from the past

computer circuitry
Jonathan Kitchen/Getty Images

Frazier is an assistant professor at the Crump College of Law at St. Thomas University. He previously clerked for the Montana Supreme Court.

Generations from now, historians will wonder why we choose to leave certain communities behind while allowing technology to race ahead. They’ll point out that we had plenty of examples to learn from — and, yet, prioritized “progress” over people.

The lessons we could have learned are tragically obvious. In the 1850s and ’60s, the telegraph took off and people assumed it would serve a great civic purpose. Instead, Western Union captured the market, jacked up the message rates and restricted the use of this powerful technology to the already powerful. Next, in the 1990s and 2000s, the internet inspired us all to dream of a more connected future. Instead, a digital divide has formed – leaving certain communities and individuals without meaningful access to an increasingly essential technology. Others surely could list other similar examples.


Clear steps could have mitigated those outcomes. Generations of postmasters general called for increasing access to the telegraph network; Congress said it was too expensive. Likewise, for decades advocates have been calling on the government to invest in the infrastructure necessary to bring reliable, high-speed internet to every home; again, equal access to opportunity was deemed too costly.

So far, the introduction of artificial intelligence seems to fit this pattern: Despite it having the potential to benefit billions, it’s been harnessed by those already benefiting from the last technological advance. And, while it’s true that some AI use cases may have tremendous benefits for all, those benefits seem likely to first go to those already financially secure and technologically savvy.

Reversing the historical trend of technological progress causing inequality to expand and become more entrenched isn’t going to be cheap but it’s imperative if we want AI to live up to its potential.

First things first, we have to make sure all Americans have access to the Internet. COVID-19 reminded us of the digital divide and, for a brief moment, led to massive government spending that helped increase access to educational, cultural and professional opportunities. That funding appears to be going the way of the dodo bird. President Joe Biden ought to insist on internet access being a core part of our national AI strategy. Absent making access a priority, we’re bound to repeat a problematic past.

Second, the government should — at a minimum — nudge and — more appropriately — subsidize the development of AI models specifically addressing the needs and challenges facing communities that have traditionally been on the losing end of similar advances.

Third, AI labs should release annual societal impact statements. Such reports would give policymakers and the public a chance to evaluate whether the pros of AI advances really outweigh the cons.

All of this will cost money, require time and (likely) delay the rate of AI development and deployment. Nevertheless, it's an investment in our community and our collective potential. If any of the three steps above were pursued, my hunch is that history will celebrate a shift in our priorities from profit and “progress” to people and patience.

What’s clear is that we cannot afford to stick with the traditional playbook. Technology must always be viewed as a tool — one we can deploy, delay and ... gasp ... decide to forgo. That’s right — AI is not the solution to everything and AI should not be allowed to upend every aspect of our individual and collective affairs.

Learning from the past is dang hard. But there’s still time for us to redirect the future by reorienting our approach to AI in the present. For too long certain Americans have been digitally forgotten; AI has given us a chance to remind ourselves of the importance of aligning technology with the public interest.

Read More

Why Journalists Must Stand Firm in the Face of Threats to Democracy
a cup of coffee and a pair of glasses on a newspaper
Photo by Ashni on Unsplash

Why Journalists Must Stand Firm in the Face of Threats to Democracy

The United States is living through a moment of profound democratic vulnerability. I believe the Trump administration has worked in ways that weaken trust in our institutions, including one of democracy’s most essential pillars: a free and independent press. In my view, these are not abstract risks but deliberate attempts to discredit truth-telling. That is why, now more than ever, I think journalists must recommit themselves to their core duty of telling the truth, holding power to account, and giving voice to the people.

As journalists, I believe we do not exist to serve those in office. Our loyalty should be to the public, to the people who trust us with their stories, not to officials who often seek to mold the press to favor their agenda. To me, abandoning that principle would be to betray not just our profession but democracy itself.

Keep ReadingShow less
Fighting the Liar’s Dividend: A Toolkit for Truth in the Digital Age

In 2023, the RAND Corporation released a study on a phenomenon known as "Truth Decay," where facts become blurred with opinion and spin. But now, people are beginning to doubt everything, including authentic material.

Getty Images, VioletaStoimenova

Fighting the Liar’s Dividend: A Toolkit for Truth in the Digital Age

The Stakes: When Nothing Can Be Trusted

Two weeks before the 2024 election, a fake robocall mimicking President Biden's voice urged voters to skip the New Hampshire primary. According to AP News, it was an instance of AI-enabled election interference. Within hours, thousands had shared it. Each fake like this erodes confidence in the very possibility of knowing what is real.

The RAND Corporation refers to this phenomenon as "Truth Decay," where facts become blurred with opinion and spin. Its 2023 research warns that Truth Decay threatens U.S. national security by weakening military readiness and eroding credibility with allies. But the deeper crisis isn't that people believe every fake—it's that they doubt everything, including authentic material.

Keep ReadingShow less
From TikTok to Telehealth: 3 Ways Medicine Must Evolve to Reach Gen Z
person wearing lavatory gown with green stethoscope on neck using phone while standing

From TikTok to Telehealth: 3 Ways Medicine Must Evolve to Reach Gen Z

Ask people how much they expect to change over the next 10 years, and most will say “not much.” Ask them how much they’ve changed in the past decade, and the answer flips. Regardless of age, the past always feels more transformative than the future.

This blind spot has a name: the end-of-history illusion. The result is a persistent illusion that life, and the values and behaviors that shape it, will remain unchanged.

Keep ReadingShow less
The Importance of Being Media Literate

An image depicting a group of people of varying ages interacting with different forms of media, such as smartphones, tablets, and laptops.

AI generated

The Importance of Being Media Literate

Information is constantly on our phones, and we receive notifications for almost everything happening in the world, which can be overwhelming to many. Information is given to us in an instant, and more often than you think, we don’t even know what exactly we are reading.

We don’t even know if the information we see is accurate or makes sense. Media literacy goes beyond what we learn in school; it’s a skill that grows as we become more aware and critical of the information we consume.

Keep ReadingShow less