Skip to content
Search

Latest Stories

Follow Us:
Top Stories

We need to address the ‘pacing problem’ before AI gets out of control

Opinion

artificial intelligence

If we can use our regulatory imaginations, writers Frazier, "then there’s a chance that future surges in technology can be directed to align with the public interest."

Surasak Suwanmake/Getty Images

Frazier is an assistant professor at the Crump College of Law at St. Thomas University. He previously clerked for the Montana Supreme Court.

The "pacing problem" is the most worrying phenomenon you've never heard of but already understand. In short, it refers to technological advances outpacing laws and regulations. It's as easy to observe as a streaker at a football game.

Here's a quick summary: It took 30 years from the introduction of electricity for 10 percent of households to be able to turn on the lights; 25 years for the same percentage of Americans to be able to pick up the phone; about five years for the internet to hit that mark; and, seemingly, about five weeks for ChatGPT to spread around the world.

Ask any high schooler and they’ll tell you that a longer deadline will lead to a better grade. Well, what’s true of juniors and seniors is true of senators and House members – they can develop better policies when they have more time to respond to an emerging technology. The pacing problem, though, robs our elected officials of the time to ponder how best to regulate something like artificial intelligence: As the rate of adoption increases, the window for action shrinks.


A little more than a year out from the release of ChatGPT, it’s already clear that generative AI tools have become entrenched in society. Lawyers are attempting to use it. Students are hoping to rely on it. And, of course, businesses are successfully exploiting it to increase their bottom lines. As a result, any attempt by Congress to regulate AI will be greeted by an ever expanding and well-paid army of advocates who want to make sure AI is only regulated in a way that doesn’t inhibit their client’s use of the novel technology.

ChatGPT is the beginning of the Age of AI. Another wave of transformational technologies is inevitable. What’s uncertain is whether we will recognize the need for some regulatory imagination. If we stick with the status quo – governance by a Congress operated by expert fundraisers more so than expert policymakers – then the pacing problem will only get worse. If we instead opt to use our regulatory imaginations, then there’s a chance that future surges in technology can be directed to align with the public interest.

Regulatory imagination is like a pink pony – theoretically, easy to spot; in reality, difficult to create. The first step is to encourage our regulators to dream big. One small step toward that goal: Create an innovation team within each agency. These teams would have a mandate to study how the sausage is made and analyze and share ways to make that process faster, smarter and more responsive to changes in technology.

The second step would be to embrace experimentation. Congress currently operates like someone trying to break the home run record – they only take big swings and they commonly miss. A wiser strategy would be to bunt and see if we can get any runners in scoring position; in other words, Congress should lean into testing novel policy ideas by passing laws with sunset clauses. Laws with expiration dates would increase Congress’ willingness to test new ideas and monitor their effectiveness.

Third, and finally, Congress should work more closely with the leading developers of emerging technologies. Case in point, Americans would benefit from AI labs like OpenAI and Google being more transparent with Congress about what technology they plan to release and when. Surprise announcements may please stakeholders but companies should instead aim to minimize their odds of disrupting society. This sort of information sharing, even if not made public, could go a long way toward closing the pacing problem.

Technological “progress” does not always move society forward. We’ve got to address the pacing problem if advances in technology are going to serve the common good.


Read More

A U.S. flag flying before congress. Visual representation of technology, a glitch, artificial intelligence
As AI reshapes jobs and politics, America faces a choice: resist automation or embrace innovation. The path to prosperity lies in AI literacy and adaptability.
Getty Images, Douglas Rissing

Why Should I Be Worried About AI?

For many people, the current anxiety about artificial intelligence feels overblown. They say, “We’ve been here before.” Every generation has its technological scare story. In the early days of automation, factories threatened jobs. Television was supposed to rot our brains. The internet was going to end serious thinking. Kurt Vonnegut’s Player Piano, published in 1952, imagined a world run by machines and technocrats, leaving ordinary humans purposeless and sidelined. We survived all of that.

So when people today warn that AI is different — that it poses risks to democracy, work, truth, our ability to make informed and independent choices — it’s reasonable to ask: Why should I care?

Keep ReadingShow less
A person on their phone, using a type of artificial intelligence.

AI-generated “nudification” is no longer a distant threat—it’s harming students now. As deepfake pornography spreads in schools nationwide, educators are left to confront a growing crisis that outpaces laws, platforms, and parental awareness.

Getty Images, d3sign

How AI Deepfakes in Classrooms Expose a Crisis of Accountability and Civic Trust

While public outrage flares when AI tools like Elon Musk’s Grok generate sexualized images of adults on X—often without consent—schools have been dealing with this harm for years. For school-aged children, AI-generated “nudification” is not a future threat or an abstract tech concern; it is already shaping their daily lives.

Last month, that reality became impossible to ignore in Lafourche Parish, Louisiana. A father sued the school district after several middle school boys circulated AI-generated pornographic images of eight female classmates, including his 13-year-old daughter. When the girl confronted one of the boys and punched him on a school bus, she was expelled. The boy who helped create and spread the images faced no formal consequences.

Keep ReadingShow less
Democracies Don’t Collapse in Silence; They Collapse When Truth Is Distorted or Denied
a remote control sitting in front of a television
Photo by Pinho . on Unsplash

Democracies Don’t Collapse in Silence; They Collapse When Truth Is Distorted or Denied

Even with the full protection of the First Amendment, the free press in America is at risk. When a president works tirelessly to silence journalists, the question becomes unavoidable: What truth is he trying to keep the country from seeing? What is he covering up or trying to hide?

Democracies rarely fall in a single moment; they erode through a thousand small silences that go unchallenged. When citizens can no longer see or hear the truth — or when leaders manipulate what the public is allowed to know — the foundation of self‑government begins to crack long before the structure falls. When truth becomes negotiable, democracy becomes vulnerable — not because citizens stop caring, but because they stop receiving the information they need to act.

Keep ReadingShow less
A close up of a person's hands typing on a laptop.

As AI reshapes the labor market, workers must think like entrepreneurs. Explore skills gaps, apprenticeships, and policy reforms shaping the future of work.

Getty Images, Maria Korneeva

We’re All Entrepreneurs Now: Learning, Pivoting, and Thriving the Age of AI

What do a recent grad, a disenchanted employee, and a parent returning to the workforce all have in common? They’re each trying to determine which skills are in demand and how they can convince employers that they are competent in those fields. This is easier said than done.

Recent grads point to transcripts lined with As to persuade firms that they can add value. Firms, well aware of grade inflation, may scoff.

Keep ReadingShow less