Skip to content
Search

Latest Stories

Top Stories

We may face another 'too big to fail' scenario as AI labs go unchecked

NVIDIA headquarters

Our stock market pivots on the performance of a handful of AI-focused companies like Nvidia.

hapabapa/Getty Images

Frazier is an assistant professor at the Crump College of Law at St. Thomas University and a Tarbell fellow.

In the span of two or so years, OpenAI, Nvidia and a handful of other companies essential to the development of artificial intelligence have become economic behemoths. Their valuations and stock prices have soared. Their products have become essential to Fortune 500 companies. Their business plans are the focus of the national security industry. Their collapse would be, well, unacceptable. They are too big to fail.

The good news is we’ve been in similar situations before. The bad news is we’ve yet to really learn our lesson.


In the mid-1970s, a bank known for its conservative growth strategy decided to more aggressively pursue profits. The strategy worked. In just a few years the bank became the largest commercial and industrial lender in the nation. The impressive growth caught the attention of others — competitors looked on with envy, shareholders with appreciation and analysts with bullish optimism. As the balance sheet grew, however, so did the broader economic importance of the bank. It became too big to fail.

Regulators missed the signs of systemic risk. A kick of the bank’s tires gave no reason to panic. But a look under the hood — specifically, at the bank’s loan-to-assets ratio and average return on loans — would have revealed a simple truth: The bank had been far too risky. The tactics that fueled its go-go years rendered the bank over exposed to sectors suffering tough economic times. Rumors soon spread that the bank was in a financially sketchy spot. It was the Titanic, without the band, to paraphrase an employee.

Sign up for The Fulcrum newsletter

When the inevitable run on the bank started, regulators had no choice but to spend billions to keep the bank afloat — staving it from sinking and bringing the rest of the economy with it. Of course, a similar situation played out during the Great Recession — risky behavior by a few bad companies imposed bailout payments on the rest of us.

AI labs are similarly taking gambles that have good odds of making many of us losers. As major labs rush to release their latest models, they are not stopping to ask if we have the social safety nets ready if things backfire. Nor are they meaningfully contributing to building those necessary safeguards.

Instead, we find ourselves in a highly volatile situation. Our stock market seemingly pivots on earnings of just a few companies — the world came to a near standstill last month as everyone awaited Nvidia’s financial outlook. Our leading businesses and essential government services are quick to adopt the latest AI models despite real uncertainty as to whether they will operate as intended. If any of these labs took a financial tumble or any of the models were significantly flawed, the public would likely again be asked to find a way to save the risk takers.

This outcome may be likely but it’s not inevitable. The Dodd-Frank Act passed in response to the Great Recession and intended to prevent another Too Big to Fail situation in the financial sector has been roundly criticized for its inadequacy. We should learn from its faults in thinking through how to make sure AI goliaths don’t crush all of us Davids.

Some sample steps include mandating and enforcing more rigorous testing of AI models before deployment. It would also behoove us to prevent excessive reliance on any one model by the government — this could be accomplished by requiring public service providers to maintain analog processes in the event of emergencies. Finally, we can reduce the economic sway of a few labs by fostering more competition in the space.

Too Big to Fail scenarios have happened on too many occasions. There’s no excuse for allowing AI labs to become so large and so essential that we collectively end up paying for their mistakes.

Read More

The NFL Playoffs Are Prime Time for Digital Piracy

Patrick Mahomes #15 of the Kansas City Chiefs celebrates during the first half of the AFC Divisional playoff game against the Houston Texans at GEHA Field at Arrowhead Stadium on January 18, 2025 in Kansas City, Missouri.

(Photo by Aaron M. Sprecher/Getty Images)

The NFL Playoffs Are Prime Time for Digital Piracy

The NFL playoffs are an exciting time for football fans to watch the chase for the Super Bowl. It was a uniquely American obsession that has increasingly captured the attention of live sports fans worldwide.

It’s also prime time for live sports piracy, and American lawmakers must enact measures to protect these live broadcasts.

Keep ReadingShow less
To help heal divides, we must cut “the media” some slack

Newspaper headline cuttings.

Getty Images / Sean Gladwell

To help heal divides, we must cut “the media” some slack

A few days ago, Donald Trump was inaugurated. In his second term, just as in his first, he’ll likely spark passionate disagreements about news media: what is “fake news” and what isn’t, which media sources should be trusted and which should be doubted.

We know we have a media distrust problem. Recently it hit an all-time low: the percentage of Americans with "not very much" trust in the media has risen from 27% in 2020 to 33% in 2024.

Keep ReadingShow less
King's Birmingham Jail Letter in Our Digital Times

Civil Rights Ldr. Rev. Martin Luther King Jr. speaking into mike after being released fr. prison for leading boycott.

(Photo by Donald Uhrbrock/Getty Images)

King's Birmingham Jail Letter in Our Digital Times

Sixty-two years after Rev. Dr. Martin Luther King’s pen touches paper in a Birmingham jail cell, I contemplate the walls that still divide us. Walls constructed in concrete to enclose Alabama jails, but in Silicon Valley, designed code, algorithms, and newsfeeds. King's legacy and prophetic words from that jail cell pierce our digital age with renewed urgency.

The words of that infamous letter burned with holy discontent – not just anger at injustice, but a more profound spiritual yearning for a beloved community. Witnessing our social fabric fray in digital spaces, I, too, feel that same holy discontent in my spirit. King wrote to white clergymen who called his methods "unwise and untimely." When I scroll through my social media feeds, I see modern versions of King's "white moderate" – those who prefer the absence of tension to the presence of truth. These are the people who click "like" on posts about racial harmony while scrolling past videos of police brutality. They share MLK quotes about dreams while sleeping through our contemporary nightmares.

Keep ReadingShow less
Trump Must Take Proactive Approach to AI and Jobs

Build a Software Development Team to Running Your Business Growth. Software Engineers on the project discuss a database design workflow and technical issues in a tech business office.

Getty Images//Stock Photo

Trump Must Take Proactive Approach to AI and Jobs


Artificial intelligence (AI) is rapidly disrupting America’s job market. Within the next decade, positions such as administrative assistants, cashiers, postal clerks, and data entry workers could be fully automated. Although the World Economic Forum expects a net increase of 78 million jobs, significant policy efforts will be required to support millions of displaced workers. The Trump administration should craft a comprehensive plan to tackle AI-driven job losses and ensure a fair transition for all.

As AI is expected to reshape nearly 40% of workers’ skills over the next five years, investing in workforce development is crucial. To be proactive, the administration should establish partnerships to provide subsidized retraining programs in high-demand fields like cybersecurity, healthcare, and renewable energy. Providing tax incentives for companies that implement in-house reskilling initiatives could further accelerate this transition.

Keep ReadingShow less