Skip to content
Search

Latest Stories

Follow Us:
Top Stories

The Untold Costs of AI: The West Is Paying for the Future That Hasn’t Arrived

The Untold Costs of AI: The West Is Paying for the Future That Hasn’t Arrived

robot, technology, future, futuristic, business, tree, symbol

Getty Images//Stock Photo

Artificial intelligence (AI) has been heralded as a technological revolution that will transform our world. From curing diseases to automating dangerous jobs to discovering new inventions, the possibilities are tantalizing. We’re told that AI could bring unprecedented good—if only we continue to invest in its development and allow labs to seize precious, finite natural resources.

Yet, despite these grand promises, most Americans haven’t experienced any meaningful benefits from AI. It’s yet to meaningfully address most health issues, and for many, It’s not significantly improving our everyday lives, excluding drafting emails and making bad memes. In fact, AI usage is still largely confined to a narrow segment of the population: highly educated professionals in tech hubs and urban centers. An August 2024 survey by the Federal Reserve and Harvard Kennedy School found that while 39.4% of U.S. adults aged 18-64 reported using generative AI, adoption rates vary significantly. Workers with a bachelor's degree or higher are twice as likely to use AI at work compared to those without a college degree (40% vs. 20%), and usage is highest in computer/mathematical occupations (49.6%) and management roles (49.0%).


For the majority of Americans, especially those in personal services (12.5% adoption) and blue-collar occupations (22.1% adoption), AI remains an abstraction, something that exists in the future rather than their present.

While the rewards of AI are still speculative, the costs are becoming increasingly tangible. And the people paying those costs are not the ones benefiting from AI today. In fact, much of the burden of AI’s development is falling squarely on the shoulders of the American West—both its people and its land. According to recent research, data centers in the United States are consuming an increasing share of the country's total electricity. These facilities, which are crucial for AI deployment, used about 3% of all U.S. electricity in 2022. By 2030, their share is estimated to grow to 9% of total U.S. electricity consumption.

This surge in energy demand is particularly significant for the Western United States, with its concentration of tech hubs and data centers. Moreover, the carbon dioxide emissions from data centers may more than double between 2022 and 2030, further intensifying the environmental impact on these regions.

Here’s why: developing and deploying AI requires enormous amounts of energy. Advanced machine learning models demand computing power on a scale that most people can barely comprehend. Recent International Energy Agency projections highlight the magnitude of this demand: global electricity consumption from data centers, cryptocurrencies, and AI is expected to reach between 620-1050 trillion watt hours (TWh) by 2026. To put that in perspective, 1,000 TWh could provide electricity to about 94.3 million American homes for an entire year.

All that energy has to come from somewhere. Increasingly, it’s coming from the West —the part of the country that has long been tapped to fuel the nation’s ambitions, from oil and gas to solar, wind, and hydropower.

This energy extraction is putting immense pressure on the West’s already strained resources. Land is being consumed, water is being diverted, and communities are being disrupted, all to keep the lights on in tech labs far removed from the realities of life on the ground. The irony is that the very regions making AI possible are the least likely to benefit from it.

The rush to ramp up energy production for AI feels eerily familiar. We’ve seen these “get rich quick” schemes before—industries that swoop into rural areas, extract valuable resources, and leave environmental and social destruction in their wake. The West has been exploited before by out-of-state interests with big promises and shallow commitments, and AI risks becoming the latest chapter in that story.

We need to have an honest conversation about the true costs of AI development—particularly when it comes to energy consumption. AI labs may talk about curing diseases and inventing new technologies, but until those breakthroughs become reality, the rest of us—especially those in the West—are left footing the bill. And right now, that bill is being paid in the form of depleted resources and communities that are being squeezed for the sake of a future that remains distant and uncertain.

The truth is, we can’t continue to deplete our resources in the hope that AI’s promises will eventually materialize. We must demand accountability and transparency from those developing AI. Where is the energy coming from? Who is being impacted? And most importantly, who will benefit?

AI’s future may hold incredible potential, but we must make sure that we’re not sacrificing the West’s present for a future that may never arrive. If AI is going to reshape our world, it must do so in a way that lifts up all Americans, not just a select few. Until then, we need to be clear-eyed about the costs—and demand better.

Frazier is an adjunct professor of Delaware Law and an affiliated scholar of emerging technology and constitutional law at St. Thomas University College of Law.


Read More

Paul Ehrlich was wrong about everything

Crowd of people walking on a street.

Andy Andrews//Getty Images

Paul Ehrlich was wrong about everything

Biologist and author Paul Ehrlich, the most influential Chicken Little of the last century, died at the age of 93 this week. His 1968 book, “The Population Bomb,” launched decades of institutional panic in government, entertainment and journalism.

Ehrlich’s core neo-Malthusian argument was that overpopulation would exhaust the supply of food and natural resources, leading to a cascade of catastrophes around the world. “The Population Bomb” opens with a bold prediction, “The battle to feed all of humanity is over. In the 1970s and 1980s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now.”

Keep ReadingShow less
Bravado Isn’t a Strategy: Why the Iran War Has No Endgame

People clear rubble in a house in the Beryanak District after it was damaged by missile attacks two days before, on March 15, 2026 in Tehran, Iran. The United States and Israel continued their joint attack on Iran that began on February 28. Iran retaliated by firing waves of missiles and drones at Israel, and targeting U.S. allies in the region.

Getty Images, Majid Saeedi

Bravado Isn’t a Strategy: Why the Iran War Has No Endgame

Most of what we have heard from the administration as it pertains to the Iran War is swagger and bro-talk. A few days into the war, the White House released a social media video that combined footage of the bombardment with clips from video games. Not long after, it released a second video, titled “Justice the American Way,” that mixed images of the U.S. military with scenes from movies like Gladiator and Top Gun Maverick.

Speaking to reporters at the Pentagon, War Secretary Pete Hegseth boasted of “death and destruction from the sky all day long.” “They are toast, and they know it,” he said. “This was never meant to be a fair fight... we are punching them while they’re down.”

Keep ReadingShow less
A student in uniform walking through a campus.

A Reserve Officer Training Corps (ROTC) cadet walks through campus November 7, 2003 in Princeton, New Jersey.

Getty Images, Spencer Platt

Hegseth is Dumbing Down the Military (on Purpose)

One day before the United States began an ill-defined and illegal war of indefinite length with Iran, Pete Hegseth angrily attacked a different enemy: the Ivy League. The Secretary of War denounced Ivy League universities as "woke breeding grounds of toxic indoctrination” and then eliminated long-standing college fellowship programs with more than a dozen elite colleges, which had historically served as a pipeline for service members to the upper ranks of military leadership. Of the schools now on Hegseth’s "no-fly list," four sit in the top ten of the World’s Top Universities for 2026. So, why does the Secretary of War not want his armed forces to have the best education available? Because he wants a military without a brain.

For a guy obsessed with being the strongest and most lethal force in the world, cutting access to world-class schools is a bizarre gambit. It does reveal Hegseth doesn’t consider intelligence a factor–let alone an asset–in strength or lethality. That tracks. Hegseth alleges the Ivies infect officers with “globalist and radical ideologies that do not improve our fighting ranks…” God forbid the tip of the sword of our foreign policy has knowledge of international cooperation and global interconnectedness. The Ivy League has its own issues, but the Pentagon’s claim that they "fail to deliver rigorous education grounded in realism” is almost laughable. I’m a veteran Lieutenant Commander with two Ivy League degrees, both paid for with military tuition assistance, and I promise: it was rigorous. Meanwhile, are Hegseth’s performative politics grounded in reality? Attacking Harvard on social media the eve of initiating a new war with a foreign adversary is disgraceful, and even delusional.

Keep ReadingShow less
Are We Prepared for a World Where AI Isn’t at Work?
Person working at a desk with a laptop and books.

Are We Prepared for a World Where AI Isn’t at Work?

Draft an important email without using AI. Write it from scratch — no suggestions, no autocomplete, and no prompt to ChatGPT to compose or revise the email.

Now ask yourself: Did it feel slower? Harder? Slightly uncomfortable?

Keep ReadingShow less