Skip to content
Search

Latest Stories

Follow Us:
Top Stories

Society needs to take a breather

Opinion

Sign saying "slow down"
gerenme/Getty Images

Frazier is an assistant professor at the Crump College of Law at St. Thomas University. He previously clerked for the Montana Supreme Court.

Patience is a virtue. It’s a simple refrain I learned well in Mrs. Campbell’s class — rather than read the instructions on a pop quiz, I rushed to start answering the questions. Turns out I missed three bonus points for simply writing my name on the back of the page rather than the front. Speed, though, has become the dominant social, political and economic norm.

We need our food delivered in minutes — you can even pay to have your UberEats order prioritized over others. We need real-time updates on our social media feeds — news and nonsense from friends and foes alike, full of instantaneous “hot takes” rather than reasoned analysis. We demand our politicians deliver progress on our ideological aims now — candidates jockey to be viewed as the one most likely to succeed on short-term policy goals and partisan preferences. And, we expect and encourage companies to compete to be the “first mover” instead of the “steady and responsible actor.”

It’s time to slow things down. We need to resist our urge to act with haste. So here are a few suggestions for how we can exercise more patience — and, by doing so, save some lives, improve our politics and enhance our discourse.


First, let’s lower the speed limit just about everywhere and let’s enhance the use of automated ways to ticket those who prioritize their speed over the safety of others. It’s not rocket science that fewer bad things can happen when cars move slower — but don’t take my word for it. Instead, consider that officials in Edmonton, Canada, saw a 50 percent drop in fatal and injurious crashes following a 6 mph drop in the speed limit. These benefits can come at relatively low cost, too. The European Union mandated the use of intelligent speed assistance systems in all new cars to increase driver awareness of excessive speeds and to ease enforcement. We can and should do the same in the United States.

Next, let’s consider granting our elected officials a single, longer term. Imagine if senators only served a single, eight-year term. Do you think they might reevaluate how they spend their time in office? I sure do. Rather than waste a third or half of their day calling donors, they could spend more time talking with their colleagues about substantive reforms. And, in place of prioritizing bills they know will please their “base,” they can more thoroughly consider legislation that may not score political points but will nevertheless further the public interest.

Finally, let’s turn to social media. This may be the most obvious place where our addiction to speed has caused poor social outcomes. Speaking from unfortunate personal experience, I know that it can feel impossible to pull away from the drama and debates that endlessly populated my apps. And you may be able to relate to my temptation to post hot takes just to see how folks respond. Those urges are understandable because that’s what the apps are built to do. We can and should advocate for more responsible platform design. The Prosocial Design Network, a community of behavioral science and design experts, has a menu of proven strategies that platforms can incorporate to make their respective feeds more aligned with quality deliberation – ideally platforms would voluntarily adopt these straightforward approaches for a better social media experience.

Realizing a slower society won’t be easy. The steps mentioned above, though, can help create space for safer streets, a more responsive democracy and a more deliberative social media ecosystem. Here’s to moving slowly and thinking about things.


Read More

Government Cyber Security Breach

An urgent look at the risks of unregulated artificial intelligence—from job loss and environmental strain to national security threats—and the growing political battle to regulate AI in the United States.

Getty Images, Douglas Rissing

AI Has Put Humanity on the Ballot

AI may not be the only existential threat out there, but it is coming for us the fastest. When I started law school in 2022, AI could barely handle basic math, but by graduation, it could pass the bar exam. Instead of taking the bar myself, I rolled immediately into a Master of Laws in Global Business Law at Columbia, where I took classes like Regulation of the Digital Economy and Applied AI in Legal Practice. By the end of the program, managing partners were comparing using AI to working with a team of associates; the CEO of Anthropic is now warning that it will be more capable than everyone in less than two years.

AI is dangerous in ways we are just beginning to see. Data centers that power AI require vast amounts of water to keep the servers cool, but two-thirds are in places already facing high water stress, with researchers estimating that water needs could grow from 60 billion liters in 2022 to as high as 275 billion liters by 2028. By then, data centers’ share of U.S. electricity consumption could nearly triple.

Keep ReadingShow less
Posters are displayed next to Sen. Ted Cruz (R-TX) as he speaks at a news conference to unveil the Take It Down Act to protect victims against non-consensual intimate image abuse, on Capitol Hill on June 18, 2024 in Washington, DC.

A lawsuit against xAI over AI-generated deepfakes targeting teenage girls exposes a growing crisis in schools. As laws struggle to keep up, this story explores AI accountability, teen safety, and what educators and parents must do now.

Getty Images, Andrew Harnik

Deepfakes: The New Face of Cyberbullying and Why Parents, Schools, and Lawmakers Must Act

As a former teacher who worked in a high school when Snapchat was born, I witnessed the birth of sexting and its impact on teens. I recall asking a parent whether he was checking his daughter’s phone for inappropriate messages. His response was, “sometimes you just don’t want to know.” But the federal lawsuit filed last week against Elon Musk's xAI has put a national spotlight on AI-generated deepfakes and the teenage girls they target. Parents and teachers can’t ignore the crisis inside our schools.

AI Companies Built the Tool. The Grok Lawsuit Says They Own the Damage.

Whether the theory of French prosecutors–that Elon Musk deliberately allowed the sexualized image controversy to grow so that it would drive up activity on the platform and boost the company’s valuation–is true or not, when a company makes the decision to build a tool and knows that it can be weaponized but chooses to release it anyway, they are making a risk-based decision believing that they can act without consequence. The Grok lawsuit could make these types of business decisions much more costly.

Keep ReadingShow less
Sketch collage image of businessman it specialist coding programming app protection security website web isolated on drawing background.

Amazon’s court loss over Just Walk Out highlights a deeper issue: employers are increasingly collecting workers’ biometric data without meaningful consent. Explore the growing conflict between workplace surveillance, privacy rights, and outdated U.S. laws.

Getty Images, Deagreez

The Quiet Rise of Employee Surveillance

Amazon’s loss in court over its attempt to shield the source code behind its Just Walk Out technology is a small win for shoppers, but the bigger story is how employers are quietly collecting biometric data from their own workers.

From factories to Fortune 500 companies, employers are demanding fingerprints, palmprints, retinal scans, facial scans, or even voice prints. These biometric technologies are eroding the boundary between workplace oversight and employee autonomy, often without consent or meaningful regulation.

Keep ReadingShow less
Close up of a woman wearing black, modern spectacles Smart glasses and reality concept with futuristic screen

Apple’s upcoming AI-powered wearables highlight growing privacy risks as the right to record police faces increasing threats. The death of Alex Pretti raises urgent questions about surveillance, civil liberties, and accountability in the digital age.

Getty Images, aislan13

AI Wearables and the Rising Risk of Recording Police

Last month, Apple announced the development of three wearable smart devices, all equipped with built-in cameras. The company has its sights set on 2027 for the release of their new smart glasses, AI pendant, and AirPods with built-in camera, all of which will be AI-functional for users. As the market for wearable products offering smart-recording capabilities expands, so does the risk that comes with how users choose to use the technology.

In Minneapolis in January, Alex Pretti was killed after an encounter with federal agents while filming them with his phone. He was not a suspect in a crime. He was not interfering, but was doing what millions of Americans now instinctively do when they see state power in motion: witnessing.

Keep ReadingShow less