Skip to content
Search

Latest Stories

Top Stories

AI Progress Delayed Is Progress Denied

Opinion

AI Progress Delayed Is Progress Denied
Students in a college classroom.
Getty Images, Klaus Vedfelt

Earlier this summer, I recorded an episode of the Scaling Laws podcast with MacKenzie Price, founder of Alpha Schools—schools “where kids crush academics in two hours, build life skills through workshops, and thrive beyond the classroom.” The secret is AI, but likely not the sort of AI that comes to mind.

Students at Alpha Schools work with “adaptive AI” that allows 1:1 learning at the pace necessary to master a subject, moving at the speed of the student’s learning rather than that of the entire class. By relying on AI to set that tempo, the school shaves hours off the traditional classroom model and reallocates that time to activities that allow students to more fully explore their interests, from horseback riding to documentary filmmaking.


This approach also offers far more individualized communication between Alpha’s teachers, or “guides,” and students. Price asserted that guides and students have around two dozen 1:1 meetings over the course of the year. In contrast, she flagged that teachers in traditional classrooms spend an aggregate of a few minutes with each student over the course of a year.

If all of this sounds too good to be true, Price has the evidence to make the case for her approach. She can easily list all the universities that Alpha graduates are headed off to and can quickly share how students have managed incredible knowledge gains in a short amount of time. If pressed, Price will let you know they have troves of empirical data on student success because Alpha is serious about the importance of using data to continually improve their adaptive AI system.

Alpha is available to students in Austin, Miami, and, soon, even more communities. Of course, access to such an innovative and controversial model comes at a price. It’s about $40,000 per year to attend an Alpha School. Unsurprisingly, that price point leaves a lot of families stuck with the traditional model.

Most schools operate as if it were 1975, rather than 2025. Our students are stuck in institutions built in bygone eras and trapped in pedagogical practices that aimed to train reliable factory workers rather than the interdisciplinary and thoughtful leaders we need in the age of AI.

I’m the “fun uncle” or “funcle” to several nieces and nephews. When I catch up with my friends about how these youngsters are doing, I hear about students struggling to get the attention they need to make reliable progress. For instance, my nephew Tommy (not his real name) recently struggled through a math lesson on fractions. Tommy’s teacher, managing 28 other students, had maybe 30 seconds to spend with him before moving on. Tommy fell further behind, his confusion deepening with each passing day.

Meanwhile, on a tablet in the school's unused computer lab, an AI tutor akin to the one used at Alpha sat dormant—one capable of detecting exactly where Tommy’s understanding broke down, adjusting its approach in real-time, and working with him until the concept clicked. The technology to give Tommy what wealthy families have always bought their children—personalized, patient, adaptive instruction—was right there. But school policies, procurement red tape, and institutional inertia kept it locked away.

This scene plays out millions of times daily across America, and it represents something more troubling than inefficiency. We're not just failing to help Tommy learn about improper fractions as well as we could; we're actively choosing to let him struggle when we know exactly how to help him succeed. And Tommy isn't alone. Across education, healthcare, and justice, we're systematically denying ourselves the transformative benefits of artificial intelligence by clinging to institutions designed for a world that no longer exists.

In a few years, it's likely that Tommy's school and schools just like his will get access to a generic version of Alpha's AI tool. Some will say that's soon enough—after all, we cannot expect all schools to overhaul their systems to look more like Alpha's. I say that's a load of hooey. Our charge is to give our students and all future generations the tools required to thrive today and for all foreseeable tomorrows. State education departments spend tens of thousands of dollars per year on each student. Those funds should not continue to subsidize a flawed and antiquated approach to education. As large enterprises, school districts are in a position to bargain with AI companies for discounted tools. They also have the means to train teachers and demand the adoption of new tools.

Will such a change be easy? No. But it is not optional. Progress delayed is progress denied. Each day that Tommy and others are stuck in classrooms of the past, they are missing out on new knowledge and the opportunities afforded by that knowledge.

Kevin Frazier is an AI Innovation and Law Fellow at Texas Law and Author of the Appleseed AI substack.

Read More

On Live Facial Recognition in the City: We Are Not Guinea Pigs, and We Are Not Disposable

New Orleans fights a facial recognition ordinance as residents warn of privacy risks, mass surveillance, and threats to immigrant communities.

Getty Images, PhanuwatNandee

On Live Facial Recognition in the City: We Are Not Guinea Pigs, and We Are Not Disposable

Every day, I ride my bike down my block in Milan, a tight-knit residential neighborhood in central New Orleans. And every day, a surveillance camera follows me down the block.

Despite the rosy rhetoric of pro-surveillance politicians and facial recognition vendors, that camera doesn’t make me safer. In fact, it puts everyone in New Orleans at risk.

Keep ReadingShow less
The Manosphere Is Bad for Boys and Worse for Democracy
a skeleton sitting at a desk with a laptop and keyboard
Photo by Growtika on Unsplash

The Manosphere Is Bad for Boys and Worse for Democracy

15-year-old Owen Cooper made history to become the youngest male to win an Emmy Award. In the Netflix series Adolescence, Owen plays the role of a 13-year-old schoolboy who is arrested after the murder of a girl in his school. As we follow the events leading up to the crime, the award-winning series forces us to confront legitimate insecurities that many teenage boys face, from lack of physical prowess to emotional disconnection from their fathers. It also exposes how easily young men, seeking comfort in their computers, can be pulled into online spaces that normalize misogyny and rage; a pipeline enabled by a failure of tech policy.

At the center of this danger lies the manosphere: a global network of influencers whose words can radicalize young men and channel their frustrations into violence. But this is more than a social crisis affecting some young men. It is a growing threat to the democratic values of equality and tolerance that keep us all safe.

Keep ReadingShow less
Your Data Isn’t Yours: How Social Media Platforms Profit From Your Digital Identity

Discover how your personal data is tracked, sold, and used to control your online experience—and how to reclaim your digital rights.

Getty Images, Sorapop

Your Data Isn’t Yours: How Social Media Platforms Profit From Your Digital Identity

Social media users and digital consumers willingly present a detailed trail of personal data in the pursuit of searching, watching, and engaging on as many platforms as possible. Signing up and signing on is made to be as easy as possible. Most people know on some level that they are giving up more data than they should , but with hopes that it won’t be used surreptitiously by scammers, and certainly not for surveillance of any sort.

However, in his book, "Means of Control," Byron Tau shockingly reveals how much of our digital data is tracked, packaged, and sold—not by scammers but by the brands and organizations we know and trust. As technology has deeply permeated our lives, we have willingly handed over our entire digital identity. Every app we download, every document we create, every social media site we join, there are terms and conditions that none of us ever bother to read.

That means our behaviors, content, and assets are given up to corporations that profit from them in more ways than the average person realizes. The very data and the reuse of it are controlling our lives, our freedom, and our well-being.

Let’s think about all this in the context of a social media site. It is a place where you interact with friends, post family photos, and highlight your art and videos. You may even share a perspective on current events. These very social media platforms don’t just own your content. They can use your behavior and your content to target you. They also sell your data to others, and profit massively off of YOU, their customer.

Keep ReadingShow less
A gavel next to a computer chip with the words "AI" on it.

Often, AI policy debates focus on speculative risks rather than real-world impacts. Kevin Frazier argues that lawmakers and academics must shift their focus from sci-fi scenarios to practical challenges.

Getty Images, Just_Super

Why Academic Debates About AI Mislead Lawmakers—and the Public

Picture this: A congressional hearing on “AI policy” makes the evening news. A senator gravely asks whether artificial intelligence might one day “wake up” and take over the world. Cameras flash. Headlines declare: “Lawmakers Confront the Coming Robot Threat.” Meanwhile, outside the Beltway on main streets across the country, everyday Americans worry about whether AI tools will replace them on factory floors, in call centers, or even in classrooms. Those bread-and-butter concerns—job displacement, worker retraining, and community instability—deserve placement at the top of the agenda for policymakers. Yet legislatures too often get distracted, following academic debates that may intrigue scholars but fail to address the challenges that most directly affect people’s lives.

That misalignment is no coincidence. Academic discourse does not merely fill journals; it actively shapes the policy agenda and popular conceptions of AI. Too many scholars dwell on speculative, even trivial, hypotheticals. They debate whether large language models should be treated as co-authors on scientific papers or whether AI could ever develop consciousness. These conversations filter into the media, morph into lawmaker talking points, and eventually dominate legislative hearings. The result is a political environment where sci-fi scenarios crowd out the issues most relevant to ordinary people—like how to safeguard workers, encourage innovation, and ensure fairness in critical industries. When lawmakers turn to scholars for guidance, they often encounter lofty speculation rather than clear-eyed analysis of how AI is already reshaping specific sectors.

Keep ReadingShow less