Skip to content
Search

Latest Stories

Top Stories

Medical Schools Are Falling Behind in the Age of Generative AI

Opinion

Medical Schools Are Falling Behind in the Age of Generative AI

"To prepare tomorrow’s doctors, medical school deans, elected officials, and health care regulators must invest in training that matches the pace and promise of this technology," writes Dr. Robert Pearl.

Getty Images, ArtistGNDphotography

While colleges across the nation are adapting their curricula to harness the power of generative AI, U.S. medical schools remain dangerously behind.

Most students entering medicine today will graduate without ever being trained to use GenAI tools effectively. That must change. To prepare tomorrow’s doctors – and protect tomorrow’s patients – medical school deans, elected officials, and health care regulators must invest in training that matches the pace and promise of this technology.


Universities embrace AI as medical schools fall behind

Across the country, colleges and universities are reimagining how they educate students in the age of generative AI.

  • At Duke University, every new student receives a custom AI assistant dubbed DukeGPT.
  • At California State University, more than 460,000 students across 23 campuses now have access to a 24/7 ChatGPT toolkit.

These aren’t niche experiments. They’re part of a sweeping, systems-level transformation aimed at preparing graduates for a rapidly evolving workforce.

Most medical schools, however, have not kept pace. Instead of training students to apply modern tools toward clinical care, they continue to emphasize memorization — testing students on biochemical pathways and obscure facts rarely used in practice.

Early fears about plagiarism and declining academic rigor led many university departments to proceed cautiously after ChatGPT’s release in 2022. But since then, an increasing number of these educational institutions have shifted from policing AI to requiring faculty to incorporate GenAI into their coursework. And the American Federation of Teachers announced earlier this month that it would start an AI training hub for educators with $23 million from tech giants Microsoft, OpenAI, and Anthropic.

Medical education remains an outlier. A recent Educause study found that just 14% of medical schools have developed a formal GenAI curriculum, compared to 60% of undergraduate programs. Most medical school leaders and doctors still regard large language models as administrative aids rather than essential clinical tools.

This view is short-sighted. Within a few years, physicians will rely on generative AI to synthesize vast amounts of medical research, identify diagnostic patterns, and recommend treatment options tailored to the latest evidence. Patients will arrive at appointments already equipped with GenAI-assisted insights.

Used responsibly, generative AI can help prevent the 400,000 deaths each year from diagnostic errors, 250,000 deaths from preventable medical mistakes, and 500,000 deaths from poorly controlled chronic diseases. Elected officials and regulators need to support this life-saving approach.

How medical schools can catch up

In the past, medical students were evaluated on their ability to recall information. In the future, they will be judged by their ability to help AI-empowered patients manage chronic illnesses, prevent life-threatening disease complications, and maximize their health.

With generative AI capabilities doubling every year, matriculating medical students will be entering clinical practice equipped with tools over 30 times more powerful than today’s models. Yet few doctors will have received structured training on how to use them effectively.

Modernizing medical education starts with faculty training. Students entering medical school in 2025 will arrive already comfortable using generative AI tools like ChatGPT. Most instructors, however, will need to build that fluency.

To close this gap, academic leaders should provide faculty training programs before the start of the next academic year. These sessions would introduce educators to prompt engineering, output evaluation, and reliability assessment. These are foundational skills for teaching and applying GenAI in clinical scenarios.

Once faculty are prepared, schools would begin building case-based curricula that reflect modern clinical realities.

Sample Exercise: Managing chronic disease with GenAI support

In this scenario, students imagine seeing a 45-year-old man during a routine checkup. The patient has no prior medical problems, but on a physical exam, his blood pressure reads 140/100.

First, students walk through the traditional diagnostic process:

  • What additional history would they obtain?
  • Which physical findings warrant follow-up?
  • What laboratory tests would they order?
  • What treatment and follow-up plan would they recommend?

Next, they enter the same case into a generative AI tool and compare its output to their own. Where do they align? Where do they differ (and, importantly, why)?

Finally, students design a care plan that incorporates GenAI’s growing capabilities, such as:

  • Analyzing data from at-home blood pressure monitors.
  • Customizing educational guidance.
  • Enabling patients to actively manage their chronic diseases between visits.

This type of training – integrated alongside traditional curriculum – prepares future clinicians to master not just the technology but also understand how it can be used to transform medical care.

A call to government: Empower the next generation of physicians

Medical schools can’t do this alone. Because most physician training is funded through federal grants and Medicare-supported residency programs, meaningful reform will require coordinated leadership from academic institutions, government agencies, and lawmakers.

Preparing future doctors to use GenAI safely and effectively should be treated as a national imperative. Medicare will need to fund new educational initiatives, and agencies like the FDA must streamline the approval process for GenAI-assisted clinical applications.

This month, the Trump administration encouraged U.S. companies and nonprofits to develop AI training programs for schools, educators, and students. Leading tech companies — including Nvidia, Amazon, and Microsoft — quickly signed on.

If medical school deans demonstrate similar openness to innovation, we can expect policymakers and industry leaders to invest in medical education, too.

But if medical educators and government leaders hesitate, for-profit companies and private equity firms will fill the void. And they will use GenAI not to improve patient care but primarily to increase margins and drive revenue.

As deans prepare to welcome the class of 2029 (and as lawmakers face the growing costs of American health care), they must ask themselves:

Are we preparing students to practice yesterday’s medicine or to lead tomorrow’s?

Dr. Robert Pearl, the author of “ ChatGPT, MD,” teaches at both the Stanford University School of Medicine and the Stanford Graduate School of Business. He is a former CEO of The Permanente Medical Group.

Read More

A person on their phone, using a type of artificial intelligence.

AI is transforming the workplace faster than ever. Experts warn that automation could reshape jobs, wages, and opportunities for millions of American workers.

Getty Images, d3sign

AI Reshapes the American Workplace—But Where Are the Jobs?

In recent years, American workers have been going through an unprecedented experiment in how we work. During the COVID pandemic and social distancing, U.S. businesses embraced the latest online technologies to vastly expand remote work. That, in turn, ushered in the slow creep of artificial intelligence (AI) applications into every crack and seam of society, including in the workplace.

If 2023 was about increasing adoption of AI coming out of the pandemic, experts are saying 2025-26 will be when companies implement deeper changes in the workplace based on ever more pervasive AI.

Keep ReadingShow less
A child looking at a cellphone at night.

AI is changing childhood. Kevin Frazier explains why it's critical for parents and mentors to start having the “AI talk” and teach kids safe, responsible AI use.

Getty Images, Elva Etienne

The New Talk: The Need To Discuss AI With Kids

“[I]t is a massively more powerful and scary thing than I knew about.” That’s how Adam Raine’s dad characterized ChatGPT when he reviewed his son’s conversations with the AI tool. Adam tragically died by suicide. His parents are now suing OpenAI and Sam Altman, the company’s CEO, based on allegations that the tool contributed to his death.

This tragic story has rightfully caused a push for tech companies to institute changes and for lawmakers to institute sweeping regulations. While both of those strategies have some merit, computer code and AI-related laws will not address the underlying issue: our kids need guidance from their parents, educators, and mentors about how and when to use AI.

Keep ReadingShow less
Could Trump’s campaign against the media come back to bite conservatives?

US President Donald Trump reacts next to Erika Kirk, widow of Charlie Kirk, after speaking at the public memorial service for right-wing activist Charlie Kirk at State Farm Stadium in Glendale, Arizona, on September 21, 2025.

(Photo by Mandel NGAN / AFP) (Photo by MANDEL NGAN/AFP via Getty Images)

Could Trump’s campaign against the media come back to bite conservatives?

In the wake of Jimmy Kimmel’sapparently temporary— suspension from late-night TV, a (tragically small) number of prominent conservatives and Republicans have taken exception to the Trump administration’s comfort with “jawboning” critics into submission.

Sen. Ted Cruz condemned the administration’s “mafioso behavior.” He warned that “going down this road, there will come a time when a Democrat wins again — wins the White House … they will silence us.” Cruz added during his Friday podcast. “They will use this power, and they will use it ruthlessly. And that is dangerous.”

Keep ReadingShow less
Congress Bill Spotlight: No Social Media at School Act

Rep. Angie Craig’s No Social Media at School Act would ban TikTok, Instagram & Snapchat during K-12 school hours. See what’s in the bill.

Getty Images, Daniel de la Hoz

Congress Bill Spotlight: No Social Media at School Act

Gen Z’s worst nightmare: TikTok, Instagram, and Snapchat couldn’t be used during school hours.

What the bill does

Rep. Angie Craig (D-MN2) introduced the No Social Media at School Act, which would require social media companies to use “geofencing” to block access to their products on K-12 school grounds during school hours.

Keep ReadingShow less