Skip to content
Search

Latest Stories

Top Stories

Large bipartisan majorities favor government regulation of AI

US Capital with tech background
Greggory DiSalvo/Getty Images

Kull is program director of the Program for Public Consultation. Lewitus is a research analyst at Voice of the People. Thomas is vice president of Voice of the People and director of Voice of the People Action.

As the House of Representatives’ new Task Force on Artificial Intelligence considers how the government should address AI issues, such as election-related deepfakes and bias in algorithms, a new survey finds very large bipartisan majorities favor giving the federal government broad powers to regulate AI.

They endorse seven proposals currently under consideration in Congress and the executive branch for regulating AI-generated deepfakes and AI making decisions with the potential for harm. Voters also favor international treaties prohibiting AI-controlled weapons and establishing an international agency to regulate large-scale AI projects.


The survey was conducted by the Program for Public Consultation at the University of Maryland’s School of Public Policy. To ensure that respondents fully understood issues around AI, as in all public consultation surveys, respondents were provided in-depth briefings and arguments for and against each proposal, reviewed by experts on each side of the debates.

Creating new laws for AI-generated deepfakes registered overwhelming bipartisan support. Advancements in AI have led to the ability to easily create hyper-realistic images, video and audio. All three proposals surveyed garnered the support of over eight in 10 Republicans and Democrats. They would:

Sign up for The Fulcrum newsletter

Large bipartisan majorities also favor three proposals for closely regulating AI programs that make decisions which can significantly impact people’s lives, including in healthcare, banking, hiring, criminal justice, and welfare, in a similar manner to the way the FDA regulates drugs. There is evidence that some of these programs have violated regulations, shown bias (e.g. by race, gender, age, etc.) and caused material harm to individuals.

More than seven-in-ten voters favor proposals that would:

  • Require these AI programs pass a test before they can be put into use, which would evaluate whether they may violate regulations, make biased decisions, or have security vulnerabilities. (National 81 percent, Republicans 76 percent, Democrats 88 percent)
  • Allow the government to audit programs that are in use, and require the AI company to fix any problems that are found (national 77 percent, Republicans 74 percent, Democrats 82 percent).
  • Require AI companies to disclose information to the government about how the decision-making AI was trained, if requested, to aid with pre-testing and audits (national 72 percent, Republicans 67 percent, Democrats 81 percent).

These proposals come from the Algorithmic Accountability Act, and mirror regulations in the European Union’s Artificial Intelligence Act.

Creating a new federal agency for AI to enforce regulations, oversee development and provide guidance on policy is supported by 74 percent (Republicans 68 percent, Democrats 81 percent). This proposal is based on the Digital Platforms Commission Act.

In the international realm, Americans also support the creation of an international regulatory agency for large-scale AI, modeled after the International Atomic Energy Agency, as proposed by OpenAI, New York University professor Gary Marcus, and U.N. Secretary-General António Guterres. A large bipartisan majority (77 percent) favors the creation of such an international agency that would develop standards for large-scale AI and have the authority to monitor and inspect whether their standards are being met (Republicans 71 percent, Democrats 84 percent).

Also in the international realm, Americans support creating a treaty to prohibit the development of weapons that can use AI to fire on targets without human control – called lethal autonomous weapons – as has been called for by the International Committee of the Red Cross, and the Campaign to Stop Killer Robots. A large bipartisan majority (81 percent) favors the U.S. actively working to establish such a treaty, and creating an international agency to enforce that prohibition (Republicans 77 percent, Democrats 85 percent).

Clearly Americans are seriously concerned about the current and potential harms from AI. Large majorities of Republicans as well as Democrats favor creating robust federal and international agencies to regulate AI and protect people from deepfakes, biased decision-making, and other potential harms from AI.

When respondents evaluated arguments for and against each of the above proposals, the arguments in favor of regulations were found convincing by larger majorities of both Republicans and Democrats. However, majorities also found many of the arguments against convincing, including: regulation will stifle innovation; prohibitions violate the freedom of expression; and international agencies may abuse their power.

Americans are wary of government regulation, but they are clearly more wary of the unconstrained development and use of AI.

The survey was fielded online February 16-23, 2024 with a representative non-probability sample of 3,610 registered voters provided by Precision Sample from its larger online panel. The confidence interval varies from +/- 1.4 to 1.8 percent.

Read the full report and the questionnaire.

Read More

Independents as peacemakers

Group of people waving small American flags at sunset.

Getty Images//Simpleimages

Independents as peacemakers

In the years ahead, independents, as candidates and as citizens, should emerge as peacemakers. Even with a new administration in Washington, independents must work on a long-term strategy for themselves and for the country.

The peacemaker model stands in stark contrast to what might be called the marriage counselor model. Independent voters, on the marriage counselor model, could elect independent candidates for office or convince elected politicians to become independents in order to secure the leverage needed to force the parties to compromise with each other. On this model, independents, say six in the Senate, would be like marriage counselors because their chief function would be to put pressure on both parties to make deals, especially when it comes to major policy bills that require 60 votes in the Senate.

Keep ReadingShow less
Trump takes first steps to enact his sweeping agenda

President Donald Trump signs an executive order in the Oval Office of the White House in Washington, D.C., on January 20, 2025.

(JIM WATSON/GETTY IMAGES)

Trump takes first steps to enact his sweeping agenda

On his first day in office as the 47th President of the United States, Donald Trump began to implement his agenda for reshaping the nation's institutions.

He signed a flurry of executive orders, memorandums, and proclamations.

Keep ReadingShow less
As Trump policy changes loom, nearly half of farmworkers lack legal status

Immigrant farm workers hoe weeds in a farm field of produce.

Getty Images//Rand22
Bird Flu and the Battle Against Emerging Diseases

A test tube with a blood test for h5n1 avian influenza. The concept of an avian flu pandemic. Checking the chicken for diseases.

Getty Images//Stock Photo

Bird Flu and the Battle Against Emerging Diseases

The first human death from bird flu in the United States occurred on January 6 in a Louisiana hospital, less than three weeks before the second Donald Trump administration’s inauguration. Bird flu, also known as Avian influenza or H5N1, is a disease that has been on the watch list of scientists and epidemiologists for its potential to become a serious threat to humans.

COVID-19’s chaotic handling during Trump’s first term serves as a stark reminder of the stakes. According to the Centers for Disease Control (CDC) and Prevention, last year, 66 confirmed human cases of H5N1 bird flu were reported in the United States. That is a significant number when you consider that only one case was recorded in the two previous years.

Keep ReadingShow less