Skip to content
Search

Latest Stories

Top Stories

Overcoming AI voice cloning attacks on election integrity

Computer image of a person speaking
ArtemisDiana/Getty Images

Levine is an election integrity and management consultant who works to ensure that eligible voters can vote, free and fair elections are perceived as legitimate, and election processes are properly administered and secured.

Imagine it’s Election Day. You’re getting ready to go vote when you receive a call from a public official telling you to vote at an early voting location rather than your Election Day polling site. So, you go there only to discover it’s closed. Turns out that the call wasn’t from the public official but from a replica created by voice cloning technology.

That might sound like something out of a sci-fi movie, but many New Hampshire voters experienced something like it two days before the 2024 presidential primary. They received robocalls featuring a deepfake simulating the voice of President Joe Biden that discouraged them from participating in the primary.


To be sure, there’s no indication that the fake Biden robocalls had a discernible impact on the New Hampshire primary, but the incident is a stark reminder of the growing threat posed by tactics like this, which are increasingly being used by malign actors to target elections not only in the U.S. but in Slovakia, Argentina and elsewhere.

Sign up for The Fulcrum newsletter

As artificial intelligence tools become more accessible and affordable, deepfake attacks (of which voice cloning is only one example) are becoming more frequent. How can voters protect themselves from similar efforts to ensure that they make informed decisions for the November general election? Here are a few tips:

1.Avoid answering calls from unknown numbers: Picking up a call from an unknown number increases the likelihood of falling for a scam. Additionally, if you answer a call from an unknown number and speak, a scammer can record your voice and use it to create cloned scam calls to trick your family members and friends.

2.Verify the caller’s identity: If you do answer a call that raises suspicion, take steps to verify the caller’s identity. Several New Hampshire votersdid this after receiving the Biden robocall and were able to confirm that the voice was fake. Try to contact the person (or their campaign) through an alternative channel to confirm that the call was actually from the person/organization it purported to be from.

3.Report potential voice cloning: If you may have received an AI voice scam call, contact the appropriate authorities so they can use their expertise to investigate further. This can help address your scam, as well as others, and deter similar future behavior. After New Hampshire voters alerted law enforcement and their attorney general about the robocall that used AI to impersonate Biden, the alleged culprit was identified and charged with 13 counts of voter suppression, a felony, and 13 counts of impersonating a candidate, a misdemeanor. He also faces a proposed $6 million fine from the Federal Communications Commission.

4.Educate yourself: Knowledge is your best defense against emerging threats. Take the time to educate yourself and those around you about the dangers of voice cloning. Be skeptical of unsolicited calls, especially if they involve urgent requests that offer suspicious information or try to get you to engage in behavior that sounds “off” (like sending gift cards to supposed relatives of friends).

5.Rely on trusted sources: Our information ecosystem is awash in lies and inaccurate information, but at least in the elections space we know whom to seek out for accurate information about the administration of elections: state and local election officials (and those who support their efforts).

6.Make a plan to vote in advance of Election Day: Devising a vote plan allows you to confirm when, where and how you can vote. It also enables you to consider alternatives in case your preferred plan for voting does not work out because of something unforeseen like an illness. Finally, planning makes it less likely that you’ll be tricked by something like a voice cloning attack, even if it appears real.

Voice cloning attacks are part of the “new frontier” in malign efforts to meddle in U.S. elections. By staying informed, establishing safeguards, and remaining skeptical of unexpected communications, voters can increase their chances of thwarting these threats before they cause real damage.

Read More

"And the Oscar Goes To…": A Divided America
a golden statue of a man standing next to a black wall
Photo by Mirko Fabian on Unsplash

"And the Oscar Goes To…": A Divided America

The Oscars have always been political, but this year, it promises to be one of the most politically charged awards shows in recent memory. It arrives at a time when the White House's dismantling of DEI programs and mass deportation raids have sent a ripple effect through all facets of American life, including Hollywood.

This is why the Dolby Theater, home to the 97th annual Academy Awards, will be the stage for two competing visions of America: one in which artists, not politicians, shape the culture and another in which the presidency seeks to define it.

Keep ReadingShow less
Main Street AI: AI for the People

An illustration of AI chat boxes.

Getty Images, Andriy Onufriyenko

Main Street AI: AI for the People

When Vice President J.D. Vance addressed the Paris AI Summit, he unknowingly made a strong case for public artificial intelligence (AI) infrastructure. His vision—of AI that empowers workers rather than displaces them, enables small businesses to compete with tech giants on a level playing field and delivers benefits to all Americans—cannot be achieved through private industry alone. What's needed is nothing less than an AI equivalent of the interstate highway system: a nationwide network of computational resources, shared data, and technical expertise that democratizes access to this transformative technology.

The challenge is clear. The National AI Opinion Monitor reveals a stark digital divide in AI adoption: higher-income urban professionals increasingly leverage AI tools to enhance their productivity, while rural and lower-income Americans remain largely locked out of the AI economy. Without intervention, AI threatens to become another force multiplier for existing inequalities.

Keep ReadingShow less
Data-based checks and bicameral balancing of Executive Orders
shallow focus photography of computer codes

Data-based checks and bicameral balancing of Executive Orders

The flurry of Presidential Executive Orders attracted plenty of data-based checks in the media. The bad propaganda, rollbacks, and a dip in the President’s approval rating may have been avoided if the US Constitution mandated the Whitehouse to do similar checks before initiating the Executive Orders.

Mandating data-based checks on executive orders ensures that decisions made by the President are rooted in evidence and have a clear, justifiable basis. Data-based checks would ensure that executive orders are issued only after they are scrutinized on their merits, impact, and alignment with the public interest. These checks help prevent orders from being issued on personally or politically motivated priorities or unsubstantiated claims.

Keep ReadingShow less
TikTok: The Aftermath
File:TikTok app.jpg - Wikimedia Commons

TikTok: The Aftermath

When Congress passed PAFACA (Protecting Americans from Foreign Adversary Controlled Applications), they should have considered the consequences. They apparently didn’t.

With approximately 170 million users, what did politicians think would happen when TikTok actually went dark? Did Congress consider the aftermath? President Trump is trying hard to find a way to keep TikTok from going dark permanently, but he likely won’t succeed.

Keep ReadingShow less