Skip to content
Search

Latest Stories

Top Stories

Overcoming AI voice cloning attacks on election integrity

Computer image of a person speaking
ArtemisDiana/Getty Images

Levine is an election integrity and management consultant who works to ensure that eligible voters can vote, free and fair elections are perceived as legitimate, and election processes are properly administered and secured.

Imagine it’s Election Day. You’re getting ready to go vote when you receive a call from a public official telling you to vote at an early voting location rather than your Election Day polling site. So, you go there only to discover it’s closed. Turns out that the call wasn’t from the public official but from a replica created by voice cloning technology.

That might sound like something out of a sci-fi movie, but many New Hampshire voters experienced something like it two days before the 2024 presidential primary. They received robocalls featuring a deepfake simulating the voice of President Joe Biden that discouraged them from participating in the primary.


To be sure, there’s no indication that the fake Biden robocalls had a discernible impact on the New Hampshire primary, but the incident is a stark reminder of the growing threat posed by tactics like this, which are increasingly being used by malign actors to target elections not only in the U.S. but in Slovakia, Argentina and elsewhere.

Sign up for The Fulcrum newsletter

As artificial intelligence tools become more accessible and affordable, deepfake attacks (of which voice cloning is only one example) are becoming more frequent. How can voters protect themselves from similar efforts to ensure that they make informed decisions for the November general election? Here are a few tips:

1.Avoid answering calls from unknown numbers: Picking up a call from an unknown number increases the likelihood of falling for a scam. Additionally, if you answer a call from an unknown number and speak, a scammer can record your voice and use it to create cloned scam calls to trick your family members and friends.

2.Verify the caller’s identity: If you do answer a call that raises suspicion, take steps to verify the caller’s identity. Several New Hampshire votersdid this after receiving the Biden robocall and were able to confirm that the voice was fake. Try to contact the person (or their campaign) through an alternative channel to confirm that the call was actually from the person/organization it purported to be from.

3.Report potential voice cloning: If you may have received an AI voice scam call, contact the appropriate authorities so they can use their expertise to investigate further. This can help address your scam, as well as others, and deter similar future behavior. After New Hampshire voters alerted law enforcement and their attorney general about the robocall that used AI to impersonate Biden, the alleged culprit was identified and charged with 13 counts of voter suppression, a felony, and 13 counts of impersonating a candidate, a misdemeanor. He also faces a proposed $6 million fine from the Federal Communications Commission.

4.Educate yourself: Knowledge is your best defense against emerging threats. Take the time to educate yourself and those around you about the dangers of voice cloning. Be skeptical of unsolicited calls, especially if they involve urgent requests that offer suspicious information or try to get you to engage in behavior that sounds “off” (like sending gift cards to supposed relatives of friends).

5.Rely on trusted sources: Our information ecosystem is awash in lies and inaccurate information, but at least in the elections space we know whom to seek out for accurate information about the administration of elections: state and local election officials (and those who support their efforts).

6.Make a plan to vote in advance of Election Day: Devising a vote plan allows you to confirm when, where and how you can vote. It also enables you to consider alternatives in case your preferred plan for voting does not work out because of something unforeseen like an illness. Finally, planning makes it less likely that you’ll be tricked by something like a voice cloning attack, even if it appears real.

Voice cloning attacks are part of the “new frontier” in malign efforts to meddle in U.S. elections. By staying informed, establishing safeguards, and remaining skeptical of unexpected communications, voters can increase their chances of thwarting these threats before they cause real damage.

Read More

People standing outside the Capitol

Dozens of members of Congress have had their likeness used in nonconsensual intimate imagery, otherwise known as deepfake porn. The majority of those impacted are women.

Kent Nishimura/Getty Images

AI enters Congress: Sexually explicit deepfakes target women lawmakers

Originally published by The 19th.

More than two dozen members of Congress have been the victims of sexually explicit deepfakes — and an overwhelming majority of those impacted are women, according to a new study that spotlights the stark gender disparity in this technology and the evolving risks for women’s participation in politics and other forms of civic engagement.

Keep ReadingShow less
Nvidia building and logo

The world came to a near standstill last month as everyone awaited Nvidia’s financial outlook.

Cheng Xin/Getty Images

Is AI too big to fail?

This is the first entry in “Big Tech and Democracy,” a series designed to assist American citizens in understanding the impact technology is having — and will have — on our democracy. The series will explore the benefits and risks that lie ahead and offer possible solutions.

In the span of two or so years, OpenAI, Nvidia and a handful of other companies essential to the development of artificial intelligence have become economic behemoths. Their valuations and stock prices have soared. Their products have become essential to Fortune 500 companies. Their business plans are the focus of the national security industry. Their collapse would be, well, unacceptable. They are too big to fail.

The good news is we’ve been in similar situations before. The bad news is we’ve yet to really learn our lesson.

Keep ReadingShow less

Berwyn Collaborative: Understanding Community Needs

“We have good people here, and if we have help highlighting our good people, we can connect more, collaborate more, be more creative, and resist harder,” said Berwyn resident Isabel Gonzalez Smith.

On a breezy November Saturday afternoon, members of the Cook County suburban city, had the opportunity to meet with local journalists and be heard at the Liberty Cultural Center in Berwyn, IL.

Keep ReadingShow less
disinformation spelled out
TolikoffPhotography/Getty Images

Listening in a time of disinformation

The very fabric of truth is unraveling at an alarming rate; Howard Thurman's wisdom about listening for the sound of the genuine is not just relevant but urgent. In the face of the escalating crisis of disinformation, distortion and the unsettling normalization of immoral and unethical practices, particularly in electoral politics and executive leadership, the need to cultivate the art of discernment and informed listening is more pressing than ever.
Keep ReadingShow less