Deepfakers beware: Do it in California or Texas and you'll be in deep trouble
California has decided to throw a flag on people who post deepfake videos of candidates running for public office.
Gov. Gavin Newsom has signed legislation that prohibits distribution of these artificially created or manipulated videos within 60 days of an election unless the video carries a statement disclosing it has been altered. Texas enacted a similar law late last month.
That the nation's most populous state, where lawmaking power is entirely in Democrats' hands, would mirror a new policy in the third-largest state, formulated entirely by Republicans, is a clear indictor that the new world of deepfakes is causing big-time bipartisan worry among politicians. But some experts question whether the laws will survive legal challenges.
Deepfake is a relatively new weapon in the increasingly nasty arsenal of attacks used on candidates and politicians.
The concept received broad attention after a video of Speaker Nancy Pelosi had been slowed down in a way that made it appear she was slurring her speech. President Trump then tweeted a link to the video along with this executive summary: "PELOSI STAMMERS THROUGH NEWS CONFERENCE."
Experts said this was not technically an example of deepfake, which uses artificial intelligence to synthesize human images and also involves superimposing existing images onto source images or videos.
Regardless, concern has been growing about the use of deepfakes by foreign adversaries or home-grown political operatives as a way to manipulate campaigns.
A study, released this week by Deeptrace, said the firm found nearly 15,000 deepfake videos on the internet, nearly twice the number the company identified in December 2018. Most of those were pornography.
A second bill signed by Newsom last week gives California residents the right to sue people who distribute sexually explicit videos using their image without their consent.
"Voters have a right to know when video, audio, and images that they are being shown to try to influence their vote in an upcoming election have been manipulated and do not represent reality," said the author of California's deepfake bill, Democratic state Rep. Marc Berman. "In the context of elections, the ability to attribute speech or conduct to a candidate that is false — that never happened — makes deepfake technology a powerful and dangerous new tool in the arsenal of those who want to wage misinformation campaigns to confuse voters."
Legal scholars, however, question whether the California and Texas laws violate free speech protections.
"I believe there are serious constitutional questions with the new law," wrote Rick Hasen, professor at the University of California, Irvine law school.
An increasing number of the country's largest publicly traded companies are disclosing more than ever about political spending habits that the law permits them to keep secret.
That's the central finding of the fifth annual report from a group of academics and corporate ethicists, who say the average score among the biggest companies traded on American exchanges, the S&P 500, has gone up each year since 2014.
Though corporate political action committees must disclose their giving to candidates, those numbers are very often dwarfed by the donations businesses make to the trade associations and other outside groups that have driven so much of the steady rise in spending on elections. Conservatives say robust disclosure of these behaviors is the best form of regulating money in politics and is working fine, and this new report reflects that. Those who say campaign finance needs more assertive federal regulation will argue such corporate transparency is inconsistent and inadequate to the task, and the new report underscores that.
A year from the presidential election, U.S. intelligence agencies have adopted a new framework for how they will inform candidates, groups and the public about attempts to disrupt our country's elections by foreign operatives.
But the one-page summary of the plan, released late last week, is so general that it remains unclear what the intelligence community plans to do if and when it discovers something suspicious.
The summary by the director of national intelligence states that the federal government will "follow a process and principles designed to ensure, to the greatest extent possible, that notification decisions are consistent, well-informed and unbiased."
The new framework is designed to prevent a repeat of some of what happened after the 2016 election.