Deepfake Bill That Contains Protections for Politicians Passes California State Senate
Most of us get our news in video form these days -- whether via the nightly news, Facebook uploads, or snippets spotted in our Twitter feed. And video of a political candidate can be especially powerful, demonstrating their verbal acuity, persuasive rhetorical style, or, on the other hand, mental gaffes and slips of the tongue. Imagine how compelling video of President Donald Trump passionately proselytizing for Medicare for All would be. Or Bernie Sanders admitting that, actually, he loves billionaires and can't wait to repeal income tax.
Wait, what? Those are obviously extreme examples and not their stated political beliefs, but both could be possible with deepfake technology -- artificial intelligence that can, for example, credibly swap Nick Cage's face onto Amy Adams's body. So, you can see how deepfakes have the potential to wreak havoc on an election. So did the California legislature, which last week advanced a bill that would allow political candidates to sue over deepfakes.
Media, Malice, and Material Deception
California Assembly Bill 730 would prohibit a person or organization "from distributing with actual malice materially deceptive audio or visual media of the candidate with the intent to injure the candidate's reputation or to deceive a voter into voting for or against the candidate, unless the media includes a disclosure stating that the media has been manipulated" within 60 days of an election. The bill goes on to define "materially deceptive audio or visual media" as:
...an image or audio or visual video recording of a candidate's appearance, speech, or conduct that has been intentionally manipulated in a manner such that the image or audio or video recording would falsely appear to a reasonable person to be authentic and would cause a reasonable person to have a fundamentally different understanding or impression of the expressive content of the image or audio or video recording than that person would have if the person were hearing or seeing the unaltered, original version of the image or audio or video recording.
Targeted candidates would have the ability to sue a person, committee, or other entities that share deepfakes without warning labels, except for media outlets paid to broadcast campaign ads.
Free Speech or Fake News?
Not everyone is a fan of the proposed law. Free speech advocates, along with media organizations like the California News Publishers Association and California Cable and Telecommunications Association, oppose the legislation, although legal experts don't believe the law would unfairly burden First Amendment rights.
"Most importantly," noted Constitutional scholar Erwin Chemerinsky, "the U.S. Supreme Court has said that speech which is defamatory of public officials and public figures has no First Amendment protection if the speaker knows the statements are false or acts with reckless disregard of the truth. The Court has explained that the importance of preventing wrongful harm to reputation and of protecting the marketplace of ideas justifies the liability for the false speech."
Notably, the California statute would only apply to the Garden State, so everyone should be on the lookout for deepfakes this election season.
- Deepfake Videos Explained: What They Are and How to Identify Them (Salon)
- Criminals and Crimefighters Test the Legal Boundaries of AI Use (FindLaw Blotter)
- Can You Legally Swap Someone's Face Into Porn Without Consent? (FindLaw's Legally Weird)
- SF Bans Facial Recognition Tech -- For SF (FindLaw's Technologist)
You Don’t Have To Solve This on Your Own – Get a Lawyer’s Help
Meeting with a lawyer can help you understand your options and how to best protect your rights. Visit our attorney directory to find a lawyer near you who can help.