Recognizing the negative effect misinformation can have on voters, Minnesota passed a law in 2023 designed to punish a specific type of deception — deepfakes. These include any form of misrepresenting a candidate by using technology to create images or footage of them saying or doing something that they never did. It targets those who create and disseminate deepfakes. According to a lawsuit filled by X Corporation (X), this is unconstitutional.
The suit charges First and Fourteenth Amendment violations and that the vague wording in the Minnesota law would put an undue burden on X's moderation of content posted on its platform. It also alleges that Minnesota Attorney General Keith Ellison holds a personal vendetta against X and its owner.
Stopping the Spread of Misinformation
As use of generative artificial intelligence (GAI) continues to expand, so too does the possibility that it can be harnessed for nefarious means. Evolved versions of GAI can create believable videos, stills, and audio of any person it has access to information about. These deepfakes can lead to confusion, deception, and malicious intent to anyone, especially those running for public office.
In 2023, the Minnesota Legislature passed a law focused on preventing election deep fakes and punishing those who create and disseminate them. If made without the consent of the candidate, the person who makes a deepfake and those who enter into a contract to disseminate the material are subject to fines and possible jail time if they act with reckless disregard.
Specifically, it makes it a crime to share a deepfake that falsely impersonates someone within 90 days of an election if the person both:
- Knows or should have known it was a deepfake
- Acts with intent to harm a candidate or influence an election
While Minnesota was the first state to enact such a law, several other states have enacted similar measures, although some only make it a civil penalty, not a criminal one.
X Marks the Spot
X's Terms of Service prohibit deepfake content posted on its platform. It does not "support, endorse, represent, or guarantee the completeness, truthfulness, accuracy, or reliability of any content." In a subchapter, the ToS states that users "may not share inauthentic content on X that may deceive people or lead to harm." While the term "deepfake" is not used, X specifies that media depicting a real person that has been fabricated or simulated through AI is prohibited.
X goes on to indicate that if they can't "reliably determine" whether manipulated media is misleading, they won't act on it and it can continue to be shared.
The law would not hold X itself liable for failing to remove content that violates Minnesota law. However, because the law punishes those who "disseminate" such AI-generated images, sharing such content could lead to criminal liability. Musk, who is fond of sharing X users' AI-generated content, for example an image of Kamala Harris in a Soviet Russia military uniform, could therefore run afoul of the law if he shares AI-generated images with the intent to harm a candidate.
X's Arguments
X's lawsuit alleges that the wording of the Minnesota law is so vague that it will cause social media platforms to resort to blanket censorship that will smother political speech that should be protected by the Constitution. It argues that its "Community Notes" feature, which allows users to comment on the content, is a viable option for spotting deepfake content. It also claims its AI chatbot can help explain the content to viewers.
X has demanded a jury trial and is seeking an permanent injunction enjoining Minnesota from enforcing the law. To date, no other social media platform has either tried to join in X's lawsuit or filed one of their own against the Minnesota deepfake law. However, a Minnesota social media content creator, Christopher Kohls, sued Minnesota in 2024. Kohls argues that the law could lead to criminal prosecution even for political satire or parody accounts. That lawsuit is still underway in a federal court in Minnesota.
Minnesota Attorney General's Office Reviewing
The Minnesota Attorney General's Office issued a press release stating the lawsuit was being reviewed. Minnesota State Senator Erin Maye Quade told a local CBS affiliate that the legislation is narrowly tailored enough to withstand strict scrutiny of the courts.
Meanwhile, the Minnesota Legislature has been trying to expand the law to include "nudification," which is creating false intimate images for as a type of "revenge porn."
Related Resources
> Deepfake Dilemma: What Are Lawmakers Doing About AI Rip-Offs? (FindLaw's Law and Daily Life)
> Tips For Catching Deepfakes in Evidence (FindLaw's Practice of Law)
> Social Media Privacy Laws (FindLaw's Consumer Protection Law)