Skip to main content
Find a Lawyer
Please enter a legal issue and/or a location
Begin typing to search, use arrow keys to navigate, use enter to select

Find a Lawyer

More Options

What Legal Responsibilities Do Online Platforms Have for Curbing Violent Rhetoric?

Signpost-showing-arrows-for-like-tweet-follow-share-comment
By Joseph Fawbush, Esq. | Last updated on

In the wake of mass shootings and acts of terror, we often hear about manifestos, grievances, and calls to action posted online. Prior to going on his rampage, the El Paso shooter posted just such a manifesto to 8chan, an online message board. 8chan subsequently went down for a few hours the day after the attacks and has struggled to stay online since.

This is because Cloudflare, a cybersecurity company, terminated its support of the website, as reported by The New York Times. Cloudflare’s CEO openly discussed his rationale and the conflicting priorities that governed his decision. Ultimately, however, he decided that 8chan promoted extremism and violence and could not be supported. Many tech companies and online platforms must make similar decisions daily.

There may be an ethical imperative for tech companies to act similarly. But is there a legal one? A recent case out of the 2nd Circuit illustrates the types of issues courts are facing in our era of social media predominance and terrorist attacks.

2nd Circuit: Facebook Not Liable for Content Promoting Hamas Attacks

The 2nd Circuit case arose after Hamas posted content on Facebook that both directly and indirectly led to the attack and death of several Israeli nationals, including a 10 year-old girl. The victims subsequently sued Facebook for allowing the posts to exist, and to “match” terrorist sympathizers with terrorist organizations.

While the content at issue violated Facebook’s terms of service and Community Standards, Facebook did not remove the content prior to the attacks. It monitors posted content for terrorism-related posts and activities through artificial intelligence software, trained employees, and user reports.

Federal Law Gives Broad Immunity to an ‘Interactive Computer Service’

Under federal communications law, no provider of an “interactive computer service” is liable for content that a third party posts. This law has been widely used to immunize social media platforms and tech companies from liability for posts from users.

The 2nd Circuit was in line with these cases when it affirmed the district court’s dismissal of the lawsuit. Facebook does not edit content, it either allows posts as-is or rejects them, so it is not the author of the posts. Its failure to spot and remove the content from Hamas does not mean it is liable for its failure, the court reasoned.

It would have been bigger news if this case was decided the other way, but there was a dissent supported by a suggestion that Congress may wish to revisit the law considering court interpretations that have “generally immunized social media companies.” Unless the law changes, it is unlikely that social media companies are going to be held responsible for the content people post to their platforms.

Related Resources:

2nd Circuit: Trump Can’t Block Twitter Foes (FindLaw’s U.S. Second Circuit)

SCOTUS 'Crime of Violence' Ruling Opens Door for Deportation Review (FindLaw’s U.S. Second Circuit)

Third Party Liability in Third Circuit: Amazon Can't Skirt Defective Products Claims

(FindLaw’s U.S. Third Circuit)

Was this helpful?

You Don’t Have To Solve This on Your Own – Get a Lawyer’s Help

Meeting with a lawyer can help you understand your options and how to best protect your rights. Visit our attorney directory to find a lawyer near you who can help.

Or contact an attorney near you:
Copied to clipboard