Block on Trump's Asylum Ban Upheld by Supreme Court
Remember Joe Camel?
During the 1980s and 1990s, Joe was a cartoon mascot for Camel cigarettes. His smiling likeness appeared in print ads and billboards everywhere. He probably enjoyed as much celebrity as Tony the Tiger or Ronald McDonald.
Today, though, we remember Joe not as a cute icon but as a symbol of a nefarious corporate scheme aimed at luring adolescents to become smokers of Camel cigarettes. Camel's parent company put Joe out to pasture in 1997 after documents surfaced showing that the company enlisted Joe as a central figure in a campaign targeting children as future smokers.
Social media companies aren't trying to hook kids into buying unhealthy commodities like cigarettes. But critics say they are nevertheless targeting children and luring them to engage in harmful activity — including suicide.
The first wrongful death lawsuits against social media companies are emerging, and federal and state lawmakers are considering further steps to require social media platforms to better safeguard children.
Legal pressure began to mount last October when Frances Haugen, a former data scientist at Facebook, told a U.S. Senate subcommittee that the tech giant knowingly employed damaging algorithms.
Social media platforms use algorithms to measure users' interests and guide them to content that may keep them online - so they see more ads. Part of Haugen's testimony focused on Instagram, a platform popular with adolescents and owned by Facebook's parent company, Meta. She leaked several of Meta's own studies on the topic, including surveys that found:
Lawmakers on both sides of the aisle in Washington already wanted to break up Big Tech and they latched onto Haugen's testimony as proof that action was needed. The problem is, Section 230 of the Communications Decency Act generally immunizes social media companies from being sued over what users post.
But what about those algorithms?
According to Seattle lawyer Matthew Bergman, the revelations about algorithms opened the door to litigation. He says lawsuits against social media companies are possible based on traditional product liability law, specifically defective design. That is, the algorithms were designed to be addictive despite the knowledge that heavy use of social media can cause mental and physical harm to minors.
Bergeman formed the Social Victims Media Law Center in November 2021 and now represents 20 families who have filed wrongful death lawsuits against social media companies.
Researchers have found that the risk of suicide among adolescents is greater the more time they spend online. Overall, suicides among people ages 10 to 24 have increased every year since 2007 and have escalated during the pandemic.
Meanwhile, legal action against social media companies is stirring on another front. Plaintiffs have filed wrongful death lawsuits claiming that algorithms encourage young people to engage in dangerous behavior.
Last December, 10-year-old Nylah Anderson of Chester, Pennsylvania, died after taking part in a "Blackout Challenge" on TikTok, a video-sharing app. These "challenges" have been a staple activity of TikTok users. In the "Fire Challenge," users doused an object with a flammable liquid and lit it on fire. Then there was the "Milk Crate Challenge," where users stack milk crates and walk across them.
The "Blackout Challenge" dared viewers to choke themselves with household items until they passed out. Then, after they regain consciousness, they share the recorded event with fellow TikTok users.
However, not all regain consciousness. On May 12, Nylah's mother, Tawainna, filed a wrongful death lawsuit against TikTok and its parent company, ByteDance, in federal court. In the complaint, she claims that at least four other people have died from the Blackout Challenge.
Eight attorneys general recently launched an investigation into TikTok's impact on young people. Connecticut Attorney General William Tong said the group is concerned about "reckless viral challenges" and will examine "what TikTok knew about the risks to our children, and precisely what they have been doing to keep our kids online."
Meanwhile, California legislators are considering a bill that would allow parents to sue social media companies if they endanger children with features that addict children. Backers of the bill contend that it gets around the Section 230 prohibition by narrowly focusing on whether apps are using addictive algorithms and not on overall content.
If you are the parent of an adolescent, it's important to be aware of the risks social media poses for their age group. Educating yourself on the apps and platforms is a good place to start.
Here are some other steps you should consider:
In a worst-case scenario involving injury or death, keep in mind that the legal boundaries on social media responsibility appear to be changing. Contacting an experienced personal injury lawyer near you could be a wise decision.
Meeting with a lawyer can help you understand your options and how to best protect your rights. Visit our attorney directory to find a lawyer near you who can help.