Block on Trump's Asylum Ban Upheld by Supreme Court
FindLaw columnist Eric Sinrod writes regularly in this section on legal developments surrounding technology and the internet.
Social media outlets now connect billions of people around the globe on a constant basis. Facebook, by headcount, has become the largest nation on the planet, with approximately two billion users. A tremendous number of these users communicate with others via their social media accounts many times a day. Of course, there are many positive aspects of social media communications; but, regrettably, there are palpable negatives as well.
Cyberbullying is one of those negatives. All too often, for example, a minor or a group of minors bullies another minor, with disastrous consequences. The victim can be ostracized, humiliated, and driven to anxiety, depression, and even self-destruction. This can even happen with adults. We learned in the news recently of a woman who was prosecuted for egging on her boyfriend via text messages to commit suicide. She ultimately was found guilty for manslaughter.
And social media has made it easy for people and groups with different opinions to engage not only in civil political discourse, but to also voice extreme accusations, to make racist and sexist remarks, and to even suggest potential violence. It seems that social media has made it simple to reach out to a vast audience while demonizing others who are not truly known on a personal, individual basis.
So, what is to be done about all of this?
It is not realistic to expect that social media companies will ensure that these types of communications never will appear on their outlets. Why? First, under Section 230 of the Communications Decency Act, internet service providers generally are immune for the content of third-parties posted on their sites.
And second, it just is not practical to assume that social media companies could police the many billions of communicates that appear on their sites daily. These companies could not possibly hire an army big enough to get this job done while also remaining economically viable. Sure, in extreme instances, social media companies can be informed of truly horrifying posts so that they can be removed, and these companies do employ some level of resources to find and address such posts. But more needs to be done.
We really need to take this on individually as people dealing with real people.
Parents need to be involved in the Internet activities of their children and teenagers. It is true that technology advances at warp speed, and often children and teenagers know more about cyberspace than their parents. It is up to parents to become educated in the first instance so that they can educate their children as their children first go online. Parents should inform their kids about the specific risks on the Internet, and how properly to treat other people when online. They also should know that they can come to their parents when they become concerned about anything that they encounter online.
And as adults, we need to treat people in cyberspace as we would in a face to face communication. And on top of that, we should get out from behind our computers and actually see people in the real world; and not just people who share our beliefs, but people outside of our respective bubbles so that we can understand that people with different points of view are still human beings deserving of respect and civility.
Perhaps these recommendations may seem like only a start and not enough. And that is probably true. Further ideas to bring people together are welcome and should be considered.
Eric Sinrod (@EricSinrod on Twitter) is a partner in the San Francisco office of Duane Morris LLP, where he focuses on litigation matters of various types, including information technology and intellectual property disputes. You can read his professional biography here. To receive a weekly email link to Mr. Sinrod's columns, please email him at email@example.com with Subscribe in the Subject line. This column is prepared and published for informational purposes only and should not be construed as legal advice. The views expressed in this column are those of the author and do not necessarily reflect the views of the author's law firm or its individual partners.