What Happens Next for Social Media?
Remember when we thought social media platforms were a path to global peace, love, and understanding?
Boy, were we wrong.
Sure, they're great for many things — keeping up with friends and family, sharing useful or amusing information, etc. But we never imagined the dark side: as havens for people with destructive agendas and ulterior motives.
We never imagined that instead of being a force for enlightenment and wise decision-making, social media would create and deepen political polarization resulting in violence.
Presidential Influence
When we think about the divisiveness, of course, the first thing that comes to mind is President Donald Trump's relentless use of Twitter and other social media platforms to energize his followers. From the time he announced his candidacy in June 2015 through the four years of his presidency, Trump issued more than 34,000 tweets, many of them containing falsehoods.
In June 2020, the New York Times monitored Trump's tweets for a week and found that one-third of them contained falsehoods or were misleading. On Jan. 13, CNBC reported its findings that Trump's most popular tweets were those that were most likely to be false or inflammatory. “Of Trump's 10 most popular tweets, four contained false claims related to the 2020 election results," CNBC said. “Of his 100 most popular posts, 36 contained election-related falsehoods."
As time went on, Twitter began to hide some of Trump's tweets or add labels that the contents were disputed or misleading. Then, after he allegedly incited the storming of the U.S. Capitol Building on January 6, both Twitter and Facebook announced that Trump's accounts had been suspended from their platforms. Other companies took similar steps. Apple and Google removed the alternative Parler social media platform from their app stores, and Amazon stopped hosting the service.
Changing Perceptions
If Democrats thought Facebook was on their side a few years ago, that began to change with the 2016 election and revelations that Russians may have tipped the scales against them with inflammatory posts designed to sow discord. Since then, the left has been increasingly critical of Facebook as a “right-wing echo chamber" that gave Trump an edge.
Facebook denies that it plays political favorites. In September, a Facebook executive told POLITICO there's a reason why the pages of conservatives drive such high interaction: “Right-wing populism is always more engaging" because it speaks to “an incredibly strong, primitive emotion" by touching on such topics as “nation, protection, the other, anger, fear."
In October, the BBC reported on left/right use of Facebook and found that right-leaning commentators appeared to hold an advantage: “Data from CrowdTangle, a public insights tool owned by Facebook, puts together the most popular posts for each day on Facebook. On any given day, the top 10 most popular political posts are dominated by right-leaning commentators like Dan Bongino and Ben Shapiro, along with posts by Fox News and President Trump."
In addition, BBC noted, Trump's Facebook page received 10 times more traffic than Biden's.
Despite Democrats' increasing unhappiness with Facebook, and even though Trump had been using Twitter to his advantage throughout his presidency, Republicans still believed that social media companies were biased against them. In August 2020, the Pew Research Center released findings of a poll which found that 69% of Republicans and Republican-leaners believe that social media sites favor Democrats, while only 25% of Democrats feel that way.
Growing Pressure for Greater Accountability
Of course, Trump's opponents were pleased when social media muzzled Trump following the January 6 riot. But in the wake of those actions, there appears to be broad agreement everywhere along the political spectrum that social media have too much power. There is broad agreement that regulation is in order.
But what might that look like?
Facebook's chief executive Mark Zuckerberg has long been successful in fending off attempts to regulate his platform on the grounds that it is not a traditional media form because it does not generate content – it just provides a place for content.
It's also a business that is unlike traditional media in other ways.
Viewers and readers of traditional news pick and choose the content they wish to consume. But with social media, users “have almost no control over the content they see," Dipayan Ghosh, co-director of the Digital Platforms & Democracy Project at Harvard's Kennedy School of Government, writes. “Instead, platforms use complex algorithms to serve content they think will keep users scrolling, often exposing them to more radical posts that they may never have sought out on their own."
For more than two decades, social media platforms have enjoyed protection from legal liability for third-party content by Section 230 of the Communications Decency Act of 1996. But those days of statutory protection may be numbered. On December 9, 2020, the U.S. government launched a groundbreaking antitrust lawsuit against Facebook, calling for it to be broken up.
In other words, the wheels are already in motion, and the Biden administration will almost certainly be looking at ways to better protect the public from harmful social media content.
Should Section 230 Be Repealed?
While Section 230 is drawing ire from every direction, there's also broad agreement that it has allowed new companies to form and thrive. And while you might applaud the absence of political speech you find offensive, Section 230's removal “would also apply to your political speech, along with photos of your kids that you want to share with your family," says David Greene, senior staff attorney at the Electronic Frontier Foundation.
“It's part of the architecture of the modern Internet," he told CBS News. “Everything you do online depends on it."
But Bradley Tusk, an attorney and founder of Tusk Ventures, a venture capital and political strategy firm, says that the events of January 6 were the “death knell" for Section 230.
“When our product becomes so dangerous that it physically endangers someone's health, then it's the government's responsibility to step in and do something about it," he told the website OneZero. “If you were to support the repeal of Section 230, that would be your perspective."
Ghosh, of the Kennedy School of Government, suggests that Congress might keep Section 230 in place but require platforms to meet certain standards for transparency and data protection. (Bipartisan legislation was already introduced to do that.)
Others believe that it's important for social media platforms to self-regulate. Three academic authors in the Harvard Business Review argue that self-regulation has worked well in several pre-Internet industries: movies, broadcasting, video games, TV advertising.
“Given the increasing likelihood of government action," they write, “the goal of self-regulation should be to avoid a tragedy of the commons, where a lack of trust destroys the environment that has allowed digital platforms to thrive."
It's hard to predict what will happen. One thing seems certain, though: Social media may be in for some dramatic changes.
Related Resources:
- Is It Legal for Debt Collectors to Contact You Via Social Media? (FindLaw's Law and Daily Life)
- How Influential Are Social Media Influencers? (FindLaw's Free Enterprise)
- Yes, Trump Can Be Charged With Inciting a Riot (FindLaw's U.S. Supreme Court)