Recently, Facebook CEO Mark Zuckerberg announced that his social media company is intending to establish an independent oversight group to help aid the struggling social media company in addressing the growing tension between free speech and censorship. In the article he wrote, Zuckerberg noted, “As I’ve thought about these content issues, I’ve increasingly come to believe that Facebook should not make so many important decisions about free expression and safety on our own.”
It seems that the brain trust at Facebook is coming to the conclusion that transparency and independent processes are the only ways to avoid government regulation or, even worse, the repeal of the protections it enjoys due to an extremely favorable provision of Section 230 of the Communications Decency Act of 1996, which states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (47 U.S.C. § 230).
The original intention of the law was to shield companies from liability for content uploaded or posted by third parties to interactive computer services such as internet service providers, blogs or social media platforms, thereby creating an environment favorable to allowing new technologies to evolve.
While fostering the innovation of new technologies is indeed a noble objective, the reality — especially with respect to social media companies — has become far more complex and, in fact, has resulted in many unintended yet foreseeable consequences.
Facebook, Twitter, Google and others have had to confront the challenge providing billions of users, sometimes anonymous or artificial, instant global reach on their platforms while balancing the risk of criminality, misinformation and other dangers that occur in real time. There are no simple answers.
From a legal perspective, social media companies have to walk an extremely narrow wire. Users of social media find it a popular way to informally connect, comment and communicate without the formality of the written or spoken word. Unfortunately, social media companies have struggled with the reality that virtual or online conduct can have real-world consequences. Simply put, we live in a time where we are told if we “see something” dangerous, threatening or strangely unusual, we are supposed to “say something.”
Why is social media different? Essentially, it is a system of small, medium and global virtual communities interacting online. Merriam-Webster defines social media as, “Forms of electronic communication through which users create online communities to share information, ideas, personal messages and other content.”
If it is our civic responsibility to report crime, terror or other threats we see in the real world, then why does that not apply to online behavior? For example, if person A threatens person B in a crowded room by saying, “I am going to kill you,” most would agree that the appropriate and responsible response for a good Samaritan bystander would be to report it immediately to authorities. Let us not forget the finale of Seinfeld.
In the world of social media, the across-the-board mentality seems to be too focused on simply finding offensive or dangerous material and deleting it as soon as possible. According to Twitter’s guidelines for law enforcement, the onus remains on law enforcement to find and request information regarding such imminent threats, and then the company “may” comply with an emergency request for vital information: “If we receive information that provides us with a good faith belief that there is an exigent emergency involving the danger of death or serious physical injury to a person, we may provide information necessary to prevent that harm, if we have it.”
In 2017, Facebook came under fire for a handful of disturbing videos that were uploaded to the company’s website. It begs the question: Should the onus really be on users to moderate social media sites, or should these companies make a greater attempt to contact authorities when necessary?
Balancing the privacy and security of users with the reality that they are using public forms of communication, I believe the scale must tip in favor of reporting illegality and threats. Additionally, the platforms themselves should not be the arbiters of what is or is not offensive, threatening or dangerous. In my opinion, that should be left to the local community and authorities.
Given their current Section 230 protection, there is little risk in social media companies maintaining the status quo. However, it is becoming far less tolerable to society, and there are arguments to be made for how social media companies and police task forces can collaborate in an ethical manner.
Some simple but important ways social media can address the problem and avoid heavy governmental regulation:
• Allow police to monitor social media just like they walk down a sidewalk looking for crime. In its current state, I believe the privatization of public activity has resulted in online anarchy.
• Allow rights owners to freely access social media to detect and remediate piracy, counterfeiting and diversion of their property on social media.
• Remember virtualization of the global community makes it no less threatening than that of the real world. Bad actors will find ways to inflict harm and cause damage in any environment. Unless we have some degree of police and regulatory sovereignty, there is little reason to comply with the law.
We all must remember that unlike a conversation on the street or even and broadcast on terrestrial radio or television, online social media activity is by definition instant and global. There are 195 nations around the globe today, and each has its own set of laws and moral standards.
One set of laws will simply never be a practical solution for internet governance, so the practical solution is for each nation and third parties to monitor, document and, if need be, enforce laws and standards online. Social Media companies can still protect user privacy without hindering the legitimate mining of public statements on their platforms.