LEGAL ISSUES
The End of Social Media Immunity?
by George H. Pike
It's been a tough year for social media. On the one hand, Facebook, Twitter, and Instagram are as ubiquitous as ever. On the other hand, the platforms have taken some hits. |
It’s been a tough year for social media. On the one hand, Facebook, Twitter, and Instagram are as ubiquitous as ever. Twitter, through the president’s regular and often provocative tweets, as well as media and late-night talk show responses to those tweets, probably has as high a profile as it ever has. If my kids are any example, Instagram and Snapchat remain among the go-to social media platforms for stories and image-sharing. Facebook has in excess of 2 billion monthly users, nearly 30% of the population of the world, and more than half of those users are online daily.
On the other hand, the platforms have taken some hits. Some may strike us as trivial, such as when Kylie Jenner’s negative tweet about Snapchat caused the loss of more than $1 billion of market value. But others, particularly the impact and role of social media in the dissemination of fake news and clickbaiting, as well as the ongoing investigation of Russia’s use of social media in influencing the 2016 election, have given these platforms a black eye. In mid-March, Facebook lost 9% of its market value when it was revealed that 50 million users’ personal data was improperly harvested by Cambridge Analytica on behalf of political clients, and this triggered at least some talk of Facebook boycotts.
FOSTA
But things may have gotten tougher. On March 21, Congress passed the Allow States and Victims to Fight Online Sex Trafficking Act of 2017 (FOSTA, aka HR 1865; congress.gov/bill/115th-congress/house-bill/1865). The bill provides that operators of “interactive computer services” that “promote or facilitate the prostitution of another person” are subject to a fine and/or incarceration in a federal prison for up to 10 years.
This is being seen as carving out the first major hole in Section 230 of the Communications Decency Act (Title 47, Section 230 of the U.S. Code; law.cornell.edu/uscode/text/47/230), the exemption from liability for third-party content that has allowed the internet to thrive over the last 22 years and that is largely responsible for the success of user-driven content, including social media.
Interactive Computer Services
The Communications Decency Act was passed in 1996 as the internet started to accelerate as a global communication and information platform. While primarily an attempt to regulate online pornographic content, much of it was struck down by the U.S. Supreme Court. However, Section 230 of the act survived, which states, in part, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
On a practical level, this has given ISPs and websites broad immunity from any legal liability for the posts and other content provided by users. Facebook posts, tweets, Wikipedia entries, forum and email list discussions, comments in response to articles, YouTube videos, and Craigslist advertisements, as long as they are provided by third parties and not substantially edited or modified by the hosting website, do not subject that website to liability for defamation or infringement or even to criminal liability.
Praised and Criticized
The Communications Decency Act ensured that ISPs would not have to worry about being liable for the unknown actions of their millions of users, but it has been applied broadly to cover even small web platforms that rely on user content. Over the years, it has been both praised as driving the phenomenal growth of the internet and social media and equally criticized for facilitating the internet’s worst abuses by allowing websites to avoid any responsibility.
FOSTA is the first significant exception to Section 230’s immunity. It was largely driven by media, law enforcement, and lawmaker reaction to thinly veiled (and some not-so-thinly veiled) advertisements on websites such as Craigslist and Backpage for sexual services and the belief that those ads have been exploitive and have led to dramatic increases in sex trafficking, including the trafficking of children. Prosecutors, lawmakers, victims, and victims’ rights groups have long argued that Section 230 has allowed these sites to profit and thrive from sex trafficking while evading civil or criminal liability. FOSTA will now let those sites be subject to criminal and civil liability for intending to promote or facilitate prostitution or sex trafficking.
Self-Censorship and Stifling Internet Development
FOSTA was passed with overwhelming bipartisan support, but it got mixed reactions from the technology industry. Facebook supported the measure as a means to “allow responsible companies to continue fighting sex trafficking. …” However, other technology companies and internet free-speech advocates expressed concerns. The Electronic Frontier Foundation (EFF) is worried about overly broad self-censorship that allows content that remotely touches on sexual services—such as the exchange of healthcare or safety information or resources for trafficking victims and sex workers—to be seen as part of an “intent to promote” prostitution. Other critics have said that while large platforms such as Facebook may have the resources to address FOSTA, small and startup companies could either over-censor to avoid liability or not engage with any user content, thus stifling internet development.
Others, including Sen. Mark Warner (D-Va.), have suggested that moves to limit Section 230 could go beyond sex trafficking to other areas. (Warner ultimately voted in favor of FOSTA.) Some commentators share his view, suggesting that other victims’ rights groups will use FOSTA to advocate for additional exemptions or that FOSTA could serve as a precedent for legislation to target fake news or political advertising.
Immediate Impact
Two days after FOSTA was passed, Craigslist shut down its personals platform completely, and Reddit banned several of its communities. The longer-term impact will take some time to unfold. As with any legal change, the language of the statute will be parsed and analyzed to see exactly how it will play out in practice. Advances in artificial intelligence and machine learning may play a role in helping websites get smarter about separating illegal content from legal content. Self-regulation within the industry can also go a long way toward taking some of the legal pressure off and addressing some of the other concerns, which can lead to a smoother future for social media. |