The Supreme Court dodged a legal challenge Thursday by saying there was no need to reinterpret Big Tech’s key liability protections, ruling in one case in favor of Google and Twitter two terrorism charges raised against them. The decision of the court means Section 230 of the Communications Decency Act– a 1996 provision preventing platforms from being held liable for content generated by their users – remains unchanged.
Supporters of Section 230 feared that regulation would narrow its scope fundamentally change what types of content social media companies are legally held liable for. Such a ruling could force platforms like Facebook and YouTube to reconsider the way they use recommendation algorithms to serve user content. In short, the lack of any changes to 230 should be a great relief for tech companies that host content on the web.
The court dismissed the first case. Gonzalez vs Googleand issued a short three-page report unsigned opinion It declined to comment on 230, saying the basis for the case was simply too weak. In this case, the parents of a teenager killed by ISIS claimed that Google encourages terrorism by promoting terrorist content through its YouTube recommendation algorithm.
“We decline to consider the application of Section 230 to a complaint that appears to contain little, if any, plausible right to a remedy,” the court ruled.
The court also ruled unanimously in favor of the tech companies in the separate but related case Twitter against Taamneh It alleged that Twitter supported terrorism because it failed to sufficiently remove ISIS content on the platform following an attack in 2017.
In a statement sent to Gizmodo, Google welcomed the ruling and said it should come as a relief to YouTubers speaking out online.
“The countless companies, academics, content creators and civil society organizations that have joined us in this case will be reassured by this outcome,” said Halimah DeLaine Prado, Google’s general counsel. Twitter responded to Gizmodo with a poop emoji.
Taken together, the two rulings represent a major win for the tech industry, which has relied on 230 safeguards to fuel its growth since its inception nearly 30 years ago. The ruling also underscores the court’s nervousness about changing the provision that defines the Internet. This uncertainty was clearly visible during oral negotiations for the fall earlier this year
“These are not the nine greatest experts on the internet,” Judge Elena Kagan said
“We are a court. We really don’t know anything about these things.”
“Freedom of Speech lives to fight another day”
Technology industry groups like NetChoice hailed the court’s ruling, which they described as a “major victory” for freedom of speech and expression online. It’s also a win for social media companies interested in moderating content on their platform without the constant threat of a looming lawsuit. in one opinionChris Marchese, director of the NetChoice Litigation Center, said a weakening of Section 230 protections would make tech companies even less able to moderate potentially harmful content.
“With billions of pieces of content added to the Internet every day, content moderation is an imperfect — but important — tool to ensure user safety and the functioning of the Internet,” Marchese said. “If such services were subject to liability for harmful content unintentionally leaking across the web, that would have discouraged them from hosting user-generated content.”
Not only technology companies and industry groups welcome the court’s decision. Civil rights and privacy organizations including the ACLU, the Electronic Frontier Foundation and the Knight First Amendment Institute all issued statements praising the court’s decision. Although child safety groups and a growing cohort of lawmakers have railed against 230 in recent years for allegedly making it difficult to hold tech companies accountable for boosting harmful misinformationSupporters of the provision say a sudden reversal could have a chilling effect on internet speech.
“With this decision, online free speech lives to fight for another day,” said Patrick Toomey, associate director of the ACLU National Security Project.
Update May 18 12:44 p.m. EST: Added statements from Google, NetChoice and the ACLU.