The supreme court in the USA ruled: social networks are exempt from responsibility for terrorist content

The supreme court in the USA ruled: social networks are exempt from responsibility for terrorist content
The supreme court in the USA ruled: social networks are exempt from responsibility for terrorist content

The US Supreme Court today (Thursday) sided with the social media platforms in lawsuits demanding their responsibility for content related to terrorism.

In one of the two cases, Twitter v. Tamana, the Supreme Court ruled that Twitter would not have to face accusations that it aided terrorism when it allowed tweets from the terrorist organization ISIS. The court also rejected Gonzalez v. Google, thereby avoiding narrowing the interpretation of Section 230 of the Media Fairness Act. This decision upholds a lower court ruling that protected social media platforms from content curation lawsuits.

The decision regarding Twitter was unanimous and was written by Justice Clarence Thomas, who said that social media platforms are little different from other digital technologies. “Bad actors like ISIS may use social media platforms for illegal, and sometimes horrific, purposes,” Thomas wrote, “but the same can happen on cell phones, email, or the Internet.”

The court ruled that the fact that Twitter is a platform for terrorist content does not create indirect legal responsibility for specific terrorist attacks. In doing so, he set the bar for similar claims in the future. “The plaintiffs’ claims are not sufficient to establish that the defendants assisted ISIS in the relevant attack,” Thomas wrote.

Twitter’s lawsuit against Tamana focused on whether social media platforms can be sued under the US Anti-Terrorism Act for providing a platform for terrorist-related content that has a distant connection to a specific terrorist attack. The plaintiffs in the case, the family of Naures Alasaf, who was killed in an ISIS attack in Istanbul in 2017, argued that social media platforms like Twitter knowingly aided ISIS, violating federal anti-terrorism law by allowing content to be published on their platform, despite policies restricting this type of content.

“Many companies, content creators and researchers will be relieved by the court’s decision,” said Halima Delaine Prado, Google’s general counsel. “We will continue to protect freedom of expression on the Internet, fight harmful content and support businesses and creators who enjoy the Internet.”

The court rejected the lawsuit against Google with only a short opinion, and left intact the ruling that stated that Google is immune from a lawsuit accusing its subsidiary, YouTube, of aiding terrorism. This ruling reassures not only Google, but also many websites and social media platforms that have urged the Supreme Court not to reduce legal protections on the Internet.

The family of Nahmi Gonzalez, who was killed in the 2015 ISIS attack in Paris, claimed that the YouTube-targeted recommendations violated the American anti-terrorism law thereby helping to promote the world view of ISIS. The lawsuit requested that some of the content recommendations not receive protection under Article 230, and thus the technology platforms would be exposed to greater responsibility in the way they manage their services.

Google and other technology companies said this interpretation of Section 230 would increase the legal risks associated with sorting and curating online content, a fundamental feature of the modern Internet. Google argued that in such a scenario, websites would either remove much more content than necessary, or completely give up content management and give way to more harmful material on their platforms.

Oregon Democratic Senator Ron Wyden and former California Republican Representative Chris Cox, the original authors of Section 230, argued to the court that Congress’ intent in passing the law was to give websites broad discretion in managing content. In December, the Biden administration submitted a report on the issue and claimed that Section 230 does protect Google and YouTube from lawsuits “for not removing third-party content, including the content it recommended.”