Their college student daughter was murdered by ISIS. Now, The Supreme Court will hear a landmark Google case that could transform the Internet.

Beatrice Gonzalez and Jose Hernandez, the mother and stepfather of Nohemi Gonzalez. (Jonathan Ernst/Reuters via The Washington Post) Photo Source: Beatrice Gonzalez and Jose Hernandez, the mother and stepfather of Nohemi Gonzalez. (Jonathan Ernst/Reuters via The Washington Post)

When the cloud-based Internet became the center of our real-world reality, many legal experts called it the Wild West. How would the law control the AI-built universe, where issues such as privacy, anti-trust issues, and hundreds of other US laws need to be enforced?

The Internet, born in 1981, has come a long way since those first heady days when computers could first communicate with one another. From the beginning, scientists and experts warned that artificial intelligence would one day be able to run the Internet without any human interaction, and some forty years later, this is looking like reality. This Google suit is a prime example of emerging laws that may shift the entire base of Internet law, based upon Section 230 of the Communications Decency Act of 1996.

Section 230, however, is based upon a simpler time in the history of the internet, before algorithms evolved into the tools they provide for global powerhouse tech platforms such as Google, Twitter, Facebook, Instagram, etc. In today’s platforms, algorithms create prolific content for all platforms in tremendous volume and speed. Section 230 was created well before this evolution of tech occurred, back when human beings controlled the content.

The Supreme Court is today hearing a landmark Google case that could have a rippling effect across the Internet, metaverse and beyond. Attorneys and legal experts are glued to their seats, because this Court decision could change Internet laws forever. Their decision might change the way all Americans use the Internet.

The Supreme Court will hear oral arguments today in Gonzalez v. Google, a lawsuit arguing over how technology firms should be liable for specific, dangerous content found on the internet that was created by algorithms.

Gonzalez v. Google came about in 2017 after a 2015 ISIS attack in a Paris restaurant killed 23-year-old Nohemi Gonzalez, who was among 130 murder victims killed during a coordinated attack on November 13 across the city. Gonzalez was a college exchange student and died of her injuries at the horrific scene.

The parents of Gonzalez filed the lawsuit against Google, arguing that YouTube, as a technology company, is legally liable for the “harmful” content created and then promoted by the company’s algorithms. The victim’s family, in court documents, say that by YouTube recommending content about the ISIS terrorist murders in Paris in 2015, they violated the US Anti-Terrorism Act.

The Gonzalez family alleges Google and YouTube violated the US Anti-Terrorism Act by promoting ISIS propaganda and videos via the recommendation algorithms.

Google disagrees, stating that the case has no legal standing because US Internet law protects technology companies from suit for any content posted by their users. For thirty years, US law under Section 230 has offered protection from liability to Internet companies such as Google, their division YouTube, Facebook, and all tech firms from liability in such cases, a freedom that gave great leeway to the global tech behemoths to grow their platforms.

After the lower court agreed with Google, the Gonzalez family appealed, taking the matter to the Supreme Court.

The Supreme Court’s decision may dramatically impact technology companies and the internet, and Section 230. Content, driven by algorithms, is often used to fill up empty space on major platforms, so a ruling by the Court could impact the way visitors experience the internet. If the Supreme Court does agree with the Gonzalez v. Google arguments, there will most likely be a wave of lawsuits brought by a tsunami of victims and their families.

On one side, the Gonzalez family argues that internet and tech companies must be liable for harmful content and that Section 230 allows them to spread dangerous propaganda.

Google general counsel Halimah DeLaine Prado wrote a blog about the issue, saying, “The stakes could not be higher. A decision undermining Section 230 would make websites either remove potentially controversial material or shut their eyes to objectionable content to avoid knowledge of it.”

Advocates for Section 230 believe the Internet must remain a free, open space, where each company can police its content. The Gonzalez family believes their case is strong and that no giant tech companies should be legally allowed to recommend deadly terrorist content without liability.

In particular, YouTube’s algorithms often do not “police” recommended content that many find harmful. The ISIS terrorist Paris murders in 2015 were often shared on YouTube and other social media platforms and offered disturbing footage and images of the attacks. Algorithms decide which videos are recommended on YouTube, and the Gonzalez family argues that by promoting the Islamic State videos, YouTube is going over and above the scope of Section 230 protections from liability for tech companies.

A second case by more families of terrorist attack victims will be heard at the Supreme Court a day after the Gonzalez v. Google arguments. That second case centers around families who argue social media firms are hosting Islamic State content that harms victims and their families.

Diane Lilli
Diane Lilli
Diane Lilli is an award-winning Journalist, Editor, and Author with over 18 years of experience contributing to New Jersey news outlets, both in print and online. Notably, she played a pivotal role in launching the first daily digital newspaper, Jersey Tomato Press, in 2005. Her work has been featured in various newspapers, journals, magazines, and literary publications across the nation. Diane is the proud recipient of the Shirley Chisholm Journalism Award.
Legal Blogs (Sponsored)