The Supreme Court is seen on Feb 21, 2023, in Washington. (PHOTO / AP)

WASHINGTON – US Supreme Court justices on Tuesday expressed uncertainty over whether to narrow a legal shield protecting internet companies from a wide array of lawsuits in a major case involving YouTube and the family of an American student fatally shot in a 2015 rampage by Islamist militants in Paris.

The justices heard arguments in an appeal by the family of Nohemi Gonzalez, a 23-year-old student at California State University, Long Beach who was studying in France, of a lower court's dismissal of a lawsuit against Google LLC-owned YouTube. Google and YouTube are part of Alphabet Inc.

The Supreme Court for the first time in this case is scrutinizing the scope of a much-debated 1996 federal law called Section 230 of the Communications Decency Act, which protects internet companies from liability for content posted by their users. In dismissing the lawsuit, the San Francisco-based 9th US Circuit Court of Appeals relied upon Section 230.

The justices signaled concern about the potential consequences of limiting immunity for internet companies and the difficulty in figuring out where to draw that line while also expressing skepticism that these businesses should be shielded for certain types of harmful or defamatory content.

"These are not like the nine greatest experts on the internet," liberal Justice Elena Kagan said of the court's members, eliciting laughter in the courtroom.

The lawsuit accused Google of providing "material support" for terrorism and claimed that YouTube, through the video-sharing platform's computer algorithms, unlawfully recommended videos by the Islamic State militant group, which claimed responsibility for the Paris attacks that killed 130 people, to certain users. The recommendations helped spread Islamic State's message and recruit jihadist fighters, according to the lawsuit

Kagan and conservative colleague Justice Brett Kavanaugh both suggested Congress might be better suited to adjust legal protections for internet companies if warranted.

The lawsuit accused Google of providing "material support" for terrorism and claimed that YouTube, through the video-sharing platform's computer algorithms, unlawfully recommended videos by the Islamic State militant group, which claimed responsibility for the Paris attacks that killed 130 people, to certain users. The recommendations helped spread Islamic State's message and recruit jihadist fighters, according to the lawsuit.

ALSO READ: Suspect involved in 2015 Paris attacks indicted in Belgium

Kagan told a lawyer for the Gonzalez family, Eric Schnapper, that algorithms are widely used to organize and prioritize material on the internet and asked: "Does your position send us down the road such that (Section) 230 really can't mean anything at all?"

Schnapper replied no and added, "As you say, algorithms are ubiquitous. But the question is, 'What does the defendant do with the algorithm?'"

Anti-terrorism law

The lawsuit was brought under the US Anti-Terrorism Act, a federal law that lets Americans recover damages related to "an act of international terrorism."

Google and its supporters have said a win for the plaintiffs could prompt a flood of litigation against platforms and upend how the internet works. The case is a threat to free speech, they added, because it could force platforms to stifle anything that could be considered remotely controversial.

Beatriz Gonzalez (right) and Jose Hernandez (second right), the mother and stepfather of Nohemi Gonzalez, who died in a terrorist attack in Paris in 2015, speak to the media outside the US Supreme Court following oral arguments in Gonzalez v. Google in Washington, DC, on Feb 21, 2023. (PHOTO / AFP)

The justices wondered whether YouTube should lose immunity if the algorithms that provide recommendations are "neutral" or used to organize content based on users' interests.

"I'm trying to get you to explain to us how something that is standard on YouTube for virtually anything that you have an interest in suddenly amounts to 'aiding and abetting' because you're in the ISIS category," Justice Clarence Thomas told Schnapper, using initials for the Islamic State group.

Justice Samuel Alito asked Lisa Blatt, the lawyer representing Google: "Would Google collapse and the internet be destroyed if YouTube, and therefore Google, were potentially liable for hosting and refusing to take down videos that it knows are defamatory and false?"

ALSO READ: YouTube, Meta will expand policies, research to fight extremism

Blatt responded, "Well, I don't think Google would. I think probably every other website might be because they're not as big as Google."

The justices struggled with where to draw the line in potentially eroding Section 230 protections.

This illustration picture taken on July 24, 2019 in Paris shows the logo of Youtube on the screen of a tablet. (PHOTO / AFP)

Google and its supporters have said a win for the plaintiffs could prompt a flood of litigation against platforms and upend how the internet works. The case is a threat to free speech, they added, because it could force platforms to stifle anything that could be considered remotely controversial

Conservative Chief Justice John Roberts questioned whether Section 230 should apply given that recommendations are provided by YouTube itself.

"The videos just don't appear out of thin air, they appear pursuant to the algorithms," Roberts said.

Kagan wondered about a website delivering defamatory content to millions of its users.

"Why should there be protection for that?" Kagan asked.

Section 230 protects "interactive computer services" by ensuring they cannot be treated as the "publisher or speaker" of information provided by users.

Critics have said Section 230 too often prevents platforms from being held accountable for real-world harms. Liberals have complained of misinformation and hate speech on social media while conservatives have said voices on the right are censored.

READ MORE: Google is paying for more information in break with past

President Joe Biden's administration urged the Supreme Court to revive the lawsuit by Nohemi Gonzalez's family.

A ruling is due by the end of June.

The justices on Wednesday will hear arguments in a related case over whether Twitter Inc can be held liable under the Anti-Terrorism Act for aiding and abetting an "act of international terrorism" by allegedly failing to adequately screen its platform for the presence of militant groups.