Monday, February 27, 2023

Supreme Court Case 21-1333, Gonzalez v. Google on Section 230(c)(1)

 

Supreme Court Case 21-1333, Gonzalez v. Google on Section 230(c)(1)

 

47 U.S. Code § 230 - Protection for private blocking and screening of offensive material

Section 230 (c)Protection for “Good Samaritan” blocking and screening of offensive material

(1) Treatment of publisher or speaker

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

This is known as the 26 Words that Created the Internet.  Jeff Kosseff wrote a book about it called The Twenty-Six Words That Created the Internet which is a book that I recommend that provided context about the law and background for this test in front of The Court.  This section has historically allowed the freedom of information to be allowed to be on the Internet and has allowed platforms and carriers to be immune from legal action, both civil and criminal, for content posted by third parties.

On February 21, 2023, this was argued in front of the Supreme Court.  You can listen to the arguments and see the facts of the case on Oyez.   This question brought is:

Does Section 230(c)(1) of the Communications Decency Act immunize interactive computer services when they make targeted recommendations of information provided by another information content provider?

Google is the defendant and is accused of allegedly creating an algorithm that directs videos on YouTube to people that may not have been looking for them.  In this instance, the videos in contention are ISIS videos that are used for recruitment and radicalization, so which makes Google liable for aiding and abetting international terrorism by allowing ISIS to use the platform to spread its message.

Eric Schnapper, arguing for the Petitioners, states, ”So, if I may make clear, as I may not have done that well, the distinction we're drawing, our claim is not that they did an inadequate job of block -- of keeping things off their -- their computers that you can access from -- from outside or from failure to -- to block it. It's that that's the -- that's the heartland of the statute.  What we're saying is that insofar as they were encouraging people to go look at things, that's what's outside the protection of the statute, not that the stuff was there. If they stopped recommending things tomorrow and -- and all sorts of horrible stuff was on their website, as far as we read the statute, they're fine.”

This leaves us with the situation that since Google/YouTube used an algorithm to direct users to videos as suggestions to watch, should they still be afforded the ability to be protected under what is commonly referred to as “230”?

Listening to the rest of this was not compelling due to none of the councils or judges completely understanding the technology.  Kudos to Ketanji Brown Jackson for being the most computer savvy for understanding the concepts.  I predict the court will find for the defendant in this instance. There should have been more focus on that the use of the algorithm used by YouTube to push content to the users that are not looking for it makes them a content provider as a value add.  Eric Schnapper failed to communicate that position.

 

 

 

 

No comments:

Post a Comment

DORA: HOW US BASED FINANCIAL FIRMS NEED TO PREPARE FOR ICT GOVENANCE

  What is DORA and ICT Governcnace? There are many laws and regulations that affect many global business entities.   International banking...