Patterico's Pontifications

2/21/2023

Supreme Court Today: Gonzalez v. Google

Filed under: General — Dana @ 11:46 am



[guest post by Dana]

Takeaways from today:

Supreme Court justices appeared broadly concerned Tuesday about the potential unintended consequences of allowing websites to be sued for their automatic recommendations of user content, highlighting the challenges facing attorneys who want to hold Google accountable for suggesting YouTube videos created by terrorist groups…The attorney for the Gonzalez family argued that narrowing Section 230 of the Communications Decency Act – the federal law protecting websites’ right to moderate their platforms as they see fit – would not lead to sweeping consequences for the internet. But both the Court’s liberals and conservatives worried about the impact of such a decision on everything from “pilaf [recipes] from Uzbekistan” to individual users of YouTube, Twitter and other social media platforms….A big concern of the justices seems to be the waves of lawsuits that could happen if the court rules against Google.

“Lawsuits will be nonstop,” Justice Brett Kavanaugh said at one point.

But Eric Schnapper, representing the plaintiffs, argued that a ruling for Gonzalez would not have far-reaching effects because even if websites could face new liability as a result of the ruling, most suits would likely be thrown out anyway.

“The implications are limited,” Schnapper said, “because the kinds of circumstance in which a recommendation would be actionable are limited.”

Later, Justice Elena Kagan warned that narrowing Section 230 could lead to a wave of lawsuits, even if many of them would eventually be thrown out, in a line of questioning with US Deputy Solicitor General Malcolm Stewart.

Meta, Twitter, Microsoft and others urge Supreme Court not to allow lawsuits against tech algorithms
“You are creating a world of lawsuits,” Kagan said. “Really, anytime you have content, you also have these presentational and prioritization choices that can be subject to suit.”

Chief Justice John Roberts mused that under a narrowed version of Section 230, terrorism-related cases might only be a small share of a much wider range of future lawsuits against websites alleging antitrust violations, discrimination, defamation and infliction of emotional distress, just to name a few.

“I wouldn’t necessarily agree with ‘there would be lots of lawsuits’ simply because there are a lot of things to sue about,” Stewart said, “but they would not be suits that have much likelihood of prevailing, especially if the court makes clear that even after there’s a recommendation, the website still can’t be treated as the publisher or speaker of the underlying third party.”

This:

Constricting Section 230’s liability protections is also likely to backfire. Section 230 does not protect only large platforms; it protects all platforms. Upstart social-media sites like Gab, Parler, and Rumble enjoy the same protections as Facebook and Twitter.

If Section 230 no longer protects automated recommendation systems, upstarts will likely suffer the most. Even setting aside the plain text of Section 230, then, its protections for automated recommendation systems should be preserved as a matter of sound policy. Such protections nurture today’s startups, benefit our economy, and help generate jobs.

–Dana

28 Responses to “Supreme Court Today: Gonzalez v. Google”

  1. Hello.

    Dana (1225fc)

  2. Nowhere in the post or in the linked story is there any mention of the incident that drives this lawsuit.

    Nohemi Gonzalez was one of over a hundred people killed in the 2015 Paris terror attack, an attack incited by terroristic videos amplified by warped YouTube algorithms. There should be some accountability for trafficking and making a profit off such content.

    But as we’ve already seen, incitement of violence only seems be a concern when it’s directed at elected officials.

    JF (e6556b)

  3. “Most of the lawsuits will fail” is a disingenuous argument.

    Just the threat of a lawsuit will cause people to give in to shakedown artists.

    More people suing each other is not the cure for what ails America.

    norcal (7345e5)

  4. “I admit I’m completely confused by whatever argument you’re making at the present time,” Justice Samuel Alito said.

    Confused? ‘The Shadow’ knew; ‘The Leaker’ knows…

    “Confused? Damn-it, un confuse ’em!”- Walter Short [Jason Robards] ‘Tora! Tora! Tora!’ 1970

    DCSCA (7c12ab)

  5. The existential part of Section 230 is whether it was ever intended to address topics outside the subject matter of the Communications Decency Act. Certainly it allows websites to purge pornographic material, but there is really nothing that clearly says they can purge political speech as nothing in the Act talked about that.

    I think that this is going to be strictly ignored, as too much has build built on this edifice, but Congress really needs to revisit this at some time they can do it without playing to the crowd.

    Kevin M (1ea396)

  6. Prediction: Vote 8-1 for the web giants. Maybe DIG.

    Kevin M (1ea396)

  7. You can bet a lot of political websites (right and left ) could be sued out of existence for defamation based on their posted comments.

    Rip Murdock (d2a2a8)

  8. Congress really needs to revisit this at some time they can do it without playing to the crowd.

    Kevin M (1ea396) — 2/21/2023 @ 1:34 pm

    LOL! Congress always plays to the crowd.

    Rip Murdock (d2a2a8)

  9. The crowd with the largest campaign contributions, that is.

    Rip Murdock (d2a2a8)

  10. From the CNN article linked by Dana:

    Under (Eric Schnapper’s, representing the plaintiffs) interpretation, could liking, retweeting or saying “check this out” expose individuals to lawsuits that they could not deflect by invoking Section 230?

    Yes, Schnapper acknowledged, because “that’s content you’ve created.”

    There goes Twitter.

    Rip Murdock (d2a2a8)

  11. Nowhere in the post or in the linked story is there any mention of the incident that drives this lawsuit.

    Second paragraph:

    For nearly three hours on Tuesday, the nine justices peppered attorneys representing Google, the US government and the family of Nohemi Gonzalez, an American student killed in a 2015 ISIS attack, with questions about how the court could design a ruling that exposes harmful content recommendations to liability while still protecting innocuous ones.

    It may not include all the particulars, but it made it clear that the case was related to her death in an ISIS attack.

    Rip Murdock (d2a2a8)

  12. Rip Murdock (d2a2a8) — 2/21/2023 @ 1:57 pm

    Gonzalez has not alleged that the terrorists ever viewed YouTube. From Google’s brief (page 13):

    Petitioners alleged that, by operating YouTube, Google committed or abetted “an act of international terrorism” that caused Ms. Gonzalez’s death. Petitioners did not allege that Google had any role in encouraging or committing the Paris attack, or that any of the Paris terrorists were recruited or radicalized through YouTube or used YouTube to plan or conduct the attack.

    Rip Murdock (d2a2a8)

  13. “Lawsuits will be nonstop,” Justice Brett Kavanaugh said at one point.

    Ever in character: just like a 24/7 beer tap, eh Brett.

    DCSCA (fc15e5)

  14. Here’s a live-blog of the oral arguments by Kate Klonick, Mary Anne Franks, Mike Godwin, James Grimmelmann, Gus Hurwitz, Jeff Kosseff, Emma Llanso, Alan Rozenshtein, Eugene Volokh, Ben Wittes and Jonathan Zittrain. As with most live blogs, the top of the page is the end, so scroll to the bottom to read chronologically.

    lurker (cd7cd4)

  15. people only need to know what mr. jeff bezo and mr. keith rupert murdock want them to know and everything else is just conspicuous consumption

    nk (5daf94)

  16. IANAL, so I am curious how you can sue someone for something that you don’t allege that they took any actions to do.

    Nic (896fdf)

  17. They are alleging that Google et al failed to take actions that they should have taken — negligence. After hijackings, people would sue airlines for lax security.

    Kevin M (1ea396)

  18. I look forward to their proof of causation.

    lurker (cd7cd4)

  19. There’s a federal statute that creates a private cause of action for providing assistance to terrorists.

    Even if the ambulance chasers get past Section 230, they will still have to get past the First Amendment “clear and present danger” test.

    nk (e70658)

  20. Obama was able to lock up the Benghazi video-maker only because he was on parole and not allowed to post on the internet at all, and by lying a lot.

    nk (e70658)

  21. Takeaways from the Supreme Court’s hearing on Twitter’s liability for terrorist use of its platform

    After back-to-back oral arguments this week, the Supreme Court appears reluctant to hand down the kind of sweeping ruling about liability for terrorist content on social media that some feared would upend the internet.

    On Wednesday, the justices struggled with claims that Twitter contributed to a 2017 ISIS attack in Istanbul by hosting content unrelated to the specific incident. Arguments in that case, Twitter v. Taamneh, came a day after the court considered whether YouTube can be sued for recommending videos created by ISIS to its users.

    For nearly three hours of oral argument, the justices asked attorneys for Twitter, the US government and the family of Nawras Alassaf – a Jordanian citizen killed in the 2017 attack – how to weigh several factors that might determine Twitter’s level of legal responsibility, if any. But while the justices quickly identified what the relevant factors were, they seemed divided on how to analyze them.

    The court’s conservatives appeared more open to Twitter’s arguments that it is not liable under the Anti-Terrorism Act, with Justice Amy Coney Barrett at one point theorizing point-by-point how such an opinion could be written and Justice Neil Gorsuch repeatedly offering Twitter what he believed to be a winning argument about how to read the statute.

    The panel’s liberals, by contrast, seemed uncomfortable with finding that Twitter should face no liability for hosting ISIS content….. …….
    The law says liability can be established for “any person who aids and abets, by knowingly providing substantial assistance, or who conspires with the person who committed such an act of international terrorism.”

    Justice Sonia Sotomayor seemed unpersuaded by Twitter attorney Seth Waxman’s arguments that Twitter could have been liable if the company were warned that specific accounts were planning a specific attack, but that those were not the facts of the case and Twitter was therefore not liable in the absence of such activity and such warnings.
    ……..
    “Substantial assistance” would hinge on the degree to which a terror group actually uses a platform such as Twitter to plan, coordinate and carry out a terrorist attack, Waxman said at one point. The existence of some tweets that generally benefited ISIS, he argued, should not be considered substantial assistance.
    ……..
    Eric Schnapper, an attorney representing the Alassaf family – who had also argued on behalf of the plaintiffs in Tuesday’s Supreme Court arguments in Gonzalez v. Google – again struggled to answer justices’ questions as they sought to find some limiting principle to constrain the scope of the Anti-Terrorism Act.
    ……..
    Justice Clarence Thomas hinted at the potential expansiveness of what Schnapper was proposing in calling for Twitter to be held liable for the ISIS tweets.

    “If we’re not pinpointing cause-and-effect or proximate cause for specific things, and you’re focused on infrastructure or just the availability of these platforms, then it would seem that every terrorist attack that uses this platform would also mean that Twitter is an aider and abettor in those instances,” Thomas said.

    “I think in the way that you phrased it, that would probably be, yes,” Schnapper replied, going on to suggest a test involving “remoteness and time, weighed together with volume of activity.”
    ………
    Gorsuch put forward a theory for why Twitter should prevail in the case but neither Twitter nor the US Justice Department took him up on it.

    Gorsuch gave Waxman a chance to reframe his arguments for why Twitter shouldn’t be liable, based on language in the law suggesting a defendant is liable for assistance provided to a person who commits an act of international terrorism. Gorsuch noted the lawsuit against Twitter doesn’t link Twitter to the three people involved in the 2017 attack on the Istanbul nightclub.
    ………
    The Taamneh case is viewed as a turning point for the future of the internet, because a ruling against Twitter could expose the platform – and numerous other websites – to new lawsuits based on their hosting of terrorist content in spite of their efforts to remove such material.
    ………

    Rip Murdock (d2a2a8)

  22. Let’s suppose that Twitter and the others hosted accounts by a prominent politician who used those accounts to assert that the upcoming election was going to be stolen and that they should prepare to contest the election’s validity by arming themselves.

    Later, after the election, a sizable body of the politician’s supporters attacked the US Congress while it was in session after listening to an exhortation by the same politician.

    Question: Is Twitter liable? Do they get credit for blocking the politician’s use of Twitter after the first part above, but before the second? Can they avoid a “deep pockets” assessment after the politician turns up bankrupt again?

    Kevin M (1ea396)

  23. Question: Is Twitter liable? Do they get credit for blocking the politician’s use of Twitter after the first part above, but before the second? Can they avoid a “deep pockets” assessment after the politician turns up bankrupt again?

    IANAL, but I think that unless there is a specific statute like the Anti-Terrorism Act or 18 U.S. Code § 2333 (both of which do not apply to “domestic terrorism”) that provides a cause of action, then Twitter would not be liable.

    Rip Murdock (d2a2a8)

  24. OK, limit the plaintiff to a Congressional staffer who was injured.

    Kevin M (1ea396)

  25. OK, limit the plaintiff to a Congressional staffer who was injured.

    Kevin M (1ea396) — 2/23/2023 @ 9:56 am

    Same answer. Twitter would not be liable, but the tweeter could be sued. See also here.

    Rip Murdock (d2a2a8)

  26. Twitter (and Facebook, Instagram, Patterico’s Pontifications, Free Republic, etc.) still have Section 230 immunity from lawsuits over what their users post.

    Rip Murdock (d2a2a8)

  27. @22. It’s unlikely anyone would be liable. The incitement exception to the First Amendment applies only to speech that’s intended and likely to produce imminent lawless action. At the very least, your hypothetical fails the imminence test.

    lurker (cd7cd4)

  28. Rip Murdock (d2a2a8) — 2/23/2023 @ 10:08 am

    Twitter (and Facebook, Instagram, Patterico’s Pontifications, Free Republic, etc.) still have Section 230 immunity from lawsuits over what their users post.

    Well, the issue is directing people to specific posts. But the court, if it upheld the suit, is unlikely to be able to give sites abright line on what to avoid.

    I think, at a minimum, it has to be something that is predictable in advance would likely lead to a crime.

    Sammy Finkelman (1d215a)


Powered by WordPress.

Page loaded in: 0.0800 secs.