In This Episode
Danielle Citron, author of The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age, joins Leah and Melissa to preview two Supreme Court cases that ask whether online platforms should be held liable for user-uploaded content. Plus, more drip-drip-drip from the investigation of the Dobbs leak.
- Read The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age (use promo code STRICT10 for 10% off!)
- Read The Onion‘s incredible amicus brief
Leah Litman [AD]
Show Intro Mr. Chief Justice, may it please the court. It’s an old joke. When an argued man argues against two beautiful ladies like this, they’re going to have the last word. She spoke, not elegantly, but with unmistakable clarity. She said, I ask no favor for my sex. All I ask of our brethren is that they take their feet off our necks.
Leah Litman Hello and welcome back to Strict Scrutiny, your podcast about the Supreme Court and the legal culture that surrounds it where your host today. I’m Leah Litman.
Melissa Murray And I’m Melissa Murray. And we are back in session. And as you’ll notice, Leah and I are alone. You know what happens when Leah and I are alone? Don’t get excited. We just have high jinks and Kate’s not here to reign us in. And so this means this podcast is about to be super lit, and it’s the perfect time to get lit because the court is about to resume hearing cases during what is going to be a massive February argument session. So if January seemed light to you and not full of headline grabbing cases, you’re right. But February is more than going to make up for that. And to help us preview some of the huge tech cases about the future of Al Gore’s Internet is Strict Scrutiny super fan and super guest Danielle Citron. Danielle is the drumroll. Jefferson Scholars Cheer Cheer Jefferson Scholars Foundation, Schenck, distinguished professor in law and cattle and Chapman, Professor of law at the University of Virginia Law School of Wahiawa, where she writes and teaches about privacy, free expression and civil rights. And if that wasn’t enough, in 2019, Danielle was named a MacArthur Genius Fellow based on her work on cyberstalking and intimate privacy. So welcome back to the show, genius. Danielle Citron.
Danielle Citron It’s fantastic to be with you.
Leah Litman So a little known fact is that Kate and Danielle actually can’t appear on the same episode of Strict Scrutiny together, since it tips the balance of the podcasts. Cassandra to optimist ratio, which, you know, is just not what the universe.
Melissa Murray Can’t do it.
Leah Litman Nope. Not what the universe allows. So, ugh no. I kid. Kate couldn’t be here due to another obligation, but she will be back next week.
Danielle Citron I’m going to try to bring that Pollyanna to the show. How does that sound?
Melissa Murray Well, that yeah, I mean, that’s the point, Kate, are very much of a peace in that regard. But we are still very lucky to have you, Danielle, because you were going to help us preview the cases the court will hear this week. And we’re also going to talk about your recent book, The Fight for Privacy. But first, we’re going to get into these cases because they’re pretty big. And as we say, they are about the future of the Internet. So let’s start there. And when we finish up and talk about your book, we’ll follow all of that with a little court culture. And we have been missing our court culture dose because we’ve been doing some other kinds of episodes, but we have a lot to catch up on. So let’s get to it.
Leah Litman Indeed. So the Section 230 cases, which are the big tech cases Melissa was alluding to, are actually a pair of cases that the court is going to hear the first week of its February sitting. The cases involve deeply tragic events in which people lost their lives because of international acts of terrorism. And the plaintiffs say these acts of terrorism were fueled by radicalization on the Internet, which is why they brought these cases. So the cases also present slightly different questions. So what we’re going to do is summarize the facts and the issues of the cases before we start to unpack them anymore. So we’ll do a little bit of background about the big issue that most people are associating these cases with, and that’s about Section 230 of the Communications Decency Act. So I guess let’s start there. Danielle, could you tell us what Section 230 is.
Danielle Citron You how your emphasis on one part of that, the title of the statute? I’m going to lean into another, which is it’s part of the Communications Decency Act of 1996. So the central purpose of the law was to criminalize the hosting of pornography online, which you might say, How is that even possible to have an Internet without porn? And that’s precisely what the Supreme Court got in, striking down basically all of the statute. And the only thing in the ambers that remains is section 230 now. So Section 230 is entitled. And I want to also pause here, because this is important protection for private blocking or filtering of offensive speech. And the one key provision which everyone focuses on and ignore is the other important key provision is section 231. It’s entitled Treatment as Publisher or Speaker, and it says no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. Now, the following subsection, which is sort of a pair with the first, is entitled Civil Liability, and it explicitly limits the liability that an interactive computer service provider will face for removing speech by third party. So it specifically says, quote, No provider or user of an interactive computer service shall be held liable. All on account of a any action, voluntarily taking it in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable.
Melissa Murray So just to paraphrase, I guess it means that a platform like Facebook or YouTube or whatever can’t be understood as the publisher of items or content that is offensive. If that content is actually shared on the platform by other people. So the fact that you host the platform does not make you a publisher for purpose or a speaker for purposes of the Communications Decency Act. And then secondarily, if you as the platform decide to do content moderation, you won’t be liable for infringing upon the rights of any of the individuals who share that content on your platform. When you take those steps to moderate or take out offensive.
Danielle Citron The entire project of Section 230 is to ensure that interactive computer services. So that’s like the early ISPs that they had in mind, that they engaged in content moderation and they wanted to make sure that they weren’t afraid to engage in content moderation because they knew about There was a case for 1995 in New York State trial court opinion Stratton Oakmont versus Prodigy, and in that case, the New York Supreme Court found that because Prodigy had engaged in some content moderation, it was like filtering dirty words, and it left up on a bulletin board called Money Talks, Defamatory Content that it then became a publisher. And so strictly liable for content that was.
Melissa Murray Then, and that would deter other ISPs from moderating. And so essentially, this is a measure that’s intended to provide ISPs with the opportunity to moderate content without worrying that they’re going to be liable down the road. So it wants to encourage content moderation because they don’t want the Internet to become a cesspool.
Danielle Citron That’s right. And and that’s precisely what critics of Representative Chris Cox and Representative Ron Widen, they got together the little merry band, and they wrote Section 230. And just to be clear, as you said so well, it was to incentivize cleaning up the Internet. That’s Cox’s words, cleaning up the Internet. And it’s really important to underscore that Section 236 we just talked about C one and two, that title of Section 230 C is called Protection for Good Samaritan Blocking and Filtering of Offensive Content. So they were leaning in heart in the idea that we don’t want to scare you off. We want to encourage you to be good Samaritans. We want you to have terms of service and decide the kinds of content that’s acceptable on your sites. And we’re not going to punish you or we’re not going to increase your liability for trying to, you know, enforce the rules, but failing to remove or keeping up some content and also overly taking down content that’s C one and C two as they work together.
Leah Litman Got it. Okay. So we’ll come back to what Section 230 does or doesn’t do in a second. But first, some specifics of the cases to give you a sense for how the 230 issue has come up and what it means or doesn’t mean. So Gonzalez versus Google is the case. That is the actual section two thirds case. So the plaintiffs in that case, who are the petitioners at the Supreme Court, meaning they lost in the courts up until now. The plaintiffs are relatives of Noemi Gonzalez, an American citizen who was murdered in a November 2015 terrorist attack in Paris, France, for which the Islamic State of Iraq and Syria, ISIS, claimed responsibility. So, Danielle, can you tell us about the plaintiff’s theory for why Google is liable for Mr. Gonzalez’s death and the case’s procedural history? That is, you know, what kind of happened in the courts. Until now.
Danielle Citron Plaintiffs sued Google under the Anti-Terrorism Act, and the plaintiffs allege that Google is liable under the ETA for providing resources and assistance to ISIS through Google’s ownership of YouTube, the video sharing platform which ISIS used to spread its message. So but more specifically, plaintiffs allege that YouTube’s algorithm, which mined subscribers data and then prioritize and feeds and recommends videos that that resulted in ISIS videos being recommended to YouTube users who then would be likely to click on those videos. And the plaintiffs also alleged that ISIS was able to derive profits from the videos that they posted because of YouTube’s AdSense program. Think of these platforms. They’re essentially their advertising platforms, right? So that YouTube is sharing some of the profits with content creators, including ISIS, so they can then share. Revenue from the ads that are placed alongside the videos. And so the courts dismissed the claims against Google. You know, Google by, you know, YouTube finding that Google and YouTube couldn’t. Could not consistent with Section 230 be held liable under a theory that turned on the content of ISIS’s speech since ISIS generated the speech. And Google YouTube can’t be treated as the publisher or speaker of a third party speech. And I know we’re going to get into why you know, the what we think about those or, you know, how the court might view that. But that’s what the courts below found.
Melissa Murray And then there’s a second case here, and it’s technically not a Section 230 case which may become important down the line, but the case is called Twitter versus Tonda. And the issue it presents is about the scope of liability under the Anti-Terrorism Act. So in 2016, Congress amended the to expressly provide for aiding and abetting liability as part of the Justice against Sponsors of Terrorism Act, JASTA. The amendments say that in an action based on an injury arising from an act of international terrorism committed, planned or authorized by a foreign terrorist organization, liability may be asserted as to any person who aids and abets by knowingly providing substantial assistance or who conspires with the person who committed such an act of international terrorism. Just to further states that how Burston versus Welch, a D.C. Circuit decision that interpreted the scope of aiding and abetting liability, provides the proper legal framework for how aiding and abetting liability should function in this context.
Leah Litman So the facts of time now are just as tragic as they are. And. Gonzalez ISIS claimed responsibility for an attack in which a gunman fired 120 rounds into a crowd at La Reina nightclub in Istanbul, Turkey. And that killed 39 people, injured 69 others, Plaintiffs who are the respondents in this case, meaning they one below, our family members of Nawaz al-Assad, a Jordanian citizen, killed in the attack. So plaintiffs brought suit under the Anti-Terrorism Act against three social media companies, Twitter, Facebook and Google, again for Google’s operation of YouTube, alleging that the companies were a critical part of ISIS’s growth, kind of based on similar theories that Daniel was outlining in the Google case.
Melissa Murray All right. So, Danielle, the procedural history around Tom is somewhat different from the procedural history in Gonzales. So what exactly happened here in how this case proceeded from the district court to the Supreme Court?
Danielle Citron Okay. So in the district court, it dismissed the complaint, finding that the company’s actions didn’t rise to the level of aiding and abetting terrorism under the Anti-Terrorism Act, as amended by JASTA. And because it dismissed the case against the companies, it never reached the Section 230 issue. So it didn’t decide whether the plaintiffs theory of liability depended on treating a content provider speech as that of social media companies. Then the Court of Appeals reversed on the aiding and abetting question, finding that the allegations did state a claim for aiding and abetting liability under the Atay, and then remanded the case to the District Court to consider the Section 230 issue.
Leah Litman So given that that is given that no court has actually decided whether Section 230 prevents the various social media companies from being held liable here, it is possible that the Supreme Court could dispose of this case. And I think also the Google versus Gonzales case without reaching the Section 230 issue. Is that right, Danielle?
Danielle Citron That’s right. And this might be a big in my world, hullabaloo about nothing. And I can’t decide if I want them to to to make a ruling in this case. And I don’t because they could be worse than the status quo, which isn’t great.
Leah Litman Imagine that the Supreme Court making the status quo worse. We’ve never encountered that possibility.
Melissa Murray I could not even imagine that.
Leah Litman Danielle. Danielle, you were supposed to be the optimist.
Danielle Citron Right? Well, I might be. Right. It depends. Maybe as we talk there might be. And I think an optimistic way to think about how they might so decide this case. I’m going to pitch it how I think they should understand that that might be a way to move forward in a Pollyana ish.
Melissa Murray The way they won’t decide it.
Danielle Citron Kate, I’m channeling Kate. Right? Way to think about these things.
Leah Litman Okay. So we are basically going to bracket the Anti-Terrorism Act question because the Section 230 issue is really what I think most people are interested in these cases for and the issue that has the possibility to really, you know, reshape or affect the Internet more so than the Anti-Terrorism Act question of liability does. Okay. So as promised, Danielle, we kind of briefly covered the companies argument for why they think Section 230 doesn’t allow the companies to be held liable. Yet is the argument for why Section 230 grants immunity from these suits? You know, they say they the social media companies, the service providers can’t be treated as the publishers or speakers of content generated. By someone else here, you know, ISIS’s videos and posts. Now, courts have reached this conclusion in some interesting ways. So can you explain more about those cases here? That is how it came to be that Section 230 is interpreted to block social media companies from being sued in cases like this one?
Danielle Citron You know, in looking at these issues, courts have kind of quickly and solely focused on section two thirds one. This is, broadly speaking, like from the get go post 1996, you know, 1997, courts began to look at the issue and they start to to say, listen, if it’s content and we’re talking about a third parties information and the lawsuit has anything to do with third party information, then we’re going to say we’re ignorant, even if the liability has very little to do with the third parties information, we’re going to say, look, we’re not going to treat you as a publisher speaker because the information is provided by someone else. And so we’re going to treat and we’re not going to treat you as a publisher speaker, and you’re off the hook. So there’s a first move that I think very possibly the court is going to focus on. And and I think the answer is and could be it arguably is they know what this lawsuit is about, what YouTube does. It’s not about what ISIS’s saying. What this law is about. The aiding and abetting is YouTube. And, you know, Google, they’re using people’s personal data to tailor and rep, using their algorithms to tailor and pitch videos to people that the lawsuit is about precisely that no matter what the videos say, they could have kind of some messages could be pro-ISIS, but in ways that are skipping down the street, holding hands, you know, whatever that may be. The theory of liability focuses on YouTube’s activity.
Melissa Murray Basically, the service providers here are saying that Section 230 immunizes them from liability. But the petitioners would argue, no, we’re not saying that you made the Icis video, but what we are saying is that your algorithms which direct certain content like ISIS videos to people who would be receptive and amenable to ISIS videos and thus more likely perhaps to act on them, that’s all you you do that right? And you cultivate this environment in which these kinds of videos can sort of take on a new life, can be disseminated more easily, and you’re actually helping and facilitating what ISIS does. And that’s what we’re suing for. Is that the argument here?
Danielle Citron That’s precisely right. And it’s not that YouTube is failing to remove or leaving up ISIS videos. That would be, I think in many respects would be understood as you get the legal shield, because all you’re doing is, you know, you’re trying to be a good Samaritan, but you’re missing some content, so you’re failing to remove some of it. But as you said, so well, that’s not what the lawsuit is about. The lawsuit is, as petitioners would say, and the government agrees in their amicus brief that what we’re focused on instead is YouTube’s own conduct.
Melissa Murray You’re cultivating a market for these offensive videos. That’s right. That’s essentially the theory.
Danielle Citron What’s really important is that YouTube is cultivating it by using their stores massive reservoirs of personal data that because they have these massive reservoirs of personal data, they can then, you know, throw their algorithms against it and then say who is most susceptible, who, given their prior viewing habits, is going to click videos for you? Who’s going to click on that? Because their entire business models like Click and Share.
Leah Litman And that’s how it’s how YouTube always knows. I want to see the Taylor Swift video just going to say, I want to see the cute dog video and I want to see the drag Queen video. Like those are the three.
Melissa Murray Wait? Is that why I’m always getting reruns of Suits.
Danielle Citron You know?
Leah Litman Yes, right. We’ve solved the mystery.
Melissa Murray We’ve solved the mystery.
Danielle Citron Melissa Murray, you know, bestie is like, there you go. I mean, that’s online behavioral advertising at its finest, right? And so the theory is it’s what you’re doing now. You’re using people’s data. You’re getting to know them really well. You’ve got you’ve collected massive hoards of personal data. You’ve bought it from advertisers, marketers, data brokers. You’re using all of that to so determined who’s going to like click and share so you can make more money. And that’s the beef that petitioners have with YouTube. Well, is YouTube is saying, oh, it’s just about the ISIS videos and hey everyone, Section 230 is all about free speech. We could talk about how that’s like. They ignore the other purposes and findings of Congress in the beginning of the statute. But so that’s the that’s the two different, very drastically different framings that we have here.
Leah Litman [AD].
Leah Litman So, you know, when you were telling the origin story of Section 230, it was clear how the provision was intended to allow content moderation without penalizing, you know, those entities for engaging in content moderation, you know, that wouldn’t actually convert them into publishers, you know, or speakers. And I guess I have a question. You know, if the plaintiff’s theory of Section 230 succeeds and the social media companies are turned into publishers or speakers when they use algorithms, you know, is there some concern or risk for what the Internet would become like without algorithms or, you know, if Google couldn’t curate search content without incurring liability or TikTok couldn’t recommend, I watch more Taylor Swift a dog videos like without incurring liability, like, how are these things going to work?
Danielle Citron Right? You know, it’s true, of course, that our online environment is mediated. You know, your spellcheck, your search, your, you know, Instagram. It’s mediated by algorithms. But when algorithms highlight and make money from online activity, then they would operate just like their offline counterparts do. That is they would face liability if the threat of liability was real. Right, and genuine. And of course, other C2 would remain so they could block and filter and curate. They could clean up the internet without any worry because Section 232 would continue to allow them to do it voluntarily so long as they did it in good faith. And so we’ve in some respects, we get back to the entire goal of Section 230 was to incentivize companies to moderate content rather than giving them a free pass for beat, for even soliciting, you know, illegality that would no longer be tenable. And I think it’s a resetting. It could be a resetting, I think, in an effective way.
Leah Litman So it seems like the Internet companies at least are somewhat concerned, just based on the line up of the briefs from Internet service providers.
Melissa Murray And the like. It’s almost like they have vast stores of money that they can just hire all the guns in the world for the almost. Yeah, heavy hitters.
Danielle Citron So those babes in the woods from 19 9596 barely a thought and you know widen and Cox’s eyes was you know, Google and YouTube, they are now the dominant market players like I so I’m not thinking they’re in their heard of Prodigy. Yeah, but by Prodigy it’s like MySpace let’s face it. But but now these interactive computer services, they’re effectively behavioral advertisers. They have been they have literally had a free pass. They have been scofflaws. They are allowed to make money from our data, and they are no responsibility for illegality that they then make money from. It’s like clicking and sharing all over the Internet. And so I’m not going to cry River. I have to say, you know, right, that the five biggest market cap companies, you know, almost in the world. Right. Might have to face internalize some of the costs that right now they externalize.
Leah Litman Yeah I just want to highlight so people understand like the big guns they have brought to this fight. You know on the brief are Paul Clement and Erin Murphy from Clement and Murphy, Lisa Blatt from Williamson, Connolly, Seth Waxman from WilmerHale, Brian Willen at Wilson SONSINI.
Melissa Murray Ted Boutrous at Gibson Dunn.
Leah Litman I mean, the list goes on. You know, I mean.
Melissa Murray If you took out this brief, the Supreme Court bar could literally not function like this is like all the heavy hitters are on them.
Leah Litman And I have some thoughts about what might happen at the oral argument, given that Lisa Blight is going to be among the lawyers arguing this case. I mean, Danielle, you said the origins of Section 230 were in trying to clean up the Internet from porn. No chance, no chance. We go through an entire oral argument without mentions of porn and who knows what else. Can’t wait. Yeah. Oh, yeah. It’s going to be lit.
Melissa Murray Like Santa goes into a triple x porn video store to be like that on the Internet. Exactly. So what we previewed this case earlier in the year, we said it had real start the rapture energy. And I think part of that is because of one of my favorite justices, Justice Clarence Thomas. Right. I mean, I think he’s going to be all over this because, one, he has been dying to claw back the scope of Section 230, in large part because there is this ongoing conservative narrative about how Section 230 allows these platforms, these liberal platforms, to censor conservatives from being free in their speech. So that’s one part of the beef here. And I think you will see a lot of action around that. But I also think that this could be part of Justice Thomas. Mrs.. Long husband did effort to undermine claw back dismantle the New York Times versus Sullivan regime and protections for free speech for journalism more generally. So I think there’s a lot going on here, and we is exactly right. This is going to be literally an unhinged oral argument that goes in all kinds of different directions.
Leah Litman This seems to totally underscore, Melissa, like your characterizations of this court as a Thomas court and basically encouraging us all to go back and look at what that guy was writing ten years ago, because that’s going to be the law very soon.
Melissa Murray Clarence Thomas’ burn book.
Leah Litman Yeah, exactly. Facebook is a fugly slut, and that’s that’s what’s written there. I saw it. No, I didn’t. But if he rolls back New York Times versus Sullivan, maybe you can sue me for saying it. I don’t know. Anyways, okay, So but you know, just an example of this, I think, is how on New York Times versus Sullivan, which is the kind of set of rules that basically insulate media companies from, you know, defamation liability about public officials if they, you know, make reasonable mistakes. That was a cause.
Melissa Murray Well the media companies make reasonable mistakes in their reporting. And if the public figures make reasonable mistakes, that’s part of the coverage.
Leah Litman Right?
Melissa Murray That’s all there. Yeah.
Leah Litman But Justice Thomas thinks that media companies maybe should be liable for false statements. You know, even if the media companies took reasonable precautions, you know, to guard against that risk. And Justice Thomas, again, has been calling for this, and this movement has now gained some real traction in conservative circles. You know, the set aside Sullivan movement, you know, the latest person to kind of jump on this bandwagon is Florida Governor Ron DeSantis. So, again, just like part of this Justice Thomas trendsetter.
Melissa Murray So this is all to say that Section 230 is actually really interesting in that it is both a target of the left and a target of the right. So conservatives are apoplectic about what they perceive as social media platforms, censorship of conservatives like Donald Trump, for example, who was famously booted from Twitter in the wake of the January six insurrection. But lefties are also concerned about what they see as the proliferation of hate speech or white nationalism that fuels violence and radicalization. And if you want an example of this, just sort of look at the way progressives have been talking about the new Elon Musk era of Twitter where content moderation seems to be non-existent. So there are good arguments, I guess, on both sides and bad arguments on both sides. But this is sort of in the crosshairs of both the right and the left, which is the usual posture that we typically don’t see in a case like this. And it means that we’re not really going to know where the justices are going to come from and what kinds of strange bedfellow coalitions we’re going to see on this.
Leah Litman You know, you know, though, that Justice Thomas and Justice Alito are just loving the idea of a Section 230 case that involves ISIS, because they’re going to be like, oh, you censor conservatives, but you don’t censor ISIS like searches and memes. Like, how dare you?
Melissa Murray Here’s what I also mean about the strange bedfellows aspect. I mean, like, those guys aren’t strange bedfellows. We knew what they likely would say, and we know how terrorism might play into it. But I mean, just imagine, though, the prospect of an issue where Danielle’s cyber civil rights initiative is filing a brief that’s on the same side and is alongside a brief filed by none other than Missouri senator and cross-country runner Josh Hawley. Like, wow, that’s I mean, that’s mind blowing. Here’s another idea. The briefs supporting neither party, but that nonetheless advance an argument for different limits on section 230 come from the Lawyers Committee for Civil Rights, the Giffords center. That’s Gabrielle Giffords and the gentleman from Cancun, Ted Cruz. It’s just like a wild assortment of people lined up on various parts of these issues. Again, lots of strange bedfellows energy here.
Danielle Citron Yeah, and like a lot of incoherence, too, because on the one hand, at least, can we go back to Justice Thomas? I think this you’ll both we’ll both enjoy this because on the one hand, he wrote responsive dissent from a cert denial in Malwarebytes versus Enigma, in which he’s focusing on how Section 230 C one he’s taking my position. It’s too broadly interpreted. It applies to everything and anything that involves ones and zeroes. And it’s not narrowly focused on defamation, but it’s just it’s the it’s the free pass. And he doesn’t like that. And at the same time, he wants to rip down the edifice. And that makes sense that the idea is he wants a ripped on the edifice of New York Times versus Sullivan because the idea is there should be more liability for harmful speech to. Free speech online and that platforms enabling and facilitating that tortious behavior should pay for it. But then on the other hand, the notion that from my understanding that Thomas is also interested in in ensuring that these platforms host all speech, that is, you know, that sort of in support of some of those state insane state laws that, you know, requires companies, they almost treat them as public utilities, that they then have to host all like a firehose of speech. That to me makes zero sense. I don’t know how Thomas lives in that land and has all three ideas holding in his head.
Melissa Murray So so a lot of don’t censor conservatives and don’t print stuff about Ginny’s text messages. I mean, it’s all right.
Danielle Citron But then you have to host her text messages. If you treat them as public utilities. See what I’m saying? So I’m just befuddled. No.
Leah Litman Yeah. I mean, it strikes me that, like, people are almost too quick to say this case brings together both sides because, you know, the different parties have like different perspectives and interests. You know, some of the complaints are about too much moderation. Right? Like that’s the Ted Cruz brief and like a conservative grievance narrative about, like how big tech censors conservatives, whereas other briefs, even though they’re formally filed in favor of the same result here, are saying there isn’t enough moderation. Like the Lawyers Committee for Civil Rights says Section 237 apply to civil rights violations or other illegal conduct like this discriminate.
Melissa Murray But Leah watch that nuance completely fall out.
Leah Litman Oh, of course it will. Of course it will. Justice Alito is going to be like the Lawyers Committee for Civil Rights is telling me I must kill Section 230. So
Danielle Citron There you go. So he’s onboard?
Melissa Murray For civil rights?
Leah Litman Woke-lito.
Danielle Citron You know, but what’s also interesting, just to pick up on that thread that, you know, conservatives argue that they’re being silenced is a there’s no empirical proof of such a thing having worked, but.
Leah Litman Never, ever let facts get in the way of a good time at the Supreme Court, Danielle.
Danielle Citron Okay. No, no, no, that’s totally fair. But that the interest you know, if you want to keep any of section that is section two thirds at issue in this case isn’t see to. So if the purpose of the statute and we know it is is and let’s go back to the title. Right. Protecting a private filtering and blocking of offensive speech, then the conservative claim is that Section 230 should be repealed is really what they’re saying, because Section 232 is going to stay. Right. Well, okay. Says you’re immune from taking down in good faith speech. You found find lewd, filthy, harassing, sorry, abusing Congress’s words. Right. You’re like, Daniel, what do you mean, lewd, filthy, you know, dirty, right. Objectionable content. That is what Congress says and see, too.
Leah Litman But here’s the thing, Danielle. Those are words that Congress used. Those are actual legal arguments. But earlier, you were telling me that this is a good Samaritan provision. And I’m pretty sure Ted Cruz and Josh Hawley are going to say they’re good Samaritans and so is Donald Trump and Marjorie Taylor GREENE And therefore, by logic, section 230 protects them. I mean.
Danielle Citron But like they’re not proven, I guess Trump is truth social. Right. So if he’s taking down posts and saying he’s doing it voluntarily and in good faith because it’s objectionable speech, then he can do it.
Melissa Murray So we’ve talked about some of the concerns of the Internet companies. We’ve talked about the strange bedfellows ness of the various coalitions, although I think we are makes a great point that there are some really important distinctions to be made, even among those individuals who are lining up on the same side. Danielle, I want to talk a little bit about a different set of concerns. So we have comments allude to some of the concerns that people challenging the scope of Section 230 immunity might make. And I think it’d be useful to hear a little bit more about those debates. And you talk about some of that in your book, The Fight for Privacy, and we talk about how judicial decisions regarding Section 230 have created the conditions that allow websites to basically not take any responsibility for their users actions and indeed even create incentives not to protect the privacy of individuals. So can you say a little bit more about that? And, you know, this is a perfect time to say something about the book as well.
Danielle Citron So, you know, thanks to Section 230, we have 90. I’m going to talk about some specific cases, but I thought I would just sort of lay the groundwork that not only does Section 230 ensure that websites don’t have to take responsibility for their user activity, but in fact it allows them to solicit, encourage or keep up illegality that involves intimate privacy violations. And and specifically, I’m talking about the nonconsensual taping, sharing manufacture of images of people engaged in sex or nude. There are 9500 sites whose raison d’etre is intimate. Abuse. And they essentially can make money. They have subscribers and they have to in time internalize zero of the costs that they externalize onto all the people. And 98% of all the folks on these 9500 websites are women, are women of color, LGBTQ individuals. Right. So they’re externalizing all these harm. And there’s nothing that victims, you know, individuals whose nude photos are posted online can do about it. They can’t sue the website operator if they sue them. They are being treated as a publisher or speaker of content provided by someone else. So I think I think it’s worth talking about a specific case, just to give you a sense of, you know, the kinds of activity that some of these sites that, you know, unlike these like, you know, sites that make money off non-consensual intimate imagery, they’re dating websites like Render that have engaged in basically no content moderation. And so Michael Herrick in New York City, living in New York City, he and an ex break up and the ex begins to impersonate him on Greineder. And what he does is send men to her house, to his workplace. And he tells these men that he wants to have anonymous sex. So men come to his doorstep. Over a thousand men within a 12 month period came to his door night and day saying, You told me you want to have sex. And when he explains, It’s not me, it’s my ex impersonating him. They get mad and he’s terrified. He has to move. He tries to get an order of protection and essentially the courts don’t. It’s I takes about eight or nine months for the courts to do anything about it. That is, to get an order of protection against this ax grinder does nothing. Granger never even responded to the emails except with a pro forma thank you for your email. The federal District Court, New York’s FDNY found that O Grinder would be treated as a publisher or speaker of the impersonated person’s speech, when in fact, Carrie’s theory, the plaintiff’s theory was the problem was the defective design of the website, knowing with total certainty that people were going to use it to impersonate others to violate intimate privacy. And yet they did nothing. They didn’t redesign the site. Right. They didn’t respond. And so it went to an appeal to the Second Circuit, which also found that Section 233 immunizes grinder even for a lawsuit, which at the heart of it wasn’t about what was said in the, you know, the Imposters posts, but rather how Grindr designed its site. Every single other gay dating website allows you to block IP addresses to ensure that you protect people from harmful impersonators.
Leah Litman So talking about the harms associated with the Internet and social media companies provides a perfect transition to talking generally about, you know, your book, The Fight for Privacy. So maybe we can just shift to doing that now. So in the book, you say, quote, Civil rights are considered fundamental because they enable us to flourish as whole individuals and active members of society. And you emphasize that not only should intimate privacy be used to fight against discrimination, but that intimate privacy is a civil right. So why did this become the foundation of your book, and how did you get to understanding it this way?
Danielle Citron Let me define first, you know, intimate privacy for us because, you know, there’s all sorts of privacy that we do and should care about. But by my lights, intimate privacy is a foundational value, and it deserves protection as a moral right, as a as a human right and as a civil right. So intimate privacy refers to the privacy around how we manage the boundaries around our bodies, our health, our innermost thoughts, which we frankly document. Every second we search, we browse, we share, we text, we communicate our sexual orientation or gender or sexual activity. And our close relationships and intimate privacy is we we all need it to flourish. So intimate privacy is what allows us to welcome people into our lives and companies that ferry our communications so that we can go backstage and figure out who we are. And it really matters for social esteem because if you are, you know, you want to be seen as a fully integrated whole person. I don’t want to just be a fragment of myself. Right. You know, if if there’s a photo of my genitals online with my name, all that everyone is going to see is just me naked, right? We’re going to see me as an object, not as a subject, as a person with autonomy. And critically, we all need intimate privacy to form friendships and love relationships. You know, how how do we get to know people is we unpeel the layers, right? We share stories and activities, experience, dreams and hopes. It’s without it, we can’t flourish. We can’t fall in love, develop relationships, enjoy self and social esteem, autonomy. And because. We know that. Who’s denied intimate privacy more often, both in corporate surveillance, individual privacy, invaders and government. It’s women and minorities.
Melissa Murray There’s this really powerful moment in the book where you point out that people’s most vulnerable, intimate, private moments are not necessarily protected by law, but the right to privacy throughout history has been invoked in ways that may actually destroy other people’s privacy. So.
Danielle Citron That’s right.
Melissa Murray Can you expand on that comparison a bit and how it’s evolved in the way that we’re seeing today?
Danielle Citron Right. So in the 18th and 19th century, 20, even into the, of course, 20th century, that is often what society would only see is the privacy of the privileged. So, you know, consider privacy. This most of this is your home, right? The family privacy, that concept of family privacy was invoked to prevent, you know, raising the curtains on the home. And so we couldn’t, you know, arrest abusers, you know, male abusers. And so what, you know, courts would say in these decisions, you know, from the 1890s was that, you know, we couldn’t arrest people because the privacy of the home family privacy was what mattered. Whose privacy were we talking about it? We were talking about the privacy, either of the man or not privacy so much as concealment of crimes, concealment of domestic abuse. And we never asked, what about the privacy of the. And usually you might think in a middle class white women, they had no privacy in the home. Right. They were at the bidding of the husband, children around them. They probably had no privacy within those four corners. And we never considered it. And at the same time, black women had no privacy at all. Right. That, you know, pre-Civil War, enslaved individuals, their bodies weren’t their own. And so because privacy worked that way in the past, privacy is a concept we should be suspicious of, like Kitty McKinnon long has been because she thinks it’s only going to operate in ways that are equality, undermining.
Melissa Murray But consolidate power and those who have already had it right.
Danielle Citron But but that, you know, we all deserve intimate privacy and especially those who have been denied intimate privacy. We need to see their privacy interests and in full view so that, you know, the real those cases about family privacy, where we said we can’t get into the home and arrest you because if we we looked at those cases, we would look at the woman, the person who was being abused, her right to privacy, and we would see her and what we think we’re.
Melissa Murray SCHNEIDER The dark side of privacy, Griswold versus Connecticut is not the the mammoth achievement we thought it was. No, it’s a really interesting and again, sort of thinking about it in the context of these cases is interesting as well. I mean, you can cut both ways. And I think we’re seeing that here.
Leah Litman And speaking of, you know, privacy cutting both ways and the concept of privacy itself being used to take away other people’s privacy. Another kind of similar theme you identify in the book is how, you know, free speech has also been a concept that has been used to take away privacy, you know, from historically excluded, subordinated or marginalized groups. So can you also elaborate on this dynamic, the push and pull between privacy and free speech?
Danielle Citron Right. So how you know, the ways in which we see intimate privacy and free speech being used to reinforce power and privilege at every step? I used to joke, or maybe I still joke that the First Amendment is like a soul sucking virus. You know, whenever we invoke it, it’s like eats up the whole room. And that’s been certainly true in discussions and debates about rape videos on Pornhub. Pornhub in response to videos that showed even people who are under 18 being raped. So the response at Pornhub was it’s free speech. That seems baffling to me when what we’re talking about is coerced sexual expression and coerced sexual activity where it has no value to self-governance. We’re not going to figure out how to live in the world. And, you know, culturally, politically, socially. It is the taping of a crime. The fact that Pornhub felt comfortable saying publicly, this is free speech and this is somebody like Kink is is actually mind blowing. It’s where that where the First Amendment has been turned in a free pass for the powerful. So everything is speech. And in some ways, that’s the 230 debate as well. But of course, is about corporate surveillance of intimate life and then corporate surveillance as a handmaid into government and government itself as an intimate privacy invader. So, you know, we talked about Grinder and its failure to design its site to protect individuals. Grindr, of course, on its profile, collects with you know what what tribe you’re in and your sexual preferences. It stores nude photos and it also encourages people to include their HIV status. Now, Grindr, if you look at its terms of service, they’re selling all of that information to advertisers, marketers, and in turn being sold to data brokers. And when The Wall Street Journal did a story about just data practices and so many people were like, I’m taking away my HIV profile, it’s only going to hurt me. It’s a one way ratchet to discrimination. So I wanted to make sure, you know, and of course, post jobs that the reservoirs of intimate data now they feed the already law enforcement already buys access to data brokers, federal, state, local And all of those data brokers in their profiles have information about individuals, abortions, their miscarriages, like granular profiles about us, our dating habits, our, you know, our period tracking app information or akiolu their location data brokers. So if you’ve gone to a clinic, then you go to CVS a week later and you get your tampons. All of that is helpful. Circumstantial evidence in cases where we’re trying to prosecute a provider. We’re trying to in states where you can prosecute, you know, the person getting the abortion for civil penalties. And so my word got worse, I have to say.
Leah Litman So with a nod to the wide ranging and expansive nature of the book, which everyone should check out, that is probably all the time we have for this particular segment. So, Danielle, thank you so much for joining us. And listeners, please be sure to check out her book, The Fight for Privacy. Thanks again, Danielle.
Danielle Citron Thank you so much for having me. And it’s wonderful to be with you both.
Leah Litman [AD].
Leah Litman So these are the only cases that the court is going to hear the first week of the February argument session. It will be hearing some additional cases, including the major student loan cases, the second week of the February argument session, and we will preview those cases on the next episode.
Melissa Murray But right now, we want to just tell you to gird your loins because we are going to get opinions this week. So on Wednesday, the court is going to issue some opinions and we don’t know what opinions we’re going to get. But we know that with this court, it’s worth bracing for impact. So just prepare yourself.
Leah Litman We also wanted to cover some court culture because in the last month we have learned some additional things.
Melissa Murray So I’ll start. So CNN reported that Jane Roberts, the wife of Chief Justice John Roberts, is a legal recruiter. That wasn’t news to anyone. We knew that. But what was news was that Mrs. Roberts, in her role as legal recruiter, often places lawyers at law firms that have active Supreme Court practices and present cases before the court, and that this bit of news wasn’t necessarily disclosed in previous filings. And, you know, the interwebs were making quite a lot of this a couple of weeks ago. We and it was just sort of like, is this? I mean, is she really the Ginni Thomas of Martha and Alito’s? I don’t know. I don’t know.
Leah Litman I don’t think so. I mean, I think it was probably some type of unforced error not to disclose this in some capacity. But the idea that she is helping lawyers get jobs at these humongous like literally humongous law firms seems pretty different to me as far as likelihood of bias in a case where, say, your spouse is actively pushing for a position that’s being litigated at the court, just hypothetically, although of course, this does, you know, highlight the smaller professional and social milieu of the court. But again, I just don’t Yeah.
Melissa Murray I also think again, too, like it can’t be the case that your partner becomes a justice of the Supreme Court and you literally have to live under a rock. I mean, I think we all should aspire to. Cecilia Stewart Marshall I was very careful. I didn’t socialize with people who are going to be before the court. But I mean, it’s D.C. if you’re a lawyer, like you can either be in the government, which is unlikely because of who your husband is. You can be at a law firm, which, you know, maybe that’s incompatible with the fact that your husband’s a chief justice and you have kids or not, Who knows? But it seems like being a legal recruiter and placing people in these positions is not that far of a stretch. But yes, do disclose it. We’d like to know.
Leah Litman Yeah. So other pieces of news, we got a little bit more drip, drip, drip about that leak investigation, didn’t we?
Melissa Murray Okay. I’m just going to say, like, did I not call this one? Because when they started talking about how Michael Chertoff, the former secretary of Homeland Security, was called in to review the marshal’s bootleg investigation. Like, I was like, well, that’s weird when he’s an actual professional at security and maybe he could have done his own independent investigation and that might have been better than the marshals investigation not to castigate on the marshal, but come on. And now we’re finding out that Chertoff. Was paid and has had an existing relationship with the court to provide security services. So he might have actually been in a really good position to do an independent investigation. But it really does seem to raise questions about how independent his separate independent review of the Marshals investigation was. I mean, this is again, just come on, guys, like get it together.
Leah Litman I mean, look, he could have called all of the clerks into a room and asked them, did you do it?
Melissa Murray You do it, you know.
Leah Litman On the other hand, he’s like, the marshal did that for me. So this all seems great. Yeah. PERF.
Melissa Murray No notes.
Leah Litman No notes.
Melissa Murray No notes.
Leah Litman No notes. Um. Some additional reporting by CNN, The Washington Post, as well as The New York Times. Also, again, kind of focusing on the Supreme Court’s investigation into the Dobbs leak has revealed that, among other things, the justices used their personal e-mail for work business, and no one felt they could say anything about that, since they’re all justices of the Supreme Court.
Melissa Murray I’m just going to say what I know everyone is thinking. But her fucking emails.
Leah Litman Oh, my God.
Melissa Murray What the actual fuck. Where is Jim Comey? When you.
Leah Litman Where is his press conference? I at least expected some tweets.
Melissa Murray I just can’t even imagine. Like, you know, the chief at gmail.com. Let me send some stuff to myself at home.
Leah Litman Yeah, it’s it’s very heartening to me that these are the justices deciding the major technology cases like Section 230 who continue to use their personal email for work because transitioning to a work email server is just a.
Danielle Citron Little bit hard.
Melissa Murray I’m still back on. The printers are not network. No.
Leah Litman I love it. I love.
Melissa Murray It. I mean, I. Yeah. Also there was reporting on the burn bag. So burn bags are these large receptacles where you put sensitive material and the idea is that they’re going to be taken offsite to be either burned, hence the term burn bag or otherwise shredded so that their contents cannot later be known. But it turns out the Supreme Court does use burn bag. So check. But it’s a big but. It takes these burn bags in which it places sensitive items like draft opinions, if you will, and leaves them out for a long time. So these are technically not really burn bags, but sort of smoldering ember be allowed to lie around for apparently long periods of time before they’re then actually taken away. So the sensitive materials contained within can be destroyed. So I’m going to say security is looking pretty great Here was where, like.
Danielle Citron You know.
Leah Litman Michael Chertoff looked at this and said, you’re doing amazing, sweetie. You’re doing amazing.
Melissa Murray You joke. But I have to say, I bet Kris Jenner would clean this shit up.
Leah Litman Oh, yeah. The momager would not stand for this. Kind of like nothing leaks.
Melissa Murray Unless I want to. Exactly. Exactly right. Yeah, exactly. Yeah. Oh.
Leah Litman Anyway, okay, so one additional thing we wanted to know, kind of part of the peace of will the court break the Internet theme of this episode is the court is discussing a cert petition currently in a case we have briefly alluded to before the case is called Novak versus City of Parma. And this is a case where some police officers arrested someone and put them in jail for making a Facebook page parodying the police department. This is like the stuff of Sam Alito’s dreams. He’s like, Can I put everyone in jail for making fun of me?
Melissa Murray I hope we get to be cells.
Leah Litman Oh, yeah. After my acceptance speech for our and these you know, I’m definitely going to make make the list. Okay. So anyways, the as I was saying, the police officers arrested someone and put them in jail for making a funny on the Internet. And the U.S. Court of Appeals for the sixth Circuit said the officers who were, of course, sued for arresting someone for making a joke. The Sixth Circuit said, you actually can’t sue the officers. They are entitled to qualified immunity because the parody page didn’t have like a big caption or warning on it that said parody. And therefore, the officers didn’t realize they couldn’t arrest this guy for making fun of like.
Melissa Murray The creator of the page. It’s like, Dude, I thought all of the pictures of Baker was the tip off that it was a parody. Oh, so anyway, we had previously discussed.
Leah Litman This case because again, we just referred to the then filed amicus brief by The Onion. And if you didn’t go check out the brief, then listeners do.
Melissa Murray So you missed out.
Leah Litman Do so, now.
Melissa Murray This brief.
Leah Litman It is so.
Melissa Murray It is the best, the absolute best.
Leah Litman It is so funny. I mean, the table of contents is a fun area.
Melissa Murray Hilarious.
Leah Litman The introduction, the statement of their interest in the case is hilarious.
Melissa Murray You should have its own Netflix special. That’s how funny it is. It’s like we read a lot of these briefs and I’m going to say they’re not all great. No, this brief is like 100% a banger.
Leah Litman And I just I have to just play a few of the hits because it’s that good. Okay, go. So this is from the section of the brief where you’re just a. Hostess say I’m a funny news organization, right? That often writes parody and therefore I have an interest in free speech. This is The Onion’s take on the interests of the immigrant section, rising from its humble beginnings as a print newspaper in 1756, The Onion now enjoys a daily readership of 4.3 trillion and has grown into the single most powerful and influential organization in human history. The Onion’s Key, in fact, driven reportage has been cited favorably by one or more local courts, as well as Iran and the Chinese state run media. I mean, it’s it’s just this amazing brief that goes in and out of different voices, like at some points engaging in parody at other points, like walking the reader through a description. It’s just so amazing.
Melissa Murray Here’s another part introduction and summary of argument. Like this is usually the most anodyne part of the brief, but they just tell you what the argument is. This is what The Onion says. Americans can be put in jail for poking fun at the government. This was a surprise to America’s finest news source and an uncomfortable learning experience for its editorials. I mean, it’s the whole thing is hilarious and really well.
Leah Litman It’s extremely well written. It’s exceptionally well done. I mean, it perfectly demonstrates why you can’t have parody without some deception without pulling one over the audience. It’s so good. It’s just incredible.
Melissa Murray The conclusion the petition for certiorari should be granted, the rights of the people vindicated, and various historical wrongs remedied. The Onion would welcome any one of the three, particularly the first.
Leah Litman You know, co-host privilege. I’m just going to note two other things about the brief super quickly. The beginning the beginning of the argument section reads as follow: To Stoltus S, you are dumb. The midsection starts out and then it proceeds to say it goes to quote some Latin phrases and it says, The Onion’s motto is central to this brief for two important reasons. First, it’s Latin, and The Onion knows that the federal judiciary is staffed entirely by total Latin dorks. They sweetly whisper, quote, stare decisis into their spouses ears. I died. I died. I mean.
Melissa Murray That was very funny. Okay. Co-host privilege. Here is my last my, like, really are going to end this episode. Okay. This is a footnote. The Onion’s journalists have garnered a sterling reputation for accurately forecasting future events. One such coup, like how they used the word coup was The Onion’s scoop, revealing that a former president kept nuclear secrets strewn around his beach homes basement. Three years before it even happened. Footnote to see Mar a Lago assistant manager wondering if anyone coming to collect nuclear briefcase from March 27, 2017. The Onion.
Leah Litman It’s. Again
Melissa Murray So good.
Leah Litman Do yourself a favor. We don’t often get to just laugh.
Melissa Murray I mean.
Leah Litman When we’re you know, thinking about the Supreme Court and this brief will give you an occasion to.
Melissa Murray I just want to know the Council of Record. Here is one Stephen J. Van Stump Vought of Miller Johnson. Also on the brief is D Andrew, reporting to.
Leah Litman A michigan law alum. Go Blue.
Melissa Murray Is this Michigan law alum actually writing these jokes or are they getting an assist from the client here?
Leah Litman You know, attorney client privilege? I don’t know.
Melissa Murray Crime fraud.
Leah Litman There has been a murder here and it is very coarse reasoning.
Melissa Murray Anyway, open invitation, fellows, to come on the pod to talk about this. Bring your clients with you.
Leah Litman We will have a good time.
Melissa Murray This was this brief was a banner year.
Leah Litman And and Lisa Blatt is going to take that energy and put it into the section to throw argument. So, listeners, you’re in for a treat.
Melissa Murray All right, listeners, that’s all we have for you. Thanks so much for listening today. And many thanks to Danielle Citron for joining us. Strict Scrutiny is a crooked media production hosted and executive produced by Leah Litman, me, Melissa Murray, and Kate Shaw, who was not here today. That’s why this was so out of control. It is produced and edited by Melody Rowell, Audio Engineering by Kyle Seglin and Music by Eddie Cooper with production support from Ashley Mizuho, Michael Martinez, Sandy Girard and Ari Schwartz, and digital support from Amelia Montooth. See you later.