Lawfare Daily: Catching Up on the State of Platform Governance: Zuckerberg, Durov, and Musk
Published by The Lawfare Institute
in Cooperation With
It’s been a busy week in the world of social media and technology platforms. Meta CEO Mark Zuckerberg sent an odd letter to the House Judiciary Committee apparently disclaiming some of his company’s past content moderation efforts. Telegram founder Pavel Durov was arrested in France on a wide range of charges involving an investigation into the misuse of his platform. And Elon Musk is engaged in an ongoing battle with Brazilian courts, which have banned access to Twitter (now X) in the country after Musk refused to abide by court orders.
These three news stories speak to a common theme: the difficult and uncertain relationship between tech platforms and the governments that regulate them. To make sense of it all, Quinta Jurecic, a Senior Editor at Lawfare, with Matt Perault—the Director of the Center on Technology Policy at the University of North Carolina at Chapel Hill—and Renée DiResta, author of the new book, “Invisible Rulers: The People Who Turn Lies Into Reality,” and the former technical research manager at the Stanford Internet Observatory.
To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://
Click the button below to view a transcript of this podcast. Please note that the transcript was auto-generated and may contain errors.
Transcript
[Introduction]
Renée DiResta: One of the things that's listed in the charging documents is that he is choosing not to comply with these investigations into this crime and that's what they go after Durov for.
Quinta Jurecic: It's the Lawfare Podcast. I'm Quinta Jurecic, a senior editor at Lawfare with Matt Perault, the director of the Center on Technology Policy at the University of North Carolina at Chapel Hill, and Renée DiResta, author of the new book, Invisible Rulers: The People Who Turn Lies Into Reality. Renée is also the former research manager of the Stanford Internet Observatory.
Matt Perault: Here we have real consequences and I don't think we want a world where a private entity complies with every foreign government's domestic legal regime, that would mean that every company should try to be operational in China and be compliant with Chinese law or Russian law. I don't think we want that world.
Quinta Jurecic: Today, the three of us checked in on the state of platform governance, examining a trio of news stories that touch on the tensions that arise around regulation of tech companies, or the lack of regulation. First, a puzzling letter from Mark Zuckerberg to the House Judiciary Committee. Second, the arrest of Telegram founder Pavel Durov in France. And third, the ongoing legal battle between Elon Musk and the Brazilian courts over the status of Twitter, also known as X.
[Main Podcast]
We're sitting down together virtually at a moment when there has been a fair amount of hubbub in the world of social media platform governance, particularly in terms of the relationships between major platforms and governments themselves. In the U.S., obviously, we're hurtling toward an election in November, which is always a particularly chaotic period, and there are a lot of different tensions that I think are really in the process of sort of bursting to the surface right now about how these major platforms should be relating to governments, how we think about platform responsibilities to remove content and, of course, the ever present question of free speech.
So there are a lot of different stories and angles to address. But to start off, I want to talk about a letter that Mark Zuckerberg sent to the House Judiciary Committee, which I think illuminates some of the different pressures that platforms are, are facing right now. The core of it is essentially Zuckerberg apologizing to the GOP majority on the Judiciary Committee for two kind of very big picture platform governance decisions that Meta made in 2020 and 2021. First, temporarily demoting the Hunter Biden laptop story during the 2020 election, and then also removing what Zuckerberg somewhat elliptically describes as COVID related content in 2021. Renée, I want to turn first to you. Can you just give us some context on what exactly is going on here?
Renée DiResta: So there's a lot of interesting theorizing about this letter. The letter went to Jim Jordan, who has been holding a series of inquiries for, since January of 2023, in which he has made the argument that there is a vast censorship-industrial complex and has, as part of that, sent out, I think, upwards of 91 subpoenas, I think I've seen at this point, in addition to a couple hundred more letters.
And in those, in that lettering and subpoenaing process, he's been soliciting documents from academic institutions, including mine; the tech platforms; other academic researchers; civil society organizations; I think we've expanded now into advertisers. And in the context of creating this, you know, this perception that there's a vast censorship industrial complex, he did send a letter to Meta. And this appears to be a response. It's unclear if there were specific questions asked or if this is something that Zuckerberg sort of proactively sent back. And in this letter, he's referencing some of the very, very big themes that Jordan has harped on in his, you know, sort of innumerable hearings about this topic.
One of them is the jawboning dynamic, the, there was a court case about this, Murthy v. Missouri, that I imagine we'll talk about, and that was related to purported censorship of COVID and election related content. So he references that in one paragraph, the COVID stuff. He references the Hunter Biden laptop scenario, saying that the FBI warned Meta about a potential influence operation, a Russian disinformation operation related to the Biden family. The Hunter Biden laptop did not turn out to be a Russian disinformation effort, but it's also not clear that the FBI said that the Hunter Biden laptop was a Russian disinformation effort. These two things are actually separate and you see that in the way that he threads the needle in this letter.
And then finally, the third point is Zuckerberg's own personal political donations made through the Chan-Zuckerberg initiative. And that was for nonpartisan election infrastructure donations. So this of course comes to be called Zuckerbucks by the House Republicans who are very, very angry about it. They spin up an entire conspiracy theory about whether it was partisan. And you see Zuckerberg both saying, I've seen, you know, it was not intended to be partisan. I've seen no evidence that it was, that had a partisan effect. Nonetheless, I'm very, very sorry that I did it. So there's a little bit of groveling, actually, in that, in that passage of the letter that I thought was very interesting. So these are, it's a letter that basically touches on the sort of, you know, three of the boogeymen, three of the shibboleths that this committee has pursued for a very long time.
Quinta Jurecic: Matt, how did you read the letter?
Matt Perault: So I think I read it in two different ways. So first, I just read the letter and I actually think that just reading the text, there's a lot of it that makes sense to me. So I've written several pieces for Lawfare and for the Knight First Amendment Institute about jawboning. I think in part because when I was at Facebook, being jawboned was such a scarring experience. It was ubiquitous. It happened at the federal level. It happened at the international level. It happened at the state level. It happened from both members of both parties. It was kind of a daily part of my life in tech policy.
Quinta Jurecic: And actually, maybe we should back up to explain what we mean by jawboning, cause I feel like the term is, it's become kind of ubiquitous, but it's not necessarily obvious what it means.
Matt Perault: Yeah. So I think there's a strict legal definition, which is when a government entity coerces a private entity into changing an approach to a speech related issue. There's a sort of more generic one, I think, which is maybe where the letter sits. There's no accusation of coercion in the letter, which means that there's no accusation that the government practices here violated the First Amendment. But, even if something is not necessarily unconstitutional, it can still be problematic and I think, in the letter, Mark Zuckerberg is kind of making the point that this kind of government pressure he feels to be inappropriate. And I agree with that point, and that's a lot of the pieces that I've written, I think, are in that spirit. Regardless of what the Supreme Court decides on jawboning, what's unconstitutional and what's not, there is a way that government actors behave in putting pressure on private entities that I think is problematic. It leads to suboptimal decisions. It oversteps what the government's role should be. And so I think in this letter, Mark Zuckerberg who is someone who I think has been jawboned, is given voice to some of what's problematic about that experience and also saying very clearly, we're going to try to be less responsive in the face of this kind of government pressure in the future.
Of course, the text of the letter is not the entire thing. It occurs as Renée is describing in this much broader context that we're in an election season right now, that this is in response to a process that has had a particular political valence from the House. And so I was somewhat confused, and I'm curious about what you guys think, about the timing and the tone. There are lines in the letter that I think are big picture, but I don't think it's what you would write if you were writing like Mark Zuckerberg's treatise on jawboning. It responds to specific things. And again, the specific incidents described have a particular political orientation.
I've seen some people speculate, like I think Katie Harbath. She had a very good analysis of this, I thought, and speculated that maybe he wrote the letter to try to get out of testifying in the fall.
Renée DiResta: Yeah
Matt Perault: I've heard people say maybe this is part of a deal to not bring KOSA, which is a child safety bill, to the floor of the House. And so, I think the kind of odd thing about that dynamic is that, if there were deals made behind the scenes to solicit this kind of speech related to this kind of speech issue, in response to what was an implied or explicit course of threat around some kind of retribution against Mark Zuckerberg or against Meta, that in itself is jawboning. And so I think the irony here, which I actually am sort of embarrassed in my own work, I haven't given enough attention to, is that one of the people who's been leading the fight around jawboning issues is Jim Jordan, who is actually a government official and therefore is barred by the First Amendment from engaging in jawboning. So if what we are seeing here is actually evidence of that. There was some behind the scenes coercive activity that in itself in theory should be considered to be unconstitutional.
Renée DiResta: Yeah, I completely agree. That was, the part where the brutal irony of that was, I think under covered, under discussed in the coverage of the letter.
Quinta Jurecic: There's also, I don't want to get too speculative here, but I did find it striking that I think only a handful of days after this letter came out, Politico ran a story about a coffee table book that Trump is releasing soon that includes an allegation that Zuckerberg plotted against him in 2020 and that he would “spend the rest of his life in prison” if he did it again. So I did wonder to what extent this is an instance of Zuckerberg trying to kind of triangulate against a potential Trump victory in November. Of course, the strangeness of that is that the letter came out at a time when Trump looks very bad in the polls.
But I think to zoom out a little bit, this does maybe point to how, platforms are arguably in a pretty difficult position right now. You know, we're heading toward the election. There are a lot of concerns about all kinds of election propaganda. There was a story just this morning about a Chinese potential influence operation trying to stir up American tensions in advance of the election. In 2020, platforms were pretty aggressive about trying to shut that down. Now we have one political party that is quite aggressive about insist, you know, pushing jawboning to prevent things from being taken down and so the platforms arguably find themselves in kind of a pretty difficult political dynamic, even apart from the, you know, questionable wisdom of threatening to jail a political opponent, which we'll table just for the minute, but I want to make sure that we don't lose sight of. I mean, I'm curious what your thoughts are about that, of sort of where the major platforms stand in this kind of political bind right now going into a tense period.
Matt Perault: I just think it's unbelievably challenging. Like I think a lot of the links in the sort of today's news in content moderation discussion that I think we're going to have today is what happens in the absence of governance because we have been expecting there to be governance on a lot of these issues over a period of time. We expected governance on the relationship between algorithms and Section 230, the Supreme Court punted on that for reasons I think that are understandable, but we don't, we don't have guidance. We were expecting guidance on jawboning, and the Supreme Court, in the case related to social media companies and government officials, punted, and so we don't have governance on that.
And we have some guidance from the court on First Amendment rights of platforms, but again, a lot of that is going to be decided. The specifics of that will be decided on remand of the Supreme Court passed on really providing at least as part of its holding really clear guidance. And so that leaves tech companies and government officials in the place they've been in for a long time, which is just trying to muscle it out. You know, not rooted, I don't think in law theory or principle, but just trying to muscle it out. And I think that leads to a whole bunch of suboptimal outcomes. Not just for government officials, but for people like Renée, who have found themselves in the crosshairs of these attacks. And, you know, the question really is what are the rights, what are the rights that apply legally?
And then I think, what are kind of the right norms? What do we want to see from government officials in terms of how they communicate with tech platforms? On that issue specifically, I feel just this tremendous amount of disappointment. I mean, this issue, and it shouldn't be surprising, But I guess I still feel surprised at how politicized it's become, like your view of the nature of the Biden administration's communications here and how valid they are depends a little bit on how valid you think the underlying communications are. Like do you think that COVID misinformation on platforms is something government officials should have raised attention on in the way that they did? And simultaneously your views of Jim Jordan and the way that he's raising issues, I think it is influenced by your view of the politics at play there. My hope for jawboning is that politics can be pushed to the side a little bit and we can develop what the right norms would be for how government officials, whether it's a member of Congress or an administration, would communicate with a private actor around speech decisions.
And I think that will serve us well, no matter what the outcome is. Like you, Quinta, were saying, you know, some of this depends on how things might look in terms of the likelihood of a Trump election, Trump winning or Harris winning. My hope would be that like, in theory, that we could make a decision on what the governance should look like independent of the political outcomes, and then it would bind either a Harris administration or a Trump administration. Again, I don't think the tone of the Mark Zuckerberg letter is written in that way. I don't think it's written as here are the norms that should govern jawboning for any administration going forward and I think that's a missed opportunity.
Renée DiResta: Yeah, I would agree with that. I think it's a precious little in the way of specifics. There's a ton of complaining about content moderation. There always has been, big umbrella term there. We can maybe get into that in the subject of things like Durov and some of the other aspects of content moderation that are also in the news lately. But one of the interesting aspects of this is in addition to there not being any guidance for government, Congress has not done the work of passing basic transparency legislation, right? There are things that we can do to make this problem less fraught. Google and a couple of other platforms proactively voluntarily disclosed government takedown requests in certain portals.
There was the Lumen database, which was sort of a self-regulatory approach that platforms followed for a while, including Twitter, which ceased contributing. This was primarily copyright, but it ceased contributing those sorts of, you know, kind of visibility into government action and government requests in April of last year, after it emerged that it had taken down content at the request of the Modi government, right? That story was embarrassing for it.
And so it simply stopped contributing to this transparency database. So as long as transparency is a, you know, is a self-regulatory nice to have, then we actually don't have any, any real visibility into what is happening. And that leads to this sort of sustained suspicion and, and keeps us, I think, in this partisan place where your opinion of content moderation is largely shaped by the communication about it that you read in your media or hear from your influencers. And that's very, very, very divided at this point.
Quinta Jurecic: So Renée, Matt had mentioned the sort of attacks that you've experienced, which I definitely want to touch on here because I think not only cause I want you to talk about them. But also because I think it touches on the, this sort of bigger issue of not only anti-jawboning posturing as jawboning, I guess, not only against, you know, the platforms themselves, but also researchers and academic institutions that are doing work on these platforms, which, you know, if I think Matt framed this conversation really nicely is what happens in the absence of governance. One answer is we have, you know, researchers and academic institutions that try to sort of provide a check or transparency of some kind, and now we're seeing attacks on that as well. So can you talk a little bit about what that's been like?
Renée DiResta: Yeah, so the, the, we got sucked into the Murthy case because, interestingly, they didn't have enough evidence of actual coercive language happening. So rather than saying, okay, there's not a whole lot of coercive language here, you know, you, you see some of these plaintiffs like Jay Bhattacharya, I have read every single filing in that case. I have never seen an instance in which a government official mentions the names of most of these plaintiffs to a tech platform. There is no specificity there whatsoever. This is why, of course, as Amy Coney Barrett writes in her decision, the standing just isn't there.
But in order to try to bolster this perception that some nefarious stuff was happening, they pointed to our research project, the Election Integrity Partnership and the Virality Project and they said that our, so this was an academic research project started by four academic institutions. It was originated at Stanford with University of Washington and then Digital Forensics Research Lab and Graphika sort of formed the four kind of core orgs at the center of the partnership. And then we created a tip line where people, you know, we reached out to the RNC, we reached out to the DNC, anybody could submit a tip, civil society organizations. And we looked at election rumors in real time as they were happening. Most of the time we just sort of logged a ticket and said, okay, this is happening. It's not a big deal. Nothing to do here. Occasionally, we would send a note to a tech platform saying, hey, this appears to violate your policies. Have a look.
And then the platforms, again, as Zuckerberg notes in his letter, in response to government requests, in response to requests from us as well, or notes from us as well, the moderation decision is ultimately made by Meta. And we have no coercive of power over Meta as academics. We can't even keep CrowdTangle alive, right? Nonetheless, what you see is these political actors deciding that, well, the censorship must have been laundered through Stanford. The government was telling Stanford to tell platforms to take things down.
Again, this simply never happened. They have not actually managed to find any evidence of this happening, but that hasn't dissuaded them. Instead they try to, you know, we weren't funded by the government. We weren’t originated by the government. We weren't run by the government. Nonetheless, our free speech rights, our First Amendment rights, our right to speak to the government, our right to speak to another private institution have come under attack, have been silenced. You know, my, every email I've ever sent has been, you know, on the topic of content moderation has been read by some judiciary lawyer at this point, right, under Jim Jordan's committee. And that is, you know, several people had to go in for five to seven hour on camera, you know, quote unquote, voluntary interviews where they don't get a copy of the transcript. They don't get a copy of the video. That's the sort of cost of doing basic, you know, election rumor tracking work that is a pretty standard academic project to study an American election. And one that we would have done and have done in the past looking at other elections worldwide as well. So this is a, you know. The chilling effect, ironically, the silencing of speech, that has been happening to us, not from us.
Matt Perault: So I want to make sure I like really understand the legal theory here because it becomes, it seems like so attenuated. So the idea is that you have had communications with the government, with the federal government. And then separately you have had communications with private tech platforms and that you have somehow become the government as a result of those two communication channels.
Renée DiResta: Exactly. So there is no, I mean, it doesn't make sense, right? When you actually get down into it, it makes no sense. It's, it's a because again, they, they try to come up with these oh, they were funded by an NSF grant, as if that then makes you an agent of the government. This has never, this has never been our understanding of what a government grant you know, kind of confers. You don't become an agent of the government by having government funding through the NSF in particular, but more importantly, the projects that they're upset about had no government funding at all. There was no government funding for the 2020 Election Integrity Partnership. There was no government funding for the Virality Project.
So not only is there no funding, not only are they not run by the government, not only are there no, you know, there were no government officials at Stanford or UW, you know, controlling anything, there is no actual cohesion to this theory. It's just a conspiracy theory, but it's one where they make the allegation, right, in order to create sort of a perception that maybe this is the way that it could have happened. And, you know, this is why we had to file amicus briefs in Murthy v. Missouri. We did it both at the Fifth Circuit level and at SCOTUS, just to correct the basic facts of the, you know, of what were, what were claimed about us, even though we were not actually parties to the case.
Matt Perault: So I'm trying in my head to just assume, assume the allegations are true or give them weight and even if you do that, like just trying really, I think, to be fair to the theory. At a minimum, you have to concede, I think, that there are a number of significant steps in the causal chain.
Renée DiResta: Yeah.
Matt Perault: That there's some relationship between a private research institution and the government and you have to establish some sort of nexus there. Funding strikes me as a weak one. Presumably you could, there could have been Biden administration officials that would say, tell the tech platforms XYZ. If you could prove that, that would be a stronger one. It doesn't seem like there's any proof of that. And that's all, and that all of that's only one part of it and then you have to establish like some coercive activity between the research institution and the platform. That at least involves a couple of causal jumps.
The odd thing to me is all the pressure is coming from an entity that is clearly a government entity. There's no dispute about whether a member of Congress is a government entity. And it's influencing speech decisions that a private entity are making. And so I think, Quinta, I think you call it like anti-jawboning jawboning, which does seem like it's at issue here. Like even if you could establish the causal links on the Stanford side, you kind of have to concede that there are fewer hoops to jump through in making the claim around members of Congress and the pressure that they're putting on a private research institution.
Renée DiResta: Well, I just want to make one small correction, which is that the people who ran the government during the 2020 election integrity partnership was the Trump administration. Just to-
Quinta Jurecic: Which Zuckerberg kind of skips over his letter. He mentions things that happened in 2020, but he doesn't mention who was in charge. No, I mean, I think that as you say, Matt, I mean, this has all become very politicized in strange ways that sort of are run in some ways, like completely orthogonal to the actual issue set in ways that I think actually makes it, you know, really difficult to talk about the genuine issues here. And that there has been this kind of growing idea, I think really since 2022, maybe I would date it since the kind of new Congress came in. This investigation began under Jim Jordan that, you know, content moderation is censorship and anything that is pushing back against that quote unquote censorship is in the name of free speech. But as we've been teasing out here, it's actually somewhat more complicated.
I do want to make sure we touch on one example of, just how much is included under the banner of content moderation, which is Renée, something you mentioned earlier, the case of Pavel Durov, the founder of Telegram. So maybe you could just give us a little bit of background on what's happening there. It's a pretty complicated story and then we can kind of touch on how it has become embroiled in the same dynamics we've been describing here.
Renée DiResta: So Telegram's an interesting platform. It has two different feature sets, right? So you have the sort of things that feel like a group chat or a messaging, you know, kind of a DM platform, so a little bit akin to WhatsApp or Signal. And then you have a broadcast channel. So one to many, you know, you create a channel, people kind of follow, and then the person who controls the channel can broadcast to all of those folks. So it kind of blends these two different models of social media, the sort of messaging, you know, messaging focused stuff, group chats, and then the broadcast thing. It's important to note that Telegram is not end to end encrypted in those broadcasts or those group chats.
In one-to-one communications. If I were to DM you, we could turn on encrypted messaging and then authorities would not be able to see the substance of our messaging. But one of the things that happens on Telegram is that explicitly illegal content is just sort of out there in plain text, anybody can see it, you know. So what the, in this particular case on August 24th, Durov was arrested in France, outside of Paris. And on August 26th, the French authorities put out a press release listing a variety of concerns. So in that gap, it's important to note public opinion on Twitter decided that this was an affront to free speech. That Durov was being persecuted for letting people like, I think as Elon put it like post memes, basically. That this was because he took a hands off approach to, quote, content moderation. That was the reason he was being arrested. And the kind of usual sort of very high-profile right-wing influencers on Twitter all decided that this meant political speech, right? That he was being targeted for not censoring inconvenient political speech.
In reality, there's like this information vacuum. This is very bad, I think, by the French authorities to let that happen. But on the 26th, this press release goes out and it lists a variety of concerns that offer more clues, right? So it talks about complicity in enabling illicit transactions in organized groups, possession and distribution of pornographic images of minors, so child pornography or CSAM, complicity in trafficking drugs, refusal to communicate information to a legitimate law and law enforcement requests. So you have a series of actual crimes that are being articulated, right? It is illegal to host and distribute child pornography on a platform in most places. And so you have France’s Politico, they kind of go after him for refusing to cooperate with a French police inquiry into child sex abuse, which again is not encrypted in a lot of the places and where it's being traded on Telegram so if he receives a request, he actually can go and respond to the request. And so one of the things that's listed in the charging documents. is that he is choosing not to comply with these investigations into this crime. And that's what they go after Durov for, it seems, in this, again, based on what we know, as of you know, as of two days ago, when the sort of preliminary charges were filed.
Quinta Jurecic: And part of the story is, as you say, you know, the, the, part of the American culture that has sort of postured itself as, you know, free speech defenders, anti-content moderation have sort of adopted Durov when this arrest first took place. Then, as you say, it seems like maybe the, the situation is a bit more complicated. I think there are still questions about the role of encryption technology in these charges against Durov. But that, you know, the question of child sexual abuse material is a pretty serious one, and one that I think people generally agree platforms should be in, you know, should be taking that, that material down.
I mean, I think what that maybe gets to, to put it in context of the Zuckerberg conversation we were having, is that, you know, a lot of the time the discussions around content moderation issues that become politically charged have to do with, you know, speech that is on the borderline in some way, which is precisely why it becomes controversial, right? You know, can you say that you shouldn't wear a mask in terms of COVID or not, right? But when it comes to something that is outright illegal, that's actually a pretty different situation. And that, you know, complicates, I think the discussion a little bit. And Matt, again, to go back to your point about what happens in the absence of governance with Telegram, I think that's, this is maybe a situation where, you know, this is a reminder that in some areas there actually is quite a lot of governance around what can and can't be on platforms. I mean, does that seem like a fair framing?
Matt Perault: Yeah, but I wonder here, it doesn't seem as black and white to me. Like, you know, illegal content exists on every platform. And I think the main difference, which I think people who have followed this more closely, Renée, also Daphne Keller, Brian Fishman, have made very clear don't draw equivalence lines between what other platforms do and what Telegram does, because illegal content might exist on other platforms, but at least they make efforts to try to address it, but that still seems fuzzy to me.
So, if you make some effort to address it, then you're not criminally liable. How does that work? What are the clear lines there? And I actually see this, Quinta, as I think it's closer to there is governance here than our jawboning discussion, but I do think there's still a lot of ambiguity. I mean, what exactly does a platform need to do with respect to illegal content in order to ensure that its CEO won't be arrested when it lands in a foreign jurisdiction?
And, you know, there are cases all the time of particularly like from a U.S. perspective of foreign governments passing criminal laws that we would not want the companies to comply with and so I think the fact that like a lot of the content here is content that most, or I should say the criminal prohibitions here are prohibitions that most people view as sort of sympathetic. You know, that we should have laws against CSAM, for instance, I think makes it seem like maybe an easier case than if the exact same thing applied with fuzzier lines. You know, if there are criminal prohibitions around certain kinds of political speech in India and a U.S. CEO lands in India and then gets arrested, like the theory would all still hold, you know, that violations of foreign law, of foreign criminal law. And I think in that case, we would feel very different about the free speech equities at play. So I think there's still like a lot here that's uncertain, even if there's kind of, I think from the people who seem to track these issues closely, a view that A, Telegram is not like other platforms and B, that, that the result of that is that the free speech argument is weaker than it would be in other cases.
Renée DiResta: Well, just to be clear child pornography is not covered under free speech, right? CSAM is explicitly illegal. This is not a gray area. And in the United States, social media platforms, including X, including Facebook, are required to report any explicit content involving children that they detect or that users report to them. And that has to go to NCMEC, right? They have to file reports. They have to be participating in that process. They also, I mean, I'm not a lawyer, maybe somebody else can weigh in on this, but my understanding is that when warrants and things come through for investigations into those types of cases, they have to comply to the best of their ability, right? So it's not a matter of oh, this is a speech issue. This is, this is an explicitly illegal content issue. And I'm not sure that the, I feel like we're trying to find a slippery slope here when there really isn't one. And it's actually okay to say that in this case.
Matt Perault: Yeah, I just feel more uncomfortable with it. Like I, again, I agree with you about CSAM. I don't disagree, but you know, the, the scope of the First Amendment in the U.S. is broad. The number of things that the government, the volume of speech that the government can prohibit is pretty narrow. And this case I think, I think the posture of the case could look very similar if there was, you know, if it was related to Holocaust denial. I don't know if Holocaust denial is actually a criminal, a criminal offense in Germany, but it, but you could imagine, you know, a platform that decides that it's not going to spend, censor Holocaust denial speech and a CEO lands in Germany. And the result is the CEO is arrested or, or a CEO has to make a decision about not traveling to Germany. That I think has the same contours as this, as this case in lots of ways. Like it's speech that lots of people would find to be, you know, not favored, it's criminally prohibited, and in the local jurisdiction is just enforcing its own law, you know, that kind of a situation makes me feel more uncomfortable. And so, you know, it's not as your point, I think it's like there are clear lines here. I just don't see it quite the same way. I think if governments, if other governments sort of learn from what the French government is doing, I assume very quickly we end up in a place that at least to me would make me uncomfortable.
Renée DiResta: Following that through then, what, what should they do, I guess is where I sort of come down on it, right? So you have these laws and you know, in this particular case, again, the, the agency in France that seems to have been involved in this is one of the child protection agencies. What is the correct course of action then at that point? We can, I mean, this actually kind of relates in a bit, we can connect it to what's happening in Brazil, perhaps, right, where you have a country that has a set of laws. And we can have our opinions about whether the law is just or not but they have these laws. And these are, just to be clear again, these are businesses, right, these are not actually just, I mean, they’re conceptualized as public squares, but this is a private company that is making a profit. There, this is an executive that is, that is choosing to flout laws and policies. We might, we can cheer for, you know, for when they defy tyrannical laws and I think that's great. But I think that that still leaves us with the question of what is to be done in, you know, in these cases. What, what should France do in response to, we are simply not going to comply with your legal requests that for information that we appear to have? Because that again, Telegram is non encrypted in that way. When, when they are saying that, what, what is it that you think should be done instead, I guess, is where I'm struggling to get to, in that world, what happens?
Matt Perault: Again, I think, I think Telegram in some ways, I'm sure in some ways it's a good example, in some ways, I think it's a bad example. Because I think, again, it operates differently from other platforms and has sort of seems like it has, it is aimed to develop a brand and identity around flouting laws, no matter what those laws look like. I mean, I think most people feel like CSAM is not defensible. If you have a company essentially defending it, that is, that puts them in an extreme outlier position. But generally we disfavor the idea of arresting tech employees because of noncompliance. Like you know, other preferable remedies would be, would be fines or, or blocking a service. And so some of these, some of these back and forth, I don't actually, I guess, have a strong view of whether the Brazil-Musk debate fits in this or not, but some of them, I think, are issues playing out the way that I would hope they would.
So one example of it is actually like the Australia News Code and Facebook, Facebook's decision to pull content there. I think when the government passed the news code, the idea is either you need to comply or you can't operate, you can't offer news in this jurisdiction. So a platform has one of two choices, or it, or the government has, you know, the same set of choices. But the platform is making the choice, do we comply with a law that we disagree with, or do we pull content? The government simultaneously would be deciding, do we allow a company that's operating in breach to just continue to offer, operate in breach of local law, or do we enforce against them? And I think when each side is, you know, makes those decisions, like it might be that the right thing is for a company to not be operational in the market, which is, I think, what we have right now with X in Brazil. And what we had for a period of time with Facebook and news in Australia. And that seems sensible to me.
Quinta Jurecic: So I think this is actually a good opportunity to transition a little bit to the situation in Brazil, which touches on a lot of these issues as, as you've both hinted. So currently we're recording on the morning of September 3rd. So Elon Musk has been in this long running, for a few days now, showdown with Brazilian Supreme Court justice, essentially over ongoing investigation into efforts by supporters of Jair Bolsonaro to keep him in power after he lost Brazil's recent election to the current president, Lula da Silva. So, Musk has essentially refused to comply in a variety of ways.
I think the last straw here was that Musk, the, the judge, Alejandro de Moraes, ordered Musk to appoint an in-country representative to, you know, be in Brazil as a way of forcing the company to comply with Brazilian law. Musk refused. And de Moraes responded with this order to basically block Twitter or X in all of Brazil. It's now been upheld by a panel of the Brazilian Supreme Court, though not the full Supreme Court. And there, there are, you know, a lot of different issues here, in part because, on the one hand, you know, Musk is refusing to comply with Brazilian law, and if you look at, at least the translations of some of the reasoning set out by the Brazilian Supreme Court panel. I don't know Portuguese, so I'm relying on Google Translate here. There is a lot of language there about, you know, we're a sovereign nation and we have our own laws. On the other hand, you can also say, you know, is this disproportionate? You are, you know, there are real equities here in terms of blocking the ability of people in Brazil from accessing a major platform.
And I think that this is also, you know, an instance where the question of what constitutes the illegality is also fuzzy. You know, this isn't an instance of, you know, Musk defending his right to blast CSAM to Brazilians. That's not on the table here. The question has to do with you know, Brazilian investigation that I think many people on both sides of the political aisle have argued is, you know, quite aggressive on the part of the Brazilian court. There are questions here about, you know, whether Brazilian law is more expansive in what kind of speech it allows to be restricted than American law in a way that might, you know, arguably put an American company in a difficult position. I don't want to, you know, put Musk on too much of a pedestal here because it is important to acknowledge that, you know, he's been very willing to take down content in places like Turkey and India from governments that have a pretty strong autocratic turn. So it is, the commitment to free speech here is maybe not quite so ideologically pure as he's presenting it to be, but this does raise a lot of, I think, you know, really serious, difficult questions. So now that we've set this on the table, I want to go back to, to the both of you and see, you know, how you're thinking about this case in conversation with everything that we've been touching on here so far.
Renée DiResta: I think the interesting question is actually that sovereignty question, right? It is that Musk got a tweet in April of 2022, where he say by free speech, I simply mean that which matches the law. I am against censorship that goes far beyond the law. If people want less free speech, they will ask government to pass laws to that effect. Therefore, going beyond the law is contrary to the will of the people, right? So this is the sort of framing that he lays out back in April of 2022. This touches on I think, you know, there was a, I believe that distributing Nazi content in Germany is actually illegal, right? That's, that's my understanding and some, somebody in the comments can correct me if I'm egregiously wrong on that.
Matt Perault: Illegal, but the question is, is there a civil liability or criminal liability?
Renée DiResta: Right, and I don't know the answer to that, but there is this question of so the platform does, you know, the platforms have always complied with not, you know, with sort of geofencing, right? With sort of not surfacing certain content to certain people in certain locales because of various, you know, kind of legal concerns related to that, to whether certain types of content is legal in one venue and not in another. And per your point, you know, there's sort of like three choices here. You either comply with the law, you exit the market, or you fight. And if you are going to fight, then I think the, the question is you know, the government then will fight back just to be clear. Right. So they are, and that's one thing that I think has been very interesting with the situation in Brazil. And I know nothing at all about Brazilian law, just to be clear.
So I've been reading, because a lot of, a lot of people have just went to BlueSky, right? A lot of Brazilian Twitter users are, are now over on BlueSky. And they've been tweeting about this actually quite a bit, right? And it's been, it's been very interesting. Maybe this is, of course, a select set of people, you know, maybe shared members of a particular ideology who were the ones who kind of, moved over, migrated over to Blue Sky. But there is a distaste for the idea that an American businessman would simply flout Brazilian law. And it is interesting. I don't know where public opinion is writ large within the country. I think that, you know, there's a talk I saw on X that they're going to be protests. I think it's going to be very interesting to see what happens here, because you do see people saying, why should we have unaccountable private power in the form of a private business that is defying these laws. And I think that that is actually one of the really interesting questions here. They, you know, so, so in light of Musk's commentary in 2022, which is, if people want less free speech, they'll ask governments to pass laws to that effect.
What he's been setting out to do, and again, since I do not know Brazilian law, I don't feel equipped to, to weigh in on whether he's you know, whether, whether this content is telling the truth or not. He's alleging that the judge violated Brazilian law in making the requests to him. So just to be clear, so he is alleging that his actions are justified because Brazilian judges violated their own constitution in making these, these takedown requests. I think there were seven accounts or something like that, that had participated in the insurrection in Brazil on January 8th of 2022. So that's, that's where the, the sort of interesting nuance is coming into play here. He is actually alleging that, that the judge violated Brazilian law. And I think that is actually kind of core to the question here. Is that true or not?
Matt Perault: I think there's some stuff that's happening here that's really positive. You know, we've been talking about an absence of governance and, you know, in, in a lot of ways, Quinta, this seems like the inverse.
Renée DiResta: It’s coming.
Matt Perault: This is real governance and governance is not consequence free. And I think that that's one thing that I think is really a positive component of this. Like one of the things that just frustrates me so much is there are so many proposals that are introduced in Congress and we never really get to see how they play out. Like the cost, the benefits, we just have arguments about how they might play out in theory, knowing that the laws are very unlikely to pass. Here, we have real consequences and I don't think we want a world where a private entity complies with every foreign government's domestic legal regime. That would mean that every company should try to be operational in China and be compliant with Chinese law, or Russian law. I don't think we want that world. At the same time, I think it's understandable that a company shouldn't expect to be able to operate in a local market while flouting local government law. And so I think you see some companies in China, for instance, deciding to have businesses in China, and then typically I think they're compliant with most components of Chinese law. I think that's an appropriate trade off within certain bounds for some companies to decide to make.
But you can't have it all. You can't be operational, set your terms the way that you want to act in violation of law, and then expect to be allowed to continue to operate your service. And so what we have now is I think an appropriate challenge under local law, you know, to the extent that challenge fails and it's determined that local law, that Musk is not going to win under local law. Then I think there's, there's an appropriate decision for a company to make about whether it wants to be operational within the country by complying with local law or whether it wants to pull its product. Then there will be, I think, presumably in the event that Musk decides to not make X operational in the long term in Brazil, a local debate about whether the law should change or not. That feels to me like to be the right, like incentive structure in lots of ways and to have, you know, the kinds of debates and discussions that we don't get in the U.S. either because Congress doesn't pass laws because jawboning allows a lot of these decisions to be negotiated in the shadows instead of in public. You know, what's happening in a very public way in Brazil, I think is probably the right way to have a debate.
Quinta Jurecic: Yeah. I mean, there's, I feel like in a way we can trace that back to that Musk tweet that you mentioned, Renée, about, you know, the democratic nature of these free speech laws, which of course, when he was posting that was completely ridiculous because among the countries that he was talking about with, you know, Turkey, right, a country that is functionally autocratic; India, a country that at the time was increasingly trending toward autocracy, that his sort of his democratic theory there is very clumsy. But Brazil, which is a country that, you know, actually has a functioning democracy. And in fact is, this investigation is part of a sort of effort by the Brazilian government to respond to an effort to end that democracy, however clumsy an effort, you can actually, you know, I think you can imagine things working through, Matt, in exactly the way that you're describing. So there's a grain of truth to that Musk tweet, although perhaps only a grain.
So if we started then with sort of the absence of governance and we've moved to, you know, what happens when there is definitely governance, perhaps too much governance. I think there is a question of, you know, what things are going to look like, governance wise in the U.S. in certainly at the very least in the coming months again? You know, this is always a very hectic period in terms of platform governance in the run up to an election. And one of the interesting things that's happened in recent months is that in the wake of the Supreme Court decision in Murthy, some of the sort of channels that had been shut down between the government and social media companies have actually been reopened along with some actual guidance on jawboning. And Matt, you've been taking a look at that. So I'm curious for your, your thoughts and sort of what the landscape is like here.
Matt Perault: Yeah. So I'm really excited about this. So the FBI, and I think now it's the only agency to do this, released an undated piece of content on its website outlining its practices with respect to sharing threat intelligence information with social media companies. It's not very long, but it outlines the rationale for the program as well as some basics on the procedures it uses and kind of the bottom line on the procedures is it has an internal office that leads its engagement with social media companies. And it will provide them with information that may be useful to them. And then, as I think Renée described earlier, when she was describing Stanford's work, the idea is to be very clear that the company is the one that makes a decision on what stays up and what stays down. And that has, I think, particular resonance when it's the FBI making the request. And so this guidance is very clear that the company has the discretion to do what it wants to do with that information and that any decision it would make is voluntary.
I think this is a baby step, but a very meaningful baby step. If we had agencies outlining more clearly how they intend to work with social media companies, which, which would give guidance to social media companies. But then also give guidance to the public about how they actually behave and give guidance to their own employees about the right procedures for them to follow. That seems really, really positive to me. I've been pushing this idea of a new administration on day one, issuing an executive order on how its employees should engage with social media companies. That's one way to do it, and the benefit of it would be that it's somewhat more comprehensive; it would govern every government employee. This is agency by agency, and so it's a little bit different, but I think would get us to the same place if other agencies would follow the FBI's lead.
Renée DiResta: Yeah, I thought, I thought that was a great, a great initiative as well. I think it's worth noting that even in the original opinion on Murthy v. Missouri, you see the, the judge arguing for particular carveouts, recognizing that there are security and national security implications. And that government has to be able to speak in some way to a tech platform because otherwise people do have to realize that, you know, you can't just have nothing there, right? That's not going to lead to a set of outcomes that the public is going to like.
You also don't want the platforms fielding investigative teams solely internally making their own attribution and determinations. This has always been, you know, since 2017, 2018 or so this kind of collaborative process where government would talk to platforms about particular signals; academia would talk to platforms about particular signals; platforms would definitely talk back to academia, and it seems based on, you know, Twitter files, emails back to government. So as long as you have some clear guidance around what is acceptable and what is expected, I think that that puts us in a much better place.
Quinta Jurecic: All right, we will end it there. All of these three stories are very much in flux, so we will keep an eye on them and see what happens in the weeks and months to come. Matt, Renée, thank you so much.
Matt Perault: Thanks, Quinta.
Renée DiResta: Thank
you.
Quinta Jurecic: The Lawfare Podcast
is produced in cooperation with the Brookings Institution. You can get ad free
versions of this and other Lawfare Podcasts by becoming a Lawfare
material supporter through our website, lawfaremedia.org/support. You'll also
get access to this, and other content available only to our supporters. Please
rate and review us wherever you get your podcasts. Look out for our other
podcasts, including Rational Security, Chatter, Allies,
and the Aftermath, our latest Lawfare Presents podcast series,
and the government's response to January 6th. And check out our written work at
lawfaremedia.org. The podcast is edited by Jen Patja. Our theme song is from
Alibi Music. As always, thanks for listening.