June 14, 2024
Cross Examiners
June 14, 2024
Cross Examiners

The Supreme Court is taking up a case that could affect how you read about issues during a crucial election year. Government officials concerned about misinformation related to topics such as COVID and election integrity had requested that social media platforms take down some posts. Several plaintiffs then filed a lawsuit alleging that by doing so, the government was suppressing speech and certain viewpoints. Now, there’s a question of whether such government actions amount to coercion or are legitimate exercises of power. Those who argue coercion believe that government requests limit free speech, claim “misinformation” is subjective, and worry that increased content moderation could lead to authoritarian control over public discourse online. Those who argue legitimate cooperation say that platforms often voluntarily comply with these requests as part of their anti-misinformation policies, making it a collaborative effort for public benefit. They also note that in cases where misinformation threatens public health or safety, such as false information about vaccines, government intervention is justified to protect the public.

Now in this mock trial, we take this question on the free speech, government, and misinformation on social media platforms. 

  • 00:00:04

    John Donvan
    This is Open to Debate. Hi everybody, I’m John Donvan, and in this episode, we are going to be examining a major case that is now before the US Supreme Court. This case has huge implications in its outcome on a lot of different topics that cover public health, election integrity, national security, and at the same time, it is a First Amendment case that involves the federal government and social media platforms and misinformation. So there is a lot there. The case is called Murthy v. Missouri. And it asks whether the US government violated the First Amendment by asking companies like Twitter and Facebook to take down some user postings during COVID. The court heard arguments in March, and is expected to rule later this year. We’re going to employ our mock trial format for this one, that is where we ask two lawyers to argue in a way that reveals the constitutional issues involved, arguments that the Supreme Court justices themselves will already have heard. So to begin, let’s imagine we are all in a courtroom, and let’s metaphorically all rise to get started as we meet our two attorneys. Here to argue on the side of the federal government, Rylee Sommers-Flanagan. Rylee is the founder and executive director of Upper Seven Law. Riley, welcome to Open to Debate.

  • 00:01:17

    Rylee Sommers-Flanagan
    Thank you so much, John. I’m super happy to be here.

  • 00:01:19

    John Donvan
    And here to argue the other side, the side o- brought by the states of Missouri and Louisiana, Charles Chip Miller, a senior attorney at the Institute for Free Speech. Chip welcome to Open to Debate. Thanks for joining us.

  • 00:01:30

    Charles Miller
    Hi, John. Thanks, uh, thanks for having me. It’s great to be here.

  • 00:01:33

    John Donvan
    So let’s get launched. Let’s get to our opening statements. And o- again, I want you, our, audience, to imagine that you are the Supreme Court justices, and we’re in a courtroom, and you are listening to the arguments being made by these two attorneys. Rylee, you are up first. It’s your chance to address the court in your opening statement.

  • 00:01:49

    Rylee Sommers-Flanagan
    Lovely. Thank you. May it please the court, Rylee Sommers-Flanagan, on behalf of Surgeon General Vivek Murthy, for purposes of this mock trial. We are here today because of an extremist conspiracy theory about censorship. Under that conspiracy theory, the people who made doctors into political ideologues during the COVID-19 pandemic insist that the government and private researchers engaged in a coercion campaign to force social media companies to censor information that they claim is merely the expression of political ideas. And to be sure, if the government were genuinely engaged in efforts to suppress political speech and ideas, it would be cause for grave concern, and it would almost certainly violate the First Amendment. But my friend Mr. Miller represents plaintiffs who have failed to offer any facts that can allow the court to reach the questions that their case professes to present.

    First, the plaintiffs show no connection between their alleged injuries and the government’s actions. That is, their injuries are not traceable to the government. We can’t draw a line between a government action or communication and any alleged injury. And looking at that same problem from a slightly different angle, this means that asking the court to enjoin the government, or to stop the government from speaking to social media companies won’t actually help plaintiffs because their concern is with social media platforms’ content moderation policies, not with the government and its alleged interference. And an injunction against government communication won’t stop social media from removing content that they deem contrary to their policies.

    Second, even though the district court allowed significant discovery, there is still no evidence in this record that shows that the government exerted undue influence on social media platforms in the form of coercive threats or what we call equivalent significant encouragement that could have compelled them to remove content. Those are terms that we use essentially as terms of art within the legal tests that determine whether the government’s actions forced another entity or person to act. Coercive threats are intuitive. They’re things like threats of criminal liability, threats of violence would also count. Significant encouragement in this context, it refers to a different thing. It refers to inducements, but it’s pretty concrete. It’s things like rewards or incentives or, you know, uh, protection from liability potentially that are used to offer something so good that the offer overwhelms a person’s ability to exercise independent judgment. Here, the government engaged in neither.

    And third, in something of a culmination of these first two points, social media companies, content moderation decisions are theirs alone, and they do not constitute state action, a fact that is clearly demonstrated by their repeated refusals to remove posts and content that the government flagged. Among other examples, companies ignored FBI flags roughly half of the time. The federal government took no adverse action in response to this refusal, and continued to communicate with social media companies openly. Ultimately, courts are here to resolve real disputes, not theoretical ones. Plaintiffs have not established standing because their alleged injuries are too attenuated from government action. In any case, the government did not coerce social media companies to remove content in violation of the First Amendment. And regardless, social media platforms make and control their own media content policies.

    Setting aside for a moment the circumstances that issue in this case that simply do not give rise to the questions presented, the truth is that the underlying fears that motivate this case are important. And they ask us to contend with issues that are far scarier than the government merely informing Facebook of Russian attempts to influence voter turnout or flagging false content. What is actually at issue here is a fight over who controls information, and who can use social media algorithms to advantage themselves in retaining and gaining power. We need the First Amendment to safeguard our ability to argue with, criticize, and persuade one another. What we don’t need are outsized protections for disinformation, which is now so often used to prevent thoughtful civic engagement and to undermine free and fair elections. The government must be o- able to communicate information, to use the bully pulpit to share its knowledge, to urge action, and to participate in public debate. This case asks the court to silence one side of an information war, to elevate a side that insists on its own weakness, though all evidence is to the contrary. This would turn the First Amendment on its head. Thank you.

  • 00:06:39

    John Donvan
    Thank you, counsel. Chip, it is now your turn to address the justices. Please, your opening statement. Again, you’re representing the states of Missouri, Louisiana, and a handful of individuals in making this case that the Biden administration violated the First Amendment.

  • 00:06:51

    Charles Miller
    Yes, thank you, and may it please the court. This case, as it comes to this court, as a Supreme Court, is about the law. It is not about judging whether there’s some sort of extremist ideology that’s being advanced, uh, by one side or another. And to the extent that there’s silencing that we’re concerned about, it’s the silencing that occurs by the government. That is why we are here. All right. And so, as we know, freedom of speech and free and open debate of political issues has been at the core of the protections of our liberties, uh, since the founding of our country. Government officials violate the First Amendment when they secretly solicit third parties to suppress the lawful political speech of any American citizen.

    Doing so violates the axiomatic principle that the government may not induce, encourage, or promote private persons to accomplish what is constitutionally forbidden to do, which has been established by this court in Norwood. So the government’s solution, is, as we always know, is to fight political speech that disagrees with with more speech and speech of its own. It is not to go behind the scenes to ask third party platforms, who are the modern day public square, to secretly censor and silence the opposing view. So when it comes to political speech, political speech is at the core of our First Amendment, and a bright line preventing the government from using its outside voice to suppress dissenting views is necessary to give the First Amendment the breathing room it needs for democracy to survive. A clear, robust rule that prohibits the government from targeting lawful political speech has the added benefit of being easily administrable. It will apply to limited facts.

    It’s the government’s serentipist- uh, secret, it’s the government’s secret efforts to silence political speech by contacting third party platforms who are not the speaker, and have little interest in taking sides in a particular matter. It is those limited facts, uh, that do not require the court to decide whether a particular email, or phone call, or nuance in conversation is coercive. It doesn’t rely upon any malleable, multifactor standard that gives courts too much discretion, and leads to unpredictability and cases that take years to decide.

    I’ve just received a decision today in another case, uh, out of Connecticut, Markley v. Connecticut, uh, that decided that my clients’, uh, wo- constitutional rights of free speech were violated in 2018 in an election. The, the chilling, uh, affects and silencing of core political speech that is usually urgent speech that must be heard at the time, uh, that must be prevented. So this case can be largely resolved on that test.

    The record below showed that public officials routinely flagged content for removal, and, for example, informed social media companies that “Removing bad guy information is one of the easy low bar things that you guys can do to make people like me think that you’re taking action.” That’s communication from the White House. These requests specifically aim to silence speech, uh, about some of the most divisive political issues over the most recent years. While other cases involving other kinds of speech might raise difficult questions about when a government puts too much pressure on parties, or involves sensitive information and non-protected speech, the court need not resolve those issues here. In this case, the federal government targeted specific lawful political speech that it disfavored, and it deployed an extensive clandestine plan to remove that speech from public view and debate.

    If the First Amendment means anything, it must prevent this kind of inter- uh, interference, uh, with the marketplace of ideas. Political speech thus occupies the core of protect afforder- afforded by the First Amendment. To that end, the Supreme Court’s cases have provided heightened judicial protection for political speech. This includes speech that discusses political issues, and speech critical, uh, of, uh, political candidates. When core speech, that is political speech is an issue, the importance of the First Amendment protection is at its zeniths. And it’s for this reason that the court has rightly rejected rules that require intricate case-by-case determinations to verify whether political speech is protected. The pitfalls of such an approach are obvious. Vague standards encourage erratic administration of law where the censors, uh, be administrative agency, judicial, uh, or in this case, uh, the, uh, executive branch. The archetypical political speech is chilled in the meantime. Uh, and that’s why the court should affirm the decision below here.

  • 00:11:52

    John Donvan
    Thank you very much, counselor. All right, we have heard both of our opening arguments. I want to thank you for both of those. And we are going to take a very brief recess, uh, and when we come back we will have opportunities for rebuttal of our opening arguments. Our mock trial of Murthy v. Missouri on conflicts surrounding free speech and misinformation, government and social media platforms will continue. We’ll be right back with more Open to Debate. I’m John Donvan.

    Welcome back to Open to Debate. I’m John Donvan. We’re in the middle of a program that’s a little bit different from our usual format. This time we are doing a mock trial, specifically a mock trial on a case concerning social media and government that is now pending before the Supreme Court. We have with us two attorneys who have laid out arguments for why the government’s actions threaten free speech as well as arguments defending the government’s actions as necessary and not coercive. Chip Miller is arguing for the Missouri side, as it’s called, represents, uh, lawsuit by the states of the Missouri and Louisiana and several individuals. Rylee Summers-Flanagan is arguing on the government side, which is known as the Murthy side of the case.

    I want to begin with the issue of standing. Rylee making the case that those who brought the s- the suits, the states of m- Missouri and Louisiana, were not themselves directly impacted in the sense that their speech was not affected by anything that the government did in this. And I think, r- Rylee, I just wanted you to lay out that case a little bit more so that Chip can respond to it.

  • 00:13:32

    Rylee Sommers-Flanagan
    Of course. So I think i- it’s helpful to acknowledge here that there are five individual plaintiffs who allege that their speech was removed from platforms as a result of recommendations made by the government to social media companies. Um, the states are also involved. Their, their claim to standing, and their argument for why they have standing is essentially that they, they believe that they can stand on behalf of their citizens, and essentially assert their right to listen or their right to hear arguments. It’s a creative suggestion about standing. Um, states generally can’t assert, uh, a right on behalf of their citizens in that particular way. The states are not directly injured, and they don’t have an interest in protecting, sort of, this, this idea that th- that there’s a right to hear, um, particularly information that is actually mediated by a third party.

  • 00:14:24

    John Donvan
    Is it enough, Rylee, that there are also five individuals named whose speech u- it can be claimed were suppressed theoretically by the government?

  • 00:14:32

    Rylee Sommers-Flanagan
    Yes. So that’s a much more concrete and direct injury in, in general terms. And surely it would count for an injury, um, in the sense that they experienced the loss of the opportunity to say something that they wanted to say online. But it is key and important that the defendant be responsible for that injury. And in this case, the United States government is not responsible for the decisions that were made by Facebook. It made recommendations. It did not cause the content to be removed.

  • 00:15:01

    John Donvan
    So we’ll get to the merits of that. I just want to take the standing issue one more round, and take it back to Chip.

  • 00:15:05

    Rylee Sommers-Flanagan
    Yeah. So the rule of standing is, uh, essentially that if there is one plaintiff that has standing, then the case proceeds as, as was articulated, there are individuals here who have alleged that they were censored, uh, in their own speech. Uh, they put post up on, on social media that were taken down, uh, either because they were talking about issues that the government wanted removed, or they were simply sharing information by, uh, what the government labeled the disinformation dozen. Um, so that should be sufficient to have standing. And then again, as, as Rylee was explaining, there is a First Amendment right not only to speak, but there’s a First Amendment right to, to hear others, uh, and to receive information.

    So, uh, Bantam Books, uh, wa- was a case that originated out of a governing body that would label books, uh, as being obscene, uh, and then it would call up to distributors, uh, and bookstores and say, “Hey, you know, here’s our list of obscene materials. Uh, you better not sell them. If you do, we’ll call the local police, and we’ll call the attorney general, and, you know, they might choose to prosecute you.” And o- one of the constitutional, uh, rights that the Supreme Court found to be violated there was the rights of, uh, the people who would read these materials to receive them. And so that’s, uh, what would go to the standing our argument, uh, here. And, you know, what’s significant is as Bantam Books involved things, uh, that were obscene or near obscene, uh, which is an area where states have the right to regulate. This case, uh, involves political speech where states and the government do not have a right to regulate.

  • 00:16:40

    John Donvan
    Okay. I, I want to take a moment just for, again, for listeners who are not familiar with all of the, the outlines of the case that we’re talking about, what, what happened. So I’m sure we all remember the pandemic. I’m sure we all remember, um, that there was a, a great deal of fraud conversation across social media about steps taken to, in theory, mitigate the impacts of the pandemic. And there were, uh, voices on social media that were, uh, supporting this. And there were voices on social media being very, very skeptical, uh, pushing back, and sharing m- their views. And sometimes they would claim their information showing that the, uh, the vaccines h- were problematic, that face masks didn’t work, et cetera.

    And what happened in the course of this is that companies like Facebook and Twitter heard from various agencies in the Biden administration asking them to do something about these postings because they were labeling them as misinformation and dangerous in the midst of a, uh, a national health emergency. L- less so w- was the argument made, “We understand that we don’t like their political points of view.” More so it was, “This stuff is hurting the public because this m- this information is, is having a dangerous impact.” However, those who held those views, had their speech r- taken down, uh, have a claim that they were censored by the government. Does everybody agree that I’ve summarized the case between the two sides fairly well?

  • 00:18:02

    Charles Miller
    Yeah, I, I think that’s very accurate. I, I just say it isn’t actually a political issue if the government says that someone’s speech is hurting someone, uh, because they’re talking about, in this case, a public health emergency that was a central political issue being debated in this country.

  • 00:18:15

    John Donvan
    Rylee, let me take it back to you. Part of the issue here is e- when does the government getting involved in something like this constitute not just talking, not just sharing information, but actually bringing pressure to the social media companies with a, kind of, veiled threat that turns this into coercion, which then would turn this into a far more, uh, potent argument that it’s censorship. When is that just information, and when is that a threat of some sort of action, a kind of or else, take it down or else? Where’s the line on that, and, and, and what does the law tell us about where the line in on that?

  • 00:18:54

    Rylee Sommers-Flanagan
    Well, I actually think that your question, sort of, gives us the answer. When there is an or else, when there is an offer of a reward that makes it, uh, difficult to impossible for someone to make a decision that’s unbiased and independent, that’s when you see coercion, right? That’s, that is, that’s what we’re talking about when we talk about i- it, it is the language of threats and rewards of encouragement that exceeds persuasive information. And I think that what we really have to acknowledge here is that we just don’t have facts in this case that come anywhere close to that line. The government expressed itself in strong words at times, absolutely, but there were no repercussions for the social media companies choosing not to adhere to its advice.

    So many different parts of the government, the FBI, the CISA, the White House, w- all of these arms of government are communicating, uh, regularly with social media companies, and advising them about content that’s on their platforms. And at the end of the day, we don’t come even close to the line that you’re asking about.

  • 00:20:07

    John Donvan
    Well, I, actually, I want to take it back to, to Chip. So Chip, I think you’re saying there was coercion, that it wasn’t just persuasion, it was more than persuasion. So make the case that that’s actually coercion because is, there, there was no punishment brought, as, as Rylee said.

  • 00:20:19

    Charles Miller
    Well, right. So, you know, I’m making two arguments. Uh, one, I’m saying there, there was coercion here, but I’m, I’m also making another argument that it really shouldn’t matter if there was coercion. The fact that the government’s requesting this is a problem. But starting with the coercion, uh, aspect of it, lo- look, look, let’s just look at it going forward theoretically from now. All right. So, you know, we, we all know that, uh, that Elon Musk owns X or Twitter. And going forward, whether it’s a continuing Biden administration or a Trump administration, think of the pressure that would be placed upon them because of his SpaceX contracts and others, if they receive a call and say, “Hey, you know, this information should be removed from X.”

  • 00:21:00

    John Donvan
    But how, how explicit does that have to be fo- to, to count as coercion?

  • 00:21:04

    Charles Miller
    So, you know, going back to Bantam Books, there was never a prosecution for anyone for selling obscene material. It was merely the fact that there’s the potential for it that triggers this because, you know, we, we need to protect free speech rights so adamantly that anytime that the government gets near removing speech… I mean, think about it. The government here admits that they were going to what are our modern day public, uh, forums, our modern day public squares, and saying, “Remove this speech so people can’t hear it.” That should be frightening. And frankly, whether, you know, the, the government was compelling them to do it or significantly encouraging, that’s where the Norwood test comes in. Again, it was another case. And in that case, they said that th- the government cannot do through private parties what it cannot do on its own.

  • 00:21:51

    John Donvan
    So what I hear Chip saying is that ther- something different. There’s a different energy, a different dynamic when the government comes to you and says, “We really don’t like this.” That the threat does not have to be explicit. And, and, then there was an example in, in the course of this conversation about the FBI. You know, when the FBI calls you, and tells you that something that’s going on is raising their hackles and there’s national security involved, they don’t have to say, “Or we will punish you for that if you don’t comply.” That it’s, it’s very, very much implied in the atmosphere, that it is effectively coercive even if the or else is not spelled out.

  • 00:22:28

    Rylee Sommers-Flanagan
    So, um, I actually think that that’s one of the most valuable examples that we can point to. And the reason is that when the social money, media companies received contacts from the FBI, they only paid attention about 50% of the time. And if we prevent the government from communicating… I mean, that’s, that’s really good evidence that this is not a coercive relationship, right? Because if I, if my simply talking to you as a member of the administration, uh, you know, there were conversations also during oral argument about low level congressional staffers. Um, other, other folks who at different, what, w- At what point is the level too high for someone to ex- make this request, and for it to become coercive? And I think that what we have to consider is, sort of, is this informational?

    And I think that them, the, the social media companies conduct shows us that they consider this information to be information, not, uh, a coercive tactic that the government was exerting against them to, to get them to remove information. Um, at times they chose to act on the recommendations of the government. They did not do so a good amount of the time. And it varied across who was communicating with them, what the content of the information was because they have their own social media content policies and moderation policies that they enact. And so they responded to the government at times saying, “This does not, uh, run afoul of our policies. And we are a third party. We are a private entity that gets to make decisions about what kind of speech is hosted on our platforms.” And at the end of the day, there also was a national emergency occurring. And we can look at data that shows that when this type of information was reada- readily available to people, and when it was ubiquitous within certain communities, take for example, um, Sean Hannity viewers, you see significantly higher rates of death in those populations. And it is true that people may and should have access to whatever information they, they decide to seek out. It is also true that social media platforms made decisions of their own accord to try to protect people from information that would cause them harm.

  • 00:24:42

    John Donvan
    Rylee, do you, do you in the end think it’s a bad thing to have the other point of view out there if you, if, if you feel the other point of view is wrong on, on something like health, something that’s really, you know, arguably there’s ultimately science to decide it? Is that information that should be suppressed?

  • 00:24:57

    Rylee Sommers-Flanagan
    I’m going to say no. I do think that one of the challenges that we face in a social media era when we have yet to fully understand technology, and points of view that are not necessarily particularly popular may be elevated quickly above those of people who are expressing more complex ideas, who may be, um, sharing information that is nuanced and true. Uh, at, at the end of the day, no, we shouldn’t suppress speech, but when we can identify that a particular speeche- speaker is gaining outsized influence, and trafficking informa- information that is killing people, we should make it a little bit more difficult. It shouldn’t be the thing that is naturally ending up on people’s, um, homepage every time when they enter any platform. I don’t think that the government should be the one making decisions about that, but I do think that they should be able to communicate clearly, uh, about information that they have so that social media companies can take decisions with full information.

  • 00:26:06

    Charles Miller
    So, ag- again, what we’re talking about is, you know, she said that the FBI had a 50 per- % success rate in removing information. I think that’s pretty scary. Um, and, you know, o- if we’re talking about a policy matter, the correct policy matter here is to always have more information. Uh, I would be concerned, um, l- lo- look, I mean, basically she’s saying, “Hey, we think that there are people out there that, th, that are paranoid, and they’re gonna believe the wrong information.” They’re gonna be more paranoid if they know that the government is out there actively suppressing information. Um, so again, ki- I guess using, sort of, the modern, uh, uh, X/Twitter as an example again, they have that community notes feature. They’re not taking down stuff that, uh, that is offensive to people, but they’re saying like, “Hey, look, this is up here, but here’s another view that maybe is more prevalent and, uh, informative.”

    I think that’s a really way t- a great g- way to go about this. So if there’s speech out there that’s, that’s incorrect, uh, and again, back at the time, that was a very actively developing situation about the, the virus, you know, it was fluid. Uh, and so that’s really when we should have, you know, all of the scientists out there debating this, not simply the ones, you know, who currently are, sort of, had the consensus about what’s going on. Silencing the debate is always a problem.

  • 00:27:11

    John Donvan
    Chip, what’s the difference between doing it in public or doing it in a private email or a phone call, and is there a difference in your view?
    Charles Miller (27:18):
    Yeah, yeah, there’s, there’s a huge distinction there, and thank you, uh, for, for noticing and asking that. So when it’s out there in the public, it’s public. Everyone can see what the president’s saying. When the president got there and said, “Social media, you are killing people.” Everyone can debate that, and see what’s going on. Uh, and we can have a debate about the debate. When it happens privately, even sometimes the people that are posting and they’re putting things up there, no one knows that this is happening. And the fact that things are happening quietly, and information is being suppressed, so we’re not having that debate, that’s the problem.

  • 00:27:49

    John Donvan
    So Rylee, you hear that distinction that, uh, Chip just made, that doing it in public is very differently from doing it behind the scenes, that there’s a transparency in one case and not in the other. And the transparency allows for m- more free flow debate. What is your response to that?

  • 00:28:04

    Rylee Sommers-Flanagan
    I am generally an advocate of transparency as, uh, sunlight is the best disinfec- disinfectant, right. Um, I think that can, that that is true. Um, and so I hesitate to say this, but I generally agree with what Chip was saying, you know, just, just for purposes. It’s always-

  • 00:28:21

    Charles Miller
    Don’t ever hesitate for that.

  • 00:28:22

    Rylee Sommers-Flanagan
    … good to agree, but just for purposes of the debate. (laughs) Um, but, but I will say, I do think that this idea that, that communicating privately necessarily gives the communication some dire penumbra that there’s, that something is being communicated that secret doesn’t strike me as being accurate. And while I certainly would have advised certain members of the administration not to use some of the language that they used, I also think that fundamentally what we’re talking about are cons- conversations. And the social media companies are powerful, and they need to hear from the government.

  • 00:28:58

    Charles Miller
    Well, and so just to be clear, like, if, you know… I’m, I’m not, uh, chiding the government or anyone for going to Facebook or others and saying, “Here’s information that we have. And, you know, it’s important for you to know this.” It’s simply the communication where they’re going and they’re saying, “Take down opposing views. Take down RFK’s post. Take down, uh, the, the posts and information, you know, by, uh, doctors and physicians who we disagree with.” It’s’ it’s the act of taking down the information that I find offensive, not communication about the topic itself.

  • 00:29:32

    Rylee Sommers-Flanagan
    So I, I think it’s important for the government to persuade and be, um, clear about what it thinks is harmful and dangerous. And I think at the end of the day that the social media companies have to make decisions about that, but I think it would be confusing to simply flag information, and not identify, sort of, w- what, what the perceived impact of that information is and, and, or are, and, so I, I hesitate to, to agree that, that requesting that something be taken down is in itself coercive because we see that it, it didn’t seem to have that effect. And also, y- you know, I, I do think these other, they’re, that one of the things if we, sort of, go back to Bantam Books is that we do have overt threats in that case. I mean, Chip did a really nice job of, of describing it, right. And you have this, uh, interaction with a bookseller in which you say, “We’re gonna, we’re gonna give the attorney general a call.” That’s not what’s happening here. And, and it’s not, it, it’s a set of circumstances in which you have an exchange, you’ve got back and forth. And if we believe that that’s not appropriate, then that’s something that we can take up in the public square, and that we may be able to push back against as a people, but what it isn’t is unconstitutional.

  • 00:30:52

    John Donvan
    All right, we’re coming up to, uh, to taking a break. (laughs) I just wanna point out that as an organization called Open to Debate, I’m delighted to hear that both of you are opposed to the suppression of debate. We think that that’s a very, very good thing. Uh, when we come back, uh, we’re gonna bring in some other voices to the conversation to, in a sense, help with some cross-examination. This is Open to Debate, our mock trial. I’m John Donvan. We’ll be right back.
    [NEW_PARAGRAPH]Welcome back to Open to Debate. We are resuming from our recess. I’m John Donvan, and I’m joined by Chip Miller and Rylee Sommers-Flanagan, who are currently arguing in our mock trial of Murthy v. Missouri on topics of free speech misinformation, government, social media platforms. Now we’re gonna bring in some other voices, some individuals who have been listening into the debate, who have themselves written and studied and thought a- a lot about the topics that we’re talking about. And up first we have Nina Jankowicz, who is CEO of the American Sunlight Project, former executive director of the Department of Homeland Securities and Information Government Board. Nina, welcome, and, uh, we are ready to hear how you might wanna cross-examine either of the sides in this conversation. The floor is yours.

  • 00:32:12

    Nina Jankowicz
    Great, thank you, John. Allegations of government censorship through coercion of private entities ought to be taken very seriously. We’ve heard that a lot today. And the US would absolutely benefit from a serious exploration of where public-private cooperation ends and coercion begins. But the brief filed by Missouri and Louisiana is not a serious brief. (laughs) These documents are deeply and deliberately divorced from reality to chill research and government coordination on issues related to the health of our information environment during a key election year. Among other inaccuracies, they claim that the White House press secretary threatened, “Legal consequences against the social media platforms if they did not take action against health misinformation.”

    She never uttered those words. They claim that a White House official’s angry email to Facebook, again, I personally wouldn’t have used those words, (laughs) was related to content moderation when in fact, it was about an unrelated platform bug. They also claim that the federal government, like Mr. Miller has claimed, was suppressing speech via its flags. But I believe that that suppression can’t be happening when, as Rylee pointed out, we know mostly platforms declined to act on these government reports, and in fact, added context to content and question, left it up rather than outright remove it. So Mr. Miller, given these and other seemingly deliberate obfuscations littered throughout these filings, how can you argue in favor of further restrictions on speech? That is both the independent researchers who are studying disinformation, as well as the government’s established rights to free expression.

  • 00:33:48

    John Donvan
    So Chip, the, the heart of the question I’m hearing is with, as Nina said, errors and a- elements of the case that as she said, “Were divorced from reality.” Does that undermine the case that was made?

  • 00:33:58

    Charles Miller
    Well, you know, look, I mean, this whole case is about either side thinking the opposing side is, views are divorced from reality. We live in a time where there are extremes, uh, positions on both sides. And it seems that, uh, we, we can waffle between those as far as, as who’s governing. And I think it’s very important for, for people who are very judgmental of the views of the plaintiffs here to recognize that those people could soon be in power. Y- y- and, and it’s t- simply, you know, it is true that the White House called and said, “Hey, you have to take these things down, and that there’s a problem, and that that’s what we’re concerned with.” And, you know, again, I’m talking about what the rule of law should be. If it goes back down to a trial court, and they determine that the facts aren’t exactly present here, then that’s fine. But I’m more concerned about what the rule should be. And the rule should simply be that the government should not be allowed to encourage the removal and repression of information.

    Anybody who’s a fact-checker can say, “Hey, that fact is wrong. You know, I’m a professionl- I’m a professional for disinformation and misinformation, whatever that exactly means. Uh, and I think that this is wrong. Fine, say that, but let the other side have their say too.

  • 00:35:12

    John Donvan
    Rylee, your thoughts on this?

  • 00:35:13

    Rylee Sommers-Flanagan
    I think that these, these are really important questions related to what is it that people have access to in terms of information. And I think that the challenge here is that when we, sort of, uh, open, we, we want everyone to have access to all of the possible information, but we also want them to understand what has the backing of experts, where is that information coming from? Is it coming from the source that is speaking? Is it coming from somewhere else? Oftentimes I think that the government has a better handle on where information is coming from, but even better than the government are private researchers, disinformation experts, people who are tracking, and trying to understand where things are coming from online. And fundamentally what this, sort of, comes down to is we have to care about what the facts of this case are, and we have to care about whether or not the questions that the court is being asked to decide are questions that are actually supported by the facts in this case, and they aren’t.

  • 00:36:14

    John Donvan
    T- thanks very much, Rylee. I just, in the interest of time, I w- I wanna, uh, be able to move on to some other questioners. So, uh, Nina, thank you very, very much for joining us at Open to Debate. It was really a pleasure to have you. Um, I’d now like to invite into the courtroom Matt Taibbi. Matt is an author and a journalist who focuses on media and politics, and he actually has a little bit of a role in the development of this story, um, as one of the journalists who reported out the Twitter files, and the Twitter files revealed a lot of the, uh, the texture of the communications between Biden administration officials and Twitter in particular. So Matt, welcome to Open to Debate, and the floor is yours.

  • 00:36:50

    Matt Taibbi
    Thanks very much for having me on. I really appreciate it. One quick comment. The number of 50% as a journalist, if the FBI had a 50% success rate in striking the content of a newspaper that I worked for, I would consider that evidence of living in a police state. So it all depends on how you look at things. I think that’s an extraordinarily high number. Uh, my question is for, uh, Ms. Sommers-Flanagan. Does the state of mind of the people i- working in the platforms play an important role in deciding whether or not coercion took place? Because one of the things that we found in the Twitter files over and over and over again is that Twitter did not understand these communications as mere information, that they did understand it as coercion. Repeatedly, we found regular evidence of that. And just as an example, uh, there was an exchange at one point in 2020, uh, between senior officials in the company about whether or not they were able to say no to a whole variety of intelligence agencies and enforcement agencies. And this is a quote from a former a CIA official who was working at Twitter. And he says about refusing, he says, “Our window on that is closing given that our government partners are becoming more aggressive on attribution.”

  • 00:38:05

    John Donvan
    So Matt, it’s, it, it sounds like you’re telling us that the people at Twitter certainly felt coerced.

  • 00:38:05

    Matt Taibbi
    Absolutely.

  • 00:38:10

    John Donvan
    They felt they were being coerced. And I w- and you’re asking, “Rylee, if they feel they’re coerced, is that not evidence of coercion?” Rylee.

  • 00:38:18

    Rylee Sommers-Flanagan
    Well, I, I would start with one important fact, which is that if the entity that’s being coerced feels that it’s being coerced, it is the proper defendant in this case. So I would start there, and point out that I think that those are facts also that should be introduced in the context of the legal filings. And also that it, it would seem that Twitter in particular has the ability and willingness to end policies that are inconsistent with the government’s perspectives. Um, and I’m going to forget the specific content moder- moderation policy and what it’s called, but Twitter did terminate its policy that related to COVID-19 disinformation in November, I think of 2022, may have been 2023. And d- did, did so despite the disagreement with the government. The social media companies were able to push back and are able to reform their moderation policies without fear of repercussion.

  • 00:39:15

    John Donvan
    And Chip, it sounds like t- you know, this is obviously looking like a judgment call, how do, how people feel and does the evidence show that they, they were able to push back, or did they feel scared pushing back? What’s your response to it [inaudible

  • 00:39:25

    ]?

  • 00:39:25

    Charles Miller
    Y- Yes, somewhat, l -look, I mean, so if, if you are a social media company, if you’re Twitter or your Facebook, your entire market participation and people being out there is based upon the assumption that you’re not getting pushed around by the government. All right, so they’re never going to come to court and admit, “Yeah, yeah, we were intimidated.” Because all of a sudden, like, people don’t trust them anymore. And so that’s why I keep saying like the facts of whether or not they were intimidated, or cooperated, or were persuaded or, you know, we’re enticed, doesn’t matter. It’s the fact that the government is requesting protected political speech be removed from the conversation that’s important.

  • 00:40:04

    John Donvan
    I hear you on that, but I want to understand, is it a request or demand?

  • 00:40:07

    Charles Miller
    When you have the government saying, and you have the White House saying like, “Hey, do this or run a comfort 230?” When you have the FBI calling them to request to be removed, the FBI works for the attorney general, uh, you know, that’s serious. And, you know, what I can tell you is, is that in, in Bantam Books, and the way that the Supreme Court has treated the First Amendment up till now, it’s just if they’re calling with a deliberate plan to s- to suppress lawful speech through informal censorship, it’s a violation. And that’s what happened here.

  • 00:40:35

    John Donvan
    Okay, Matt, thank you very much for your question. And we have time for one more. I want to bring in Eric Schurenberg, who is, uh, a business journalist and also a media e- executive, and has now, uh, founded a nonprofit called Alliance for Trust in Media. Eric, thanks for joining us on Open to Debate, and please come in with your question.

  • 00:40:51

    Eric Shurenberg
    Thanks for having me here, John. We’re facing what promises to be a highly contentious presidential election in a few months. It’s a time when many Americans would like to think that social media platforms and law enforcement are at their most attuned to anti-democratic and illegal content, like interference from foreign adversaries, or voter suppression schemes. In fact, all such efforts have been chilled by Missouri’s and Louisiana’s claims. So here’s the question. If social media companies can’t talk to the FBI or the Department of Justice, which have a duty to prevent such interference and such schemes, how can we protect election integrity?

  • 00:41:32

    John Donvan
    Rylee, I want to ask you to take that question first. And I think what I hear in Eric’s question is the assumption that a, a decision against the government in this case could really shut down the government’s willingness and practice of communication in general. Do you see that that threat is actually there, that t- the government could be so silenced by the court on this? Is that a realistic outcome?

  • 00:41:54

    Rylee Sommers-Flanagan
    Absolutely. I, I think that what we really need to consider on, in, in this case is what is the impact of telling the government that it can’t communicate, and particularly on the facts of this case. And I know that I sound like I just keep coming back to the same point, but part of the problem here is that we are starting from a place where the proscribed behavior would be so difficult to identify that we would have a very difficult time ensuring that the government could actually comply with the court order. Um, it’s imprecise and overbroad. It requires more, e- it requires the government to make judgment calls about what it can and can’t say that c- could very well endanger our elections, that could prevent social media companies h- having access to basic information about what is being communicated on their platforms.

    And I think fundamentally the question here about communication and, and what should be out there, I don’t think it’s a zero-sum game of, “We need to delete all content that says a certain, a- a- a certain thing that may be illegal. Right. Um, voter suppression is illegal. It’s or, or that may and be intentionally undermining elections that may be trafficking in information that will harm people. So I think what we, what need to think about is how do we place appropriate boundaries? This case doesn’t present the facts that allow us to do it in any sort of clear way.

  • 00:43:23

    John Donvan
    Chip, uh, same question for you. The implications of a vote going against the government, a scenario that, uh, Eric has has laid out, could be quite dire if he’s arguing for the government’s ability to communicate things that it, it feels it needs to communicate.

  • 00:43:36

    Rylee Sommers-Flanagan
    Yeah, so again, this case and the ruling from the court should be tailored to request to remove political speech. So if there’s a scenario where there’s some foreign actor trying to interfere with our election, that’s not protected. Agency for International Development v. uh, the Alliance for Open Society, uh, has held that the government can get involved and do that. Uh, with respect to, um, election misinformation, uh, the government can get out there and speak, and, and request actions for that. That’s Minnesota, uh, Voters Alliance v. Mansky. So what we’re talking about is a narrow rule here for political speech.

  • 00:44:12

    John Donvan
    Thanks very much, uh, for that answer. And Eric, thank you for your question. Uh, uh, obviously what’s, what’s come out in the answer to that question is that wh- what, what is at stake here? The reason this is such a big deal, this case, is are the implications for national security and for election integrity and for public health. All of these things now in our lives inextricably linked to conversations that happen on social media in a very powerful way. So now we’re gonna head to our closing round, and in our closing round, each of our attorneys has two minutes to make one more time the thrust of their case, and try to persuade you that they are on the right side of this issue. Rylee, you are up first. Your closing statement, please.

  • 00:44:45

    Rylee Sommers-Flanagan
    At the end of the day, this case is about, one, making sure that courts don’t get, get involved in theoretical disputes. And two, understanding that checks on government overreach are essential to a functioning democracy. And that while the questions raised in this case, they are divorced from any evidence showing that the government has done wrong. We need courts to remain neutral, unbiased arbiters of the law. Mr. Miller has asked us whether we can decide between political questions. And I think fundamentally what he’s asking is he’s, he’s telling us that political questions are easy to identify, and that political speech is easy to identify. And I would caution us against using that as a crutch to come to a conclusion in this case. W- what constitutes political speech may change at any given moment depending on who is in power, and who wants most to retain that power to feed conspiracy theories, who wants to ensure that at the end of the day, people, uh, are afraid or feel that they have access to complete information.

    I think these are the issues that are at the core of the questions before the court, although what will actually, I hope, happen is that the court will take a step back, and will find for the government, not because it has resolved these profound questions of what is political speech, but instead because they haven’t been raised by the issues that are presented in this case, and the communications that we have seen between the government, and the social media companies, and the resulting, um, the, the speech that social media companies determined was in violation of their content moderation policies. Um, and with that, I hand it back over to Mr. Miller.

  • 00:46:45

    John Donvan
    Thanks very much, Rylee, and indeed Chip, it is your turn for your closing statement, please.

  • 00:46:49

    Charles Miller
    If there’s one thing that we know, it’s that if government has the power to act, it will. To the extent that the court rules here that the government can co-jool, uh, or work with social media companies to take down speech that it disfavors, it will do so. And I think what’s important for our audience, for the, the court here to recognize is that w- where you currently sit on an issue, uh, should not determine where you stand in this case. Uh, meaning there are those of us who, who felt that it was important to take, uh, necessary precautions to save and protect public health. Um, uh, e- and so we may think, “Okay, the government was being helpful here.” But, you know, Mr. Trump may win, uh, election this fall. And, you know, you just have to ask yourself, what happens, uh, if that, if that occurs, and what speech will be suppressed then when, uh, it’s his White House that’s calling, uh, to Elon Musk, uh, or to other social media companies to say, “Hey, take this down.” We’re taking away from that.

    Um, uh, currently there’s o- obviously ongoing debate, uh, about the, uh, Israeli-Palestinian situation. Uh, uh, no matter what side of that you’re on, imagine if it’s determined that your views should be suppressed. Um, and to the extent, um, uh, that it’s not easy to determine what is political speech. uh, the re- this, th- the fault there should be for openness and to allow communication. Uh, you know, John, what I’m, what I’m concerned about here is that it feels that we’re heading to an illiberal age, and it doesn’t matter what party is in power, it seems that they wanna suppress views that they disagree with. That’s scary. It should not happen. And that’s why the court should affirm here.

  • 00:48:42

    John Donvan
    Thank you very much Chip. And with that, this Open to Debate mock trial is adjourned. I wanna thank our attorneys who took part in this mock trial, Chip and Rylee, for participating, and for bringing your expertise and your experience and especially your insights as e- lawyers to a question that is weighing very much on the public’s mind. We, we love getting these insights to how, uh, the Constitution is supposed to work, and how things play out in the courtroom before the Supreme Court. So thank you very, very much for, for the way both of you did this, and conducted it with such mutual respect as well. We really appreciate that.

  • 00:49:12

    Charles Miller
    Thank you both.

  • 00:49:13

    Rylee Sommers-Flanagan
    Thank you.

  • 00:49:14

    John Donvan
    I wanna also thank our guest questioners, Nina, Matt and Eric for bringing in questions that really made the conversation even m- more interesting. And I wanna thank you, our audience, for tuning into this special episode of Open to Debate. As a nonprofit, our work to combat extreme polarization through civil and respectful debate is generously funded by listeners like you, and by the Rosenkranz Foundation, and by supporters of Open to Debate. Robert Rosenkranz is our chairman. Clea Connor is our CEO. Lia Matthow is our chief content officer. Elizabeth Kitzenberg is our chief advancement officer. This episode was produced by Alexis Pancrazi and Marlette Sandoval. Editorial and research was by Gabriella Mayer and Andrew Foote. Andrew Lipson and Max Fulton provided production support. Mili Shah is director of audience development, and the Open to Debate team also includes Gabrielle Iannucelli, Rachel Kemp, Linda Lee and Devin Shermer. Damon Whitmore mixed this episode. Our theme music is by Alex Clement, and I’m your host, John Donvan. We’ll see you next time on Open to Debate

JOIN THE CONVERSATION
0

Have an idea for a debate or have a question for the Open to Debate Team?

DEBATE COMMUNITY
Join a community of social and intellectual leaders that truly value the free exchange of ideas.
EDUCATIONAL BRIEFS
Readings on our weekly debates, debater editorials, and news on issues that affect our everyday lives.
SUPPORT OPEN-MINDED DEBATE
Help us bring debate to communities and classrooms across the nation.