Episode 217 -

DISINFORMED: rump's Facebook ban was upheld. Is this censorship? NO! Pen America's Nora Benavidez explains

air date May 11, 2021

retrieved from pen.org on 7/14/2021

retrieved from pen.org on 7/14/2021

On Wednesday, Facebook's Oversight Board upheld the decision to ban Trump from social media for now. Facebook has 6 months to decide if his ban will be permanent.

Some may say banning Trump from social media is a free speech issue, but in this episode of DISINFORMED, Nora Benavidez, Pen America's director of Free Expression Programs explains why they're wrong.

Follow Nora: https://twitter.com/AttorneyNora

Listen now

Bridget Todd (00:03):

You're listening to DISINFORMED, a mini series from There Are No Girls On The Internet. I'm Bridget Todd. So you might've seen that on Wednesday, Facebook announced they were upholding their ban at the twice impeached former president Donald Trump for now. Here's what you need to know. Basically, Facebook played a huge part in the January 6th insurrection. Facebook was the main platform mentioned by those involved. It was used even more than other platforms popular by fringe right-wing types like Parler, according to charging documents from the Department of Justice.

Bridget Todd (00:39):

This is why we saw Facebook and other platforms ban Trump on January 7th, the day after the insurrection. The ban was enacted after Facebook removed two of Trump's posts during the capital riot, including a video in which he said, "Supporters should go home." But, in which he repeated his false claim of widespread voter fraud saying, "I know your pain. I know you're hurt. We had an election that was stolen from us." It should go without saying that this was an obvious lie. The election wasn't stolen, and this lie was the entire basis for his supporters storming the Capitol in the first place.

Bridget Todd (01:13):

Even though they banned him, we knew this wasn't permanent. Because despite being a billion dollar company, Facebook is basically incapable of making a single clear decision. Facebook pretty much threw up their hands and said they were not able to make this decision. Hired the so-called oversight board to make the decision of whether Trump should be permanently banned or not for them.

Bridget Todd (01:32):

The oversight board upheld Trump's ban for now, but also said that Facebook should not have imposed an immediate suspension without clear standards, and said the company should determine a response consistent with rules applied to other users. So the whole thing was basically a massive waste of time. And we're pretty much back where we started.

Bridget Todd (01:51):

The Facebook board pretty much punted to Facebook and said that they should make the decision about whether to reinstate Trump consistent with their policies. And now Facebook has six months to figure it out.

Bridget Todd (02:02):

This is a real failure of leadership. The Facebook oversight board is basically just a way for Facebook leadership to avoid real accountability. You know, the kind of accountability that comes with making a decision. Facebook has already done so much harm to our discourse and democracy. And don't forget the kind of disinformation, harassment, and inciting, and violence that has been allowed to fester on their platform. Overwhelmingly hurts women, LGBTQ folks, and people of color.

Bridget Todd (02:29):

Now, many detractors might say that banning Trump from social media is a violation of freedom of speech. But Nora Benavidez, director of the U.S. Free Expression Programs at PEN America foundation says those people are just wrong. Let's revisit her episode of There Are No Girls On The Internet and listen as she gets into why banning Trump from social media is not a free speech issue.

Bridget Todd (02:51):

Today marks the start of Trump's second impeachment trial for his role on the insurrection on January 6th. Here's Maryland representative Jamie Raskin earlier today at the trial.

Jamie Raskin (03:01):

If that's not an impeachable offense, then there is no such thing. On the day Congress met to finalize the presidential election, he would have you believe there is absolutely nothing the Senate can do about it.

Bridget Todd (03:16):

Now it's clear that Trump continuously used social media to spread disinformation, including the repeated baseless lie that he won the 2020 election and that it was being stolen from him. That lie as we know, culminated in his supporters attacking the Capitol. But even before that, Trump has always used social media to incite violence, stoke tension, and spread distortions and baseless claims. That is until January 8th, when Facebook, Twitter, and other platforms finally gave Trump the boot.

Bridget Todd (03:44):

But don't give tech leaders too much credit. They only did this after years of pressure from platform accountability advocates who warned that failure to act could lead to the kind of violence that we saw on January 6th. And even now, Facebook's oversight board is weighing whether they made the right decision by kicking Trump off the platform. So it's possible that we haven't even heard the last of Trump on social media.

Bridget Todd (04:07):

A lot of people were questioning if by banning Trump from social media, his free speech was being threatened. But Nora Benavidez, free speech attorney and the director of US Free Expression Programs at PEN America foundation, a nonprofit that protects and promotes free expression says that when it comes to understanding free speech on social media platforms, a lot of us are asking the wrong questions.

Nora Benavidez (04:27):

I remember when the insurrection happened, it felt for me like total whiplash. There was still a kind of hesitancy, concern like what was going to happen in the lead up to the inauguration. So that January 6th just somehow felt not too surprising, but still very, very jarring. And in the weeks that have followed, we've just tried to be responsive to people.

Nora Benavidez (04:54):

People have questions like were the people storming the Capitol exercising their First Amendment rights. Did Trump have a right to be on Twitter and on Facebook? And so a lot of questions have emerged that we've just tried to be ready to answer and to talk with people about what's really happening. Because they're really, really complex issues.

Bridget Todd (05:15):

So full disclosure, I have worked as you know worked with organizations that have tried to pressure social media platforms to take disinformation seriously. So a big part of that has been asking them to remove Trump when he tweets things that are incendiary or inciting violence. And truly, I don't think I ever thought I would see Trump be banned from platforms. I remember when that happened, I was fully kind of surprised because I think I was quite used to the idea that he's just above consequences or accountability. There will never be any of it. So I was quite surprised. What did you think when you saw Trump being banned from these platforms?

Nora Benavidez (05:56):

The insurrection happened on I think Wednesday. And then the cascade of first, there were labels placed on a video of his. Then he was permanently suspended from Twitter. And it just sort of felt like this jaw dropping cascade where I was like this is really happening.

Nora Benavidez (06:15):

And immediately though, I was watching on my own Twitter of people saying, "This violates the First Amendment. This is an egregious assault on free speech." There was so much hate and confusion, and frankly miseducation. And part of what is important to keep in mind is that Twitter, Facebook, those are private companies. And you know I am a First Amendment lawyer. I firmly believe in the ability for all of us to engage in open discourse, to hear what other people think. But Trump has absolutely contributed to, if not been the biggest superspreader of dangerous misinformation that I think is absolutely disinformation with the goal of kind of dividing us and making people believe these false narratives.

Bridget Todd (07:08):

Deplatforming is actually pretty effective. Booting Trump from social media almost immediately slowed the spread of false information on those platforms. Zignal Labs found that false claims about election rigging dropped from 2.5 million mentions to 688,000 mentions across platforms, and hashtags commonly used to spread election rigging claims like #marchfortrump dropped by 94.3%.

Bridget Todd (07:32):

And it's not just Trump. A study conducted by the Election Integrity Partnership found that just a few dozen pro-Trump Twitter accounts, including Trump's own account were the original sources for about a fifth of misleading election claims around the 2020 election. It really goes to show you how just a small handful of accounts can be responsible for major chaos on platforms.

Nora Benavidez (07:53):

I was actually very much in favor of his being removed. I think what was necessary and really good was that data immediately emerged from a couple of researchers that found about 73% of the misinformation that had been on Twitter was no longer there thanks to his removal, along with the removal of about 37 others who were sort of the biggest influencers and spreaders of misinformation.

Nora Benavidez (08:20):

And I think that we need data like that to be able to make the case again, to platforms in the future. Because the big criticism was this just didn't happen soon enough. And I'm all here for the criticism. But I also think we need to lay the foundation for what are solid policy and community guidelines and standards that the platforms are implementing that then have data to back up that when they do that, it benefits the health of the internet.

Bridget Todd (08:49):

People were able to say here was the measurable impact of banning Trump and other accounts who were responsible for spreading this kind of thing. Here was the measurable impact. And truly, what has been the downside? Less noise, less lies spreading like wildfire on these platforms. It's been interesting to see how there was such a quick benefit, a quick and obvious benefit with very few in my book, if any downsides.

Nora Benavidez (09:19):

I think there may be the downside of people who supported Trump feeling somehow more quote unquote, in air quotes I'm doing this, censored. And that somehow, their censorship makes them emotional, driving them potentially to be more upset. And you know my work on the psychology side of misinformation, and that I think we have to stay vigilant to the ways that people will be emotional to information, the way we are all emotional to news. I mean, our relationship to news is emotional. We've seen the way platforms hinge our interactions on our likes or dislikes. So it's I think important to make sure we don't completely marginalize people that may actually be moveable. Maybe not today, but down the line.

Nora Benavidez (10:12):

So I kind of have this lingering concern about how we engage those that continue to believe alternate realities, that continue to want to see Trump. And it's really hard to reach those people. So it's like a sort of weird gut instinct where I kind of wonder, what are they thinking about? How can we sort of call them in and talk with them about these issues? In ways where if they had been unmovable before, slowly, slowly, some of these Trump supporters are starting to see the cracks in that reality.

Bridget Todd (10:51):

I hadn't even thought about that, the kind of emotional response that someone might have to feeling the shame of a platform saying your ideas, what you have to say is not acceptable here. I hadn't even really thought about that. And I think it really highlights to me that at the heart of a lot of your work with PEN America, and just generally is empathy. And sort of trying to remember that deep down, we're talking about people. People are all complex. All humans are driven by a complex series of feelings and motivations internally. So remembering that these are not just 'users,' but they're people who have these complex emotional responses.

Bridget Todd (11:32):

I think that empathy can get lost in conversations about disinformation and how it spreads, and what motivates people to spread it. But I am grateful that that idea seems to be at the heart of so much of the work that you do to create off-ramps for folks like this.

Nora Benavidez (11:50):

Bridget, I have to say I never thought I would play even close to the therapist role that I feel like I do sometimes. Being a lawyer, thinking about the law, thinking about legality on Twitter or Facebook, and yet being faced with people who ask questions and actually want to engage somehow. And I think that the problem the internet has bred is that as we all know, we're siloed, we seek out our own, and difference of any sort. Whether it's something we can accept and listen to, or something we find so offensive and egregious. We simply want to shut it down and not hear it.

Nora Benavidez (12:32):

And Trump I think went so far as to then when he heard things he didn't like, he called those lies. And the tendency of course is to shut people out, to disengage. And that is not getting us anywhere. In fact, I think the idea that we never engage with people who are radicalizing or that we never seek out others has led to things like the insurrection, where we need to engage. We need to find entry points. And I think we need to have a kind of expectation setting where we're not all going to agree. But we can exist in disagreement, and be civil, and believe in each other's humanity.

Nora Benavidez (13:11):

Of course, if our disagreement is grounded in someone believing you or I shouldn't exist or have the same rights, that's different. But the level of empathy that I hope we can bring sometimes to our online discourse is something that can try to create space for understanding. And I'm really glad that you see and talk about empathy in this way, that you see the value of those things. That as someone who is a storyteller, you focus in media and stories. It's so critical to remember that even if we find it somehow offensive, we have to find ways to engage and find each other's humanity. Even if it sounds corny.

Bridget Todd (13:54):

It doesn't sound corny. Often, you said that you feel like a therapist. And when I have these conversations about tech platforms, it's funny how often they turn to things that sound a bit corny, like empathy, or remembering there's a human on the other end of the screen, things like that. And I think you really hit on something that is one of the reasons why I'm so interested in disinformation and making the show is that I feel that that thing that you just described, the silos and people being closed off and all of that, I feel like we've gotten to a place where it's no longer easy to assume good faith when you're having discussions on the internet. Right? When I first got online, the reason why I found such a freedom and love of talking on the internet was because it felt like I was able to reach so many different people, get so many different perspectives. And it felt relatively safe. Now, I feel that that safety is gone, right?

Bridget Todd (14:51):

When I have a conversation with somebody on Twitter, I'm not sure if they're someone who we can have a good faith disagreement or a good faith dialogue. It just feels like we're no longer able to assume good faith in all of our discourse online, because the temperature has been turned up so high in part by things like disinformation, and lies, and violent rhetoric online.

Nora Benavidez (15:14):

I'm going to make it really personal. Sometimes I get retweeted on Twitter, by people who then have a comment about my tweet. And often, the assumptions, often wild assumptions people will make about more of me and who I am really stress me out. And I totally, totally know what you're saying, where it feels like you may not be able to trust someone. So I often try to then engage with the people that retweet me. Whether it is someone on frankly any side of the political spectrum making these assumptions. And in no way to defend myself, but just to open conversation.

Nora Benavidez (15:56):

I've had really wonderful learning experiences where people will kind of turn down their temperature when I respond in ways that engage them with open-ended questions, which is hard to do. I mean, you can't do that at scale every day. And that's one of the questions that I think about all the time is how we scale empathy.

Nora Benavidez (16:17):

But in the meantime, those little one-offs that can help change or slowly bring people together and help their attitudes kind of move at a slower pace where they're assuming less about me or you. I think is a great thing.

Nora Benavidez (16:33):

And the internet has changed. I mean, I remember when something like section 230, not to get technical, but when that was an exciting regulation to actually help people feel safer on the internet. To be able to have conversations on blogs. And the internet is just such a different place now, that it feels like you're always just about to have a complete crisis on Twitter. That's how I always feel. I'm like, "This could really go badly." And we can't all spend all of our days having those meaningful, slow conversations. So I guess I just sort of wonder how do we scale empathy.

Bridget Todd (17:15):

Let's take a quick break.

Bridget Todd (17:15):

And we're back. I say this a lot. We need to completely reimagine how we think of the internet. And that means we thinking so much of the infrastructure around technology. It means creating a tech press that centers and valorizes empathy, not scale. It means building platforms that center people, not users, or clicks, or how much time we spend with our faces at a screen.

Bridget Todd (17:48):

Now what I'm describing is a radical reimagining. And it might sound as big or as complicated as smashing the patriarchy or ending white supremacy. But, I believe that it's possible. I think there are enough people out there like you and me who truly believe in the power and possibility of the internet, and the power of expression. And I believe that when we come together, big radical things really are possible. And I know that we have the power to radically rethink the internet so that it's safer and more accessible, and so that it works for more people.

Nora Benavidez (18:18):

I totally agree. I mean, and smashing the patriarchy and ending white supremacy are also valid and worthwhile endeavors is how I feel. I'm like all of it. Just yes and to everything. The best tech emphasis, the people who are working on innovation that I think are doing the best work understand that these are not tech issues. They're human issues. And we're never going to solve everything by abdicating control or brainstorming to what can be automated, what AI can do, what the platforms can do better somehow to basically game our way out of what humans are fundamentally now doing. Which is why are we hateful sometimes? Why are we bigoted? Why do people have an interest in trolling others?

Nora Benavidez (19:09):

So I think that a lot of the solutions have to be human centered where instead of just saying, "The platforms can do X, Y, Z, or the government can do all these other things." What can we do? What can everyday people do to feel more empowered? Maybe they can build. Imagine this. Build an ecosystem of users and other people they enjoy connecting with. I mean, can you imagine the joy of actually thinking that your Twitter feed is an interesting place to learn things instead of just who has the best breaking news 140 character tweet? That would be really exciting I think.

Nora Benavidez (19:47):

So I love the idea, and I think there just need to be more of us doing this. More of us committed to talking about humans, and how humans fit into what we're doing as users. Because it really should go the other way around. Users don't come first, it should be people coming first.

Bridget Todd (20:04):

You're actually doing a lot of that work in terms of building more people who see things like that. At Penn, you have this weekly series where you are joined by other journalists and other thinkers to help people understand the role that we all play in making the internet a better place. Can you talk about some of the steps that anyone listening could take to help make the internet a safer and better place?

Nora Benavidez (20:27):

Oh, Bridget. That's like my favorite question. You got me at the sweet spot. I'm like, here we go. We've looked at disinformation for years. And everyone always asks what's the solution. So kind of to my last point, I think a lot of it, we have to focus on humans and give power to take control of what they're seeing online. So I always think it's best to frankly just start with understanding what you do when you're online. How are you consuming information, Bridget? Is it on Twitter? Let's be honest. Are you going to Facebook and are you clicking on links, or are you maybe privately just reading the headline that I shared or your other friend shared? Let's be real like. Sometimes that's all I do. I'm like, I can make a wild assumption based on what Bridget shared. And I'm not even clicking the link.

Nora Benavidez (21:19):

So just sort of taking a pulse, being honest with yourself. Are you going to Fox News? Are you going to the New York Times? Are you going to news websites? How are you doing things? Are you listening to podcasts like this? I think it's good to know how you consume information.

Nora Benavidez (21:36):

And then I also think with that, we should all start asking ourselves how we feel when we're online. are you feeling anxious when you read things? Are you angry? A lot of times, disinformation will thrive because we're emotional, and we often want to share things. We want to engage with people quickly. So we'll accept something without fact-checking it, without thinking about or asking ourselves who wrote this? Where did it come from? Why am I seeing it?

Nora Benavidez (22:04):

So if you can just pause, that is my number one recommendation. Pause to ask a number of things, but just pause. And from slowing down, I think we actually could all benefit. We could potentially share less misinformation. We could also then think more carefully about what our own attitudes are based on what we're seeing. A whole host of, frankly empathy could arise out of just slowing down and getting to a place where we're not just liking, disliking, loving, laughing at things. But maybe where we're in a place of, "Huh. I want to learn more about it."

Bridget Todd (22:44):

Yeah. If you were in my kitchen right now where I am, I have a post-it note on my laptop that just says slow down. And it's just a reminder for me in all the ways, because I can sometimes move too quickly just in life. But I think that the idea of just taking a breath, taking a beat has been so helpful. It sounds so basic. But before I started really thinking critically about the way that I felt when I was online, I would be moving so quickly. And then when I started to slow down a bit, I actually kept a little journal. So if I saw something that made me feel anxious, I would write that down. If I saw something that made my heart race, I would write that down. If I saw something that made me laugh, I would write that down.

Bridget Todd (23:29):

And from doing that, I really saw the ways that it's just like my brain chemistry was just firing so quickly. And I wasn't even stopping to think. And I find that when you have those moments where somebody tweets something and they really wish they had it and it's going viral and they're like, "Oh God." I always wonder, were they in that moment where they've just been seeing so many different takes, and different jokes, and different tweets, their brain is no longer processing what they're doing. That they're just putting a tweet into the world without even really thinking about it.

Bridget Todd (23:59):

So I really find so much value in almost thinking about using the internet as a kind of meditative practice and an opportunity to really be mindful of how you're doing, how you're feeling. Are you anxious? Are you emotional? What's going on? As opposed to something where you're just mindlessly scrolling and not really being in charge of your emotions, and your brain, and all of that.

Nora Benavidez (24:27):

If I could do away with one term this year especially, it would be doomscrolling. Because one, when are we not doomscrolling nowadays? News is never inspiring. But also, it's so passive. And part of what I think is really great about what you're explaining is that you have literally taken control of what you're doing to say, "I'm going to proactively write things down. I'm going to question and reflect on what I'm feeling." Sometimes, just the sheer act of feeling like you're in the driver's seat, whatever it is then that you end up doing. If it's writing your feelings down, if it's deciding not to tweet. But just thinking of yourself as the one that's in control instead of doomscrolling, I think is so exciting for people. I's almost like a mind shift where people, once I say it sometimes in our trainings, people will say, "I didn't even think that I could curate my newsfeed that way." And I'm like, "Yes, we can choose." And it would be so exciting to be stimulated, not just totally freaked out by what we're seeing.

Bridget Todd (25:34):

Exactly. And I think that's something that I think that everybody can take away. So just this week, we might not be able to topple the stranglehold that some of these platforms seem to have on our democracy, on our discourse. But individually, you can control how you consume social media. You can delete Facebook from your phone and say, "I'm only going to go on from my browser twice a day," or something like that. We might not be able to control how these things show up in a larger way. But we can control how they show up to us personally, how they show up in our household. Whether you take your phone to bed to doomscroll Twitter before sleep or not, those are choices that we have control over, not anybody else. So really trying to internalize that we do have some control, even though sometimes it feels like we don't.

Nora Benavidez (26:25):

You're not going to be great at this overnight. It won't be like suddenly, you feel more excited or eager to be online and to find this healthier internet. But I think that it can slowly lead to other practices. And if you can start with slowing down, so let's say just questioning yourself, scrolling at a slower pace, finding then interesting articles, and pieces of content, and people to follow and connect with. Then you may frankly start wondering other things. You can maybe get better at spotting when a headline lacks context.

Nora Benavidez (27:10):

I've gotten to the point where it doesn't happen overnight, but you can sort of be scrolling and you read a piece of content and you're like, "I would normally be really upset by that. Why would I be upset? Could I explore what the underlying reason would be?" And when you slow down like that to ask deeper questions, not just about my own reaction, but about the piece of content you're seeing, I actually think we all are training ourselves to have a kind of media literacy and digital literacy that everyone's going to need in the future. I mean, the internet isn't going away. So we need that eventually.

Nora Benavidez (27:49):

The other thing is disinformation is a service for hire. It's an industry. And disinformation is not going away. Online hate is not going away. So we all, if we just a little tiny bit chip away at and built up better and better practices, we could truly help create the ecosystems we enjoy that have less of that. And at scale, it isn't perfect. But it's at least one piece of how we can help create something that's healthier. And frankly, less hateful.

Bridget Todd (28:21):

More after a quick break.

Bridget Todd (28:34):

Let's get right back into it. I believe the only way that some of these platforms will meaningfully rethink the role they've had in spreading misinformation and hate is if we make it unprofitable, right? So if it hurts their bottom line to traffic in hate, and violent rhetoric, and disinformation, that's the only way that I think that some of these companies will stop trafficking in it, I guess. And I think that the way to do that can be so individual. I'm not going to engage with it. I'm not going to amplify it. I'm not going to support it. If I see that kind of thing online, I'm going to kind of divest from it, I think. I think a lot of that can be driven by individuals.

Nora Benavidez (29:17):

I'm thinking of the headline divesting from disinformation. I love it.

Bridget Todd (29:20):

Yes. We should all be doing that I think.

Nora Benavidez (29:25):

That's your article, Bridget. I mean I think the other ... I don't want to seem too corny in that humans are going to solve all of it, because I think the platforms have so much responsibility. And I think the future of minimizing how toxic the internet can be in part is that the platforms have to begin informing users in better ways.

Nora Benavidez (29:49):

And I don't know if you and I have ever talked about it, but I think that Twitter's very first step into putting labels on content really saw a horrible moment from the media's perspective, when people started calling these warning labels. Because I think that that has such a negative connotation. And I think that labels and the platform mechanisms to slow people down are actually a fantastic measure that we should be encouraging and learning about. It's the stuff that absent me teaching millions of Americans let's say about media literacy, platforms can help slow people down.

Nora Benavidez (30:26):

So I think there's this other piece also that can happen in tandem to you and me taking control, that the platforms try to find ways to very in a non-partisan, careful, educative manner, slow people down. And just do the thing that we're already doing a little bit. Question what we're seeing, wonder if there are ways that we want to explore or learn more about something. If we know that it's misleading.

Bridget Todd (30:55):

Yeah. So you mentioned when you share something based on a headline, that thing that Twitter has now, which I think is a great tool. But sometimes I feel kind of dragged by it where it's like, "Do you want to read the article before you retweet it?"

Nora Benavidez (31:06):

How many times have you not read it Bridget? Be honest. Because I every time haven't read it.

Bridget Todd (31:12):

Yes. And it honestly is a good reminder. And when I get it, I'm like, "Come on. I'm retreating an article by somebody that I know. I trust them." But it is a good reminder to just, "What's the hurry? Why don't you just read it?" Sometimes I'm like drag me, Twitter functionality.

Bridget Todd (31:30):

So I want to switch gears a little bit. As we're seeing right now, elected officials are being held accountable for their role in either supporting or inciting the insurrection. Trump was obviously kicked off social media platforms. Senator Josh Hawley's book was briefly dropped by its publisher. And kind of along the same lines of free speech online, I've seen a lot of people asking questions about what is commonly known as, 'cancel culture.' So what are your thoughts on so-called cancel culture as a free speech attorney? Are these free speech issues?

Nora Benavidez (32:03):

I'm going to give you the short answer, which is it's really hard. Lawyers will often say it depends. And part of what I think is difficult is that cancel culture often seems to imply that you're going to be in a way canceled from an existing platform, from your base. Which also then to me suggests that you have a base to be canceled from. You don't talk about canceling someone who isn't very well known. And I think there's this interesting power dynamic within the paradigm of thinking that there's a cancel culture.

Nora Benavidez (32:37):

I think it's more interesting to weigh what's at stake here. When we think about someone like Donald Trump, was he canceled? I honestly think that he was legitimately taken off the internet because of the real-world harm his words online have. And that is not cancel culture. That is a very deep, difficult balance. Where all of us, whether it is regulators, policymakers, platform people, everyday users like you and me, what are his words doing to our world? It is a drastically dangerous step when he spreads lies, when he sows discord, when he promotes hate. All of the different ways that I think the platforms may be in being late. Somehow if I had to guess, we're thinking that it would not be that bad, not be that bad. And somehow the insurrection was that crack, that breaking point where people realized online issues have real-world harms.

Nora Benavidez (33:40):

Which has happened before. We've seen it when there are mass shootings. We've seen it when there are acts of terrorism and other types of bombings. And what's fascinating in that moment is that I'm like, "Is that cancel culture?" I don't think that is. I think that that is a private company taking necessary actions to limit or triage the type of real-world harm that's happening. I think it was absolutely too late. And I always think that it was too late.

Nora Benavidez (34:08):

At the same time, Donald Trump is a public figure and was an elected official who was leading this country. So there is also this interest in hearing unedited what he says. So from a free expression perspective, I always wonder what is the most speech we can have with the least harm? I think we passed the harm point frankly some time ago. But there's definitely, I think we have to stay committed to having access, all of us to various forms of expression. Because free expression rights aren't just Donald Trump's right to say what he wants. We all have free expression rights to access what various people are saying. I have the right to then hear what you are saying. You have the right to hear what I am saying. And in some parts of the world, part of what we have to keep in mind is Facebook, Twitter, that's all the internet. That's all people have. That's the only connection they have to what's happening politically. We are seeing right now in Myanmar with the internet being shut down, that is a political tactic to silence dissidence and descent. So it's a difficult question to answer when there are questions implicated in that larger one around who is speaking, what is the effect offline? Is it damaging? When in total, it seems to tip the scales towards violence and hate.

Bridget Todd (35:34):

I have to say in general, I hate conversations about 'cancel culture' because the concept just seems so disingenuous. When people talk about the importance of being willing to engage ideas one might not agree with, it always somehow seems like marginalized people are the ones expected to be doing the engaging. And arguing with people who are hell-bent on misunderstanding you is exhausting. And if you aren't even able to talk about something with a shared set of guiding facts, the sky is blue, two plus two is four, Trump lost the election, then it can really get thorny.

Nora Benavidez (36:06):

When the election happened in November and Biden won, there was I think rightly a kind of exhaustion. People were so tired of fighting and eager to disengage completely from what might be different. And I'm with a lot of those people. I think it's really exhausting work to try to engage with people who disagree. And you have to always know that ultimately, a lot of conversations will not be productive. And it isn't on you or me to do the labor of teaching someone. Someone else's teachable moment isn't necessarily my job.

Nora Benavidez (36:48):

At the same time, I think the crisis and the basic human crisis we're in of how divided we are merits a very serious examination that we all need to sit with of what is it where we are interested in disengaging all the time with anything different. And that permeates all of us. So I think the thesis is why are we driven to do things that divide us? And then how can we overcome our own instincts to actually broker conversation? To find an entry point. Even if it's little, and even if it only happens for a couple of minutes. That is the beginning of being able to find what I think is the antidote to disinformation, where we have a shared basis of facts. Even if you and I disagree on policies, we can both look at a set of facts and say, "Yeah, that's what we're basing our opinion on." And we may differ then on what we think needs to happen, but we can agree on certain things. And we have absolutely not reached that point of being able to agree on basic facts.

Bridget Todd (37:50):

Yeah. That's exactly where I want to get us to. We can disagree, but we are all on the same page about some basic facts. Two plus two is four, the sky is blue. We have a shared understanding of reality. And it's sad to me that we're not there.

Bridget Todd (38:04):

Pew actually found that among Trump voters, 40% say that he definitely won the election. Another 36% say that he probably won the election. And this week, Reuters polled members of Congress. Now about 90% of them wouldn't say one way or the other whether or not they agreed with Trump that the election had been stolen from him. And two of them still say the election was stolen from Trump.

Bridget Todd (38:25):

So it's like we're not even speaking the same language. We do not have a shared reality. And it seems like such a low bar, but that's where I want to get us to. A place where we have a shared reality.

Nora Benavidez (38:35):

I mean, it should be a low bar. But honestly, there's also just so much bigotry and hate, that a lot of those kinds of civil discussions are not grounded in mutual humanity. So my civil rights background is like, how do we have the most for the most people? And how do we maintain and protect basic rights for people who have traditionally and historically been marginalized in every institution? So there's this other piece that informs the work I do and how I think about mis and disinformation, because I know that it affects different communities differently. Black and brown people are targeted with disinformation so that they are disenfranchised and believe their voice doesn't matter. So sort of in the middle of all of it, I still very much have eyes wide open that some of us will be affected, targeted, and marginalized in more specific and acute ways that have massive effects on democracy.

Nora Benavidez (39:32):

So I don't want to sit here like Pollyanna, "It's all going to be great if we could just find moments of factual shared belief." Because some of that assumes that other people will believe you and I matter as much as anyone else. And the reality is a lot of people don't think that.

Nora Benavidez (39:51):

But some, some are moveable. Some are people where you engage and they are not hateful people interested in less for certain people. And that's the moment where I'm hoping to reach at least some pockets of the United States.

Bridget Todd (40:08):

So you sound pretty hopeful.

Nora Benavidez (40:10):

Oh, Bridget. I actually have a lot of hope. I mean when I was younger, I don't want to get too misty-eyed, but I would work in the south with people like John Lewis and civil rights figures. And they always said that change takes a lifetime. It takes many lifetimes. And this intractable issue of how we connect is one of the biggest issues I think of our lifetime. And it's very slow. So I am hopeful in the sense that I see everyday, more and more people interested in doing this work. I see movement lawyers. I see amazing researchers finding data and trying to find ways to triage the way the internet is making us all bleed. And I'm very hopeful. It takes a long time. And for every step forward, there'll be three steps back, and then maybe two steps forward. Maybe another step forward. So it's really just sort of a dance with democracy.

Bridget Todd (41:10):

Where can folks keep up with you? And, can you tell us more about the weekly trainings that PEN America is running?

Nora Benavidez (41:18):

So people can always follow me @AttorneyNora. I'm @AttorneyNora on every platform. And then PEN America has trainings around media literacy and disinformation defense almost weekly. And that all is on pen.org. Our next series is going to focus on Black and Latino mistrust around the vaccination process, and try to help walk people through why hesitancy is okay. Why it's okay to question, is the vaccine good? Should I get it? And so we'll be doing a series around disinformation and COVID later this month.

Nora Benavidez (41:56):

And then we're also going to be doing work on how community organizers can embed messaging in their work with communities. We really want to empower people to feel like they know how to talk about these issues when someone comes to them. So more to come. Pen.org, you can check it all out.

Bridget Todd (42:18):

If you enjoyed this podcast, please help us grow by subscribing. Got a story about an interesting thing in tech, or just want to say hi? We'd love to hear from you at hello@tangoti.com. DISINFORMED is brought to you by There Are No Girls on the Internet‪. It's a production of iHeartRadio and Unbox Creative. Jonathan Strickland is our executive producer. Tari Harrison is our supervising producer and engineer. Michael Amato is our contributing producer. I'm your host, Bridget Todd. For more great podcasts, check out the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.