I’m Kara Swisher and you’re listening to “Sway.” I’ve criticized Facebook a lot over my years covering big tech, and honestly, they deserve it. But one of Mark Zuckerberg’s most questionable moves of late was the creation last year of an oversight board. On paper, the board is independent. Twenty people from across industries that Zuckerberg has called a, quote, Supreme Court. But it’s a body that’s been funded by Facebook, and the board’s initial members have been picked by the company, which presents quite an optics issue. I’ve called it a Potemkin village, meant to paper over Facebook’s lazy abrogation of responsibility. So the board’s big decision this week to uphold Facebook’s ban on Trump raises a lot of questions for me. They said Facebook had six months to be more specific — a forever ban or a time limited one. The board said, quote: “In applying a vague standardless penalty and then referring this case to the board to resolve, Facebook seeks to avoid its responsibilities. The board declines Facebook’s request, and insists that Facebook apply and justify a defined penalty.” I couldn’t have written it better myself. Actually, I have been, for years. Alan Rusbridger is one of the founding members of that board. He was Editor in Chief of the UK’s Guardian and steered the paper through WikiLeaks and the Edward Snowden revelations. Now, he’s on the other side, working with a company he once would have covered and demanded answers from. He’s here today to take me behind the board’s decision because he was in the room where it happened.
Alan, welcome to Sway.
Hi. I’m glad to be here.
How are you doing?
I’m all right.
So you must be exhausted, at this point. Let me just say, this was a decision I did not expect you all to do. There were so many scenarios of what could happen here, and this was fascinating. But I’d first like to talk about how you got to the board. So, you’ve been a journalist for decades. You started reporting at The Guardian in 1979. You were Editor of the paper, as it broke news on Wikileaks, the News International phone hacks, and of course, the Edward Snowden revelations. So now, you’re an oversight board member. Why? How did they get you to do this?
Well, I mean, you can’t edit a paper for 20 years, and not think deeply about free speech. And I’m really interested in social media, and all that’s good and bad about it. And I’m very cautious of people who just want overnight regulation, as though we’re going to solve this by next Tuesday. And so, when somebody came along said, would you be interested in being considered? I said, yes. It was a long process. It took about a year for them to pick the first 20. And of course, we were all skeptical, as you are, about whether this is a truly independent thing or not. And I think we were all — those of us who joined — are satisfied that it certainly feels independent.
All right. So, what did they do to convince you? Because Facebook has had a lot of taint over various things, over the years.
Well, I suppose, two things. One, that we would be genuinely free of any influence from Facebook, once we were up and running. And so far as I’m concerned, that is true. I mean, we have our own board of trustees who overlook us, with some very distinguished people who I respect. So I don’t feel that there’s any way that Facebook has any hold over me, which is, actually, what I felt when I was editing The Guardian. I worked for a trust. And similarly, I felt very independent from any pressure. And the second thing was, I guess, we all wanted some assurance that Facebook would be bound to do at least some of what we asked them to do. So there are two things we can do. We can order them to do stuff. We can say, look, take this down. Leave it up. We can ask them questions. And we can make policy recommendations. They’re bound to accept our decisions, but they’re not bound to accept our policy recommendations. So I feel, it would have been pointless joining a board that didn’t have, at least, that element of compulsion.
Of course. But did you have any reservations about the image of Facebook around the world and the idea of affiliating your reputation against it?
Of course, of course. And Facebook is incredibly cack-handed, opaque, arrogant. We could spend a whole podcast discussing all the things that are wrong with Facebook. But if you believe that social media, also, has a good and great function, and has the potential, not only the potential, it is actively doing good as well, then, of course, you could stand on the sidelines and throw rotten tomatoes. But if you wanted to make it work, then sometimes, maybe, you have to roll up your sleeves and try.
Were you worried that this was Facebook’s attempt, as many people think, to stave off not regulation, per se, but to give a veneer of regulation? When, in fact, you’re not really a Supreme Court. I find that unusual, that Mark used that term. And I always look at what he says, and sometimes, he can be very ham handed in the way he talks about things.
I mean, listen. It’s an odd situation. I’ve never met my fellow judges in real life, because we started in the pandemic. But I’ve come to know them over Zoom and to respect them greatly, and quite a lot of them have very little time for Facebook at all. The idea that this is a hand-picked board of people who are going to give Facebook an easy life. So I think all of us, we’re none of us fools, we realized that this could be used as a fig leaf. I think you have yourself use the words Potemkin and Village, in relation —
Yes, I believe.
— to what we do.
I think, probably, fig leaf.
So we’re not blind —
I don’t question you, I question Facebook. Just to be clear.
But we’re all aware of that. And maybe, in our judgment yesterday, you see our awareness of that. Please don’t ask us to do stuff that is your responsibility. So thank you very much, but that’s your job. That’s not our job. And so I think, in all our work, you will see us very aware that we don’t want to be used in a way that the dodges Facebook’s own responsibility.
Yeah. I think that was the center of my complaint about it. This was their job. I’ve always said that they’ve abrogated responsibility. That’s been my main message. So you signed on to the board last year. Did you ever imagine, at that point, that a sitting US President would be using social media to help incite an attack on the Capitol? Which you all clearly said is what happened in his posts, that he was using these posts to do so.
No. I mean, the thing with Trump is that he just constantly astonished, well, me. He constantly astonished me. And every time you thought he couldn’t do something more outrageous, or more bad, or more erosive of democracy and of the role of the press, but never in my wildest dreams, could I have imagined a scenario where he would behave as he did in the run up to January the 6th, and on the day itself.
And using social media to do so, among the many things. I don’t blame social media, completely, but it was a long con, by Trump, using these tools, among his other tools, like rallies and the television and things like that.
Yeah. I mean, I think we’re all complicit a bit, aren’t we? I mean, the way he used the press. I think the way he used television, it’s all in modern politics, it’s all indivisible. In politics and social media, it’s the same thing. So I think, social media should take a lot of blame, but it’s not just social media.
No, of course not. But my point being is that you all determined that the things he said, as did Twitter, as did YouTube, were a tweet too far, essentially.
Yes. I mean, these are incredibly difficult issues. And it’s taken a 38-page judgment to unpick some of them. I suppose, as a newspaper editor, it’s a crass thing to say, but on the side of free speech, broadly, my instinct is to give more latitude, especially to political leaders. Because I think it’s important, not only that they have the right to be heard, but that we have the right to hear them. And if I want to make a judgment on somebody I might vote for, one day, and it turns out they’re incorrigible liar, I would like to know that. And I don’t want somebody protecting me from that. So, although I really hated and disliked a lot of the things that Trump did on social media, I thought, broadly, he should be allowed to do it.
We’re not talking about January 6. We’re talking about his life before January the 6th.
Let’s go into what happened. So, of course, January 6 led Facebook and Twitter and eventually, YouTube to remove Donald Trump from their platform. In Facebook’s case, that decision was applied for an indefinite period. That’s the word Facebook used. And the oversight board was expected to review that decision and either overturn it or back a permanent ban. Essentially, they handed over this to you. So let’s talk about the final decision from the oversight board, this week. That’s what people were expecting. Instead we got this decision not to make a final decision. Talk me through the board’s ruling, broadly.
Well, we decided that Facebook was both right and wrong. It was right to remove the posts, on the day. I think that’s relatively uncontroversial. I think it’s broadly accepted that what he was doing, even though the language has got dog whistle qualities and talks about peace and going home, that the dog whistle is also incredibly dangerous, inflammatory speech, at a time when democracy itself was under attack. So I think, most people think Facebook was right to take the stuff down.
And the board agreed with this.
We agreed with that. And we agreed, I think, that when they met the next day, Facebook, on the 7th and banned him for two weeks —
that seemed to be a proportionate, necessary thing. The problem comes with this word, indefinite. So again, I think it’s defensible. I’m not saying I agree with it, but I think it’s defensible, for Twitter to say, permanent. You’re off forever. But Facebook’s use of this word, indefinite, is problematic in two ways. One, it’s nowhere in Facebook’s own rulebook. So it’s Facebook saying, we want people to abide by the rules. It’s a bit rich, then, to break our own rules, by doing something that is not laid down anywhere. And the second, I think, this term, indefinite, is just unfair. That’s why you don’t get indefinite jail sentences or indefinite anything, really. It’s just fair to people to say, it’s two weeks, or it’s forever, or it’s two months or it’s two years, or this is what you have to do in order to be allowed back on. But just to say indefinite gives too much power to Facebook.
Well, some might say they did that so they could hand it over to you. Just hand this —
— hot potato right over to your hands.
We were not blind to that. And again, we don’t think that’s our function. So we said, no, this is your responsibility. And you should have six months in which you, if you like, you can rewrite the rules.
So talk about this, the idea that they use this word. They have been in business for a little while and have encountered these problems before. The fact that the ruling was so arbitrary, and they don’t enforce them. The rules are opaque. When I was dealing with Alex Jones, they kept changing the number of times he could violate and then said, we’re just trying to make sure he doesn’t game us. I said, he’s gamed you, already. It’s already happened. Why were there no rules? Did you ever get any insight?
Well, I’m not here simply to blow the oversight board’s trumpet. But maybe we are performing a function, here, in exposing the opacity, or the density, or the inconsistencies, or the fact that a lot of this is not thought out. To give you one example, Facebook have what they call a newsworthy exception.
That’s, I guess, an attempt to give some free speech protection to things on account of they’re newsworthy, that they’re important, that they’re current. And it turned out, they had never applied that to Donald Trump. And we’ve managed to get that out of them. So I think, we are getting under the bonnet, or under the hood, of Facebook, and beginning to establish things that weren’t visible, previously. I would imagine that over time, there will be lots of examples of that.
Yeah. Yeah, I was shocked that they were being opaque. After years of covering them, that’s all they do. That’s their color. So the Trump case, essentially, you said did not meet the criteria for a permanent ban, or that you just aren’t going to call this one.
Well, both, I think. Well, not so much, I think, it wasn’t our job to establish whether they had met the criteria. We just said, A, it’s too arbitrary, and B, If you want to make a permanent ban, that’s fine. But that’s your decision, it’s not ours.
So one of the things that you’ve just mentioned, newsworthiness, explain what that is. Because you’re an editor. What is newsworthiness, in the way you look at it, here?
Well, it’s saying, are there things that are important or significant and current, which demand an exception to the rules, if you like. I mean, should there be a special protection for speech around certain kinds of issues, because it’s important that we discuss them?
We hear them, like when the looting starts, the shooting starts. Things like that.
Was this a surprise to you, that they never applied newsworthiness to the policy? This, despite the company waving its hands and telling me this, actually, that they were referencing this policy during some discussions of Trump.
It was a surprise to me, because I think, actually, newsworthiness is quite an interesting concept. It’s come up in one or two of the other decisions we did. It came up in the decision that we did over COVID. That was highly newsworthy. The particular case that we considered, the case of a doctor in Marseilles who was advocating for hydroxychloroquine. That’s a absolutely newsworthy thing. And we decided, in that case, again, that that should be allowed. But it was interesting to me that they had never applied it to Donald Trump. And of course, anything that we’re saying here about Donald Trump is going to have implications for Facebook, down the line, in terms of handling any number of populist leaders.
Right. We also learned, through the decision, that Facebook had found Trump had violated its rules five times before January 6, three times within the last year. There were also 20 pieces of content posted that have been flagged, but these didn’t result in any action. And researcher Evelyn Douek suggests that this may be because of a cross check policy Facebook grants high profile accounts, which is also opaque.
Yeah. Well again, that was interesting, because this question of political speech is, again, a thorny one that people have been discussing in free speech terms for maybe 200 years. This question of whether, if you’re a public figure, particularly if you’re a politician, that this business of the interlinked right to speak to potential voters, to citizens, and the right of citizens to learn as much as possible from people who might end up leading them and making laws, whether that attracts a kind of special privilege. And we can think, probably, of areas of law. I mean, the famous New York Times v Sullivan case, which is so determinative of libel, where the judges essentially said, if you’re a public figure, there should be more latitude for the press to write about you without penalty.
So one of the things I just want to make clear, Trump was violating Facebook’s terms well before the Capitol attack. Everybody could see it, in plain sight. And Facebook was one, aware of it, and not allowing for newsworthiness. They let it slide because, did you ever figure out why they let it slide? What was the reason, if it wasn’t newsworthiness?
Well, I don’t know. I’m guessing that, the thing that I’ve just said about the latitude that they gave Trump, because he was a political figure. As repugnant as many people would have found Trump’s speech, the argument says, but in the end, he’s the elected president. And you can imagine the discussion that we could be having now, had they moved sooner to what would be called censoring the president. And a lot of people would be saying, so there’s this big tech, this Californian company, that nobody elected, is deciding they have the right to censor the president. So they’re going to lose both ways on that one.
Yes, well you could look at it as, here’s a persistent troll, who lies and spreads hate and disinformation, who violated the law. Not just once, Alan, dozens of times.
No. But this is a problem that all media has faced, from television and the New York Times and the Washington Post. How much do we feel obliged to report this and how much do we feel obliged to —
contradict it, or put interstitials over it. Are we going to use the word, lie, or not? I mean, these are debates that have been going on—
Well, two observations. It’s unfettered information that does not, at least, have a, this isn’t true next to it. And secondly, it’s amplified in a way that is not like a billboard, not like a television. It’s more than anything else.
It’s clearly different in an age where somebody can speak to 35 million people, simultaneously. That is a problem that has never existed in humankind before. A problem. It’s an opportunity, it’s a challenge. But if somebody starts to use that ability to speak to 35 million people to incite harm, again, we say, Facebook has got to move much more quickly and have the capability to be able to resource their response to these kind of unprecedented situations, much better than they are, at the moment.
All right. But there’s cross check policy seems to be Jerry rigged. I think I’m being nice here, because I know how they work, here. This is, they would check it again and can overrule checkers. Correct?
Did you get any insight into this policy?
A bit. Not enough. Again, I think, we asked 46 questions of Facebook and we had answers in the majority. But there were about two questions they didn’t answer properly, and seven where they didn’t give us an answer. So again, we’re chipping away, but we’re not getting as far as we want to.
Do you guys not have a right to have that? Do you not get subpoena power, if you’re the Supreme Court, for example?
Alas, well, mercifully, we’re not the Supreme Court.
Well, he called it that.
He did. But we avoid the phrase, ourselves.
Yes. I would, too. But you did not have the power to compel them to do so?
We don’t. They can come back. And maybe, there’s a bit of a pattern developing, in which there are certain areas they say, actually, it’s irrelevant to what you do.
Determined by them.
Determined by them, at the moment. But again, I think, having created this body, it’s going to be quite embarrassing for them to be constantly seen not to be complying with it.
It is fair. I think, it’s obviously fair to say that at the moment, we start with a clearly defined remit.
We’re limited to take it up or leave it down, basically. But, as you saw from the decision yesterday, for instance, one of the things we said is, actually, you should do a full and open review about your own role in the events up to January the 6th.
And publish that.
Which they’ve been doing. Yeah.
So they can say no to that. But if they keep saying no to the things that we suggest, I don’t think that’s going to place them in a particularly good light.
Yeah. Well, Alan, you have to have shame before you’re embarrassed. But that’s my opinion, again. So what were the unanswered questions, as a board member, about the policy? You said, seven, they didn’t answer at all. Could you give any insight to that?
Well, for instance, let’s use the A word, the algorithm. It is interesting to us, and I think, to the wider public, what are the design decisions behind their treatment of Trump? Because this leads into their accountability for the events leading up to January the 6th. So we would like to understand better, that the decisions that Facebook makes in what gets promoted, what gets viral, who gets to see what, now, we don’t have the powers to do that, at the moment. But I think, if we keep on asking, it will get increasingly embarrassing. You would say, well, they have no shame.
I know they have no shame. But keep going.
Let’s find out. Let’s find out.
Yeah. So did you have any particular unanswered questions that you felt frustrated, when they wouldn’t answer them? If you had to pick one of them that was, like, are you kidding me.
Well, I do think there’s this question of virality, which is the thing that everyone says about Facebook, and it may well be true. But I don’t personally know. These questions about how Facebook chooses what to go in your feed and what to promote, and what not to promote, and whether Facebook is engineered in order to create polarized disagreement people like that or people click on it. Yeah. Now, everyone says that’s true of Facebook, and maybe it is. But I think, probably, our job as a board is to get as far as we can in understanding that. And at the moment, we’re just in the foothills.
A lot of researchers think they should give information out. They should make it clear. You’re the Edward Snowden editor guy, who got a lot of this information out. Do you think this kind of information should be made, given the importance of this company and the enormous —
— Power they have?
Of course. Yeah. Of course I think that. And I would like the board to be able to use its power, to maximum effect, to get it all out. Now, of course, I understand that people are impatient. They want this to happen immediately. And I think it may happen. I don’t think it’s going to happen immediately, but I think the more that we can establish our bona fide ideas, our credentials, I think yesterday was a big step towards that, then the more power we have. The more influence we have in trying to act as advocates on behalf of the people, for the information that, I think, Facebook, in the end, is going to have to be more open about.
So, if you were the editor of The Guardian, still, I would imagine you’d say, go get that algorithm, to your reporters. Go find out what they were doing. Go find sources and figure it out. Correct?
Yeah. That would be a great story to get.
But that would be the thing you’d want from them. Correct?
Yes. And maybe there’s a parallel with Edward Snowden, that when we first saw the Edward Snowden documents, and we had general reporters on the case, they were baffling. Because this was an internal language within the intelligence agencies.
And we had to bring in other people, because we had to understand the language in these slides. And it was also true of the people who were in charge of oversight of the intelligence agencies. They often picked people who were retired. And almost look as though you’ve been hand-picked, because you can barely use your mobile phone, let alone understand how something like the NSA or GCHQ work. And so, of course, the easy thing is to say, let’s see the algorithm. But having the technical capability to understand what you’re being shown, or what even to ask for —
Certainly. But do you have the power to do that, as a board member? Hire people, get in there?
As an editor, you say, let’s do this.
Absolutely. Absolutely, we do. In virtually all the cases, we do. We go to experts. They may be linguistic experts or regional experts. But absolutely, if we need technical expertise, we have the budget and the capability to go out and find it.
And get it from them. Or demand it from them.
Well, we can continue to demand it from them. At the moment, that’s one of the questions they’re not answering.
But I guess, it comes down to, Facebook’s got a bigger, wider problem, in terms of regulation, in terms of people who want to break it up. And I would have thought that at some point, they’re going to have to realize that it’s in their interests to start being more transparent about how they work.
Yeah. OK. So now, you put the ball squarely back in Facebook’s court by giving six months to quote, review this matter, to determine and justify a proportionate response that is consistent with the rules that are applied to other users of the platform. That’s very lawyerly. But it sounds to me like, I’m thinking Alex Jones, eventually, or Holocaust deniers. I did the very famous interview where Mark said, Holocaust deniers don’t mean to lie. And then two years later, he decided differently. Any predictions where this will go?
Yeah. I hope that the function we’re having is just to say, think your way through all these things, because you clearly haven’t thought about them enough. I have observed the kind of statements that Zuckerberg has made over time, that you describe, and clearly, there’s a man who’s doing a lot of catching up from not having thought very much about free speech issues, to having to be much more sophisticated. That, I think, coincided with him saying, actually, I probably don’t have the capability. He’s an engineer. He’s not a moral philosopher. He’s got no background in law, or journalism, or free expression. So if you want to do him credit, you could say, well, actually, this is probably why he set the board up, in the first place.
Again, it’s still his job, nonetheless. But have you talked to him a lot about this, or not?
No. No, no.
Not at all.
I was once in the same room with him and Rupert Murdoch, at a weird —
— occasion in Davos, about 15 years ago. And he was, literally, the 25-year-old geek in the corner of the room. But I’ve never spoken to him.
And your observation, you never spoke to him. You had no encounter, personally.
No. It was kind of fascinating. He was just at the time, it must have been 2009, 2010. So Facebook was just breaking through, and even Rupert Murdoch had begun to realize that this might have an effect on his business. And you had these incredibly powerful press giants, not knowing quite how to speak to this awkward 25-year-old geek, but realizing he could be trouble for them.
Yes, indeed. Rupert’s been onto him for a long time, as you know. So one of the things that you’re saying here is, this is someone who’s just learning on the job. And my very first column for the New York Times was the pricey education of Mark Zuckerberg. Pricey to us, not to him. What will cause him, one, to become educated? We’re sitting here waiting for someone to keep making drastically damaging mistakes and then figure it out. How long will it take them to make this decision? Twitter was slow, for sure, but they could make this decision. And Jack Dorsey has done this. He’s stuck to it. It’s a hard decision. He’s taking the slings and arrows from doing it. Why can’t Facebook?
Well, my impression is that Facebook, it’s got quite a closed culture. I mean, you’ve had much more to do with these companies than I have. My impression is that Google is reasonably open, Twitter, moderately open, and Facebook, not so much. So, I think, there is a culture there in that company, which can’t I can’t pretend to be an expert about, and which is very directed at the pinnacle of Zuckerberg at the top. And so, I think, that culture probably has to change. I don’t know how it’s going to change, and I don’t know what role Zuckerberg can, or should, play in that. Again, maybe the oversight board is a sign of a willingness to change. It’s quite a big deal for a company to have done. Rightly or wrongly, you can love it or loathe it, but it’s not an insignificant thing for a company to have done.
And yet, Jack Dorsey did it.
No, I mean, the oversight board.
Oh, the oversight board. Mhm. But Jack Dorsey just made the decision, without having to. I mean, in my mind, a duck is a duck is a duck. And everyone’s looking at a duck and saying, is it a duck? Or is that duck dangerous? I want to get to some of the other things in the board dynamics. The board decision noted a minority of members emphasized that they would have gone further, specifically, in protecting human rights and dignity. This is a dissenting opinion, but pokes at the conventional wisdom about the board, the board is overly protective of free speech. This decision, you had made previous ones where you gave wide latitude to speech. And this one, you did not. Can you talk about this minority group?
As you would expect, in a case like this, some members were much more censorious of Trump, and much less permissive about his right, and would have been much tougher. But they were the minority. But their views are, I think, fairly represented in the report.
Were you in the minority?
I don’t think I want to say.
OK. Why not?
Well, because, I think, the verdict is the verdict. And I think it becomes a bit invidious if you start saying, well, I disagreed or didn’t agree. I’m part of the collective decision.
OK. I’m all for transparency, Alan. Anyway, is there any infighting on the board?
Yeah. It just reminds me of the Liz Cheney secret ballot, that’s all I’m saying. Like, they say they hated Liz Cheney, and then they didn’t vote against her. But that’s OK. So is there any infighting? Talk to me about how the discussions are.
Yeah. OK. So first of all, the heavy lifting on each case is done by five people.
There are panels of five. And with Trump, we came together early, to scope the general issues. We then —
Look through the comments, there were 9,000 comments.
There were 9,000 comments. We got a great staff, who sifted and made sense, and highlighted the important ones, but also highlighted trends. And the panel of five, it went through a number of drafts, and those drafts would then be shared with the wider board. And through a period of iteration, which went on, probably, longer than it should have done, but on the other hand, it was incredibly important and complex case, we finally arrived at a draft that we could all sign off on.
And was there fighting? Because there’s all this stuff going on in newsrooms and companies and Basecamp last week, this idea of what can be said. Was it a relatively cohesive group of people, from your perspective? You’ve run a newsroom.
Yeah. I’ve run a newsroom. And I have seen fights. I mean, not trying to say I’ve seen physical fights in newsrooms. But I’ve certainly seen passionate disagreement. And there have been powerful disagreements, I would say, in this case, but I wouldn’t say it was fighting. Somehow, we have created, maybe because we’ve never met, we’ve created an atmosphere in which people engage powerfully, but in a civilized way. And of course, a lot of it is done through shared documents, so it’s not actually shouting across a crowded Zoom.
Right. Was anyone there worried about being perceived as cancel culture? There’s this debate going on, which I still don’t fully understand, because some of it’s consequence culture. Some of it’s this woke, although the opposite of woke is asleep. Were you are worried about that idea of being attacked as being censorious?
Well, yeah. I don’t like the word cancel culture, either.
I don’t either. I hate it.
You can imagine, there was a spectrum of people at one extreme, who want to be extremely permissive about speech, especially the speech of the President of the United States. And at the other extreme, there were people who said, but he’s a monster. He was inciting violence. He was destroying democracy. No way, should he ever be allowed back.
All right. So on the minority. It mentioned, to ensure that users who seek reinstatement after suspension recognize their wrongdoing and commit to observing the rules of the future, it sounds like you want an apology. Is that asking, the minority was asking for an apology from Trump?
Well, I think it comes down to, if you’re going to allow him back on, do you set preconditions? Or do you, in football, we would call it a yellow card. You’re on a warning, you’re on probation. If you offend again, you’ll be out again. So there were some people who said, well, a good precondition would be for him to accept the result of the general election. If you’re prepared to accept that you lost, we’ll allow you back on. Now, we didn’t adopt that as the board. But those are the kinds of decisions that we anticipate Facebook would now want to have.
Well, he’s already doing the big lie thing. I can’t imagine he wouldn’t continue to do the same thing. What was your personal view of the apology requirement?
Well, I’m I think, my view, and I’m thinking aloud to some extent, is that I would allow him back. I mean, I can’t see him saying, in order to come back on Facebook, I’m going to accept that the election was not a fraud. I would be more interested in setting out the standards of conduct that we would expect, we, Facebook, would expect of him. And a bit like people who are on probation, if he breaks them again, then he’s off.
So you would bring him — you would have brought him back on? Or permanently banned him?
I’m not saying that I would want him back on or not. Um, because I think —
But if he were, as sort of a three strikes rule.
It’s probably, I mean having as a board said that this was your decision. I think it’s probably not very helpful for me to start saying what would I do if it were my decision.
Were you in the minority group?
Of, of what?
Of this group that was doing these other things. it doesn’t seem like you were.
Uh — I mean, again, I mean, I — OK, why, why play around? No, I wasn’t.
OK. We’ll be back in a minute. If you like this interview and want to hear others, follow us on your favorite podcast app. You’ll be able to catch up on Sway episodes you may have missed, like my conversation with former Facebook Chief Security Officer, Alex Stamos, and you’ll get new ones delivered directly to you. More with Alan Rusbridger after the break.
All right. So let’s talk about the implications for Trump. So what does the oversight board mean for what happens here, for Donald Trump? He has, of course, just come out with his own, I guess, it’s a blog, “From the Desk of Donald J. Trump.” I’m sure you’ve seen it. How consequential, from your perspective? Were you thinking about this at all? I’m assuming you weren’t.
Well, he’s lost an enormously influential platform, in terms of numbers and audience. And it is, as we said before, you can’t detach Facebook from the political process. It is the political process. Both are the same thing. He’s a cunning and well resourced guy, and I’m sure he will try and find works around it. He was doing that yesterday.
Yesterday, on Twitter.
He’s trying to find ways of making announcements that his supporters, have them put out, and gain traction. And he may start his own platform. But he’s going to struggle to create a platform as powerful as Facebook.
Yeah. Well, good luck with that. So Trump responded to the oversight board ruling, by reiterating this deplatforming was a total disgrace. Republicans, you saw all the reactions to the decision, a lot of it was performative, talking about free speech and they’re going to come get you. What was your thoughts on that, that you’re now a mortal enemy of free speech?
Well, I think that was always going to be the reaction. And I guess, that is why, as we said earlier, that’s why Facebook avoided taking the statement until the last possible moment, to avoid the perception that this was an elite metropolitan Californian company taking decisions on. So I think they were always bound to say that.
So should the board have the power to deplatform a leader? You didn’t take that power, in this case, which you certainly could have. Do you feel comfortable with this responsibility?
I don’t think we have the power to completely deplatform. We can take down individual pieces of content, and I’m perfectly happy with that responsibility. I think the bigger questions of deplatforming, I would rather were taken by Facebook.
You think that’s their job, and then you can rule on that rule, essentially.
That’s the way you look at it. How much do you make?
What six figures? There’s a lot of different six figures.
It’s six figures.
High? Low? We can play that game.
Six figures, yeah.
OK. Well, in the spirit of transparency, for example, I would appreciate it. But has it been worth it? That’s really what I want to ask.
Well, I mean, personally, I find it absolutely absorbing. I was teaching a bunch of kids in Oxford, this morning. And all conversations lead back to Gutenberg.
They do? At Oxford, they do. They don’t, in the United States. All conversations lead back to Beyonce, here.
Conversations they were having in the 15th century about the implications of movable type, and moving speech out of the purview of a few monks. And the cataclysmic following on from that, and who was allowed speech, and who wasn’t allowed speech. So to be having a ringside view at the same debates, 500, 600 years later, is fabulously interesting, and I think, incredibly important. And I’ll just say this. I know that people want all this to be solved by next Tuesday. And I know Facebook have made, as I said, we could have spent the whole hour talking about how terrible Facebook is. But these are decisions that any company would find impossible. Any company, any institution. Because the trade offs on balances about free speech, which have puzzled and troubled and led to catastrophic results over 300, 400 years. So it’s not going to be solved by next Tuesday. And I would rather be part of a board that can take the time and has got the expertise to make, actually, judgments that I think will stand the test of time. That seems to be an incredibly important function. And so, to answer your question, yes. It’s worth it.
Yes. All right. Let me just push back on one thing. I know people don’t want an answer next Tuesday. I wrote a story in 2019 describing this entire thing. Their anticipation of consequences is, at this point, negligent, as far as I’m concerned. And I get the free speech arguments and everything else. But the stuff you uncovered, that they don’t have any processes, which we all have long suspected, has been troubling, to say the least.
Yeah. But that’s a valuable, if we did nothing else, to have got under the hood, in the way that we have, and to start bringing this out in the open, will force change. And so, even if we never wrote any judgments, I still think we’re having a good influence. And I happen to think that the judgments are pretty good, too.
All right. I appreciate it. I really appreciate you coming on. I want to talk to you in six months.
I think we’ll have a lot to catch up on. And when this happens in six months, we’ll see how it ends.
I’ll come back and do it.
All right. Thank you so much “Sway” is a production of New York Times Opinion. It’s produced by Nayeema Raza, Blakeney Schick, Heba Elorbany, Matt Kwong, and Daphne Chen. Edited by Nayeema Raza and Paula Szuchman, with original music by Isaac Jones, mixing by Erick Gomez and Bryson Barnes, and fact checking by Kate Sinclair, Ben Phelan, and Kristin Lin. Special thanks to Shannon Busta and Liriel Higa. If you’re in a podcast app already, you know how to get your podcasts, so follow this one. If you’re listening on the Times website and want to get each new episode of “Sway” delivered you, with an abject apology from Donald Trump to Mark Zuckerberg, and the rest of the nation, as if, download any podcast app, then search for “Sway” and follow the show. We release every Monday and Thursday. Thanks for listening. One more thing. We’ve got an event coming up for Times subscribers. I’ll be debating my fellow hosts from Opinion podcasts, Jane Coastan and Ezra Klein, as well as columnist Farhad Manjoo, about the merits and dangers of cancel culture. Comedian Trevor Noah will be weighing in on the subject, too. It’ll be on Wednesday, May 12. Times subscribers can RSVP at nytimes.com/cancelculture.