Clean

What does the Facebook Oversight Board do?

Published Mar 28, 2022, 10:37 PM

Since its formation, the Facebook Oversight Board has reviewed some important content moderation cases on Facebook. In some cases it upheld Facebook's initial decision, in more cases the board overturned Facebook's actions. But why does the board exist in the first place and how does it work? Dex Hunter-Torricke joins the show to explain.

Welcome to tech Stuff, a production from I Heart Radio. Hey there, and welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm an executive producer with I Heart Radio and how the tech Area. A couple of years ago, Facebook, which of course later became Meta, announced the intention to create an Oversight Board for the purposes of her viewing Facebook's content moderation practices. The company was still is under a ton of pressure to respond to various issues, ranging from disinformation campaigns to the proliferation of hate speech. Unfortunately, Facebook's responses were frequently criticized as not being sufficient, as were the company's policies, which had noticeable gaps and transparency issues. Thus, the creation of the Oversight Board and the Board, while deriving funds from a trust that was created by Facebook, is independent of Meta itself. I had the chance to speak with dex Hunter Torrik, who heads up the communications department at the Oversight Board. We talked about the Board's mission, We talked about its history and how it tackles the challenge of picking and deciding cases that have the potential to impact millions or potentially even billions of users. And what follows is my conversation with him. Enjoy Dex, Welcome to text stuff. It is a pleasure to have you on the show. Great. I'm really excited to talk to you because the Facebook Oversight Board is something that I've talked about a few times since it first began to to be a thing a few years ago. But I feel like I include myself here. I feel like a lot of people don't have a full unders standing of what the board actually does, why it's They're like, what necessitated it's um, it's creation and the process by which it goes through while it's determining the outcome of of various cases. So I'm so glad to have you here to kind of pull back the veil of mystery because I think it's very easy easy for those of us in the media too, even unconsciously misrepresent what's happening. Sure. I mean, when you hear about the board, right, and you think there's a group of experts who are focused on things like content moderation standards and Facebook's community standards, you know, the rules that govern the platforms, it is something that I think a lot of people sort of skip past, right. It's almost like when you're installing an app on your iPhone. You get those disclaimers and nobody reads them. You just click yes. You know. Um, it's something that's incredibly complex, and it sounds quite dry, and it is is. And because of that, we naturally gravitate to the more exciting questions about, you know, how should social media be regulated? Um, what are the kind of structural issues within the industry? Um. You know that we can have lots of points of view on the truth is those rules that Facebook and now matter of course, um, you know have constructed four platforms like Facebook and Instagram. They have an enormous impact on free speech and human rights around the world. They govern almost every aspect of online discourse today on any given topic. So we really see the value of the Board as an institution that is focused on scrutinizing those rules and understanding the impact on people's lives and then working at how to improve them so that we better defend free expression and human rights. That is a incredibly valuable mission. And I would say, I think the other thing to think about with the Board is really, um, we are only a small part of the solution to the you know, much larger challenges that you know are occurring all across the tech industry and across social media, and in particular with meta UM. And I think people often today, where we have such short attention spans and the world is in such crisis, we often think, UM, I want a quick solution to all of these problems. And the board is not a quick solution to those problems, and it will only ever be a small part of the overall solutions that we need. But even though it's only part of the solution, I think it can still be incredibly valuable. So that's that's how I go about describing the impact of the boards and working on a sort of purpose in the world. Yeah, and when you look at as you were mentioned, in the world in crisis, I mean there are crises all over the world where we often see social networks being pulled into them. Whether you're talking about Myanmar or Russia and Ukraine, or you talk about India, I mean, they're they're famous cases or even here in the United States. UM, famous cases all around the world where you have this delicate balancing act of how do you best serve uh, the the customers of Facebook who are actually using the product, or depending on your point of view, the product of Facebook or using the product, uh. And then and then how do you how do you also make sure that it's it's not violating any laws to uphold a certain policy like that. That to me is probably the most complicated part of the picture, because I think a lot of people can have a knee jerk reaction when they see maybe a piece of content being removed or a piece of content staying up when you may seem like it's controversial, but without having the full understanding of the context, uh it can you know you might not you might not have the the appreciation of why that move was made. And then, of course the Oversight Board can step in when you have cases where people have appealed UH an action that Facebook has taken in order to make a judgment over whether or not that was the right call more or less getting it right absolutely, And I think you hit the nail on the head right, um, which is these are incredibly challenging issues where there isn't always an easy right answer um. And without the context, you certainly can't make an informed decision about these issues. So I'll give you a very specific example. One of the early cases of the board took on when we started operating in twenty twenty one was around a speech relating to COVID misinformation. So there was a piece of content that had been taken down by then Facebook from their platform under their COVID misinformation rules, and when the board looked at it, we actually found it was perfectly legitimate speech. It wasn't something that was spreading something that we thought, you know, had a potential of leading to to imminent harm. It was something that was a legitimate debate about public health policy and how authority should be responding to COVID in France. And these are extraordinarily thorny questions, right if you have something that seems like it might be in the COVID misinformation space, given how sensitive that topic is, given the enormity of the public health emergency, you might as a platform just you know, err on the side of caution and take something down. But if you do that, and if you do that a lot, and you have that sort of cumulative impact on freedom expression, we are muscling the ability of communities to have informed debates about these issues, and we need to be able to have those. It is perfectly legitimate to have debates about how as a society we should respond to things such as COVID. Another example, just from the last few months, we had a case that involved someone in Sweden who had posted about sexual abuse of minors. And this is obviously, you know, ex extraordinarily sensitive, and this piece of content was you know, taken down UM and it was found to be you know, very graphic and something that was deeply unpleasant at the same time as being you know, deeply deeply unpleasant and very very hard to to read. It was something that raised legitimate questions again about sentencing and how people you know, who are involved in these kinds of crimes should be treated as a society. So again something that is important for people people to be able to have discussions about in spite of how difficult it is, UM, you know, to read some of these things. So the board is constantly looking to navigate these issues, and we judge each piece of content based on a set of several things. How does content interact with Meta's own rules and standards, so their own stated rules that applied for the platforms um is content compatible with that. We're looking at the values that the company says that they abide by. And of course then you sometimes have tension between the values that the company says there by by the actual rules that they've come up with, and we look to clarify that tension. The third piece, which is where I think the most interesting. You know, an important angle for the board is is how content and the decisions made about that content relate to international human rights standards. So of course there have been decades of work by all sorts of experts and leaders, you know, across the ecosystem which have worked to codify human rights standards and norms. You know, how we think it is acceptable to treat people and communities in a way that protects their freedom of expression and their human rights. And the Board is constantly going back to that body of knowledge to try and make decisions in a way that is principled and ultimately puts those people first. It boggles my mind at how complicated this is. See, I love tex stuff. I love doing the technology side because I've said this many times. The beauty of technology is either it works or it doesn't right, either it's wired properly or it's not. Whereas when we start getting into the application of technology in the real world, gosh, people just make it so much more complicated. Yes, really, I used to be a SpaceX and you know, my favorite thing about that going to there from where I previously was, you know, I was working at Facebook before that was ultimately the rocket launchers or it doesn't. There's no spinning it. You know, either you know, puts a satellite up there or it doesn't. It everyone knows when it doesn't. But this isn't like that, and that's what makes it so I think challenging and sometimes unsatisfying for a lot of people. You know, there are a relatively fewer opportunities for the board to say, ah, we have absolutely nailed it and everyone else is wrong about this. You know, there is the rocket, you know, launching. You know, sometimes there's a lot of debate about whether that's a rocket or not. Dex and I will have more to say about the Facebook Oversight Board after these messages. When you start talking about things like digital rights, I mean, there have been plenty of cases, even well before the Facebook Oversight Board was formed, where you had some really thorny aces where the question was who is ultimately um in the wrong or more in the wrong on some of these cases where it became a question of freedom of speech, and digital rights versus someone's right to privacy. These are very complicated matters and I I certainly wouldn't want to be the person to have to adjudicate which is the most important. Um. Let's let's let's backtrack a little bit though. Let's talk a little bit about the actual formation of the board itself. Can you talk about sort of how that came to be? Yeah? Absolutely, So. The board was created by Facebook back when it was still called Facebook, and the first um, sort of you know, vision for the board was announced by Mark Zuckerberg back in twenty nineteen, and he wrote a long note which he posts on his Facebook outlining the vision for an independent body that could make decisions about some of the most consequential content issues that the company was face ing. And the sort of elevator pitch for why you would want to create the sort of body was very simple. Um, the world is of enormous and growing complexity, and there are fewer and fewer decisions that the company can just go and make successfully on its own. So having an independent oversight body filled with leaders and experts from a lot of different backgrounds was a way of expanding the diversity of perspectives that the company had available to it, and at the same time building a structure that gave people trust in the rules and the way those rules were enforced on these platforms that have become such a big part of our lives. And the design of the board incredibly complex, something that took quite a long time because I think the company, to its credit, wanted to get this right. So they went out and they consulted an enormous number of different stakeholders around the world, you know, more than two thousand people, um, you know, in many, many countries, And they spent about a year tracking around basically having these consultations with civil society, with experts, with lawyers, um, you know, with policy makers, until they came up with a structure which was you know, I think, really really strong and and is the one that we now use. So they built this institution with three components. There's the board, which is whateveryone immediately thinks of when they think of the oversight board. Right now, we've got twenty members, you know, who come from a lot of different backgrounds and perspectives. Ultimately the board might grow up to a maximum of forty people. We then have a trust and the trust has a set of trustees who operate again independently from META. Their job is to oversee the sort of fiduciary you know, duties that are needed to run an organization like this, and the real value is to protect the independence of the board. So META does fund the board. They put money into that trust. There's an irrevocable trust that's overseen by those trustees, and the trustees are there basically to ensure that all the you know, sort of messy you know conversations about resource seeing and so on take place at arm's length from where the decisions are happening. And trustees, of course don't have any role in the decision making and the sort of substantive you know, policy work of the board. You then have the third piece of the board, which is where I sit, which is the full time team of staff. So behind those board members you have a full time team of you know, other experts and people who helped keep the institution running, UM who helps steer those cases through their life cycle at the board and so UM. The board was announced in May twenty twenty, that's when the first twenty members were announced, and then we started accepting cases towards the end of that year, and then the first decisions came out in January one. So we're just a little bit over a year into the life of the board now. Um and you know, we've delivered over twenty cases now over a hundred detailed policy recommendations to the company. Some of these cases have been very, very significant. Others have been very significant but have received less attention. But we're starting to see the impact of the board's on the company, starting to see tangible outcomes in terms of how those rules are structured at Meta and how they're being enforced. Um And we've got you know, years left to run for the board, So I think it's it's an extraordinary exciting experiment, um you know, which Meta committed to, but is one that is becoming you know, more real by the day and and less like a experiment and more like an enduring institution that I think will be around for some time. I like that word experiment because I feel like that's why a lot of people viewed the board when it was first announced. It was really a new kind of concept, this idea of an outside entity that was independent of the platform saying making judgments about whether or not the platform was actually enforcing its own rules, whether those rules were legal in some cases, I'm sure, um and and and again, like the context of each case, I think a lot of people were kind of kind of confused by it, like it was surprising, it was not something they would think of. But then you also have to consider the fact that META has such a global reach, right, It's it's very easy, I think, especially for people in my country, in the United States, it's very easy for us to be so US centric that we forget the reach and how deep the penetration is in other parts of the world. Uh, And so we don't even start to consider the fact that we need to have a different set of rules in order to guide those organizations because we're used to everything being US. So that I think was part of the confusion. Yeah, I think I think you raise a really important point, Jonathan. I mean, so much of the discourse you're absolutely right, is driven by folks in the United States, and if not in the United States, in Western Europe. And we are the parts of the world which are super connected, you know. Our experience of connectivity, um, you know is absolutely phenomenal compared to the experience that a lot of people in other communities have and um, I think people often take connectivity for granted in those markets. Um. You know, we've just seen, you know, in the last few days and weeks what it means to be connected. Um, you know in places like Ukraine or in Russia, where people are you know, struggling to have their voices heard. And the extraordinary impact that social media can have in allowing people to mobilize and communities to organize and to fight for things like freedom, you know, a struggle that again a lot of people in the West of sort of you know, until fairly recently, it's sort of taken for granted. So I do think the fact that the Board operates and thinks very globally is a huge asset. Over half of our cases so far have been from the Global South, and the Board has been very deliberate about going after issues in places like India and myan mar um, you know, in Latin America, because we think those areas have been neglected for too long. When it comes to looking at content moderation and the discourse about protecting speech online and well, I think another thing that really nails home how important this is for me is that if we start removing things like like ethics and or morality or whatever, and we're looking at things from purely a business standpoint. From Meta's point of view, um, clearly the company gets the majority of its revenue from places like the United States, in Western Europe, less so in other places where you have this limited connectivity, and as a result from that business perspective, again not taking humanity into account, you start to understand, oh, this is where they're going to dedicate their resources because this is where the revenue comes from. That's where an uh an independent board really becomes important because they don't have that connection to this is where the revenue comes from, So this is where we need to focus. It's they're divorced from that so that they can take that approach where they say, no, let's really look at these areas where traditionally the company itself may not have dedicated a lot of its focus and makes certain that the decisions that are being made are really the right ones and are not neglecting a population simply because they're not driving ad revenue the way other parts of the world are. Yeah. Absolutely, I mean the board doesn't look at things like metas public relations, you know, strategy, or the impact of decisions on advertisers or you know, metas ad revenues. That is absolutely not a rubric that we're interested in. UM. We're we're focused on, you know, advancing free speech and human rights and very much UM ensuring that the company treats its users better. And I think the way that the board was constructed bringing together leaders you know, who have the kinds of backgrounds and perspectives and expertise who allow us to go and focus on that UM, you know, who aren't being sidetracked into you know, considering you know, those other issues. I think that's a huge strength for the board. It's an extraordinarily principal set of leaders UM and they really you know, want to focus on doing good for the users UM of Metas products, but also wider society. It's not just about the users of Facebook and Instagram, because we know that those platforms and you know, the billions of people who use them also have a deep, deep impact on broader society and and the world we all live in. Well, then let's let's talk a little bit about the process, and can you talk a bit about how the board decides which cases to consider. I assume that they receive far more than they actually go through. Yeah, So I mean since we started accepting cases towards the end of twenty we've now received over a million user appeals users of Facebook and Instagram. UM. That's the primary route by which appeals come to the board. Meta can also refer questions to the board, but we've naturally wanted to focus as much as possible on hearing the appeals that are coming up from the users of those platforms. So we've taken a lot more from from users in case us UM. The way we sift through that avalanche of UM you know, incoming appeals UM, some of it is based on how we UM you know, categorize UM those appeals that are coming in. So we have systems and you know processes that are designed to sift us down to a smaller number, which is still quite large, which then people are you know going through you know, in in a lot of detail. So the big criteria we use as those cases are coming in are all these cases that we think have the potential to impact a lot of users of UM of metas platforms. We obviously want to take on UM, you know, cases with the limited resources of the board which have the biggest impact for people. So we're looking at things that have the potential to impact thousands or millions of people around the world. We're looking for things that raise important questions about METAS policies. So you might have a single case. It might you know, be something about COVID misinformation in France, or it might be something about you know, UM, you know, hate speech you know in UH in India that might raise big questions more generally about the policies and the rules and how they're enforced on METAS platforms. The other one is, um, is this something that's going to have a big impact on freedom of expression and does it raise big questions about public discourse online? So UM with each of those cases, really we're looking for these emblematic cases, things that represent something much much larger. We aren't really designed, and we've never conceived ourselves as a sort of general customer service type of infrastructure. With twenty board members and you know, about fifty staff, there's no way we could possibly UM here every single one of those million plus appeals. UM. It just wouldn't be sustainable. But what we can do is pick up UM, you know, a set of cases every year which are then having a much wider area of impact on the company, and that ultimately will then see the impact on in um potentially millions of of other you know, situations that are playing out every day on Facebook and Instagram. So it's like it's like setting precedent almost where you can say, all right, we take this one case which is very specific here, but we can generalize the decision whether we uphold or overturned whatever Facebook's action was, and that in turn sets the precedent where similar cases that follow this should go along the same general pathway. I see the real value of that, especially again, like talking about a company as large as as Meta is UM. I mean, everyone knows, there's just it's impossible for that company to monitor everything that's posted. It's just that's not practical at all. But being able to set these rules and be able to say, yes, this, this is a valid application of your policy, and to UH to send that message to the company so that it can continue to do that and UH and in the knowledge that it's doing the right thing according to the rules that's set up or as is the case, I think, I think more than I think more often than not, in the cases that have been heard. We've actually seen an overturning of Facebook's decision where it's really seeing the message of you don't you don't have this right on this one. That's right, you need to re examine that's right. I mean, you know, well, well more than half of the cases we've taken on, we've we've ended up overturning the company. Um. And I think in the recommendations, um, you know, going back to that, well more than half of the recommendations we've given to the company, they've agreed to those recommendations and they've committed to doing them or they've already um, you know, implemented them. And I think, um, that that is a powerful sign that the board is working, that we are starting to have an impact on those bigger issues. UM. When we talk about cases, people often gravitate naturally to the part where we overturned the company, um, or we we upheld them the binding aspect of the decision itself on a specific bit of content. I think lot of the really interesting impactful work of the board comes in terms of our recommendations. So there's more than a hundred detailed pieces of guidance that we've given to the company. Because that's where um, you have the chance to shape those broader standards and how they're enforced to really deliver a lot of very detailed, practical guidance to the company, which of course is a recommendation. They don't have to follow it, but they do have to, um, you know, commit to studying it, you know, um, you know for real, and and to communicate transparently, um, what they're going to do with those recommendations we've given to them, and the company has been pretty good about doing that up till now. Well and and again to think back on just the way tech companies grow from my perspective, generally, I see uh kind of a probably a seventy thirty split of engineers really pushing an idea which ends up blossoming and perhaps even going to to the beloved unicorn status, and then maybe marketing which is really pushing the high bug thing. But but you know, that's that that the whole process, you know, it's all about growth, growth, growth, and you eventually hit a point where you have grown larger than what you are easily able to manage, whether it's because you've expanded into other markets, like just going into Europe and the g d p R considerations you have to have, Like these are all things that I think a lot of people just don't take into account early on. And so I definitely see that that value coming in again because reaching out and creating this this organization where you have representatives from all over the world. I mean, I know that a lot of people have harped on the fact that I think the United States has the most number of representatives on the board as it currently stands, but it the board is made up of people from all over right, that's right. Absolutely. I think about one quarter of our board currently, UM, you know comes from the United States. UM. You know, I think that reflects partly you know, global inequities and the fact that you just have a disproportionate number of you know, experts and institutions that generate expertise, UM, which are located in the West. UM. But every member of the board I think has you know, extraordinary skills and expertise and UM. Certainly in the next round of members, we're looking to continue expanding the diversity off the board. Well. And that's nice too, because again in in tech companies in particular, diversity is an issue that we have seen come up again and again where we, or rather a lack of diversity has frequently been an issue. So making certain that that becomes UH and a priority is really refreshing to be able to break free of that very narrow view of the world that some companies can develop due to a just a lack of perspectives. It's not consciously ignoring things, but just because of you know, the actual individual backgrounds. UM. I wanted to talk also about actually the process of considering a case, like what is how does the board do that? As I understand it, there they have a focus group essentially that looks at a case in great scrutiny. That's right. So when a case comes into the board, we convene a panel and it's five members UM. The membership of these panels is regularly rotating UM, and you have a sort of cross section of expertise in different backgrounds represented within those panels. You always have at least one member of the panel who's coming from the region where that content you know is UM you know coming from or or implicates, and that panel takes the initial UM you know, review of the case. They spend a decent amount of time studying these things in depth and trying to reach a sort of provisional decision. The decision that they come up with on a case then goes before the entire board, So that's another you know, sort of check within the board to ensure that we are really studying these things with a you know, three sixty degree perspective. Um. Every decision is then voted on by the entire board, and decisions have to you know, receive a majority of sport from the board, otherwise they can be sent back and you know, we can convene a new panel to look at these things. But I think another very important part of this process, which I'll call out is the public comment process. So unusually, you know, I think for for an entity you know, that's been empowered to take on this role, we we also want to ensure that we aren't just limiting our expertise to ourselves. So we do, you know, obviously provide independent oversight of the company. There are many many more points of view though from the world that we thought it was important to reflect in the decision making process. So with every case and every um you know, policy, um, you know that we're we're working to review, we are going out to civil society UM, two academics UM too, regular users of these platforms and saying, if you have a point of view, share it with us. UM. So we have this process called the public comment process, and we get really valuable comments thus submitted by people all over the world. UM. You know cases from the Middle East. UM. You know, we've received you know, really really valuable input from grassroots you know organizations, UM, from countries all over the region. UM. When we have the big Trump case looking at whether President Trump's access to Facebook and Instagram should be restored, we had over nine thousand comments, you know, delivered from around the world. So we had you know, everyone from you know, members of the public, you know, to people who are in the U. S. Congress are submitting very detailed guidance and what they thought were the implications on free speech, UM and human rights. So I think the board has UM, you know, a process that's designed to reflect our own expertise and to you know, bring in that diversity internally, but we also think about the diversity in terms of the external world. We've got a bit more to talk about with the Facebook Oversight Board after the following break. One of the big problems Meta had was that it failed to have a policy in place that would allow sort of an indefinite ban, and that really what Meta needed to do was either set a firm limit on what the band was or to just go ahead and call it a lifelong ban. But it could not exist in this sort of nebulous, vague banning condition. And I remember when that came out, I saw a lot of knee jerk reactions all across the board, like in various ways, but to me, it was really nailing home for the first time in my experience what the oversight board was doing in terms of metas policies, and that it's saying, there are cases that are falling outside of your of your rules, and you have to you have to craft the rules to cover these cases. Otherwise there's no way to say whether it's fair or not you don't cover it. That's exactly right. There was a very important principle at stake here, right, which is that the rules should apply to everyone, and everyone also includes the company. So the rules exist to UM you know, govern the way that users get to use those platforms, but they also govern the way Meta should be serving their users. UM so that you know the fact that the company didn't have clear, transparent, defined standards on how to navigate innssy like this. That was a huge gap in UM you know, the systems that are designed to serve their users and UM you know. Ultimately, it pushed the company to go and rewrite those rules and to create new UM you know, processes and standards, which I think will serve them much better in future situations where they have to navigate these very very thorny issues. You know, Ultimately, if you don't write down the rules and everyone can't see them and you know, see them transparently for what they are, it makes it much more likely that a big company like Meta is going to be able to get away with not treating their users fairly. I do think it was an important point to really defend UM you know. The board also said that the suspension UM you know it was correct, so it was the correct move to go and suspend him um you know quickly from access to Facebook and Instagram. But despite that, there was still you know, bigger you know principles at stake, and the company didn't think through the long term consequences of how they did it right. Yeah, it makes me think and you may not get this reference. Actually, a lot of my listeners might not get this reference, but it makes me think of of Calvin Ball, which was a thing in the comic strip Calvin and Hobbes, and it was a game that this little boy named Calvin would play where he literally would make up rules as the game was being played, and there was no way to know how to win the game because he was the one making the rules in real time as the game's being played out. And I have a feeling that that was kind of how we were seeing meta with some of these policies too, and that there was obviously this huge external pressure on the company to take action. Uh and and the company obviously was already in a space where they're this external pressure had been building for quite some time, particularly here in the United States. I mean, Zuckerberg had to appear before the Senate a couple of times leading up to this, and so there obviously was this call to action. But again, until you have those transparent rules that everyone understands and applies to everyone, taking action on its own isn't enough. It's it's something that could easily be reversed, because if if things swing a different way, then the company can stand accused of unfairly applying rules to certain people. In fact, we've seen that discourse rise as well. Absolutely, I'm going to use that Calvin Ball analogy now, Um, I mean that's fantastic. We're basically we are designed to be the anti Calvin Ball mechanism. We do not think Calvin Ball is a good you know game to play when it comes to speech for billions of people online, I do confront this narrative all the time, which is is the board of distractions somehow from regulation? Like is this something that is getting in the way of the bigger fix that we need as a society. And I just think that's so you know, misplaced as a concern, and and simply the evidence doesn't you know, back it up. Um. You know, policy makers have been looking at these issues for years, and they've been moving at an incredibly glacial pace. And part of the reason they have been moving at that place, I don't think it's necessarily, you know, due to any ill intent. It's because these are massive, deeply complex issues, and it takes years to align the interests you know, and then to you know codify you know, legislation and to you know, build a coalition to support that legislation. The board is designed to be something that's solved one small part of this overall challenge with social media and to move faster. So I mean, we've gone ahead and we've built this institution which is starting to serve the users of Facebook and Instagram and and you know, the communities who are impacted by those platforms. Nothing we're doing is getting in the way of policymakers also getting stuck in and you know, having you know, a broader you know, sweep at the systemic and the structural issues that also impact social media. So we really don't see ourselves in any way as a substitute for you know, all the effort of policy makers and all the work that needs to be done. We're a compliment. We're a very small compliment, and you know, we're looking forward to seeing, you know, what people come up with in terms of other proposals to you know, improve social media and deliver a healthier social media environment. And yet even though the Board is this this small slice, because of its global nature, it is I would argue, h more well positioned to tackle its particular mission than any regional government would be simply because again you're trying to legislate something that has a global reach, but you don't have global jurisdiction. So you've got these various nations around the world all grappling with similar issues. But unless there's some sort of broad agreement across the world as to you know, how to go about doing this, it's just gonna be very messy for a long time. So that's another reason I think it does take a long time to see things on the regulatory side take shape, uh not to mention in places like here in the United States, whenever there's a change in administration, there's a massive change in the position of regulations. So you can have a regulatory body that uh completely changes shape from one administration to another, and then any any progress you were making on any particular issue might get reset. So I think that having something that's at least addressing part of it, and a very important part, even if it is a small one, is fantastic. It's at least it's encouraging, because I think otherwise the narrative tends to be that we're kind of stuck in a quagmire waiting for a solution. Too. Something that I think everyone recognizes is a problem. Although they may disagree what that problem specifically is, they just don't recognize that there is a problem. Yeah, I mean it's it's interesting, Jonathan. I mean we actually this is where there's a real consequence of social media and all the years we've lived in the digital era in how it impacts are thinking, right, Um, all of our attention spans have become way shorter. Um. I learned something interesting the other day. The average attention span for a goldfish is fifteen seconds, and the average attention span for a person today is about nine seconds. Um. So like we we literally you know, have these incredibly short attention spans. And because of that, everyone's looking for really simple solutions to complex problems. And I think of this as the sort of buzz feed approach to solving problems. We all want this one weird trick to solve content moderation, and you're hoping there's there's that one weird trick that will you do everything for you And I think the experience of the last decade has very clear he shown us that's not the case at all. There is no UM you know, special piece of regulation. There is no um you know, incredibly well crafted institution including the oversight board that's going to solve all your problems in their totality, and anyone who's claiming that you know, either from government or from the tech industry, is just lying. Um. We can also bits of that problem and together we add up to something that's very, very meaningful, and we'll protect users and communities. Um. But we should all be realistic about what we can do and recognize the scale of the challenge. Very well said. I could not agree more. I think it's incredibly beneficial for listeners out there. If you if you really want to get an understanding of how complex this is is simply go and review one of the cases that the board tackled and really read up on all the different UH factors about it. Because you start to really you get past that knee jerk reaction. Right. You might initially you have a feeling of this should go this way or this should go that way, But as you really dive into it and you start to pull back and look at the larger implications, then you might start to understand there isn't a simple on off switch. The world is not a binary system. We cannot treat it that way. We have to We have to take into consideration all the complexities, including where there may be gaps in policy. And that's why there's this this issue, because we can't definitively say this is against the rules if there are no rules that govern it. So I think that if if people do take that time, which obviously with the nine second attention span is going to be challenging, but they do take that time, they're they're going to gain that greater appreciation because I mean, I'm guilty of this too, Like I'll see a story pop up and very quickly make a judgment, and it's only by resisting that urgent and engaging in critical thinking and taking those those further steps that I can get past that. And I don't do it all the time. I say all the time on the show. The two things I advocate for the strongest our compassion and critical thinking. I think the two of them together absolutely necessary if you want to if you want a better world. But even though I advocate strongly for it, I also admit fully that I am not I'm not a a a perfect steward of that approach. Sometimes I fall victim to it too, And I really think that by diving in a little further and reading up on these things, people can get a greater appreciation for the complexities that are involved. UM. I certainly do not envy anyone on the board's position. I am fascinated by the process. But I cannot imagine really dedicating the kind of time and attention to sometimes incredibly emotionally charged situations and determining whether or not a company's decision on that matter was in line with its policy or not. Absolutely, And this this is the challenge, right all the board members are human beings. They're all looking at the same headlines. We're all looking at. UM. These are issues which can be extraordinarily emotive, and there is always, you know, an urge today for people to make those snap decisions about these things. And the board has been designed not just in the extraordinarily impressive people who have been added to the board, but in the mechanisms that go into making that institution. It's been designs that we don't make those snap decisions. Um. You know, we provide that thoughtful, you know, measured review of the decisions that Meta has made. They have a responsibility to act first and to act fast on a lot of these issues. But then we're going to come in and take a closer look and say, have you thought about this? You know, implication for you your users. They used to be that slogan which became very popularized and then maligned from Facebook, move fast and break things. And I always tell my team, half jokingly, only half joking, Lee, that all slogan, if that was one, is moved thoughtfully and defend human rights, because that's a much better way of actually going about doing that well. And again it's one of those things where you go from the technical aspect of let's make something that's really cool to the application aspect of how does this interact with the real world. I'd like to wrap this up by sort of asking are there are there plans? Like really, right now the oversight board is focused primarily upon uh content moderation policies. Are their plans for that to expand beyond Facebook's content moderation or is that just going to be the primary focus of the board from here on out. Yeah, you made a very important point in Jonathan, which was about how, um, you know, people can lose focus as they scale, and so we think of ourselves in many ways as an organization, as a startup. We're a very small, you know group taking on a very big, you know mission. We've evolved enormously over the last couple of years to navigate through the sort of organizational and strategic challenges as an organization. We think the right thing to do is to stay focused. Right now, Meta is a huge challenge. Just getting our arms around the mission of the board today is something that you know, takes up an enormous amount of work, and we want to make sure that we're delivering the maximum impact in terms of that original mission before we look to expand further. Having said that, you know, we absolutely recognize that this is a shared challenge across the industry. All the problems that META is dealing with our problems that manifest in different ways for other platforms. I mean, look at Spotify or Netflix over the last few months and all the various controversies and and problems that they've they've been experiencing. So UM For now, the way we think about it is focused on getting that core mission done, share as many of our learnings UM as widely and as transparently as possible in a way that can be helpful to other companies and down the road, you know, over the next you know, coming years, um, I think we'll be looking to explore whether there's something else we can deliver for companies. And you know, as the Board evolves itself and we look to expand the things that we're looking at within Facebook and Instagram, they may be other things that you know, then coming to focus for the rest of the industry and they say, hey, actually, maybe the Oversight Board can be helpful for us as we develop our plans. Uh. Fascinating. Dex. Thank you so much for joining the show and giving us more information about the Oversight Board. I have a much greater appreciation for what it does now than I did before we even started chatting. It was really informative and educational. I hope my listeners enjoyed it as well. Great to be here, Jonathan. Thanks again to Dex for joining the show. It was really interesting to hear about the board's mission directly from someone who works with the organization, and it gave me a greater appreciation for the scope of the board's job as well as the potential impact of its decisions. And it really does highlight the necessity to seek out a diverse array of perspectives as companies scale up in many ways, I totally understand how Facebook could find itself in such a complex situation that it required the creation of an external entity. I mean, come on, let's let's really be real here, cards on the table. Facebook evolved out of a tool that was meant to allow male Harvard students to rate the attractiveness of female Harvard students. That's why the predecessor to Facebook was all about. So that wasn't exactly aiming to become a nexus of global communications. And the process of growth and scaling and expansion is one that happened so quickly that it's not a surprise to me that people of the company didn't necessarily realize they needed a robust set of policies until problems began to pop up. And in many ways, Facebook's path could serve as a lesson to other platforms, either to create their own independent oversight boards or to incorporate departments that are dedicated to the formation and execution of policies and to really staff that with a diverse group of perspectives so that you can best serve your users. Right, if your users are all over the world, then you darn well need to have that diversity of perspective in order to serve them properly. And I'm sure there will be cases where the Oversight Board will make a decision that I will have trouble understanding. Uh, there may well be cases where I have a fundamental disagreement with the board's conclusion. But at the same time I have to account for several facts. Namely, while I feel strongly about human rights issues, I am by no means an expert, right, I do not spend the same amount of time and energy researching these cases, nor do I have the background in human rights and digital rights that the board members have, And the conclusion that the Board comes to could be nested in a much deeper problem, one that where they will Facebook itself lacks the framework to issue a clear decision on the matter, and it might be that the Board comes to its decision not because of specific matters with the case, but because there are no actual rules that govern what Facebook does. And therefore, you know, if there are no rules that say Facebook can do this, that's a problem. As dex indicated, reality is a complicated and messy matter. In the end, I am glad there's an independent group holding Facebook accountable, and one that can compel Facebook to reverse decisions that on close examination do not appear to follow with Facebook's stated rules and goals. I do hope to see a more broad application of those principles across the web and the tech industry in general. Um, that kind of consistency is really important. And again I don't anticipate agreeing with every single one of those decisions, but at least I can be confident that the decisions were made by people who were taking incredible care and consideration when judging the matter. Uh, and not just be something where you know, it's a moderator who's under intense pressure to look through as many comments or or posts as they possibly can, and they're just hitting you know, delete or or they're leaving alone one after the other in order to get through it an incredible backlog, right, Like, my heart really goes out to moderators too. We've heard some terrible stories about the emotional impact that moderating can have on folks who have to go through all these different types of posts on Facebook that get reported. Anyway, that wraps up this episode. If you have suggestions for people I should have on the show or topics I should cover, please reach out to me. The best way to do that is on Twitter. The handle for the show is text Stuff hs W and I'll talk to you again really soon. Yeah. Text Stuff is an I heart Radio production. For more podcasts from I Heart Radio, visit the i heart Radio app, Apple Podcasts, or wherever you listen to your favorite shows.

In 1 playlist(s)

  1. TechStuff

    2,455 clip(s)

TechStuff

TechStuff is getting a system update. Everything you love about TechStuff now twice the bandwidth wi 
Social links
Follow podcast
Recent clips
Browse 2,452 clip(s)