Then on top of that, someone who doesn’t like the decision that Facebook has made has the right to appeal it within the company, and they would get a yes or no answer, but no explanation. … I’m not much of an empirical guy, but I think something like 80% or 90% of this is elimination of bots, which are artificially-generated posts that aren’t coming from human beings at all, and AI is pretty good at identifying bots.īut in addition to that, Facebook has three different monitoring centers around the world, each of them having about 10,000 employees who review the posts and see whether they comply with Facebook Community Standards. Now, as a mechanical matter, first there’s the use of AI and algorithms to find some kinds of impermissible content. Many of those, though, are, as you would expect, somewhat vague and subject to different kinds of interpretation, so that leads to many controversies. You can look it up and read the Community Standards for yourselves. They have what they call the Facebook Community Standards, which is an elaboration of their policies and what can and can’t be expressed. It is a platform that’s supposed to be a good place for families, and so it’s had, for example, an anti-nudity policy, quite rigid anti-nudity policy from the beginning. McConnell: Facebook from the beginning has had some restrictions. ![]() How does Facebook saying this is or is not something that you can say on our platform threaten free speech in general? That’s a little scary, but I think that you’re absolutely right. ![]() You have said that Facebook has one of the most influential roles to play in deciding what can and can’t be said in our culture today. I’m also a former federal judge, so that probably also attracted some favorable attention.Īllen: Well, we’re certainly glad to see you on the board. So I guess in Silicon Valley, when you’re talking about issues of that sort, my name would come up pretty quickly. Now, as for me, I don’t exactly know, nobody really tells you where your own name comes from, but I do teach freedom of speech and religion and press right in Facebook’s backyard at Stanford. And Facebook has agreed that it will comply with the oversight board’s decisions. So if you post on Facebook and Facebook decides to take your message down, then you can come to the oversight board for a second opinion on that. What they decided to do was to create an outside board of independent-minded people with experience in free expression issues to give a second look to the decisions made by the company about content moderation. And with that have come controversies and problems: What gets posted? What comes down? And the company realized that it was not a good thing for any one entity, even itself, to be making these important free speech decisions. McConnell: The mission and purpose is that, over the years, Facebook has become the leading platform of communication around the world. Professor McConnell, thank you so much for being here today.Īllen: To begin, can you just tell us a little bit about the mission and purpose of Facebook’s oversight board and how you came to be one of the four co-chairs of the board? Virginia Allen: I am joined by Michael McConnell, professor and director of the Constitutional Law Center at Stanford Law School and a co-chair of Facebook’s new oversight board. You can also leave us a message at 20 or write us at Enjoy the show! If you like what you hear, please leave a review. All of our podcasts can be found at /podcasts. The Daily Signal Podcast is available on Ricochet, Apple Podcasts, Pippa, Google Play, or Stitcher.
0 Comments
Leave a Reply. |