This transcript was prepared by a transcription service. This version may not be in its final form and may be updated.
Ryan Knutson: Who are you, Jeff? Introduce yourself.
Jeff Horwitz: Oh, well, My name is Jeff Horwitz, and I’m a reporter for The Wall Street Journal. I’ve been covering Facebook for the last two and a half years.
Ryan Knutson: And if a friend asked you what you’ve been working on, what would you tell them?
Jeff Horwitz: So the last few months have been a bit odd. I have had to answer that question with I can’t tell you about that. But the next thing I would say is that we had the opportunity of reviewing what I think is the most significant body of information about how Facebook works, how its executives think, and how it’s trying to fix some of the problems that it knows its systems caused. I think that’s the most significant stuff on that front that I’ve ever seen.
Ryan Knutson: This large number of internal Facebook documents gives us an unparalleled view of how the social media giant operates. The documents were largely written by a group of researchers inside the company whose job it was to study the platform’s problems and try to come up with solutions. And Jeff has spent the last several months going through the documents with a team of Wall Street Journal reporters.
Keach Hagey: I’m Keach Hagey.
Newley Purnell: My name is Newley Purnell.
Sam Schechner: I’m Sam Schechner.
Justin Scheck: I’m Justin Scheck.
Georgia Wells: My name is Georgia Wells.
Justin Scheck: When I got my first look at these documents, I was shocked.
Georgia Wells: I was shocked. I was surprised.
Newley Purnell: All of a sudden the doors were opened.
Sam Schechner: I just kept clicking from one document to the next and I couldn’t believe what I was reading.
Jeff Horwitz: It covers just such an incredible range of kind of how Facebook interacts with society. This vast collection of-
Justin Scheck: Research.
Keach Hagey: Training videos.
Georgia Wells: Many of them are slide decks.
Jeff Horwitz: Back and forth. Versions of documents sent between senior Facebook executives.
Justin Scheck: The documents have names like-
Jeff Horwitz: Understanding the Intersection between Criminal Organizations in Human Trafficking.
Keach Hagey: Project Daisy Launch Discussion: Version Presented To Mark.
Jeff Horwitz: One document is called apple escalation.
Keach Hagey: There’s teen girls body image.
Sam Schechner: Another is called coordinated social harm.
Jeff Horwitz: There was an element of, oh my God. They put that down on paper
Ryan Knutson: From these documents. We’ve learned new details about Facebook’s algorithm and how it’s weighted.
Kate Linebaugh: We’ve learned how criminals use the platform for human trafficking and how Instagram impacts mental health
Ryan Knutson: Together. These documents show that Facebook knows it’s causing harm. And in most cases, the company hasn’t taken significant steps to stop it.
Kate Linebaugh: They also reveal, that what Facebook has told the public often isn’t the full story.
Speaker 20: You see a theme in all these documents that Facebook and its top executives know what their problems are, but in many instances, can’t, or won’t address them sometimes because it fears hurting the business or growth.
Ryan Knutson: Facebook stated ambition has long been to connect people. And it has, Facebook has more than three billion users, more than a third of the world’s population. And the documents show that what happens on the platform has real world consequences.
Jeff Horwitz: As the company’s platforms become sort of increasingly ingrained in our lives. The decisions they make and the priorities they have is something that affects everybody even if you don’t use the platform. And so even if you aren’t on Facebook ever, you’re still living in Facebook’s world.
Ryan Knutson: Welcome to the journal, our show about money, business and power. I’m Ryan Knutson
Kate Linebaugh: And I’m Kate Linebaugh. This is the first episode of the Facebook files. An investigative series that will take an unprecedented look inside the $1 trillion social media.
Ryan Knutson: This is part one, the white list.
Jeff Horwitz: This is something that went deeply wrong and it’s something Facebook knew about. And it’s something they didn’t want to reveal.
Ryan Knutson: Coming up on the show, a secret Facebook program that shields millions of its most powerful users from the platform’s rules.
The wall street journal has reviewed a lot of internal Facebook documents, but right now we’re going to start with just one. It’s a text document on a white page, sans serif font. It looks like any other corporate memo, in large texts across the top, the title says AC Priv, the white list problem.
Jeff Horwitz: It was labeled AC Priv. So attorney client privileged. So definitely not a thing that is supposed to be just floating around, but yet here it was.
Ryan Knutson: The document is an internal review of a secret Facebook policy known as the white list. The policy exempts certain high profile people from the social networks rules. The document starts with a list of bullet points under the letters T L D R, which stands for too long didn’t read. It’s how Facebook begins a lot of its internal memos. The first bullet point is a quick summary of the program.
Jeff Horwitz: We are exempting certain people in businesses from our policies and standards.
At various stages[crosstalk 00:00:05:35] (inaudible)
Ryan Knutson: The next bullet point starts laying out concerns about the policy.
Jeff Horwitz: This undermines our fairness and legitimacy efforts creates legal and compliance risks for the company (crosstalk)
Ryan Knutson: Lower down. The document says, quote, that means
Jeff Horwitz: “For a select few members of our community. We are not enforcing our policies and standards. Unlike the rest of our community, these people can violate our standards without any consequences.”
Ryan Knutson: Wow. So, at Facebook, there is a white list of people who can say whatever they want on the platform, violate the rules and not get automatically kicked off Facebook or have their posts deleted.
Jeff Horwitz: Yeah. The first time I saw the document, it was just kind of a whoa situation.
Ryan Knutson: When Jeff went to Facebook to ask about the white list, a spokesperson acknowledged it exists, but said the company has already identified issues with it and is working to phase out the practice. But why does Facebook have a secret white list in the first place? Well, the company didn’t exactly intend to create one. It just kind of happened over time.
Jeff Horwitz: Facebook didn’t really begin its life with any rules whatsoever
Ryan Knutson: In Facebook’s early days, aside from a ban on things like nudity and harassment, people could post pretty much whatever they wanted, but as more people joined Facebook, the company started adding rules about content it would take down. And if someone violated Facebook’s rules, they could have their posts deleted, get suspended or even kicked off the platform permanently.
Jeff Horwitz: And these rules kind of grew up and they started building out the team that was going to enforce them. And the problem they ran into is that they make mistakes. They’re human. There are a lot of people on the platform and some of these things are pretty tricky. So, their enforcement mechanisms weren’t particularly reliable.
Ryan Knutson: There were so many posts to watch that Facebook had to use artificial intelligence in addition to human moderators and every now and then the company would land in hot water when it would take action against a politician or celebrity’s account. For example, in 2014, the singer Rihanna had her account on Instagram, which is owned by Facebook temporarily shut down. After she posted a partially nude photo of herself on the cover of a French magazine, the mistake made headlines and Facebook had to publicly backtrack.
Jeff Horwitz: So, they needed to have some way to prevent people who were big deals from getting just summarily kicked off the platform or having their content taken down.
Ryan Knutson: Facebook’s answer to this problem was a system that was internally referred to early on as Shielding
Jeff Horwitz: Shielding, that was supposed to take sensitive accounts and prevent enforcement actions from just taking effect immediately. And athletes, movie stars, politicians, mayors, sometimes academics like anybody who could really get a lot of attention. They were concerned, especially about messing that up. Cause messing up on an average Joe, well it’s unfortunate. You’d like to do better, but no big deal, right? From the point of view of Facebook, this was like all totally understandable. They had to find a solution. And this was the most ready thing at hand.
At some point someone came along and said, shielding fancy people, doesn’t sound great guys. And so they decided they wanted to stop using the name, Shielding.
Ryan Knutson: The document we mentioned earlier, AC Priv the white list problem, as well as other documents Jeff saw, say that at this point, Facebook rebranded it, instead of calling it Shielding, Facebook started using the name Crosscheck.
Jeff Horwitz: And so they changed that name to Crosscheck right. To try to emphasize that this was supposed to be quality control.
Ryan Knutson: Crosscheck, meaning like someone’s going to cross check a decision that a moderator has made about deleting a post.
Jeff Horwitz: Yeah, exactly.
Ryan Knutson: And as time went on, Crosscheck grew more elaborate and more unwieldy.
And so how do people get added to this Crosscheck program?
Jeff Horwitz: For a long time, the answer was how ever anyone felt like it.
Ryan Knutson: Jeff says that most Facebook employees had the power to add someone to Crosscheck and the program grew so large that so many posts were getting flagged from so many Crosschecked accounts that it became impossible for the company to review them all. And when they did delete posts or disable someone’s account, they were still making mistakes.
Jeff Horwitz: The company said they estimate they mess up around 10% of the time. And so the only way to ensure that they didn’t make bad enforcement calls against powerful people was to not make any enforcement calls at all.
And that became known as white listing. They would exempt people who were considered sensitive for any number of reasons. They were famous, they were athletes, they were powerful or the family members of people who were powerful and they would give those people partial or complete exemption from Facebook’s rules.
Ryan Knutson: Jeff says at least 45 teams across the company were keeping their own individual white lists. According to the documents, the lists include prominent figures like Donald Trump and his son, Donald Trump Jr. Conservative author, Candace Owens, and Senator Elizabeth Warren, even mark Zuckerberg himself is white listed.
At one point, at least 5.8 million accounts were either part of the Crosscheck program or on a white list, meaning they were completely exempted from Facebook’s rules. And these accounts ran the gamut.
Jeff Horwitz: If you were over a certain size and like kind of considered a public figure, you’re going to have it. And so that actually applied to animal influencers.
Ryan Knutson: Really?
Jeff Horwitz: Yeah. So, Doug the Pug, very popular on Instagram, way more popular than you or me, Ryan.
Speaker 21: Hey Doug, wake up. Happy birthday Dougy. (crosstalk)
Ryan Knutson: Also very controversial.
Jeff Horwitz: Yeah, you can never tell what Doug the Pug is going to do next. Right?
Speaker 21: The most questions I get about Doug include what does he eat every day? And does he actually eat pizza? (inaudible) (crosstalk)
Ryan Knutson: But what about high profile accounts for people who are also known to spread misinformation like authoritarian dictators, or people with large followings who just spread conspiracy theories? Were they also included on this list?
Jeff Horwitz: Yeah. If you were over a certain size, it was going to give you some protection.
Ryan Knutson: Because of this protection, powerful users were able to post things like hate speech or incitements to violence and it would stay up.
Can you give me an example of how this has played out for a high profile person?
Jeff Horwitz: Yeah. An athlete who goes by the name of Neymar. He is a Brazilian soccer player. One of the world’s most famous soccer players.
Speaker 8: (inaudible) Blocked by Lindelof, Neymar! And there is the early goal that Paris Saint German were looking for, Neymar providing the finish (inaudible) (crosstalk)
Jeff Horwitz: He’s definitely in the top 20 accounts in the world has well over a hundred million followers on Facebook and Instagram.
Ryan Knutson: In 2019, a woman accused Neymar of rape. Neymar whose full name is Neymar da Silva Santos Junior, denied the allegations and was never charged. In the course of his denial, He live-streamed a video of himself on Instagram and Facebook
Speaker 9: (foreign language)
Ryan Knutson: He went through messages. The woman had sent him revealing her name and nude photos.
Jeff Horwitz: That is a no go on Facebook’s platform that is called nonconsensual nudity, AKA revenge porn, just completely forbidden and Facebook’s policies on it are very clear, which is that per their operational guidelines, internal documents, the way you handle that is you immediately take down the content, obviously. The next thing you do is you permanently delete the account that posted it, right? Like zero tolerance is the idea. You just can’t do this to somebody. And here’s the thing. Neymar was crosschecked
Ryan Knutson: Because Neymar was in the Crosscheck system. Facebook didn’t take the usual actions of deleting the offending post and then deleting his account. What happened instead was detailed in a document called Mistake Prevention Incidents Investigation. The Neymar incident appears alongside more than a dozen other incidents that Facebook employees were keeping track of.
Jeff Horwitz: Someone who worked for Facebook saw that this had been posted and basically said, take it down, right. Try to delete the post, but they didn’t have the authority to do that.
Ryan Knutson: The document says that a Facebook employee actually tried to delete Neymar’s post on the Saturday it went up, but Facebook system blocked them from doing it.
Jeff Horwitz: And for the next 24 hours, plus this video in which Neymar basically showed the world, this woman’s name and nude photographs of her stayed online and it was viewed well north of 50 million times.
Ryan Knutson: Wow.
Jeff Horwitz: What happened in the immediate wake of it, is that first of all, the woman was just harassed, unbelievably on the platform.
Ryan Knutson: The woman was inundated with harassment and bullying online, Facebook removed more than 3,500 accounts of people impersonating her, but Neymar didn’t face many consequences for breaking Facebook’s rules.
Jeff Horwitz: Neymar is not just famous. He’s really famous and booting, really famous, really photogenic social media stars off your platform isn’t really the business Facebook’s in. What happened is that after consulting with senior leadership at the company, and they don’t specify who, they determined that they were just going to take down the post and not actually take down Neymar’s account.
Ryan Knutson: The wall street journal reached out to Neymar for comment. A representative said that Neymar is just a user of Facebook and adheres to the company’s rules like everyone else, Facebook declined to comment on the Neymar incident.
So, this one piece of revenge porn posted by one soccer player got 50 million views. How is that even possible?
Jeff Horwitz: So, one thing, a feature of the Crosscheck program that I think was very significant is that when a piece of content was initially found to be likely violating of Facebook’s rules, Facebook didn’t stop promoting it to other people.
Ryan Knutson: The documents show that people inside Facebook actually tried to figure out how many views these kinds of bad posts were getting, posts that would otherwise be taken down if the accounts weren’t high profile and protected by Crosscheck
Jeff Horwitz: And the number they found for 2020 was north of 16 billion.
Ryan Knutson: 16 billion views?
Jeff Horwitz: Views.
Ryan Knutson: Of content that should have been taken down?
Jeff Horwitz: Yeah, content that absolutely by Facebook’s final determination after multiple layers of review, definitely violated their rules was viewed 16.4 billion times.
Ryan Knutson: That’s 16.4 billion views of things like hate speech, racism, revenge porn, graphic violence. The number is so large because remember these are high profile accounts with huge numbers of followers. Their posts instantly reach a lot of people.
Jeff Horwitz: If I’m just a random guy and nobody follows me and I want to say some really vitriolic things, it’s probably not a big deal. People have been doing this on the open internet forever. Right? The thing that, when it starts to matter though, is when an account is important when it has a following, right?
When there’s a whole network of people that it can influence. And as people inside Facebook noted when discussing this program, they were literally applying a lower standard to the people who, when they misbehaved, it was most dangerous.
Ryan Knutson: After the break, how the existence of this program undercuts what Facebook tells the world.
Mark Zuckerberg: Different policies on this stuff. But at Facebook, we’ve tried to distinguish ourselves as being really strong in favor of giving people a voice and free expression. Certainly (inaudible) (crosstalk)
Ryan Knutson: In May 2020, mark Zuckerberg appeared on Fox news and defended freedom of speech on the platform. For years, Zuckerberg has publicly emphasized his desire to make Facebook an egalitarian place in a level playing field when it comes to freedom of speech and he repeated that pledge to host Dana Perino.
Mark Zuckerberg: And I certainly think our policies have distinguished us from some of the other tech companies in terms of being stronger on free expression and giving people a voice than a lot of others out there.
Ryan Knutson: But he said, there are rules on Facebook and those rules are applied equally to everyone.
Mark Zuckerberg: Let me just be clear about what our rules are. I don’t think it’s appropriate for Facebook to do fact checking, but we do have clear policies. And if anyone violates them whether you’re a high ranking government official like the President or anyone on our platform, we do have to take action. We will enforce, no matter who you are on the platform, but (inaudible)
Ryan Knutson: What else has Facebook said publicly about its platform being an equal playing field?
Jeff Horwitz: Facebook has talked about the importance of making sure that they’re not reinforcing existing power structures. Letting someone who is a big shot say things on your platform that you literally wouldn’t let a little guy say, is obviously inherently, not that … it’s the exact opposite direction of what Facebook says they’re doing. And they’ve also said consistently for years, that even if they might provide a second level of quality control for high-profile accounts and they said they were doing that sort of to protect the sanctity of speech. Mark has said that holding the powerful to account, is supposed to be the point of Facebook
Mark Zuckerberg: Individuals today have more voice, more ability to affiliate with who they want and stay connected with people, ability to form communities in ways that they couldn’t before. And I know that’s massively empowering to individuals and that’s philosophically kind of the side that I tend to be on. (crosstalk)
Jeff Horwitz: So, Crosscheck is anything but that, because in the end, this isn’t a thing you’re doing to make sure that everyone gets a fair shot. This is something that you’re doing to make sure you don’t upset powerful people.
Ryan Knutson: In 2018, Facebook acknowledged the existence of Crosscheck after an undercover reporter for the British TV station, Channel Four discovered the program, but in a press release at the time Facebook wrote quote, we want to make clear that we remove content from Facebook no matter who posts it when it violates our standards, there are no special protections for any group, whether on the right or the left, the press release continued saying the crosscheck program, quote simply means that some content from certain pages or profiles is given a second layer of review to make sure we’ve applied our policies correctly. But the company has never talked about the white list, how there’s another layer of protection that exempts certain people from Facebook’s rules almost entirely. And the documents show that some Facebook employees were concerned about the unequal treatment caused by this program.
Jeff Horwitz: This is something that really tortured some of the people who were thinking through this stuff. Nobody thought this was a good look for the company. People argued that they needed it, that they didn’t really have any other option, but this wasn’t what Facebook was ever promising it was going to do, this wasn’t even something Facebook was willing to admit it was doing. And it’s something that I think a lot of people had trouble living with.
Ryan Knutson: So in 2019, Facebook conducted an internal audit of the Crosscheck program, including white listing. Jeff has been able to review that audit
Jeff Horwitz: By 2019. I think there was enough internal discussion about this, that they decided that they needed to at least figure out what the current status of the program was right? Because people had just been adding names, willy-nilly, and there were pretty much no controls and there were all these different lists and no one knew who was on it or who was responsible for what? So they did an audit. And that audit was pretty damning in its findings, which was just that the program was totally out of control, that they weren’t even sure they could even find all of the people who’ve been exempted from Facebook’s rules, because some of them were just, had been coded straight into Facebook systems.
Ryan Knutson: This finding was spelled out in that document we started with AC Priv, the whitelist problem.
Jeff Horwitz: And just in case anyone was unclear, there’s a section titled, why is this a problem? And the answer is, exempting, AKA white listing specific people and entities creates numerous legal compliance, PR risks for the company and harms our community. And then it basically says that it is allowing violators to stay on the platform, that it creates particular legal and compliance risks. And the line there was that white list and special exemptions treatments are not publicly defensible. It creates distrust, fuels the narrative that Facebook and biased and undermines our goal of building legitimacy with stakeholders and community. Importantly, it is a breach of trust. We are not actually doing what we say we do publicly.
Ryan Knutson: Recently, Facebook has started to make some changes to Crosscheck, one document Jeff saw shows how it’s been going.
Jeff Horwitz: They, in a notable description of their progress in March of this year, declared that they were going to basically just block the addition of any new names to Crosscheck for at least the indefinite future to quote unquote, stop the bleeding.
Ryan Knutson: Jeff, brought his reporting To Facebook earlier this month, this was the company’s response.
Jeff Horwitz: So this is from Facebook’s spokesman, Andy Stone.
Criticisms of our execution are fair, but the Crosscheck system was designed for an important reason, to create an additional step so that we can accurately enforce policies on content that could require more understanding
Ryan Knutson: The spokesman said that some posts like from activists, trying to raise awareness about violence or journalists posting from a war zone need additional review. The response from Facebook continued
Jeff Horwitz: A lot of this internal material is outdated information stitched together to create a narrative that glosses over the most important point, Facebook itself identified the issues with Crosscheck and has been working to address them. We’ve made investments, built a dedicated team and have been redesigning Crosscheck to improve how the system operates.
Ryan Knutson: The documents show that Facebook is trying to eliminate the practice of white listing and set a goal of eliminating total immunity for so-called high severity violations of Facebook’s rules in the first half of 2021.
So, what does this particular story say about Facebook itself as a whole?
Jeff Horwitz: In some ways Facebook was really visionary in terms of understanding the future and building something that was kind of a product for a new age of the internet. In other respects, it was exceedingly shortsighted in how it built this thing. And in the case of Crosscheck, right, there was no doubt inside the company, this was unacceptable, but that’s nonetheless what they built
Ryan Knutson: Over the next several weeks we’ll publish more installments of this series in our feed. And tomorrow part two of the Facebook files
Speaker 13: Internally, there’s this growing body of evidence that for many teens and in particular, teen girls, Instagram can be toxic.
Ryan Knutson: The series is part of the journal podcast, a co-production of Gimlet and The Wall Street Journal. Your hosts are, Kate Linebaugh and me, Ryan Knutson. The series was produced by Pia Gadkari, Max Green and Martin Kessler, with production help from Enrique Perez de la Rosa. This episode was edited by Catherine Brewer, Jarrard Cole and Annie-Rose Strasser. Special thanks to Colin Campbell, Anthony Galloway, Mitchell Pacelli Falana Patterson, Lydia Polgreen, Brad Reagan and Matthew Rose. Our engineer is Griffin Tanner. Our theme music is by So Wylie and remixed by Peter Lehner, additional music in this episode from blue dot sessions and audio network, fact checking by Nicole Pasulka, also thanks to the whole journal team. Priscilla Alabi, Sam Bear, Annie Minoff, Laura Morris (inaudible) Rikki Novetsky, Sarah Platt, Willa Rubin, Matthew Sherman, Matthew Schiltz, and Nathan Singapard
Thanks for listening. See you tomorrow.