"The Facebook Files"
CountryUnited States
LanguageEnglish
SeriesThe Facebook Files
Genre(s)Long-form journalism
Published inThe Wall Street Journal
Publication typeDigital journalism
PublisherDow Jones & Company
Media type
Publication dateSeptember 2021

The Facebook Files: A Wall Street Journal Investigation is a series of news reports by The Wall Street Journal, first published in mid September 2021, based on internal documents from Facebook Inc. (now Meta Platforms), leaked by whistleblower Frances Haugen.

The series included reports that based on internally commissioned studies, the company then known as Facebook was fully aware of negative impact on teenage users of Instagram, a contributing factor of Facebook the service to violence in developing countries, as well as the impact of the company's platforms on spreading false information, and promoting anger-provoking posts.

In October 2021, a consortium of news outlets began publishing a series known as the Facebook Papers, based on the leaked documents and additional information.

Background

There were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.

Whistleblower Frances Haugen on 60 Minutes, October 3, 2021

In mid September 2021, The Wall Street Journal began publishing articles on Facebook based on internal documents from unknown provenance. Revelations included reporting of special allowances on posts from high-profile users ("XCheck"), subdued responses to flagged information on human traffickers and drug cartels, a shareholder lawsuit concerning the cost of Facebook CEO Mark Zuckerberg's personal liability protection in resolving the Cambridge Analytica data scandal, an initiative to increase pro-Facebook news within user news feeds, and internal knowledge of how Instagram exacerbated negative self-image in surveyed teenage girls.[1]

Siva Vaidhyanathan writes for The Guardian that the documents are from a team at Facebook "devoted to social science and data analytics that is supposed to help the company's leaders understand the consequences of their policies and technological designs."[2] Casey Newton of The Verge wrote that it is the company's biggest challenge since its Cambridge Analytica data scandal.[3]

The leaked documents include internal research from Facebook that studied the impact of Instagram on teenage mental health.[4] Although Facebook earlier claimed that its rules applies equally to everyone on the platform, internal documents shared with The Wall Street Journal point to special policy exceptions reserved for VIP users, including celebrities and politicians.[5] After this reporting, Facebook's oversight board said it would review the system.[6][7]

The former Facebook employee behind the leak, Frances Haugen, revealed her identity on 60 Minutes on October 3, 2021.[8]

The series

Beginning October 22, a group of news outlets began publishing articles based on documents provided by Haugen's lawyers, collectively referred to as "The Facebook Papers".[9][10]

Instagram's harmful effects on teenagers

The Files show that Facebook has been conducting internal research of how Instagram affects young users for the past three years. While the findings point to Instagram being harmful to a large portion of young users, teenage girls were among the most harmed. Researchers within the company reported that "we make body issues worse for one in three teenage girls". Furthermore, internal research revealed that teen boys were also affected by negative social comparison, citing 14% of boys in the US in 2019.[11] Instagram was concluded to contribute to problems more specific to its app use, such as social comparison among teens.[12]

Violence in developing countries

An internal memo seen by the Washington Post reveal that Facebook has been aware of hate speech and calls for violence against groups like Muslims and Kashmiris, including posts of photos of piles of dead Kashmiri bodies with glorifying captions on its platform in India, still, none of their publishers were blocked.[13][14] Documents reveal Facebook has responded to these incidents by removing posts which violate their policy but haven't made any substantial efforts to prevent repeat offenses. As 90% of monthly Facebook users are now located outside of the US and Canada, Facebook claims language barriers are one obstacle that is preventing widespread reform.

Controlling falsehoods about the U.S. elections

The New York Times points internal discussions where employees raised that Facebook was spreading content about the QAnon conspiracy theory more than a year before the 2020 United States elections. After the election, a data scientist mentioned in an internal note that 10 percent of all U.S. views of political content were of posts alleging that the election was fraudulent.[15]

Promoting anger-provoking posts

In 2015, in addition to the Like button on posts, Facebook introduced a set of other emotional reaction options: love, haha, yay, wow, sad and angry.[16] The Washington Post reported that for three years, Facebook's algorithms promoted posts that received the 'angry' reaction from its users, based on internal analysis showing that such posts lead to five times more engagement than posts with regular likes. Years later, Facebook's researchers pointed out that posts with 'angry' reactions were much more likely to be toxic, polarizing, fake or low quality.[17]

In 2018, Facebook overhauled its News Feed algorithm, implementing a new algorithm which favored "Meaningful Social Interations" or "MSI". The new algorithm increased the weight of reshared material - a move which aimed to "reverse the decline in comments and encourage more original posting". While the algorithm was successful in its efforts, consequences such as user reports of feed quality decreasing along with increased anger on the site were observed. Leaked documents reveal that employees presented several potential changes to fix some of the highlighted issues with their algorithm. However, documents claim Mark Zuckerberg denied the proposed changes due to his worry that they might cause less users to engage with Facebook. Documents have also pointed to another 2019 study conducted by Facebook where a fake account based in India was created and studied to see what type of content it was presented and interacted with. Results of the study showed that within three weeks, the fake account's newsfeed was being presented pornography and "filled with polarizing and graphic content, hate speech and misinformation", according to an internal company report. [18]

Employee dissatisfaction

Politico quotes several Facebook staff expressing concerns about the company's willingness and ability to respond to damages caused by the platform. A 2020 post reads: "It’s not normal for a large number of people in the 'make the site safe' team to leave saying, 'hey, we're actively making the world worse FYI.' Every time this gets raised it gets shrugged off with 'hey people change jobs all the time' but this is NOT normal."[19]

Apple's threat to remove Facebook and Instagram

In 2019, following concerns about Facebook and Instagram being used to trade maids in the Middle East, Apple threatened to remove their iOS apps from the App Store.[20]

XCheck

The documents have shown a private program known as "XCheck" or "cross-check" that Facebook has employed in order to whitelist posts from users deemed as "high-profile". The system began as a quality control measure but has since grown to protect "millions of VIP users from the company's normal enforcement process". XCheck has led to celebrities and other public figures being exempt from punishment that the average Facebook user would receive from violating policies. In 2019, football player Neymar had posted nude photos of a woman who had accused him of rape which were left up for more than a day. According to The Wall Street Journal, "XCheck grew to include at least 5.8 million users in 2020" according to Facebook's internal documents.[21]The goal of XCheck was "to never publicly tangle with anyone who is influential enough to do you harm".[22]

Collaboration on censorship with the government of Vietnam

In 2020, Vietnam's communist government has threatened to shut down Facebook if the social media company doesn't co-operate on censoring political content in the country, Meta's (then known as Facebook) biggest market in the region.[23] The decision to comply was personally approved by Mark Zuckerberg.[24][25]

Suppression of harmful political movements on its platform

In 2021, Facebook developed a new strategy for addressing harmful content on their site, implement measures which were designed to reduce and suppress the spread of movements that were deemed hateful. According to a senior security official at Facebook, the company "would seek to disrupt on-platform movements only if there was compelling evidence that they were the product of tightly knit circles of users connected to real-world violence or other harm and committed to violating Facebook’s rules". As part of their recently coordinated initiative, this included less promotion of the movement's posts within users' News Feed as well as not notifying users of new posts from these pages. Specific groups that have been highlighted as being affected by Facebook's social harm policy include the Patriot Party, previously linked to the Capitol attack, as well as a newer German conspiracy group known as Querdenken, who had been placed under surveillance by German intelligence after protests it organized repeatedly “resulted in violence and injuries to the police”.[26]

Facebook's AI concern

According to the Wall Street Journal, documents show that in 2019, Facebook reduced the time spent by human reviewers on hate-speech complaints, shifting towards a stronger dependence on their artificial intelligence systems regulate the matter. However, internal documents from employees claim that their AI has been largely unsuccessful, seeing trouble detecting videos of cars crashing, cockfighting, as well as understanding hate-speech in foreign languages. [27] Internal engineers and researchers within Facebook have estimated that their AI has only been able to detect and remove 0.6% of "all content that violated Facebook’s policies against violence and incitement".

Facebook's response

In the Q3 2021 earnings call, Facebook CEO Mark Zuckerberg discussed the recent leaks, characterizing them as coordinated efforts to paint a false picture of his company by selectively leaking documents.[28]

According to a leaked internal email seen by New York Times, Facebook asked its employees to “preserve internal documents and communications since 2016”, a practice called a legal hold. The email continues: “As is often the case following this kind of reporting, a number of inquiries from governments and legislative bodies have been launched into the company’s operations.”[29]

See also

References

  1. ^ "Facebook Files: 5 things leaked documents reveal". BBC News. September 24, 2021.
  2. ^ Vaidhyanathan, Siva (October 8, 2021). "Facebook has just suffered its most devastating PR catastrophe yet". The Guardian. Retrieved October 8, 2021.
  3. ^ Newton, Casey (September 28, 2021). "Why Facebook should release the Facebook Files". The Verge. Retrieved October 4, 2021.
  4. ^ Gayle, Damien (September 14, 2021). "Facebook aware of Instagram's harmful effect on teenage girls, leak reveals". The Guardian. Retrieved October 10, 2021.
  5. ^ Horwitz, Jeff (September 13, 2021). "Facebook Says Its Rules Apply to All. Company Documents Reveal a Secret Elite That's Exempt". Wall Street Journal.
  6. ^ "Facebook oversight board reviewing 'XCheck' system for VIPs". Associated Press. September 22, 2021. Retrieved October 28, 2021.
  7. ^ "Facebook oversight board reviewing 'XCheck' system for VIPs". Associated Press. September 21, 2021.
  8. ^ Ghaffary, Shirin (October 3, 2021). "Why this Facebook scandal is different". Vox. Retrieved October 24, 2021.
  9. ^ Danner, Chas (October 23, 2021). "What Was Leaked in the Facebook Papers?". Intelligencer. Retrieved October 24, 2021.
  10. ^ Varnham O'Regan, Sylvia; Di Stefano, Mark (October 22, 2021). "New Facebook Storm Nears as CNN, Fox Business and Other Outlets Team Up on Whistleblower Docs". The Information. Retrieved October 25, 2021.
  11. ^ "Facebook documents show how toxic Instagram is for teens, Wall Street Journal reports". CNBC. September 14, 2021.
  12. ^ "Facebook Knows Instagram is Toxic for Teen Girls, Company Documents Show". Wall Street Journal. September 14, 2021.
  13. ^ Cat Zakrzewski; Gerrit De Vynck; Niha Masih; Shibani Mahtani (October 24, 2021). "How Facebook neglected the rest of the world, fueling hate speech and violence in India". The Washington Post. Retrieved October 29, 2021.
  14. ^ https://www.washingtonpost.com/technology/2021/10/24/india-facebook-misinformation-hate-speech/
  15. ^ Ryan Mac; Sheera Frenkel (October 22, 2021). "Internal Alarm, Public Shrugs: Facebook's Employees Dissect Its Election Role". New York Times. Retrieved October 29, 2021.
  16. ^ "There is a specific sociological reason why Facebook introduced its new emoji 'reactions'". Business Insider.
  17. ^ https://www.washingtonpost.com/technology/2021/10/26/facebook-angry-emoji-algorithm/
  18. ^ https://www.bloomberg.com/news/articles/2021-10-23/how-facebook-s-algorithm-led-a-new-india-user-to-fake-news-violence?sref=X1c60Hpu
  19. ^ Hendel, John (October 25, 2021). "'This is NOT normal': Facebook employees vent their anguish". Politico. Retrieved October 29, 2021.
  20. ^ Jon Gambrell; Jim Gomez (October 25, 2021). "Apple once threatened Facebook ban over Mideast maid abuse". AP. Retrieved October 29, 2021.
  21. ^ Horwitz, Jeff (September 13, 2021). "Facebook Says Its Rules Apply to All. Company Documents Reveal a Secret Elite That's Exempt". Wall Street Journal.
  22. ^ Chappell, Bill (October 25, 2021). "The Facebook Papers: What you need to know about the trove of insider documents". NPR. Retrieved October 29, 2021.
  23. ^ "Exclusive: Vietnam threatens to shut down Facebook over censorship requests - source". Reuters. November 20, 2020.
  24. ^ https://www.washingtonpost.com/technology/2021/10/25/mark-zuckerberg-facebook-whistleblower/
  25. ^ "Opinion | Mark Zuckerberg is for free speech when it's convenient". MSNBC.
  26. ^ "Facebook Increasingly Suppresses Political Movements It Deems Dangerous". Wall Street Journal. October 22, 2021.
  27. ^ "Facebook Says AI Will Clean up the Platform. Its Own Engineers Have Doubts". Wall Street Journal. October 17, 2021.
  28. ^ https://s21.q4cdn.com/399680738/files/doc_financials/2021/q3/FB-Q3-2021-Earnings-Call-Transcript.pdf
  29. ^ Mac, Ryan; Isaac, Mike (October 27, 2021). "Facebook tells employees to preserve all communications for legal reasons". The New York Times.

Further reading