Forward of the 2020 election, Fb applied safeguards to guard towards the unfold of misinformation by prioritizing security over progress and engagement. It rolled again these defenses after the election, permitting right-wing conspiratorial content material to fester within the weeks main as much as the January 6 riot on the U.S. Capitol, in keeping with a whistleblower. 


© Thomas Trutschel/Photothek through Getty

Frances Haugen, a former Fb worker, filed not less than eight separate complaints with the Securities and Alternate Fee, alleging that the social community “misled buyers and the general public about its function perpetuating misinformation and violent extremism regarding the 2020 election and January sixth rebel,” together with eradicating “security programs” put in place forward of the 2020 election. 

Fb selected revenue over security, whistleblower says



“And as quickly because the election was over, they turned them again off or they modified the settings again to what they have been earlier than, to prioritize progress over security,” Haugen mentioned in an interview with “60 Minutes” correspondent Scott Pelley. 


Load Error

Fb disputes that and says it maintained vital safeguards, including in a press release that it has “expressly disclosed to buyers” the chance of misinformation and extremism occurring on the platform stays. 

In 2019, a yr after Fb modified its algorithm to encourage engagement, its personal researchers recognized an issue, in keeping with inner firm paperwork obtained from the supply.

The corporate arrange a faux Fb account, beneath the identify “Carol,” as a check and adopted then-President Trump, first girl Melania Trump and Fox Information. Inside in the future, the algorithm beneficial polarizing content material. The following day, it beneficial conspiracy principle content material, and in lower than every week, the account obtained a QAnon suggestion, the interior paperwork mentioned.

By the second week, the faux account’s Information Feed was “comprised by and enormous” with deceptive or false content material. Within the third week, “the account’s Information Feed is an intensifying mixture of misinformation, deceptive and recycled content material, polarizing memes, and conspiracy content material, interspersed with occasional engagement bait,” the interior paperwork mentioned.

Fb says it used analysis just like the check account to make security enhancements and in its determination to ban QAnon. The corporate added that the quantity of hate speech customers truly encounter has declined in every of the final 5 quarters.

Whereas talking to “60 Minutes,” Haugen defined how the polarizing content material reaches customers.

“There have been a lotta individuals who have been indignant, fearful. So, they unfold these teams to extra folks. After which after they had to decide on which content material from these teams to place into folks’s Information Feed, they picked the content material that was probably to be engaged with, which occurred to be indignant, hateful content material. And so, think about you are seein’ in your Information Feed every single day the election was stolen, the election was stolen, the election was stolen. At what level would you storm the Capitol, proper?” Haugen mentioned.

“And you’ll say, ‘How did that occur?’ Proper? Like, ‘Why are we taking these extremely out-there matters? QAnon, proper, loopy conspiracies. Why are these the issues that Fb is selecting to point out you?’ And it is as a result of these issues get the best engagement,” Haugen mentioned, evaluating it to “gasoline on a hearth.”

Haugen is testifying to the Senate Commerce Committee on Tuesday. “Fb, time and again, has proven it chooses revenue over security,” she mentioned. 

Fb assertion on suggestion it has mislead the general public and buyers: 

“As is clear from the information and our quite a few public statements over the previous a number of years, Fb has confronted problems with misinformation, hate speech, and extremism and continues to aggressively fight it.  Unsurprisingly, we expressly speak in confidence to buyers that these dangers have and do and should sooner or later happen on our platform.” 

Declare eradicating security programs after the 2020 election allowed divisive content material to unfold:  

“In phasing in after which adjusting further measures earlier than, throughout and after the election, we took under consideration particular on-platforms alerts and data from our ongoing, common engagement with regulation enforcement. When these alerts modified, so did the measures. It’s unsuitable to say that these steps have been the explanation for January sixth — the measures we did want remained in place by way of February, and a few like not recommending new, civic, or political teams stay in place to this present day. These have been all a part of a for much longer and bigger technique to guard the election on our platform — and we’re happy with that work.”  

Carol’s (the faux Fb account) journey:   

“Whereas this was a examine of 1 hypothetical person, it’s a good instance of analysis the corporate does to enhance our programs and helped inform our determination to take away QAnon from the platform.”  

Position in January sixth: 

“The notion that the January 6 rebel wouldn’t have occurred however for Fb is absurd. The previous President of america pushed a story that the election was stolen, together with in particular person a brief distance from the Capitol constructing that day. The duty for the violence that occurred on January 6 lies with those that attacked our Capitol and people who inspired them. We’ve a protracted monitor file of efficient cooperation with regulation enforcement, together with the businesses answerable for addressing threats of home terrorism.” – FB spokesperson

Inside FB analysis that discovered solely 3-5% of hate speech and fewer than 1% of Violence/ITV speech prompts motion from the platform: 

“When combating hate speech on Fb, bringing down the quantity of hate speech is the aim. The prevalence of hate speech on Fb is now 0.05 p.c of content material seen and is down by virtually 50 p.c within the final three quarters, information which can be regrettably being glossed over. We report these figures publicly 4 instances a yr and are even opening up our books to an impartial auditor to validate our outcomes. That is probably the most complete, refined and clear effort to take away hate speech of any main client expertise firm; and there’s not an in depth second.”   

Proceed Studying