Facebook bungled efforts to curb hate speech ahead of Capitol attack

Capture investment opportunities created by megatrends

Facebook bungled efforts to curb hate speech ahead of Capitol attack

23 October 2021 Technology & Digitalization 0

Internal documents have revealed Facebook’s bungled attempt to curb an explosion of hate speech and misinformation ahead of and during the January 6 Capitol riots, causing distress among its employees.

The revelations come even as in mid-January, Sheryl Sandberg, Facebook’s chief operating officer, downplayed the notion that the social media network played a big part in the events leading to the storming of the US Capitol, arguing that it was “largely” organised on other platforms.

The documents were disclosed to US regulators and provided to Congress in redacted form by Facebook whistleblower Frances Haugen’s legal counsel. A consortium of news organisations, including the Financial Times, has obtained the redacted versions received by Congress.

They are just some of thousands of memos from within Facebook expected to be reported on in the coming days, as the company faces its biggest public relations crisis since the Cambridge Analytica scandal.

Earlier this month, Haugen testified in Congress that Facebook routinely chooses profits over the safety of its users and downplays the harm it can cause to society. She will also appear before the UK parliament on Monday shortly before the company reports its third-quarter earnings.

During the 2020 election, Facebook implemented safeguards to diminish the visibility of content that “delegitimised” the process and promoted violence. These were turned off after the November 3 vote but on January 6, the company scrambled to “break the glass” and turn emergency measures back on again.

Some may have come too little, too late, however. By 2pm on January 6, as rioters began wrangling with police on the steps of the Capitol building and breaking through the doors and windows of the US Senate, Facebook still had not turned on certain proposed measures, one document showed.

This included preventing groups and pages on the social network from changing their names to terms such as “Stop the Steal”, a movement that ballooned through the wielding of mass group invites by a core band of connected, conspiracy-minded users. 

Facebook points out that it did turn off some of these “levers”, such as limiting live videos that its automated systems flagged as election related.

But in a complaint to the US Securities and Exchange Commission, Haugen argued that some measures were reinstated “only after the insurrection flared up”. 

Another Facebook document, first reported by the news site BuzzFeed, shows that groups such as Stop the Steal were able to achieve “meteoric growth rates”. Some 30 per cent of invites came from just 0.3 per cent of accounts, such as those of the far-right activists Ali Alexander and Amy Kremer. 

Several documents showed employees raising concerns over the failure to flag and prevent co-ordinated efforts such as Stop the Steal. “After the Capitol insurrection we realised that the individual delegitimising groups, pages and slogans did constitute a cohesive movement” read one. Enforcement was “piecemeal” and “lacked a single source of truth”, researchers lamented.

But in another assessment in the aftermath of the riots, one memo pointed out that, regardless of identifying dangerous narratives, there were also delays in building and implementing the emergency measures to combat misinformation, in part as employees waited for approval from Facebook’s policy team.

Some Facebook staffers, meanwhile, were up in arms. In a discussion on Facebook’s internal Workplace network with Mike Schroepfer, chief technology officer, after the riot, one employee said: “I’m struggling to match my employment here. I came here hoping to affect change but all I’ve seen is atrophy and abdication of responsibility. I’m tired of platitudes.” Another said: “We’ve been ‘hanging in there’ for years. We must demand more action from leaders.”

“We spent more than two years preparing for the 2020 election with massive investments,” said Joe Osborne, a Facebook spokesperson.

“In phasing in and then adjusting additional measures before, during and after the election, we took into account specific on-platforms signals and information from our ongoing, regular engagement with law enforcement. When those signals changed, so did the measures,” he said, adding that it was “wrong to claim” that these steps were to blame for January 6.

Daily newsletter

#techFT brings you news, comment and analysis on the big companies, technologies and issues shaping this fastest moving of sectors from specialists based around the world. Click here to get #techFT in your inbox.