Social media chiefs on defensive at US online child protection hearing

Capture investment opportunities created by megatrends

Social media chiefs on defensive at US online child protection hearing

31 January 2024 Technology & Digitalization 0

Unlock the Editor’s Digest for free

The chief executives of five of the top social media platforms, including Meta’s Mark Zuckerberg and Linda Yaccarino of X, were lambasted at a tense and at times emotional hearing in Congress over what lawmakers described as a failure to protect children online. 

The hearing before the Senate judiciary committee — which also featured TikTok’s Shou Zi Chew, Snap’s Evan Spiegel and Discord’s Jason Citron as witnesses — highlighted growing bipartisan concerns that the social media platforms exposed younger users to child predators and provided a marketplace for child pornography.

The committee began by showing an evocative video of victims speaking about sexual abuse on the platforms. The parents of children who had died by suicide after being exploited online also featured.

“Mr Zuckerberg, you and the companies before us — I know you don’t mean it to be so — but you have blood on your hands,” said Lindsey Graham, the South Carolina senator and the committee’s top Republican, in his opening remarks, prompting applause from the crowd in the room. “You have a product that’s killing people.”

Senator Dick Durbin, the committee’s Democratic chair, pointed to statistics showing a rise in sharing of child sexual abuse material online as well as an increase in reports of the “sextortion” of children. “This disturbing growth and child sexual exploitation is driven by one thing: changes in technology,” he said.

Several proposed pieces of federal legislation are in train that target the Silicon Valley groups, such as the controversial Kids Online Safety Act, which requires platforms to protect children from online harms.

The Senate and the House have so far failed to find consensus on the precise measures that should be taken. Bills such as the Kids Online Safety Act have faced pushback from technology platforms and the trade groups that represent them. 

During the hearing, the executives remained largely cautious about the series of current proposals. In his opening testimony, Meta’s Zuckerberg called for lawmakers instead to mandate regulation requiring Apple and Google app stores to verify the age of younger users. He reiterated the long-standing assertion that the platform had introduced numerous tools and features to protect children. 

In her opening statement, Yaccarino insisted that X, formerly known as Twitter, was “not the platform of choice for children and teens” and “does not have a line of business dedicated to children”. 

Yaccarino, who took the helm of the Elon Musk-owned platform last year, also called for more “collaboration” as advancements in artificial intelligence technology improved offenders’ tactics and capabilities. “You have my commitment that X will be part of the solution,” she said. 

But she also stated that the Kids Online Safety Act “should advance, and we will continue to engage on it to ensure it protects free speech”. 

TikTok’s Chew avoided mention of any specific legislation in his opening statements, but said that the company expected to invest more than $2bn in trust and safety efforts this year.

In a testy exchange with Citron, Graham asked if the gaming-focused chat messaging group supported the various proposed pieces of legislation, such as the Stop CSAM [Child Sexual Abuse Material] Act, one by one. Citron avoided answering in the affirmative, prompting Graham to state: “If you’re waiting on these guys to solve the problem, we’re going to die waiting.”

Ahead of the sessions, lawmakers released internal documents and emails showing that Zuckerberg had rejected requests from global affairs head Nick Clegg in 2021 to increase staff levels to bolster its efforts on child safety.

Meta said in a statement that the emails showed requests to expand existing wellbeing teams, adding: “The cherry-picked documents do not provide the full context of how the company operates or what decisions were made.”

Meta has been singled out recently, with the US state of New Mexico filing a lawsuit in December arguing the platform failed to remove child sexual abuse material from its platforms and was a “prime location for predators”. The accusations followed an undercover months-long investigation in which the attorney-general created “decoy accounts” posing as children aged 14 and under. 

A Wall Street Journal investigation also found its algorithms facilitated the creation of a network to buy and sell underage sex content. Meta said at the time that it had improved its proactive detection of potentially suspicious groups. 

X, meanwhile, faced fresh scrutiny over the weekend after it was forced to block searches for Taylor Swift when sexually explicit images of the pop star that were created using artificial intelligence proliferated on the platform.