Pinterest admits site was not safe before death of British teen Molly Russell
A senior executive at Pinterest has admitted to a British inquest that the social media site was not safe at the time a 14-year-old girl took her own life after viewing harmful posts about self harm and suicide.
Jud Hoffman, Pinterest’s head of community operations, said it was “correct” that the social media site was not safe when Molly Russell took her life in November 2017, adding there was “clearly content that should have been removed that was not removed”.
He also told North London Coroner’s Court on Thursday that there was also “absolutely a risk that . . . content that violates our policies” remained on the site despite updates to make it safer. Asked by senior coroner Andrew Walker if Pinterest was safe today, Hoffman said it was “imperfect and we strive every day to make it safer and safer”.
Russell from Harrow, London, ended her life after consuming a large volume of content linked to self-harm, depression and suicide on social media sites including Pinterest and Meta-owned Instagram.
On the third day of the hearing, Russell’s father, Ian Russell, said social media algorithms had pushed his daughter towards graphic and disturbing posts and blamed tech sites for contributing to her death.
“I believe that social media helped kill my daughter and I believe that too much of that content is still there,” he said.
Giving evidence, Hoffman said “I deeply regret that she was able to access some of the content,” adding when pressed by barrister Oliver Sanders, KC, representing the Russell family, that he would not show some of the posts to his own children.
Hoffman said Pinterest’s guidance was “when in doubt, lean towards . . . lighter content moderation” at the time when Russell ended her life. He said Pinterest’s policy in 2017 was to hide images that “may be considered disturbing” but not remove them. The company only removed images or posts that promoted harmful behaviour or self-harm.
“It’s not what is in place now, but it’s what was in place at the time,” Hoffman said.
The inquest also heard on Thursday that Pinterest had recommended content to Molly Russell that included an image of a bloody razor. She was emailed “10 depression pins you might like” by the platform according to Sanders, and after her death continued to receive emails from Pinterest, including one entitled “new ideas for you in depression”.
The hearing is shining a light on the role social media algorithms play in pushing distressing content towards users who seek it out, and ignited debate about the need for better regulation of technology companies.
Hoffman said Pinterest’s community guidelines and enforcement tools were tightened in 2019 to proactively remove harmful content from the site, adding that the “technology that we have in place now was simply not available to us then”.
On Thursday morning, Ian Russell told the inquest that Pinterest had clearly made improvements to its content moderation, but argued that tech platforms needed to go much further to keep users safe. “If a 14-year old can find a way of locating that content . . . I find it really hard to believe that some of the most powerful global brands in the world . . . can’t find a way to . . . help prevent the content reaching vulnerable people,” he said.
Hoffman and Elizabeth Lagone, head of health and wellbeing at Meta, have been called to appear in person after the coroner said they were not allowed to testify via remote link.
The high-profile hearing comes as the passage through parliament of the online safety bill, which aims to compel internet companies to keep their platforms safe, has been paused. Liz Truss, the new prime minister, is said to be considering relaxing a controversial clause that would make platforms responsible for removing content that was “legal but harmful”, such as bullying.