Episode details

Available for over a year
Over 500 hours of video are posted on YouTube every minute. Over 4 million photos are uploaded to Instagram every hour. There are around 500 million posts to X (formerly Twitter) every single day. These numbers are growing by the second. How do you even begin to monitor and police such a relentless avalanche of information? In this new series, Zoe Kleinman journeys into the world of the online content moderators. Big social media platforms rely on automation for much of the work, but they also need an army of human moderators to screen out the content that is harmful. Many moderators spend their days looking at graphic imagery, including footage of killings, war zones, torture and self-harm. We hear many stories about what happens when this content falls through the net, but we don’t hear much about the people trying to contain it. This is their story. The battle against harmful online content is hitting the headlines more every day, even as AI moderation gathers pace. Ironically it needs moderation itself. In the second episode of this series, a former Facebook content moderator reveals the impact this work can have on the mental health of employees. Zoe hears what the day to day life of a moderator is like and the challenges of working out what to keep and what to remove. And she finds out from former moderators and Silicon Valley reporters how this tech landscape is changing today. Presenter: Zoe Kleinman Producer: Tom Woolfenden Assistant Producer: Reuben Huxtable Executive Producer: Rosamund Jones Sound Designer: Dan King Series Editor: Kirsten Lass A Loftus Media production for BBC Radio 4
Programme Website