Press "Enter" to skip to content

Ex-YouTube employee reveals how the site’s recommendation AI creates a ‘toxic’ cycle to promote extreme or inappropriate content

YouTube has found itself embroiled in more than a few controversies in recent years thanks to its ‘Recommended for You’ feature, which has been criticized for promoting violent and extremist content and, most recently, inappropriate videos of children.

It’s a problem that the firm has been scrambling to correct – but also one that could have been anticipated, according to an engineer who worked on the system.

In an op-ed for Wired, former Google employee Guillaume Chaslot says the root of the issue lies in the design of the recommendation algorithm itself.

The system, Chaslot explains, is built to predict and curate content geared toward the user’s specific interests – be those innocent or nefarious – and gets better and better at its job the more you engage with it.

Breaking News: