Menu

How YouTube leads viewers down a rabbit hole of extremism

Two related articles about YouTube caught my eye over the past few days. The first, Zeynep Tufekci’s YouTube, the Great Radicalizer explains how YouTube’s algorithms almost always lead people to conspiracy theory videos:

It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century. […]

What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.

This is bad enough, but then there’s James Cook’s article YouTube suggested conspiracy videos to children using its Kids app, in which he explains how not even the YouTube Kids app is immune to this:

YouTube’s app specifically for children is meant to filter out adult content and provide a “world of learning and fun,” but Business Insider found that YouTube Kids featured many conspiracy theory videos which make claims that the world is flat, that the moon landing was faked, and that the planet is ruled by reptile-human hybrids.

I try not to be too quick to call technology evil, but this is definitely not a “all technology is neutral” situation. Product managers and developers have the power to stop this kind of escalation from happening.