YouTube’s Recommendations Seem to Favor Fringe Content

Technology

It starts with an innocent search of the day’s headlines on YouTube. While things seem normal at first, the site eventually pulls you further and further away from mainstream content and more towards fringe content designed, above all, to keep you watching. Before you know it, you have fallen down a rabbit hole full of videos about anti-vaxxers, flat-earthers and other conspiracy theorists who claim that the government is behind the latest mass shooting, for example.

“YouTube is something that looks like reality, but it is distorted to make you spend more time online,” former YouTube engineer Guillaume Chaslot says. “The recommendation algorithm is not optimizing for what is truthful, or balanced, or healthy for democracy.”

“Watch time was the priority,” Chaslot says. “Everything else was considered a distraction.”

Chaslot says he tried multiple times to get the platform to stop creating what he called “filter bubbles,” video-recommendations that only reinforce people’s existing views. Chaslot was fired in 2013 over what the company says were performance issues. He says it was his agitation over trying to get YouTube to change the way they did things that caused his termination.

He went on to found Algotransparency.org, a website, Chaslot says, dedicated to exploring bias for extreme, divisive content in YouTube’s algorithm.

One of Chaslot’s biggest finds is that YouTube’s algorithm may have not been neutral during the 2016 Presidential election between Donald Trump and Hillary Clinton. Chaslot claims the algorithm predominantly drove viewers towards negative-Hillary-Clinton content and positive content on Trump.

According to Chaslot’s conclusion, “We have examined the network of YouTube A.I. recommendations and seen that YouTube’s A.I. mostly recommended Trump before the election.”

Favoring of Trump was not the only bias revealed Chaslot says. “Analyzing the hundreds of recommendations led to observing numerous type of bias: asymmetry of recommendations towards candidates, fake news, anti-media resentment, etc… The most prevalent bias seemed to me that recommendations were divisive. The asymmetry towards one candidate might be explained by the fact that he was more divisive than his opponent,” he went on to state.

Not a lot of information is known about YouTube’s algorithm, but Guillaume claims that the most important aspect of it is the twenty or so videos that get served up in the recommended, “watch next” section. The videos are meant to entice viewers and keep them on the website longer. The problem is that the information in the videos very often represent left- or right-wing fringe opinions.

YouTube CEO Susan Wojcicki, responding to criticism of the platform’s algorithm in an interview, acknowledged that some content creators have tried to take advantage of YouTube’s formula. “Some have tried to take advantage of our services,” she said. “It’s incredibly important to me and to everyone at YouTube that we grow responsibly. There isn’t a playbook for how open platforms operate at our scale.”

She also said that the focus for the company this year was “violative content,” and dedicating a team of 10,000 employees by the end of the year to address it. Another change to the platform the CEO touted was that videos advertisements are placed on will now be reviewed by humans to ensure the videos followed the guidelines of the platform’s community policy.

“This may signal that major brands believe that YouTube’s significant efforts and investment to curb content violators are working,” marketing intelligence firm MediaRadar said in a statement.

Photo by Rego Korosi via Flickr

Join the discussion