YouTube Plans Changes to Curb Conspiracy, “Clickbaity” Videos

Technology

Video portal YouTube has announced changes that will make it harder for videos espousing conspiracy theories and other subjects meant to fool viewers into clicking on them, to be found the company announced last month.

“When recommendations are at their best, they help users find a new song to fall in love with, discover their next favorite creator, or learn that great paella recipe. That’s why we update our recommendations system all the time—we want to make sure we’re suggesting videos that people actually want to watch,” the company wrote in a blog post.

“To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”

The company stressed that the changes won’t impact the availability of videos on the site – as long as they are in compliance with Community Guidelines. Instead, only that recommendations of such videos will be limited.

The company says the change will be gradual and will only affect videos in the United States. The changes will roll out to more countries as the programs that generate recommendations to viewers become more accurate.

YouTube has been under pressure for some time to change the criteria it uses to recommend videos to its users. Former employees have heavily criticized the tech giant for prioritizing the length of time users spend on the site watching videos, at the expense of the information – sometimes harmful – that those videos promulgate.

“YouTube is something that looks like reality, but it is distorted to make you spend more time online,” former YouTube engineer Guillaume Chaslot says. “The recommendation algorithm is not optimizing for what is truthful, or balanced, or healthy for democracy.”

“Watch time was the priority,” Chaslot says. “Everything else was considered a distraction.”

Chaslot says he was fired in 2013 after he tried several times to get the site to stop creating what he called “filter bubbles,” video-recommendations that only reinforce people’s existing views. He went on to found Algotransparency.org, a website he dedicated to exploring bias for extreme, divisive content in YouTube’s algorithm.

YouTube says Chaslot’s termination was performance-related.

One of Chaslot’s biggest finds is that YouTube’s algorithm may have not been neutral during the 2016 Presidential election. Chaslot claims the algorithm predominantly drove viewers towards negative-Hillary-Clinton content and positive content on President Donald Trump.

Chaslot says however, that the inherent bias on YouTube’s algorithm is more towards division rather than in favor of a particular candidate or candidates.

“Analyzing the hundreds of recommendations led to observing numerous type of bias: asymmetry of recommendations towards candidates, fake news, anti-media resentment, etc…” Chaslot said.

“The most prevalent bias seemed to me that recommendations were divisive. The asymmetry towards one candidate might be explained by the fact that he was more divisive than his opponent.”

YouTube’s algorithm is a secret but Chaslot says the most important aspect of the formula is the twenty or so videos that get served up in the recommended, “watch next” section. The videos are meant to get viewers to remain on the website longer. The problem is that the information in the videos very often represent conspiracy theories as well as fringe opinions.

The company says that last month’s initiative was only one in a series it plans to implement to address the issue. “It’s just another step in an ongoing process, but it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube,” the company said.

Photo by Andrew Perry via Flickr

Join the discussion