YouTube’s Algorithm Doesn’t Care if You ‘Thumbs Down’ Videos

A photo of a screen on YouTube with the mouse hovering over the dislike button.

YouTube has already stopped videos from displaying the number of dislikes it’s received, but apparently giving a video a thumbs down doesn’t change how many similar videos the platform recommends you.
Photo: Wachiwit (Shutterstock)

My YouTube recommendations are full of old reruns of Gordon Ramsay’s Kitchen Nightmares. It might be partly my mistake for getting drunk one night and watching a full episode. Let me tell you, if there’s one thing I don’t want anymore on my feed it’s the famous blowhard Brit tearing down another chef while the world’s most obnoxious sound effects (braaa-reeeee) shuffle through in the background. I’ve disliked plenty of these videos, but now I’ve got Hell’s Kitchen showing up on my page, and I’m feeling more and more like a “raw” steak that Ramsay is prodding and berating.

But apparently I’m not alone with my YouTube recommendation woes. A report from the Mozilla Foundation released Monday claims, based on a survey and crowdsourced data, that the “dislike”And“ don’t recommend channel ”feedback tools do not actually change video recommendations.

Well, there’s two points here. One is users constantly feel like the controls Google-owned YouTube provides don’t actually make a difference. Two, based on data gleaned from users, that the controls offer a “negligible” impact on recommendations meaning “most unwanted videos still slip through.”

The foundation relied on data from its own RegretsReporter browser plugin tool that lets users block select YouTube videos from appearing on their feed. The report says it based its analysis on close 2,757 survey respondents and 22,722 people who allowed Mozilla access to more than 567 million video recommendations taken from the tail end of 2021 to June, 2022.

Though the researchers admit the survey respondents are not a representative sample of YouTube’s vast and diverse audiences, a third of those surveyed said that using YouTube’s controls didn’t seem to change their video recommendations at all. One user told Mozilla they would report videos as misleading or spam and they would be back in their feed later on. Respondents often said blocking one channel would only lead to recommendations from similar channels.

YouTube’s algorithm recommends users videos they don’t want to see, and it’s often worse than just old Ramsay cable. A 2021 report by Mozilla, again based on crowdsourced user data, claimed that folks surfing the video platform are regularly being recommended violent content, hate speech, and political misinformation.

In this latest report, Mozilla researchers found that pairs of videos including those users rejected, like a Tucker Carlson screed, would just result in another video from the Fox News YouTube channel being recommended. Based on a review of 40,000 video pairs, often when one channel is blocked the algorithm would simply recommend very similar videos from similar channels. Using the “Dislike” or “Not interested” buttons only prevented 12% and 11% of unwanted recommendations, respectively, compared to a control group. Using the “don’t recommend channel” and “remove from watch history” buttons were more effective at correcting users’ feeds, but only by 43% and 29%, respectively.

“In our analysis of the data, we determined that YouTube’s user control mechanisms are inadequate as tools to prevent unwanted recommendations,” Mozilla researchers wrote in their study.

YouTube spokesperson Elena Hernandez told Gizmodo in an email statement that “Our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, like creating echo chambers.” The company has said they don’t prevent all content from related topics from being recommended, but they also claim to push “authoritative” content while suppressing “borderline” videos that come close to violating content moderation policies.

In a 2021 blog post, Cristos Goodrow — YouTube’s VP of Engineering — wrote their system is “constantly evolving” but that providing transparency on their algorithm “isn’t as simple as listing a formula for recommendations” since their systems take into account clicks, watch time, survey responses, sharing, likes, and dislikes.

Of course, just like every social media platform out there, YouTube has struggled to create systems that can fight the whole breadth of bad or even predatory content being uploaded to the site. One upcoming book shared exclusively with Gizmodo said that YouTube came close to yanking billions of dollars in ad revenue to deal with the strange and disturbing videos being recommended to kids.

While Hernandez claimed the company has expanded its data APIthe spokesperson added “Mozilla’s report doesn’t take into account how our systems actually work, and therefore it’s difficult for us to glean many insights.”

But this is a critique that Mozilla also puts at Google’s feet, saying that the company does not provide enough access to let researchers assess what impacts YouTube’s secret sauce, AKA their algorithms.

.

Leave a Comment