Home / Blog / Tech / YouTube’s Dislike Button Rarely Shifts Video Recommendations, Researchers Say

Latest News

20 Sep
Tech
542 views
0 Comments

YouTube’s Dislike Button Rarely Shifts Video Recommendations, Researchers Say

For YouTube viewers dissatisfied with the videos the platform has recommended to them, pressing the “dislike” button may not make a big difference, according to a new research report.

YouTube has said users have numerous ways to indicate that they disapprove of content and do not want to watch similar videos. But, in a report published on Tuesday, researchers at the Mozilla Foundation said all of those controls were relatively ineffective. The result was that users continued receiving unwanted recommendations on YouTube, the world’s largest video site.

Researchers found that YouTube’s “dislike” button reduced similar, unwanted recommendations only 12 percent, according to their report, titled “Does This Button Work?” Pressing “Don’t recommend channel” was 43 percent effective in reducing unwanted recommendations, pressing “not interested” was 11 percent effective and removing a video from one’s watch history was 29 percent effective.

The researchers analyzed more than 567 million YouTube video recommendations with the help of 22,700 participants. They used a tool, RegretReporter, that Mozilla developed to study YouTube’s recommendation algorithm. It collected data on participants’ experiences on the platform.

Jesse McCrosky, one of the researchers who conducted the study, said YouTube should be more transparent and give users more influence over what they see.

“Maybe we should actually respect human autonomy and dignity here, and listen to what people are telling us, instead of just stuffing down their throat whatever we think they’re going to eat,” Mr. McCrosky said in an interview.

One research participant asked YouTube on Jan. 17 not to recommend content like a video about a cow trembling in pain, which included an image of a discolored hoof. On March 15, the user received a recommendation for a video titled “There Was Pressure Building in This Hoof,” which again included a graphic image of the end of a cow’s leg. Other examples of unwanted recommendations included videos of guns, violence from the war in Ukraine and Tucker Carlson’s show on Fox News.

The researchers also detailed an episode of a YouTube user expressing disapproval of a video called “A Grandma Ate Cookie Dough for Lunch Every Week. This Is What Happened to Her Bones.” For the next three months, the user continued seeing recommendations for similar videos about what happened to people’s stomachs, livers and kidneys after they consumed various items.

“Eventually, it always comes back,” one user said.

Ever since it developed a recommendation system, YouTube has shown each user a personalized version of the platform that surfaces videos its algorithms determine viewers want to see based on past viewing behavior and other variables. The site has been scrutinized for sending people down rabbit holes of misinformation and political extremism.

In July 2021, Mozilla published research that found that YouTube had recommended 71 percent of the videos that participants had said featured misinformation, hate speech and other unsavory content.

YouTube has said its recommendation system relies on numerous “signals” and is constantly evolving, so providing transparency about how it works is not as easy as “listing a formula.”

“A number of signals build on each other to help inform our system about what you find satisfying: clicks, watch time, survey responses, sharing, likes and dislikes,” Cristos Goodrow, a vice president of engineering at YouTube, wrote in a corporate blog post last September.

Leave a Reply