YouTube’s ‘Dislike’ and ‘Not Interested’ Buttons Barely Work, Study Finds

Even when users tell YouTube they aren’t interested in certain types of videos, similar recommendations keep coming in, according to new research from Mozilla.

Using video recommendation data from over 20,000 YouTube users, Mozilla researchers found that buttons such as “not interested”, “dislike”, “stop recommending the channel” and “remove watch history” are largely ineffective in preventing similar content from being recommended. Even at their best, these buttons still skip more than half of similar recommendations to what a user said they weren’t interested in, according to the report. At worst, the buttons barely made a dent in blocking similar videos.

To collect data from videos and real users, Mozilla researchers recruited volunteers who used the foundation’s RegretsReporter, a browser extension that overlays a general “stop recommending” button on YouTube videos viewed by viewers. attendees. On the back-end, users were randomly assigned to a group, so different signals were sent to YouTube each time they clicked the button placed by Mozilla – dislike, disinterested, do not recommend the channel, removes history and a control group for which no feedback has been sent to the platform.

Using data collected from more than 500 million recommended videos, search assistants created more than 44,000 video pairs – one “rejected” video, plus one video subsequently recommended by YouTube. The researchers then rated the pairs themselves or used machine learning to decide if the recommendation was too similar to the video a user had rejected.

Compared to the baseline control group, sending the ‘dislike’ and ‘not interested’ signals was only ‘marginally effective’ in preventing bad recommendations, preventing 12% of 11% of bad recommendations, respectively. The ‘Do not recommend channel’ and ‘Remove from history’ buttons were slightly more effective – they prevented 43% and 29% of bad recommendations – but researchers say the tools offered by the platform are still insufficient to keep unwanted content away.

“YouTube should respect user feedback about their experience, treating them as meaningful signals about how people want to spend their time on the platform,” the researchers write.

YouTube spokeswoman Elena Hernandez says these behaviors are intentional because the platform doesn’t try to block all content related to a topic. But Hernandez criticized the report, saying it failed to take into account the design of YouTube’s controls.

“It’s important to note that our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, such as creating echo chambers,” Hernandez said. The edge. “We welcome academic research on our platform, which is why we recently expanded access to the Data API through our YouTube research program. Mozilla’s report does not take into account how our systems, and therefore it is difficult for us to collect a lot of information.

Hernandez says Mozilla’s definition of “like” doesn’t take into account how YouTube’s recommendation system works. The “not interested” option removes a specific video, and the “don’t recommend channel” button prevents the channel from being recommended in the future, Hernandez says. The company says it does not seek to stop recommendations from any content related to any topic, opinion, or speaker.

Besides YouTube, other platforms like TikTok and Instagram have increasingly introduced feedback tools that allow users to train the algorithm, supposedly, to show them relevant content. But users often complain that even when they signal that they don’t want to see something, similar recommendations persist. According to Mozilla researcher Becca Ricks, Mozilla researcher Becca Ricks isn’t always clear what the various controls actually do, and platforms aren’t transparent about how feedback is taken into account.

“I think in the case of YouTube, the platform balances user engagement with user satisfaction, which is ultimately a trade-off between recommending content that gets people to spend more time on the site and content that the algorithm thinks people will like,” Ricks said. The edge by email. “The platform has the power to alter which of these signals carries the most weight in its algorithm, but our research suggests user feedback isn’t always the most important.”

Leave a Comment