
Mozilla experts found that user actions on YouTube videos do not greatly affect the behavior of recommender algorithms. Writes about it The Verge.
The researchers analyzed the accounts of more than 20,000 service users who installed the RegretsReporter extension. With it, users can filter content and hide unwanted videos.
According to Mozilla, the Dislike, Not Interested, and Don’t Recommend Videos from This Channel buttons have little to no effect on the recommendation system. Users continued to encounter similar content.
The authors of the study argue that, at best, the algorithms do not cope with the task and offer more than half of the unnecessary videos in the feed. In the worst case, the effect of the buttons is scanty.
On average, the “Dislike” option influenced recommendations in 12% of cases, “Not interested” in 11%. The buttons “Don’t recommend videos from this channel” and “Remove from watch history” turned out to be more effective — 43% and 29%, respectively.
The researchers believe that the video service should pay attention to the problem and recommend fewer unwanted videos in the feed.
“YouTube should respect user feedback on their experience as meaningful signals about how people want to spend their time on the platform,” they said.
YouTube criticized the results of the study and said that Mozilla did not take into account many factors that affect the operation of the algorithms.
According to company representative Elena Hernandez, the system deliberately promotes some unwanted videos so that users do not end up in an information bubble.
“It’s important that our controls don’t filter out entire topics or points of view, as this can have negative repercussions for viewers,” she said.
Hernandez noted that video and channel interaction buttons affect specific videos and creators, not topics in general. The video service will not stop recommending all content related to a topic, opinion or speaker, she added.
Recall that in September 2022, scientists announced an increase in user confidence in AI moderators.
In October 2021, a former Meta employee accused Facebook of deliberately using algorithms to incite hate for profit.
Subscribe to Cryplogger news in Telegram: Cryplogger AI – all the news from the world of AI!
Found a mistake in the text? Select it and press CTRL+ENTER

Mozilla experts found that user actions on YouTube videos do not greatly affect the behavior of recommender algorithms. Writes about it The Verge.
The researchers analyzed the accounts of more than 20,000 service users who installed the RegretsReporter extension. With it, users can filter content and hide unwanted videos.
According to Mozilla, the Dislike, Not Interested, and Don’t Recommend Videos from This Channel buttons have little to no effect on the recommendation system. Users continued to encounter similar content.
The authors of the study argue that, at best, the algorithms do not cope with the task and offer more than half of the unnecessary videos in the feed. In the worst case, the effect of the buttons is scanty.
On average, the “Dislike” option influenced recommendations in 12% of cases, “Not interested” in 11%. The buttons “Don’t recommend videos from this channel” and “Remove from watch history” turned out to be more effective — 43% and 29%, respectively.
The researchers believe that the video service should pay attention to the problem and recommend fewer unwanted videos in the feed.
“YouTube should respect user feedback on their experience as meaningful signals about how people want to spend their time on the platform,” they said.
YouTube criticized the results of the study and said that Mozilla did not take into account many factors that affect the operation of the algorithms.
According to company representative Elena Hernandez, the system deliberately promotes some unwanted videos so that users do not end up in an information bubble.
“It’s important that our controls don’t filter out entire topics or points of view, as this can have negative repercussions for viewers,” she said.
Hernandez noted that video and channel interaction buttons affect specific videos and creators, not topics in general. The video service will not stop recommending all content related to a topic, opinion or speaker, she added.
Recall that in September 2022, scientists announced an increase in user confidence in AI moderators.
In October 2021, a former Meta employee accused Facebook of deliberately using algorithms to incite hate for profit.
Subscribe to Cryplogger news in Telegram: Cryplogger AI – all the news from the world of AI!
Found a mistake in the text? Select it and press CTRL+ENTER