A new study by Mozilla found that when users tell YouTube that they are “not interested” in certain types of videos, similar recommendations from the platform continue to appear for the same users.
Mozilla researchers, using video recommendations data from more than 20,000 YouTube users, found that buttons like “Disinterested,” “Dislike,” “Stop suggesting a channel,” and “Remove from watch history” were largely ineffective at preventing the recommendation. With similar content, the report also found that even at their best, these buttons still allowed for more than half of recommendations similar to what the user said they weren’t interested in.
Research assistants created more than 44,000 pairs of videos, plus a video that was later recommended by YouTube. From 500 million recommended videos.
Compared to the baseline control group, sending “hate” and “disinterested” signals was only “marginally” effective in blocking bad recommendations, blocking 12 percent of 11 percent of bad recommendations, respectively, and the “Don’t recommend channel” and “Don’t recommend channel” buttons, respectively. “Remove from history” is more effective, blocking 43 percent and 29 percent of bad recommendations, but researchers say the platform’s tools are still insufficient to target spam.
“YouTube should respect the comments that users share about their experience, and treat them as meaningful signals about how people want to spend their time on the platform,” the researchers wrote.
YouTube spokeswoman Elena Hernandez said these behaviors are intentional because the platform does not attempt to block all content related to a topic, and criticized the report, saying it “does not take into account how YouTube’s controls are designed.”
“Importantly, our controls do not filter entire topics or perspectives, as this can have negative effects on viewers,” Hernandez added. “We welcome academic research on our platform, which is why we recently expanded access to the Data API from During the YouTube Researcher…Mozilla’s report doesn’t take into account how our systems actually work, so it’s hard for us to gather too many ideas.”
Other platforms such as TikTok and Instagram have provided users with more comment tools to train the algorithm to show them relevant content, but users often complain that even when they report that they don’t want to see something, similar recommendations persist.
You can follow the latest news through my lady’s account on Twitter