It’s easy to get to grips with YouTube as it recommends an endless array of videos, each giving you more of the same type of content. But it’s not always the same content. Sometimes the process gets complicated and you end up watching something that you are not interested in. Mozilla is curious as to why this is happening and has created a browser extension to follow YouTube’s recommendations.
Identify the problem
Mozilla introduced a new browser extension this week: RegretsReporter. Its goal is to collect YouTube’s “regrettable recommendations”. The hope is that users will have a better understanding of how the algorithm works and provide details on the patterns it discovers.
YouTube started working on the YouTube Recommendations Mission last year. One example he collected was a user searching for viking videos and getting white supremacist videos recommended to them. Another YouTube user searched for “fail” and received recommendations for horrific fatal car crashes.
Mozilla VP of Advocacy and Engagement Ashley Boyd said that so far there hasn’t been such a big effort to understand YouTube’s recommendations.
“A lot of attention goes to Facebook – and rightly so – when it comes to disinformation,” Boyd said. “But there are other parts of the digital ecosystem that have been overlooked, and YouTube was one of them. We started looking at what YouTube was saying, how they were organizing the content, and noticed that they were addressing concerns about the algorithm and said they were making progress. But there was no way to verify their claims. “
YouTube doesn’t seem too happy about Mozilla digging into their business and reviewing their recommendations. A YouTube spokesperson said in a statement that the company is still interested in researching its algorithm.
“However, it is difficult to draw general conclusions from anecdotal examples, and we update our recommendation systems on an ongoing basis to improve the user experience,” the spokesperson said. It was also noted that over the past year, YouTube has launched “over 30” different changes to lower limit content recommendations.
The video site / app has promised to change its algorithm on several occasions. Boyd notes that those promises were there even as the video platform’s executives knew videos containing hate speech and conspiracy theories were recommended.
Mozilla browser extension to follow YouTube recommendations
Mozilla hopes that the browser extension will make its algorithm more transparent. The company wants to know what types of recommended videos lead to racist, violent or conspiratorial content. Mozilla hopes to identify unsafe content recommendation patterns.
“I would love to see people take more interest in how AI, and in this case recommender systems, is affecting their lives,” Boyd added. “It doesn’t have to be a mystery, and we can be clearer on how you can control it.”
If you are concerned about your privacy when Mozilla collects your YouTube browsing information, the data collected will be linked to a randomly generated user ID and not to your YouTube account. Only Mozilla will have access to the raw data, Boyd explains. It will not collect data from private browser windows, and when sharing its results, it will minimize the risk of identifying users.
However, YouTube finds the method proposed by Mozilla “questionable”. For example, he was unable to review the definition of “regrettable”.
Mozilla’s plans are to collect information for six months and then present its findings to users and to YouTube.
“We believe [YouTube is] is committed to this problem, ”Boyd said. “We would love if they could learn something more from our research and make viable changes to work on creating more reliable systems for recommending content.”
All along, however, the only thing that has not been answered is why Mozilla has taken such an interest in YouTube and its recommendations.
If you’re more concerned about tracking when using Firefox, check out these two simple and effective Firefox add-ons to stop sites from tracking you.
Is this article useful?