Mozilla wants your help to fix terrible YouTube recommendations

YouTube’s recommendations algorithm can lead you down some very weird rabbit holes, suggesting videos that feel weirdly personal and off-target at the same time. Today, Mozilla will introduce a new browser extension, RegretsReporter, that aims to crowdsource research about users’ “regrettable recommendations,” to let users better understand how YouTube’s recommendation algorithm works and provide details about patterns it discovers.
Mozilla started gathering stories from users last year about the videos that YouTube recommended to them; one user searched for videos about Vikings and was recommended content about white supremacy; another searched for “fail” videos and started getting recommendations for grisly videos of fatal car wrecks.
But there hasn’t really been a large-scale, independent effort to track YouTube’s recommendation algorithm to understand how it determines which videos to recommend, said Ashley Boyd, Mozilla’s vice president of advocacy and engagement.
“So much attention goes to Facebook — and deservedly so— when it comes to misinformation,” Boyd said. “But there are other elements in the digital ecosystem that have been under attended-to, and YouTube was one of those. We started to look at what YouTube said, how they curated content and noticed that they responded to concerns about the algorithm and said they were making progress. But there was no way to verify their claims.”
A YouTube spokesperson said in a statement to The Verge that the company is always interested to see research on its recommendation system. “However it’s hard to draw broad conclusions from anecdotal examples and we update our recommendations systems on an ongoing basis, to improve the experience for users,” the spokesperson said, adding that over the past year, YouTube has launched “over 30 different changes to reduce recommendations of borderline content.”
The Google-owned video platform has promised on numerous occasions to tweak the algorithm, Boyd points out, even as company executives were aware that it was recommending videos containing hate speech and conspiracy theories.

The browser extension will send data to Mozilla about how often you use YouTube, but without collecting information about what you’re searching for or watching unless you specifically offer it. You can send a report via the extension to provide more detail about any “regrettable” video you encounter in the recommendations, which will allow Mozilla to collect information about the video you’re reporting and how you got there.
Mozilla is hoping the extension will make the “how” of YouTube’s recommendation algorithm more transparent; what type of recommended videos lead to racist, violent, or conspiratorial content, for instance, and identify any patterns about how often harmful content is recommended.
“I would love for people to get more interested in how AI and in this case, recommendation systems, touch their lives,” Boyd said. “It doesn’t have to be mysterious, and we can be clearer about how you can control it.”
Boyd stressed that user privacy is protected throughout the process. The data Mozilla collects from the extension will be linked to a randomly-generated user ID, not to a user’s YouTube account, and only Mozilla will have access to the raw data. It will not collect data in private browser windows, and when Mozilla shares the results of its research, it will do so in a way that minimizes the risk of users being identified, Boyd said.
Mozilla does not have a formal arrangement with Google or YouTube for its research into the recommendation algorithm, but Boyd says they’ve been in communication with the company and are committed to sharing information.
YouTube, however, said the methodology Mozilla was proposing seemed “questionable,” adding that it wasn’t able to properly review how “regrettable” is defined, among other things.
Mozilla plans to spend six months collecting information from the extension, after which it will present its findings to users and to YouTube. “We believe they are committed to this issue,” Boyd said of YouTube. “We would love it if they could learn anything additional from our research, and making some viable changes to work toward building more trustworthy systems for recommending content.”
YouTube’s recommendations algorithm can lead you down some very weird rabbit holes, suggesting videos that feel weirdly personal and off-target at the same time. Today, Mozilla will introduce a new browser extension, RegretsReporter, that aims to crowdsource research about users’ “regrettable recommendations,” to let users better understand how YouTube’s recommendation…
Recent Posts
- Apple’s C1 chip could be a big deal for iPhones – here’s why
- Rabbit shows off the AI agent it should have launched with
- Instagram wants you to do more with DMs than just slide into someone else’s
- Nvidia is launching ‘priority access’ to help fans buy RTX 5080 and 5090 FE GPUs
- HPE launches slew of Xeon-based Proliant servers which claim to be impervious to quantum computing threats
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010