YouTube Keeps Recommending Regrettable Videos

( A report from the Mozilla Foundation claimed on Wednesday that YouTube users are still being recommended “regrettable” videos, thanks to Google-built algorithms that show users other videos that they might be interested in.

The report reveals instances of YouTube’s algorithm’s automatically encouraging people to watch videos that the platform would typically not want people to see. The term “regrettable” – which is a term used by users to describe the videos they don’t want to see – can mean anything from “misinformation” and so-called “hate speech,” or violence.

So for some people, this is a huge deal. For others, who actually value free speech, perhaps the only thing to be worried about here is that YouTube’s algorithms could be encouraging people of all ages – including children – to watch videos that promote or display violence.

The Mozilla Foundation performed an investigation over 10 months that used crowdsourced data from its Firefox web browser. A special extension built for Google’s Chrome browser was also used, which allowed users to more easily report content that they don’t think should be shared or recommended.

According to researchers, 71% of videos that were reported by users and flagged as “regrettable” specifically came from YouTube recommendations. Interestingly, the data also showed that YouTube’s algorithms were promoting objectionable or negative content because those videos were being watched by more people.

Brandi Geurkink, a senior manager from Mozilla, said that YouTube should admit that their algorithms are actively designed in a way that “harms” and “misinforms” users.

It seems pretty shocking that “hate speech” would be promoted by YouTube, given how easy it is to be banned from the platform these days, but it looks like YouTube could be allowing some things to slip through.

But not for much longer if Mozilla has anything to do with it…