YouTube executives ignored warnings and still don’t properly manage toxic videos

YouTube executives ignored warnings and still don’t properly manage toxic videos

Youtube leaders ignored proposals to modify the channel recommendations in order to stamp out toxic videos and to tackle the conspiracy theories that seem to be constantly present when discussions about Youtube practices emerge. A recent report says that executives were more concerned with keeping viewers engaged.

Bloomberg discussed with Youtube and Google employees and they claim that they’ve manifested their concerns about the “mass of false, incendiary and toxic content” over the last few years. They even came to suggest that several executives be changed or track the prevalence and popularity of toxic videos to show senior management just how big the problem was. Unfortunately, engagement metrics proved to be much more important and the case was closed.

YouTube employees have battled executives’ approach to the platform’s problems. It even came to five senior staff members leaving the company over the last couple of years claiming YouTube’s long-standing “inability to tame extreme, disturbing videos” was why they departed.

Why Youtube doesn’t act on banning disturbing videos

Without any doubt, outlandish and oftentimes inappropriate. Although we haven’t seen any official numbers, the reported earnings through video sharing on the platform are estimated to pull in north of $16 billion per year.

A spokeswoman disapproved some of the recent claims from this report, including that fact that CEO Susan Wojcicki might just close her eyes to some of the issues and controversies the company is faced with. It’s known that Wojcicki has previously stated about the company that:

“We’re really more like a library,” staking out a familiar position as a defender of free speech. There have always been controversies, if you look back at libraries.”

The service has been focused on finding solutions for some of its “toughest content challenges” over the last two years. Reports say that employees have been trying to tackle the issues on their own. One employee wanted to flag troubling videos, which fell just short of the hate speech rules, and stop recommending them to viewers. Another wanted to track these videos in a spreadsheet to chart their popularity. A third, fretful of the spread of “alt-right” video bloggers, created an internal vertical that showed just how popular they were. But all of these measures were met with the same dismissive approach from superiors.

Youtube says that its mantra includes “updating their recommendations system to prevent the spread of harmful misinformation, improving the news experience on YouTube, bringing the number of people focused on content issues across Google to 10.000, investing in machine learning to be able to more quickly find and remove violative content, and reviewing and updating our policies.”

Even having pledged to stop recommending conspiracy theory videos, a privacy engineer suggested that videos skirting the edges of the site’s content policies shouldn’t be included in any of the platform’s recommendations. His proposal was rejected at the time, according to the report, though YouTube eventually adopted the idea this January. YouTube has added other measures to combat false content, including information panels on video pages and search results that offer truthful information on sensitive or controversial topics.

Meanwhile, Motherboard reports white nationalist and neo-Nazi propaganda videos are freely available on YouTube. The publication shared some examples with the service, which removed advertising from them, added content warnings and made sure they didn’t appear in recommendations. But the videos are still on YouTube, and you can view them via search results.