Why YouTube must act to tackle ‘undesirable’ content

The start of 2018 was greeted by a scandal that quickly received international condemnation. Logan Paul, a former vine star turned YouTuber, had uploaded a video as part of his Japan tour exploring the infamous Aokigahara ‘suicide’ forest, in which he stumbles upon a recently deceased individual. Filming at close proximity to the corpse, he only blurs the deceased’s face, with the rest of the body left to graphic display. Even worse is that insensitively, Paul makes crude jokes such as “have you never seen a dead body before?” in response to a crew member’s disgust at the scene. Notably, such actions conflicted with his original defence of the video, in that the video’s intent was geared towards suicide prevention.

The international condemnation was immediate with #LoganPaul trending at number one on Twitter. On YouTube, Paul’s video trended at a similar position for a while, until he chose to take it down himself. YouTube (who acknowledged their delayed response) eventually reprimanded Paul by removing him from Google Preferred, and his appearances on YouTube’s subscription service have been halted until further notice.

“We know that the actions of one creator can affect the entire community, so we’ll have more to share soon on steps we’re taking to ensure a video like this is never circulated again.” Based on this, the impression given is that this is an isolated incident in YouTube’s long reign as a video platform. Yet Paul’s insensitive video is only another deposit into the dark, murky waters of the YouTube platform.

The ascent of PewdiePie, Youtube’s most famous creator, had been glorious. Felix Kjellberg, who started off with a gaming channel that focused on his ‘hilarious’ reactions to horror games, really took off in the community: he currently has over 60 million subscribers, a figure which continues to grow in size. Kjellberg’s recent career has been a hotbed for scandals, insensitivity and a veer towards alt-right shenanigans. It can be argued that Kjellberg begun to propagate Nazi imagery from late 2016 onwards – in one instance, he released a video where clips of Hitler speaking feature alongside Kjellberg, who wore military uniform. Similar to internet trolls and the alt-right, Kjellberg continued to test boundaries, culminating in paying $5 for two Indian men to dance and laugh while holding a sign reading “Death to all Jews.”

The platform has become a cesspool for some truly undesirable content

In February 2017, as a result of this video, PewdiePie was quickly dropped from Google Preferred, and was dropped from Disney after pressure from a Wall Street Journal inquiry. Yet he still had immense support from the YouTube community, and this withstood when he was condemned again for the use of the ‘n-word’ during a livestream. These blatantly anti-Semitic and racist actions were explained away as ‘satire’ and ‘just a joke’ by a community which holds Kjellberg as their ‘hero’. In their eyes, his success represents the advent of a new platform and a new celebrity; he is just an ordinary guy who ‘worked hard’ and made it. He has, however, also become a ‘hero’ in the eyes of the alt-right. The neo-Nazi and white supremacist website The Daily Stormer has notoriously claimed that he is ‘their guy’ in normalising their vitriolic ideology.

Putting these individual YouTubers aside, it is also important to focus on the YouTube community as a whole. The likes of prank channels have been central to YouTube’s algorithms since the beginning of the platform. However, the content of these has taken a more sinister turn towards sexual harassment, exploitation, and mental trauma. The more harrowing of this genre was “KILLING BEST FRIEND PRANK” by Sam Pepper, in which he kidnaps a man and forces him to watch another friend get murdered, the victim screaming and sobbing “we’re just kids”. Pepper was eventually run off the platform, after being subject to intense scrutiny from both the YouTube and international community. Prior to this, his videos regularly graced the trending list and his ‘social experiments’ were largely uncontested, only decried by certain feminist groups who noticed his tendency towards harassment early on.

Appallingly, YouTube has also had an issue with an underground paedophile community. This was discovered by YouTubers, notably Pyrocynical and H3H3productions, who brought this to the attention of the platform. The videos did not show children nude, but they were depicted in suggestive positions. Comments on these videos were vile, clearly from individuals who hoped to pervert these rather innocuous videos. There was even evidence of an underground trade between possible paedophiles, “Videos to exchange, message me” is one such example, with many commenters replying below, asking for more.

The ‘it’s just a joke’ mantra… has particularly gained traction with the rise of angry-white-male identity politics, who have defended their ‘hateful’ content under free speech and troll culture

Simply put, the platform has become a cesspool for some truly undesirable content. YouTube has often used the defence that ‘community guideline’ rules and their commitment towards the employment of over 10,000 people to regulate content is good enough. Recent statements from YouTube’s Robert Kyncl show the platform absolving itself with the defence that: “We [YouTube] are not content creators; we’re a platform that distributes the content”. The argument is that their noble commitment to an open platform has necessarily incurred such costs. Seemingly, YouTube is willing to ‘tolerate’ problematic content as long as it doesn’t raise flags among its advertisers.

Undesirable content is largely a cultural problem. The ‘it’s just a joke’ mantra is not a sudden development, but has been a sinister notion developed over the years. It has particularly gained traction with the rise of angry-white-male identity politics, who have defended their ‘hateful’ content under free speech and troll culture. If YouTube is serious about regulation, they ought to take a long, hard think about the creators they promote and allow to propagate. They should crack down on hateful and exploitative content to highlight that their platform is not for the deplorable. Without YouTube’s willingness to get off its high horse, it will continue to enable undesirable content to be shared. This has detrimental ramifications not only online, but also socio-politically in the international community as a whole.