YouTube is Much to be Blamed for Logan Paul's 'Suicide Forest' Video
Paul, who has over 15 million subscribers and is actually part of YouTube’s Red subscription service, later on, deleted the footage and asked for an apology online. Although the deletion was made less than 24 hours after posting, more than six million people had already seen it.
Apparently, it emerged to have been given “green light” by YouTube’s moderation team. The latter is solely responsible for, well, moderating content and deeming each video offensive or not.
The revelation surfaced after one of the platform’s very own content assessment team posted a screenshot that showed the approval of Paul’s video. It appears to have been approved on January 1, just after being flagged by concerned viewers.
Image via YouTube |
If you come to think of it, the very person responsible for letting an unsuitable content on its platform is the same person who is paid to keep such content from being posted. Irony at its finest, so to speak.
The footage included a hanging body (which is also the thumbnail of the video) and was titled, “We found a dead body in the Japanese Suicide Forest.” But despite the disturbing title and scenes, it still passed YouTube’s moderation process. This even goes without saying that it quickly ranked as one of the platform’s trending videos during that time.
What is worst is the fact that most of Paul’s subscribers are children under the age of 18. YouTube even detailed in its guidelines that “it’s not okay to post violent or gory content that’s primarily intended to be shocking, sensational, or disrespectful.” Clearly, Paul’s video violated this.
Image via NYMag |
The YouTube star has since apologized – for the second time – following the outrage. Unfortunately for the video platform, people are now focused on how and why the footage was not removed immediately.
A YouTube spokesperson recently released a statement (via TechCrunch) concerning the issue:
Our hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated. We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center.
It cannot be denied that YouTube’s supposed “strict moderation system” is broken. The platform has pledged to increase its investment in terms of artificial intelligence moderation in hopes to increase its moderation manpower.
Post a Comment