YouTube has broken some direct business ties with one of its biggest stars, Logan Paul, 10 days after Paul uploaded “We found a dead body in the Japanese Suicide Forest…” to his 15 million subscribers. The video, which Paul initially called “a moment in YouTube history,” quickly drew sharp criticism for its offensive, voyeuristic treatment of a dead body hanging in a tree, and was removed by Paul on Jan. 1. After apologizing for the video, Paul announced he would temporarily suspend his channel to “take time to reflect.”
YouTube issued this statement on Jan. 10: “In light of recent events, we have decided to remove Logan Paul’s channels from Google Preferred. Additionally, we will not feature Logan in season 4 of ‘Foursome’ and his new Originals are on hold.” (Google Preferred is a premium advertising program that aggregates “the top 5 percent of YouTube content” targeting users between 18 and 34.)
An earlier statement from YouTube on Jan. 2 condemned Paul’s video, expressed sympathies with the family of the dead person seen in the video, and reiterated that the video violated YouTube’s community standards regarding “violent or gory content posted in a shocking, sensational or disrespectful manner.” And on Jan. 9, YouTube tweeted the following from its official account:
An open letter to our community: Many of you have been frustrated with our lack of communication recently. You’re right to be. You deserve to know what’s going on. Like many others, we were upset by the video that was shared last week. Suicide is not a joke, nor should it ever be a driving force for views. As Anna Akana put it perfectly: “That body was a person someone loved. You do not walk into a suicide forest with a camera and claim mental health awareness.” We expect more of the creators who build their community on @YouTube, as we’re sure you do too. The channel violated our community guidelines, we acted accordingly, and we are looking at further consequences. It’s taken us a long time to respond, but we’ve been listening to everything you’ve been saying. We know that the actions of one creator can affect the entire community, so we’ll have more to share soon on steps we’re taking to ensure a video like this is never circulated again.
This, however, is the first action YouTube has taken against Paul because of the video. Notably, Paul has not been suspended or removed from YouTube, or its ordinary monetization network. (The Preferred Network attracts premium advertisers and earns at a higher pay rate.) Paul was issued a strike for violating YouTube’s community standards; three strikes or violations are typically required for a user to be ejected from the platform outright.
A YouTube spokesperson also emphasized that the company’s decision to cut these particular business ties was specific to this case, and did not indicate a change in YouTube’s policies regarding controversial content. Those continue to be handled separately.
Furthermore, this may not be YouTube’s only response. Its efforts to “ensure a video like this is never circulated again” are not limited to Paul’s removal from the Preferred network or his projects with YouTube Red. A YouTube spokesperson said that the platform was always seeking to improve its efforts at moderating and removing offensive and inappropriate videos.
Google took similar steps against PewDiePie last year when it was revealed that the YouTube star had posted videos with anti-Semitic comments.
Jason Kint, CEO of Digital Content Next, was unimpressed. “Google’s response was meaningless,” he said. “I’m interested in where they ultimately go on this. On one side, they have their normal safe-for-Google strategy to tune the dials on the algorithm. On the other side, they have 80,000 employees [and] $35 billion in profit to come up with real solutions … A commitment to resources, both human and engineering, to better flag and prevent vitality of garbage.”
YouTube has two problems: What to do in specific cases like PewDiePie’s or Logan Paul’s, where a featured entertainer had proved toxic to the YouTube brand, and what to do with policing the sheer scale of its platform, with millions of videos across the site. How can it be a source of trusted entertainment for its audience and advertisers, but also an open platform and the main video repository for the web?
Videos on Preferred channels go through the same process of content moderation in cases like Paul’s as any other video on YouTube: algorithmic processes or a community member flags the content, and a human reviewer confirms or denies that flag; sometimes, in ambiguous cases, a higher-level reviewer is called in. The video can be removed, or it can be allowed to stay on the platform but demonetized. (A move to more aggressively demonetize YouTube content deemed potentially offensive was dubbed “the Adpocalypse.”)
In the case of Paul’s video, YouTube viewers flagged it as violating community standards, but a human reviewer initially approved the video. According to a YouTube spokesperson, “Sometimes videos go through more than one review to ensure our community guidelines are being accurately enforced. These decisions can be difficult and nuanced which is why we have escalation paths in place to ensure correct decisions are made. As we noted–this video did violate our community guidelines. Had Logan Paul not taken it down, we would have.”
It’s not clear whether more tech or more people would have kept Paul’s video off the platform for a day. YouTube so far as opted to handle these cases with as light a touch as they can, withholding future money, but not accelerating its punishment system, NFL-style, to respond to criticism by making an example of bad actors. And its global changes, like in the case of the Adpocalypse, seem equally prone to criticism for overreaching. Tweaked guidelines and uncomfortable apologies, rather than anything that would radically upset the platform, seem like the most likely future.
By Tim Carmody