Amid the furore over the company's failure to police children's content, there is a solution
Given that a vlogger named Logan Paul recently exposed his 15 million subscribers to an image of a newly deceased suicide victim, there’s been plenty of talk about YouTube’s responsibilities as a publisher this week.
Paul, who hails from Ohio, reportedly makes around a million dollars a month from his exploits and describes himself, on his YouTube page, as a “22 year old kid in Hollywood making crazy daily Vlogs!” In general, this involves him doing dumb stuff like teasing his own dog with food tied to a drone (side note: there is a special place in Hell for people who are mean to dogs) and, like, eating really hot tortilla chips? Or something? He shouts a lot, has Trumpian hair and, like a lot of YouTube stars, reminds you of the kind of inane celebrities that emerged in the mid-noughties reality TV boom.
In his now-deleted, most controversial upload, entitled ‘We found a dead body in the Japanese Suicide Forest…’, Paul and his goons visited the Aokigahara forest at the base of Mount Fuji, Japan. The site is a notorious spot for suicides, and the vlogger stumbled upon a dead body. A man had hanged himself from a tree. Before he removed the video, 6.5 million viewers watched Paul joke about the body (“What, you never stand next to a dead guy!?”).
How did the video even find it way to Logan’s young audience? Because the channel has a ‘publish first, ask questions later’ policy. As online culture expert Sarah T. Roberts, an assistant professor of information studies at the University of California, has said: “YouTube is absolutely complicit in these kinds of things, in the sense that their entire economic model… for revenue creation is created fundamentally on people like Logan Paul.”
Back in November, writer James Bridle published an excellent, impassioned blog post about the weird, unpredictable, unregulated nature of YouTube’s algorithmic set-up. On both the main service and its supposedly child-friendly app YouTube Kids, sketchy videos are being served up alongside content design for young audiences. These can be either a) adult parodies of children’s programmes or b) much uglier Frankenstein’s monsters spliced together by anonymous content creators farming search terms for parts.
The writer cites a video named ‘Wrong Heads Disney Wrong Ears Wrong Legs Kids Learn Colors Finger Family 2017 Nursery Rhymes’. In the video, various Disney characters – Aladdin, The Genie and more – are detached from their own heads. The heads then rotate above them, landing on the wrong bodies, before eventually relocating the correct one. When it’s incorrect, Agnes, the little girl from Despicable Me, begins to cry. When it’s correct, she cheers. In both instances, the sound is that of an actual baby gurgling, which in this context is creepy in the extreme.
Sign up for the newsletter
Bridle notes of this queasy, unsettling mash-up: “I have no idea where the “Wrong Heads” trope originates, but I can imagine … that somewhere there is a totally original and harmless version that made enough kids laugh that it started to climb the algorithmic rankings until it made it onto the word salad lists, combining with Learn Colors, Finger Family, and Nursery Rhymes… not merely as words but as images, processes, and actions — to be mixed into what we see here.”
Who is making these videos? We don’t know. The channel responsible has only a Gmail account as its contact information and is not accountable in the way that, say, Disney or Cartoon Network or CBBC are. Parents enter into a safe space when their kids watch shows made by these heritage studios, but with YouTube there is no assurance that content is appropriate for young viewers. Bridle found another, much grimmer video that sees well-loved children’s cartoon characters navigate sets from Grand Theft Auto as they murder and bury each other. The video, removed since his post, is called ‘BURIED ALIVE Outdoor Playground Finger Family Song Nursery Rhymes Animation Education Learning Video’.
YouTube CEO Susan Wojcicki has penned a blog post that claims the company had pledged to take a harder line stance on this issue. “We will continue the significant growth of our teams into next year,” she writes, “with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.”
The problem, though, as many have noted, is that millions of – potentially young – viewers could have watched a violent, exploitative or creepy video before complaints result in its removal. That’s precisely what happened with Logan Paul. As the children’s actor and presenter Ed Petrie (who has worked on shows for CBBC and Nickelodeon) explained on Twitter: “On their YouTube kids app they should vet every video before it’s posted, instead of taking things down in retrospect after they get complaints. It will cost them more money. But they make A LOT of money. And with that comes responsibility.”
Many successful tech companies seem to believe that, because they’re innovative – or ‘disruptive’, to use their self-aggrandising Silicon Valley terminology – they needn’t play by the same rules of responsibility or accountability as everyone else. Uber lost its license to operate in London back in September. This came in the light of news that the firm had not dealt properly with instances of sexual assault against female customers; TfL fairly claimed that “Uber’s approach and conduct demonstrate a lack of corporate responsibility.” The tech company countered that this “shows the world that, far from being open, London is closed to innovative companies.”
This is the same as Apple and Amazon thinking they’re above paying tax. But it doesn’t matter how forward-thinking you believe you are. If you actively take more away from society – be that unpaid taxes or safe spaces for customers and viewers – than you contribute, you’re welcome to go the way of Logan Paul’s career.
Ed Petrie is correct. YouTube should act like any major broadcaster and take responsibility for the content that it brings to young audiences. It won’t be cheap, but the reportedly $75bn company can afford it.