According to the NSPCC, some metaverse apps are “dangerous by design” after a researcher posing as a 13-year-old girl witnessed grooming, sexual material, racist insults and a rape threat in metaverse platform VRChat.
In a report written by the BBC, it’s claimed the researcher visited virtual-reality rooms via an Oculus Quest 2 headset, where avatars were simulating sex. She was shown sex toys and condoms, and approached by numerous adult men.
Despite the fact VRChat has a minimum age rating of 13, the researcher allegedly created a fake Facebook profile to set up her account – and her real identity was never checked. She downloaded the app via Facebook’s Meta Quest headset app store.
Writing about her experience, the researcher said that VRChat “definitely felt more like an adult’s playground than a child’s. Everything about the rooms feels unnerving. There are characters simulating sex acts on the floor in big groups, speaking to one another like children play-acting at being adult couples.”
“It’s very uncomfortable, and your options are to stay and watch, move on to another room where you might see something similar, or join in – which, on many occasions, I was instructed to do,” she added.
Metaverse app allows kids into virtual strip clubs https://t.co/PCLMUip9lc
— BBC News Technology (@BBCTech) February 23, 2022
Head of online child safety policy for the NSPCC Andy Burrows said the findings were
“extraordinary. It’s children being exposed to entirely inappropriate, really incredibly harmful experiences.”
“This is a product that is dangerous by design, because of oversight and neglect. We are seeing products rolled out without any suggestion that safety has been considered,” he said.
In response, VRChat told the BBC it was “working hard to make itself a safe and welcoming place for everyone. Predatory and toxic behaviour has no place on the platform”.
Meta‘s product manager for VR integrity Bill Stillwell said in a statement: “We want everyone using our products to have a good experience and easily find the tools that can help in situations like these, so we can investigate and take action. For cross platform apps, we provide tools that allow players to report and block users. We will continue to make improvements as we learn more about how people interact in these spaces.”