When a new technology surprises us enough, we have a habit of observing that it’s “like something from science fiction”. We also have a tendency to forget, as we use that phrase, that most science fiction tends towards the bleak.
Earlier this month it was revealed that UK police were investigating their first sexual assault case related to conduct in the metaverse – the much-hyped virtual reality-led next generation of social media.
According to the reports, which do not name which of the various metaverse apps the alleged assault took place in, an underage teenage girl was “gang raped” within the online universe. While this is the first such instance to be investigated by UK authorities, it is far from the first to have taken place. Grimly, it will not be the last.
It is important, then, to think about what has actually taken place – abstracted out from this case, about which little is known. The first important thing to stress is that obviously no physical harm can come about directly as the result of sexual violence in the metaverse – a virtual reality headset shows you a virtual world and lets you hear the real voices of other users, but there is no physical sensation alongside it.
An attack in the metaverse, then, resembles more a greatly intensified version of online abuse or harassment. We should not understate how serious this can be: your whole field of vision is taken up when you are wearing a VR headset and, just as importantly, you can hear the real voices of the people attacking you.
If someone has previously regarded an online community as a safe one, or somewhere integral to their life, it becoming unsafe is as much of a loss as losing access to a real-world community – and far less invasive forms of online abuse have been shown to cause serious trauma in their targets. This is not a minor matter.
SumOfUs, a corporate accountability non-profit advocacy organisation, conducted research into assaults in the metaverse in a paper published in 2022, focused on the largest single metaverse community – Meta’s (the company previously known as Facebook) Horizon Worlds.
Despite being the largest established metaverse, Horizon Worlds is something of a flop: at its peak it claimed 300,000 monthly users, and this was falling rather than rising last time any figures were reported. Immersive metaverse communities were hailed by the tech elite as the future, but the general populace clearly feels differently.
That seems to have made such communities risky for anyone entering them with a female-presenting avatar. When SumOfUs sent researchers in, several of them quickly reported finding themselves in sexually aggressive situations, one reporting that several men imitated group sex with her avatar – all non-consensually – while passing around a virtual bottle of vodka.
Footage was posted to accompany the report, and when watched on mute it is more surreal than disturbing – a camera is bouncing with a floating bottle of vodka obscuring much of the view. Characters in Horizon Worlds are cartoonish floating torsos, whose clothing is not removable. All that really takes place on the screen is some weird cartoon jumping – you can understand the difference when you can hear real men’s voices accompanying the nonsensical action. It is clear that the men are enjoying the idea, and not hard to see that anyone subject to such an attack could easily come to believe that those men would wish to do the same in the real world.
This leaves us in a strange and uncomfortable position. We have not solved any of the problems of existing social media, interacted with through the relatively safe medium of a screen, with easy blocking tools to hand. Abuse, misinformation and targeted harassment on most such platforms is getting worse, not better.
Given that, launching new versions of these networks where such assaults feel so much more real and invasive seems as reckless as it is stupid. If such communities are to grow, they need to be policed, and they need agreed rules of the road. As things stand, it feels that most of us have voted with our wallets, and passed on the metaverse altogether.
Meta insists that Horizon Worlds, like most metaverse applications, is intended strictly for adults, and so in theory no underage teenagers should be on there in the first place – but there is very little stopping teens from faking their ages and getting on there, as is evident.
In practical terms, as to what can be done now, it is worth looking to those who spend the most time covering and thinking about technology for what works in protecting children and vulnerable internet users.
One veteran tech editor – on the beat for 30 years or more – is a parent himself, and often expresses amazement that most parents let their children (and even younger teens) use the internet unsupervised. The rule in his house was that any screen that connected to the internet had to stay in the family living room, and could never go upstairs.
His view is that allowing a child to browse entirely unsupervised is like allowing them to wander a strange town unaccompanied. It is a compelling one.
It is not clear whether the UK’s current legal regime on sexual offences is up to the task of managing assaults such as the one authorities are currently investigating. That said, it is not clear whether the UK’s laws on sexual offences are up to the task of tackling anything.
The temptation is to call for a new law, but is one actually needed? What would be useful is for some kind of expert taskforce to be convened to consider such issues and make recommendations – can new guidance be issued on how to apply current laws to these situations? Or is a new law essential?
The UK’s criminal justice system is failing, but it’s not doing so through any shortage of new laws being passed. Prosecutors are, if anything, spoiled for choice. It is better to think before we act, and go for what works over gimmicks.
More importantly still, it is worth remembering where most online harms happen. Fewer than 500,000 people a month globally are using metaverse apps, while almost five billion of us use some form of social media. To protect most people, we should be keeping our eyes where the users are: Facebook, Insta, X and TikTok. The metaverse can, by and large, wait.
This news is republished from another source. You can check the original article here