Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on April 18, 2018

9 ethical problems with VR we still have to solve

VR could be the next big thing, but not until we figure out how actual-reality rules should apply.


9 ethical problems with VR we still have to solve

Virtual reality (VR) has made some impressive progress in the past few years. Despite a false start a few years back, VR headset sales are improving, with more than a million units shipped in a single quarter for the first time at the end of 2017. Sony, HTC, and Oculus are seeing increased sales, and VR developers are moving beyond gamers to target a broader market.

So far, most VR engineers and developers have been focused on solving problems like how to make a more comfortable, portable headset, and how to decrease the costs of production so headsets are more affordable for the general population. But there are bigger, higher-level problems that still need to be solved in the world of VR — and how we solve them could have a major impact on the future of the industry.

Rather than focusing on profitability or user adoption, VR developers should be spending more time navigating these all-too-important ethical dilemmas. This isn’t a finite list of the ethical questions of VR, but here are the nine issues I believe are most pressing and that we need to resolve as soon as possible.

 

1. User protection

Users may require at least some physical protection, depending on the nature of the headset. If deprived of real-time sensory feedback, users could end up walking into walls or fail to recognize key dangers in their immediate surroundings. There are already some suggested resolutions for this, including using a circular walking arc to simulate straight-line walking without ever walking past an intended boundary, but they still need time for development.

2. User isolation and social effects

Already, we’ve seen the rise of technology capable of forming physical addiction. Though rare, some individuals are so consumed by social media and/or video games that they isolate themselves from society to an unhealthy degree. When entire, immersive worlds are available to explore, who’s job will it be to prevent that from happening?

3. Pornographic content

There’s already some evidence that excessive exposure to pornography could influence harmful behavior toward women. If users engage with pornographic content in an even more realistic environment, with a first-person style of interaction, what effects could that have on violent crime? The problem becomes even more complicated when you introduce the possibility of simulated interactions with real-world people, or the possibility of virtual sexual acts that are illegal in the real world.

4. Virtual crimes

Speaking of crimes, how are we going to manage the execution of crimes in a virtual world? Today’s video game culture is separated by the veil of screens and controllers; titles like Grand Theft Auto may allow a person’s avatar to kill and steal, but using thumb gestures to control an onscreen character is much different than executing a stabbing motion or pulling the trigger yourself in a hyper-realistic environment.

5. Realworld applications

After spending too much time in a virtual environment, it may be difficult for users to return to the real world and behave the same way they did before the virtual experience. They may be desensitized to certain types of violence or interactions, which could damage their social relationships. They may also overestimate their physical abilities, attempting a jump they can’t make or trying a skill they’ve only perfected in a VR environment.

6. In-game trauma

It may not be necessary to experience an event in physical reality to experience the effects of Post-Traumatic Stress Disorder (PTSD). For games that require tough moral decisions, or experiences that simulate a harrowing ordeal, participants may be forced to deal with lasting psychological consequences. How are VR developers going to prevent this, or manage it when it occurs?

7. VR as torture

What if you could inflict trauma on someone in a virtual environment? Would that count as torture? The answer isn’t black-and-white, but it’s a question we need to explore — and one that’s been raised by philosophers.

Military personnel may view VR as a kind of ethical alternative to torture, putting people through horrible experiences without ever inflicting any physical harm. You could make an easy case that this is immoral behavior, but who’s responsible for controlling or stopping it?

8. Virtual travel

VR could help people explore the world, introducing them to new countries and locations they might otherwise never get to visit. But what about sites that severely restrict visitors? Is it ethical to allow someone to remotely visit a site that’s considered holy? Or allow someone to peek around an ex’s apartment? What kind of limits are we going to impose for virtual travel?

9. User privacy

As with most new technologies, we also need to think about user privacy. Users will be able to take more actions and interact with more types of content than ever before, engaging in behaviors they may avoid in the real world. Who is responsible for ensuring users’ privacy, and how could this data be used? Should it be allowed to be given to advertisers, or remain in the individual’s control.

So without any clear, objectively “correct” answers, how are VR engineers supposed to go about answering these questions?

I propose three simple strategies:

1. Consult: No matter how simple some of these questions might seem to an individual, VR developers aren’t going to have the answers. Developers and investors need to work alongside psychologists and philosophers to find and support their conclusions.

2. Invest: For every dollar we spend on VR tech, we should be spending a dollar on research into the effects of the technology. New studies are necessary to learn how VR could shape our minds and behaviors.

3. Safeguard: If a VR developer doesn’t understand the ramifications of a certain feature, they should take measures to protect users just in case; for example, they can downplay the realism of a traumatic scene, offer detailed warnings about the potential effects of a new feature, or somehow proactively detect aberrant user behavior.

Until we have a better understanding of the long-term effects of VR, we need VR developers to prioritize these three important steps.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with