While there’s plenty of noise about VR in the tech industry, it’s hard to get a handle on how it’s evolving and what it’ll spell for gamers in the years to come.
To learn more, I spoke to Frank Soqui, Intel’s GM for virtual reality gaming, about the future of VR and interactive entertainment at this year’s Intel Extreme Masters esports tournament in Katowice, Poland.
Hate spammy ICOs and crappy cryptocurrencies?
So do we.
The company has been investing in developing numerous technologies for advancing VR experiences, including physics engines and AI, as well as Project Alloy – a reference design for an all-in-one headset that cuts the cord and packs all the necessary computing firepower into a wearable device that could be less cumbersome than those from the current generation.
Soqui, a 35-year veteran at the company, believes that the revolution will begin with the spectator experience before it changes the way we actually play.
TNW: Where does VR fit into the world of esports?
Soqui: Primarily, and perhaps the easiest way of getting there, would be from an audience perspective. So the question is, how do you get audiences to consume online the experience they want to see in esports? Take a look at Sliver.TV’s tech for 360-degree video coverage at live esports events? There are a lot of such technologies in the works – drawing the audience closer to the live experience, as well as into the game.
The other side of it is VR esports. VR is still fairly new; I don’t think the traditional esports players will move immediately to competitive VR games, but rather we’ll likely see a whole new audience for esports that’s interested in a whole new area of gaming. It’s going to be a whole lot more social and collaborative.
TNW: I have my reservations about that; for one thing, I’m not entirely sold on 360-degree video technologies that merely give you more control over seeing the space in which esports competitions are being held (like at a stadium, where you can see the audience and the players). I am, however, interested in tech that brings spectators into the actual game environment.
My second concern has to do with the roughly 45-minute limit on how long you can physically wear a VR headset, both as a player and a spectator. Do you see that changing in the near future?
Soqui: When there’s a compelling experience, I think that the time spent wearing a headset becomes less relevant. Industrial design is going to become much more important as VR tech advances, in order to make the equipment more comfortable to wear over extended periods of time.
I heard about a headset that LG showed off at GDC which looks interesting from an ergonomics perspective (the front of the headset flips up, so you don’t have to take it off to see things around you); Sony’s PlayStationVR headset is also designed for comfort. I think people are improving rapidly on what’s already out there.
The first set of headsets we’ve seen were rushed to get the experience out there, to test if they’re immersive enough (we’re starting to prove that it is), and how long can we sustain this – and that speaks to both, the content available and the wearability of the hardware. I think, every quarter or so, I’m seeing some new advancement in the ergonomics, so you’re not as fatigued. And on the content front, people are developing a wide range of software, from snack-sized experiences that take up only 10-20 minutes to games that you can play for as long as you want. It’s up to users at the end of the day. And what’s great about this space is that people are extremely vocal; if something’s not working out, we hear about it immediately and you can be sure it’s going to be fixed.
TNW: Have you seen any VR games that bring in some level of competition that might develop into full-blown esports titles?
Soqui: Survios’ Raw Data is a collaborative game. I’ve seen early versions of it that feature player vs. player, as well as in cooperative play and multiple team play. So those are possibly the beginnings of competitive games in VR.
TNW: PC-based VR experiences are presently much more immersive than those available on mobile devices, thanks to better graphics and a larger field of view. What’s being done to take things to the next level?
Soqui: There’s a lot of effort being put into developing wireless devices that free you from being tied to a PC, more headset display panels for better viewing, and better audio (integrated with the headset), voice-based control schemes, and eye tracking tech (that’s already on notebooks) that will help make my physical interactions direct the VR experiences more naturally.
Another thing is using more cameras to do things like allowing you to see the physical world around you without taking off the headset (say, if you pause an engrossing horror game that’s starting to feel like a bit much), and also to allow you to use your hands for interacting with the VR space.
TNW: Last year, you talked about how the future of VR is in haptics (among other things). Can you talk about what developments you’ve seen in that space since, and what you’re expecting on that front?
Soqui: I’m seeing more simplistic experiments, like hand-and-glove controls, and vibration feedback for various parts of your body (bullet to the chest). I’m expecting to see sensors embedded in clothing, for two reasons. The first application of that would be to feel sensations like when, for example, someone taps you on the shoulder – that kind of thing can make you feel like you’re really in that space you’re seeing.
The second application, the bidirectional one, would be – I’ll use a simplistic example: Say you’re under a zombie attack and your heart’s actually racing. The game can sense your heart rate and perhaps reduce the number of zombies that spawn in that level. Or, say there’s a horde of zombies and your heart’s not racing at all – we could put in a bunch more zombies in that case.
This is where i see things going with haptics: Not only introducing new sensations based on what’s happening in the VR experience, but also having my reactions in the game translating into how the game interacts with me. And the reason I like that kind of use of artificial intelligence being brought into games is that it keeps things fresh. My reactions change the way the game plays, so it’ll never be the same a second time around.
However, games need to be built from the ground up to support such features; you can’t retrofit them onto existing games and expect them to work well. I’ve seen smaller companies try that, and it doesn’t translate well.