Better than Locomotion - Scopes in VR

Since the dawn of modern VR in 2013, there has been no shortage of FPS games; mostly ports of existing PC titles, piped into a headset but still using WASD or a gamepad to move players at break-neck speeds. The first Oculus Rift backers were certainly convinced that FPS would work perfectly in VR, only to discover the nauseating truth about locomotion and simulation sickness. The now infamous Steam Dev Days conference of 2014 heralded new advances in framerates, low-persistence screens, and positional tracking, all developed to minimize the well documented side effects of the Oculus DK1.

Just as notable were the appeals to developers to follow a newly-minted bible of best practices. Innovative hardware research means nothing if game designers break the rules, so these points were repeated for emphasis, and then published in the Oculus Rift Best Practices Guide.

First on the chopping block were the FPS games we all expected to define virtual reality. The dominant form of 3D gaming for decades, First Person Shooters fall apart in VR. Locomotion, aiming, jumping, climbing stairs, HUD, inventory...these critical elements proved to be pitfalls with no easy workarounds. Developers were asked to find other kinds of content, or at the very least to strap players inside a cockpit where motion sickness was somewhat less likely.

The singular achievement in bringing First Person Shooters closer to reality turned out to be motion controllers, specifically the wands for the HTC Vive and Touch Controllers for the Oculus Rift. Holding a gun naturally in your hand, aiming down iron sights and pulling a trigger on the bottom of the controller proved to be a disturbingly natural interaction. Innovative reload mechanics started to blur the line between gaming and simulation, but an old limitation reared it’s head once again.

Locomotion.

Even the most advanced room-scale VR on the planet still confines players to less than 15 feet in any direction; hardly enough to leave the starting pen of a Counterstrike match. Realistically, the amount of space most users have in their homes is barely enough to take more than 2 steps, so nothing short of an untethered arena can translate room-scale into the sizes that players have grown accustomed to.

With a few exceptions, Teleportation became the go-to mechanic for getting around a map. It provided a means of exploration with minimal motion sickness, but sadly breaks immersion almost every time a player teleports. Given the choice, many developers opted for keeping the player confined to a single area for the duration of the game. Thus you will find a large and rapidly growing list of “wave shooters” where the player is attacked by oncoming hordes...often zombies...but must hold their ground because movement is not allowed. Screen resolution and player aiming skill only allow accuracy up to a few dozen feet away, hence the “waves” of enemies will spawn at a distance and slowly (or quickly) move towards the player until they are in range.

This is where the Sniper Scope comes in.

In April of 2016, I attended a VR Hackathon with Victor Brodin and the simple goal of throwing together a fast new prototype for the HTC Vive, and scouting for some local talent to add to our VR team(mission accomplished!). The night before the competition, inspiration struck, and we instantly agreed that a Sniper game was something we could pull off in 36 hours, so we set to work, and “The Nest” was born.

The reception to our prototype and subsequent Steam release continues to be the biggest surprise of my career. Virtually everyone comes away thrilled with how natural the scope feels. The immersion is beyond what most people expect, due in part to the physical “rifle stock” that one of our team members assembled out of a 2x4, cardboard and some duct tape. Players hold this “stock” with both hands, and almost universally close one eye to aim down the adjustable zoom scope. The extra effort we put into the zoom controls resulted in the ability to magnify enemies who are hundreds of feet away until they take up almost your entire field of view, before squeezing the trigger and going for a kill.

The realization was crystal clear. This is a VR locomotion system.

What is gained from the zoom scope is the ability to view distant objects with amazing detail, and interact with enemies who are only a few pixels tall at normal resolution. Moving to that enemy would require breaking immersion or risking motion sickness, while having that enemy move to you would add to the already saturated market of wave-based shooters. Using a scope, players train themselves to survey the wide-view of the environment through their non-dominant eye, while panning and zooming with the scope to track targets, investigate interesting objects, and obviously to kill robots.

In terms of raw immersion, the prop rifle stock is actually much less important than the image in the scope. I call this the “second screen effect,” and theorize that we spend so much time looking at interactive screens that our brains are very willing to accept small screens inside VR. Job Simulator has a replica PC where the player empties a fake email inbox, controlled by a fake mouse on the desk. Looking at the low-res monochrome CRT on the desk and moving a virtual mouse has a noticeable “presence multiplier,” due to how accustomed we are to such mundane tasks.

Another fan favorite is Budget Cuts, slated to launch as a full game later this year. They have a novel teleportation mechanic that turns your hand into a “preview” of the location that you are about to teleport to, and this “preview” is basically a second screen, showing a video feed of where you are headed. Players fire a projectile that sets the destination, and at the click of a button the “preview” image expands until you are now fully in the new space.

So how do you take advantage of this in your VR experiences?

I don’t expect a sudden influx of Sniper games, though I have heard a great deal of enthusiasm for big game hunting, WWII bell-tower defenses, and counter-terrorism simulations. The most common(and distasteful) suggestion is a level that takes place in Dallas in 1963, which will most likely become a mod if a developer doesn't deliberately make a game around it. All of these are inevitable, as they are literal uses for an actual rifle scope.

However, these are not the only areas that can take advantage of a zoom scope. A talented designer could make a game where you are a paparazzi trying to catch the perfect shot of a celebrity, a detective game where you need to gather evidence of some sinister plot, or a submarine game where the functioning periscope is your only way to see above the waves. Telescopes, binoculars, camcorders; these can all take advantage of variable zoom levels to allow stationary players to explore distant environments.

Games and non-gaming experiences should take advantage of the ubiquity of our cell phones and put them into the game as mini-maps, UI consoles or chat windows. If we just pretend that cell phones have good zoom capabilities and let players look into the distance through the small screen, without breaking presence by teleporting them from place to place.

There is no one-size-fits-all locomotion solution for VR right now. Some players have cast-iron stomachs and can play existing FPS ports in VR with no nausea at all. Comfort turning, instant acceleration, vignetting screen edges; these are all ways to reduce some instances of simulator sickness...but not all. There are successful “dash” mechanics where the player quasi-teleports, but can actually see a brief suggestion of the forward movement, and less successful experiments with 3rd-person swap where the player pilots an avatar to a new location before the camera teleports there.

As long as players are standing, the least dangerous method seems to be teleportation. It breaks immersion, but accomplishes the utility of moving the player without causing nausea. My recommendation, however, is to augment this method with a player controlled screen that can give them a detailed view of distant objects. These in-VR screens are a presence multiplier, and players feel much less stationary when they can zoom across the map to see an objective, or an enemy.

In our case, we are thrilled with the reception to The Nest, and are working to blend the scope with other locomotion as we expand the broader game. The Sniper mechanic feels almost perfect, but we have only scratched the surface, and hope other VR designers will try it as well.