This article is aimed at Virtual Reality (VR) and 3D configurators designers and developers who create full-room VR setups but do not have the ability to make many demos for users.

We already had the opportunity to set up a full-featured Vive with controllers in our Unity office and distribute lots of demos to a variety of interested people. So far, more than 100 people have been able to make their first full-room experiences in VR. It is a lot of fun to see how people react and what they like best. We could observe come consistent behavioursm optimization potentials and associated user confusion which we would like to report about here.

First of all we would like to give you a short summary:

  • Don`t force users to do anything before they have oriented themselves. Give them time to look around, just like in real life.
  • Always explain unusual key assignments. Otherwise, users will think something`s wrong – they won`t always try as long as they can with controllers, for example, until you master the controls.
  • Use animated text or audio cues to explain interactions, button assignments, and gameplay. Don`t use static text unless it`s huge.
  • People don`t have a perfect sense of their physical self, and VR amplifies that. It`s very hard to draw a straight line in 2D, but impossible in 3D. It`s also very difficult to know where the arm really ends.
  • Give users feedback that they are being tracked and everything is still working. Haptic feedback is helpful here.
  • Try out all the new crazy things you can think of . VR is a blank page, intuitive user interfaces aren`t obvious and now it`s time to think on a larger scale.

The demos we`re going to refer to.

Here you will find a list of apps that are used quite often in practice:

  • Tilt Brush: A 3D app for drawing.
  • Job Simulator: A game in which you perform less demanding tasks.
  • TheBlu: An underwater simulation.
  • Fantastic Contraption: The VR version of the classic 2D construction game.
  • Aperture Robot Repair: A structured cinema experience with game elements in the portal universe.
  • The gallery: Six elements: an RPG puzzle/adventure game.
  • Final approach: Another VR remake, this time of the classic air traffic control game.
  • Chunks: A Minecraft variant with a roller coaster.

Caves.

Here are some points to keep in mind before you start reading:

  • Due to the time constraints, we will only deal with anecdotes from the UX perspective here: These are not hard facts.
  • We will not go beyond the basics of VR UX here. Terms like proprioception or usability should not be alien to you.

Important observations.

Give users time to orient themselves.

Users are usually too busy orienting themselves to their new environment to notice or do anything specific for at least 10 seconds, so don`t force them to dive directly into the action. Oculus Story Studio mentioned this phenomenon in his lecture at Siggraph 2015. For their first short film, Lost, they designed something called “in” to restore the acclimatization that naturally happens when watching movies or TV, such as dimming the lights.

The more you get used to VR, the less time you need to orient yourself, but we assume that it will never completely go away. Every new environment needs a certain amount of time to orient itself, especially when the user has taken on a different body or entered a certain awesome scene.

A large percentage of users simply don`t read static instructions.

No big surprise, because users never read anything. It`s still a littler surprising to see how many people just ignore instructions, considering how obvious they are in VR.

vr number e

Fantastic Contraption has practical signs with clear instructions on how to play the game. Tilt Brush goes one step further and includes helpful instructions that point directly at your controller and tell you exactly what you need to do. You literally can`t change a brush or paint in Tilt Brush without looking at their hands, yet more people somehow look past them.

In my experience, there are two types of people: those who read all these instructions clearly and start creating them, and those who don`t. The last group is much, much larger than the first. Those who read tend to get along faster and have more fun. My advice then is not to remove the help text, but to figure out how to get users to read it. Animate text, fade it in or include larger arrows: these are all possible options. I`m also a fan of audio cues that work remarkably well, I`ll talk about that later.

Clear instructions are especially important if you use unusual key mappings. Users forgive the attempt to click around, but if you haven“t noticed a button, they won`t try to use it in VR. Instead, they will think that the game is broken, they did something wrong, or the game is just lame.

A general caveat: Everyone notices a lot of text overlaying the environment. However, whether the message reaches its intended destination is by no means guaranteed.

The use of VR controllers is often very tricky.

The Vive controllers are new and unusual: they don`t feel like an Xbox or Playstation controller and don`t have a joystick.

controller e

Tilt Brush uses the thumb pads for secondary actions, like enlarging the brush stroke (right hand) and rotating the cube menu (left hand), but in general you only notice that the thumb pads are accidentally useful – although Tilt Brush has clear signs pointing directly to the touchpad and saying what you`re doing.

In addition, each game has the ability to fully customize the appearance of the game controller. It`s not just about changing the key layout, which is what most games can do today. That is, you can actually change how the controllers look in VR: They can look like weapons, wands, horses, 20-sided dice or whole life-size houses. In most cases, however, apps keep at least the general shape of the controller, or a simplified version of it, with visible buttons in equivalent places as in the real world. The other commonly used option is to use dummy hands.

In the Gallery: SixElements you have two disembodied hands with gloves with which the user grips things. In the Gallery, the grab action is mapped to the side buttons – the only demo on my list that uses the side buttons for a primary action. Nobody instinctively found out, not even because you can`t feel the side buttons so easily. Most people don`t realize they`re there. Instead, they try to grab the trigger or thumb, then they think that either the controllers are broken, that they`re doing it wrong, or that they just can`t interact with things in the game.

Chunks uses the secondary small button to switch between controller modes. This is really useful fast, but unusual.

In Aperture Robot Repair, you need to click the touchpad to capture things remotely. Most people ignore the huge blue flashing button that the touchpad has turned into and try to use the trigger instead, fail and assume again that the demo doesn`t work properly or that they might not have to open those special drawers etc.

It`s also worth noting that the long beam throw quickly gets tricky when you remotely select or manipulate objects. Try to reduce the movement to large distances. Aperture Robot repair solved this by quickly jumping the controller out of your hand, much like Ocarina of Time`s Hookshot. Gaze-based selection can also work, but becomes less and less useful even at greater distances.

Given that no controller is working perfectly yet, it is necessary to have a little visual, acoustic or haptic feedback to confirm to users that they are still being tracked. Make them liberal and consistently usable. Haptic and acoustic feedback, in particular, feels good and directly contributes to an increase in the sense of presence.

In all cases such as these, it is advantageous to get a quick overview of the adapted controllers.

The sense of movement of most users is somewhat disturbed. VR makes this even clearer.

The two dollar word we are looking for here is proprioception and it has to do with how we normally measure distance and our physical bodies in space. For example, you know how long your arm is: you can easily absorb a glass of water without having to check it from all sides.

Usually our proprioception is good, perhaps not accurate, but good enough for everyday life. For small, precise movements such as discs or drawing, people are used on 2D planes that are physically supported by a table, desk or the like. In real life, we can`t draw sculptures that defy gravity and if we can see our movements over time – such as the trace left by a sparkler – we generally can`t move around them fast enough to see how far away our actions are in the Z-space.

This separation between what you are doing and what you think you are doing is very quickly apparent in Tilt Brush, where attempts to write her name or draw a house turn into Liechtenstein sculptures as soon as you move three meters to the side and see what her drawing looks like from a different angle.

With Fantastic Contraption, it`s less obvious that about 1/3 of the users grab an object with the trigger and miss it. This happens almost never in the Job Simulator, where the objects are only a few meters away. There may be a small perception disorder in our brain around the 5-foot mark.

In environments where most of it is usable, such as the Job Simulator, it`s fine if you can easily figure it out for yourself. But in media content like The Gallery: SixElements, where only certain objects can be edited, designers make sense of objects when you get close enough to grab them.

Users seem to learn best from audio cues.

On a certain level, that make sense. Often tutorials are done in games where someone tells you what to do: in quest form (the ancient way that shows you how to use your sword), as a narrative sheet (the boss on the intercom that says there`s poisoning), or simply as a HUD when you want to go into battle for the first time and you`re told to press the X or Y key.

If immersion in VR is your goal, it can be difficult to justify stopping the action and temptingly try to simolify the UX so much that there is no user confusion. Instead, I would recommend trying audio cues in the form of a narrator, character, or a god-like, all seeing AI.

Aperture Robot Repair uses the snobbish British Robot to guide you through the room and tell you what to do next. He doesn`t tell them how to do it, but at least you know you`re on the right track when you get audio feedback.

Final approach has a similar narrative approach, this time with a cheerful British narrator who airily explains the basics of how to select a plane and land on the tropical island. His instructions have large visual elements such as flashing green squares and arrows, but people don`t notice you at first.

As for the rest of the demos, we usually provide audio cues ourselves. For example, in Tilt Brush there is a large sign that points directly to the thumb pad: “Swipe here” with an animated arrow from left to right. In my experience, a person might have noticed the sign. The rest will discover it by chance or not at all. But as soon as I tell them what to do, they learn and they don`t forget.

We look forward to seeing what new, clever ways there are to teach game interactions in VR. But so far, a clear, disembodied narrator who introduces the space and the initial tasks seems to be most useful way to orient the user without interrupting the Immersion.

Encourage users to move as much as possible.

First-time users don`t move very much: they`re either afraid of stumbing over wires or they just feel a littler strange in a room that doesn`t correspond to reality. To counteract this, Skyworld forces users to move from a solid platform into “thin air”, which feels strange but is a good mechanical learning factor. Fantastic Contraption is impossible to play without moving to build its creations and the UX is so seamless that users don`t notice that they are forced to move.

Aperture Robot Repair gets a golden star here above all. By manipulating the user into a certain position in the demo, they make it look like they have some procedural things going on based on the user`s position. It feels creepy as if the robots know exactly where you are – but that`s not the case,

The big question around the room, of course, is what to do when the user is sitting at a desk, with a high probability of turning something hot into something expensive and electronic. In this case, the ability to grab or move objects or the scene is very important. Oculus Medium allows you to move and scale your project very quickly by simply grabbing it with the double trigger keys. It feels great and lets you manipulate objects of any size – remember to take something the size of a city block, reduce it to one foot and then rotate it. That saves you walking in a block of the city. This is the magic of VR.

Conclusions.

You may have noticed that we haven`t talked much about heads-up displays, peripheral interfaces, floating inputs or the like. The reason for this is that most test persons do not have this technology.

But I am reluctant to draw any solid conclusions. It could of course be that VR is so new that it makes sense to start with the familiar and gradually introduce users to this technology.

If that`s the case, something like Fabtastic Contraption`s toolbox seems to be a natural way to test the water. The toolbox is a floating, purring cat that makes pink clouds fly. It`s enchanting, familiar but strange, introduced at the right time and easily one of the most adorable details in any VR game out there.

Like everyone else I can’t wait to see where VR is going and now it’s time to try everything to see what works and what doesn’t or what might work in the future when users are prepared. Finally, we will codify the behavior, but VR can and should actively avoid falling into exaggerated UI guidelines like the classic “users don’t scroll under the fold” stereotypes of the early 2000s.

VR developers have one big advantage: the cost of prototyping new user interfaces, inputs and mechanisms is quite low compared to the cost of designing a new physical mouse, keyboard or touch screen. In VR, the same physical controllers can be used as pistols, lightsabers, laser pointers, slingshots, grapplehooks, spray cans, or anything else you can imagine. Also, unlike the keyboard, mouse and phone, there are almost no fixed conventions for selection and movement.

That’s crazy, great and exciting, so I advocate the use of audio commands or text instructions rather than strange controllers and key assignments.

Try it out for yourself and get your own picture.