Introduction to Best Practices
VR is an immersive medium. It creates the sensation of being entirely transported into a virtual (or real, but digitally reproduced) three-dimensional world, and it can provide a far more visceral experience than screen-based media. Enabling the mind’s continual suspension of disbelief requires particular attention to detail. It can be compared to the difference between looking through a framed window into a room, versus walking through the door into the room and freely moving around.
Overview
If VR experiences ignore fundamental best practices, they can lead to simulator sickness—a combination of symptoms clustered around eyestrain, disorientation, and nausea. Historically, many of these problems have been attributed to sub-optimal VR hardware variables, such as system latency. The Oculus Rift represents a new generation of VR devices, one that resolves many issues of earlier systems. But even with a flawless hardware implementation, improperly designed content can still lead to an uncomfortable experience.
Because VR has been a fairly esoteric and specialized discipline, there are still aspects of it that haven’t been studied enough for us to make authoritative statements. In these cases, we put forward informed theories and observations and indicate them as such. User testing is absolutely crucial for designing engaging, comfortable experiences; VR as a popular medium is still too young to have established conventions on which we can rely. Although our researchers have testing underway, there is only so much they can study at a time. We count on you, the community of Oculus Rift developers, to provide feedback and help us mature these evolving VR best practices and principles.
Note: As with any medium, excessive use without breaks is not recommended developers, end-users, or the device.
Rendering
- Use the Oculus VR distortion shaders. Approximating your own distortion solution, even when it “looks about right,” is often discomforting for users.
- Get the projection matrix exactly right and use of the default Oculus head model. Any deviation from the optical flow that accompanies real world head movement creates oculomotor issues and bodily discomfort.
- Maintain VR immersion from start to finish—don’t affix an image in front of the user (such as a full-field splash screen that does not respond to head movements), as this can be disorienting.
- The images presented to each eye should differ only in terms of viewpoint; post-processing effects (e.g., light distortion, bloom) must be applied to both eyes consistently as well as rendered in z-depth correctly to create a properly fused image.
- Consider supersampling and/or anti-aliasing to remedy low apparent resolution, which will appear worst at the center of each eye’s screen.
Minimizing Latency
- Your code should run at a frame rate equal to or greater than the Rift display refresh rate, v-synced and unbuffered. Lag and dropped frames produce judder which is discomforting in VR.
- Ideally, target 20ms or less motion-to-photon latency (measurable with the Rift’s built-in latency tester). Organise your code to minimize the time from sensor fusion (reading the Rift sensors) to rendering.
- Game loop latency is not a single constant and varies over time. The SDK uses some tricks (e.g., predictive tracking, TimeWarp) to shield the user from the effects of latency, but do everything you can to minimize variability in latency across an experience.
- Use the SDK’s predictive tracking, making sure you feed in an accurate time parameter into the function call. The predictive tracking value varies based on application latency and must be tuned per application.
- Consult the OculusRoomTiny source code as an example for minimizing latency and applying proper rendering techniques in your code.
Optimization
- Decrease eye-render buffer resolution to save video memory and increase frame rate.
- Although dropping display resolution can seem like a good method for improving performance, the resulting benefit comes primarily from its effect on eye-render buffer resolution. - Dropping the eye-render buffer resolution while maintaining display resolution can improve performance with less of an effect on visual quality than doing both.
Head-tracking and Viewpoint
- Avoid visuals that upset the user’s sense of stability in their environment. Rotating or moving the horizon line or other large components of the user’s environment in conflict with the user’s real-world self-motion (or lack thereof) can be discomforting.
- The display should respond to the user’s movements at all times, without exception. Even in menus, when the game is paused, or during cutscenes, users should be able to look around.
- Use the SDK’s position tracking and head model to ensure the virtual cameras rotate and move in a manner consistent with head and body movements; discrepancies are discomforting.
Positional Tracking
- The rendered image must correspond directly with the user's physical movements; do not manipulate the gain of the virtual camera’s movements. A single global scale on the entire head model is fine (e.g. to convert feet to meters, or to shrink or grow the player), but do not scale head motion independent of interpupillary distance (IPD).
- With positional tracking, users can now move their viewpoint to look places you might have not expected them to, such as under objects, over ledges, and around corners. Consider your approach to culling and backface rendering, etc..
- Under certain circumstances, users might be able to use positional tracking to clip through the virtual environment (e.g., put their head through a wall or inside objects). Our observation is that users tend to avoid putting their heads through objects once they realize it is possible, unless they realize an opportunity to exploit game design by doing so. Regardless, developers should plan for how to handle the cameras clipping through geometry. One approach to the problem is to trigger a message telling them they have left the camera’s tracking volume (though they technically may still be in the camera frustum).
- Provide the user with warnings as they approach (but well before they reach) the edges of the position camera’s tracking volume as well as feedback for how they can re-position themselves to avoid losing tracking.
- We recommend you do not leave the virtual environment displayed on the Rift screen if the user leaves the camera’s tracking volume, where positional tracking is disabled. It is far less discomforting to have the scene fade to black or otherwise attenuate the image (such as dropping brightness and/or contrast) before tracking is lost. Be sure to provide the user with feedback that indicates what has happened and how to fix it.
- Augmenting or disabling position tracking is discomforting. Avoid doing so whenever possible, and darken the screen or at least retain orientation tracking using the SDK head model when position tracking is lost.
Accelerations
- Acceleration creates a mismatch among your visual, vestibular, and proprioceptive senses; minimize the duration and frequency of such conflicts. Make accelerations as short (preferably instantaneous) and infrequent as you can.
- Remember that “acceleration” does not just mean speeding up while going forward; it refers to any change in the motion of the user. Slowing down or stopping, turning while moving or standing still, and stepping or getting pushed sideways are all forms of acceleration.
- Have accelerations initiated and controlled by the user whenever possible. Shaking, jerking, or bobbing the camera will be uncomfortable for the player.
Movement Speed
- Viewing the environment from a stationary position is most comfortable in VR; however, when movement through the environment is required, users are most comfortable moving through virtual environments at a constant velocity. Real-world speeds will be comfortable for longer—for reference, humans walk at an average rate of 1.4 m/s.
- Teleporting between two points instead of walking between them is worth experimenting with in some cases, but can also be disorienting. If using teleportation, provide adequate visual cues so users can maintain their bearings, and preserve their original orientation if possible.
- Movement in one direction while looking in another direction can be disorienting. Minimize the necessity for the user to look away from the direction of travel, particularly when moving faster than a walking pace.
- Avoid vertical linear oscillations, which are most discomforting at 0.2 Hz, and off-vertical-axis rotation, which are most discomforting at 0.3 Hz.
Cameras
- Zooming in or out with the camera can induce or exacerbate simulator sickness, particularly if they cause head and camera movements to fall out of 1-to-1 correspondence with each other. We advise against using “zoom” effects until further research and development finds a comfortable and user-friendly implementation..
- For third-person content, be aware that the guidelines for accelerations and movements still apply to the camera regardless of what the avatar is doing. Furthermore, users must always have the freedom to look all around the environment, which can add new requirements to the design of your content.
- Avoid using Euler angles whenever possible; quaternions are preferable. Try looking straight up and straight down to test your camera; it should always be stable and consistent with your head orientation.
- Do not use “head bobbing” camera effects; they create a series of small but uncomfortable accelerations.
Managing and Testing Simulator Sickness
- Test your content with a variety of un-biased users to ensure it is comfortable to a broader audience. As a developer, you are the worst test subject. Repeated exposure to and familiarity with the Rift and your content makes you less susceptible to simulator sickness or content distaste than a new user.
- People’s responses and tolerance to sickness vary, and visually induced motion sickness occurs more readily in virtual reality headsets than with computer or TV screens. Your audience will not “muscle through” an overly intense experience, nor should they be expected to do so.
- Consider implementing mechanisms that allow users to adjust the intensity of the visual experience. This will be content-specific, but adjustments might include movement speed, the size of accelerations, or the breadth of the displayed FOV. Any such settings should default to the lowest-intensity experience.
- For all user-adjustable settings related to simulator sickness management, users may want to change them on-the-fly (for example, as they become accustomed to VR or become fatigued). Whenever possible, allow users to change these settings in-game without restarting.
- An independent visual background that matches the player’s real-world inertial reference frame (such as a skybox that does not move in response to controller input but can be scanned with head movements) can reduce visual conflict with the vestibular system and increase comfort (see Tracking).
- High spatial frequency imagery (e.g., stripes, fine textures) can enhance the perception of motion in the virtual environment, leading to discomfort. Use—or offer the option of—flatter textures in the environment (such as solid-colored rather than patterned surfaces) to provide a more comfortable experience to sensitive users.
Degree of Stereoscopic Depth (“3D-ness”)
- For individualized realism and a correctly scaled world, use the middle-to-eye separation vectors supplied by the SDK from the user’s profile.
- Be aware that depth perception from stereopsis is sensitive up close, but quickly diminishes with distance. Two mountains miles apart in the distance will provide the same sense of depth as two pens inches apart on your desk.
- Although increasing the distance between the virtual cameras can enhance the sense of depth from stereopsis, beware of unintended side effects. First, this will force users to converge their eyes more than usual, which could lead to eye strain if you do not move objects farther away from the cameras accordingly. Second, it can give rise to perceptual anomalies and discomfort if you fail to scale head motion equally with eye separation.
User Interface
- UIs should be a 3D part of the virtual world and sit approximately 2-3 meters away from the viewer—even if it’s simply drawn onto a floating flat polygon, cylinder or sphere that floats in front of the user.
- Don’t require the user to swivel their eyes in their sockets to see the UI. Ideally, your UI should fit inside the middle 1/3rd of the user’s viewing area; otherwise, they should be able to examine it with head movements.
- Use caution for UI elements that move or scale with head movements (e.g., a long menu that scrolls or moves as you move your head to read it). Ensure they respond accurately to the user’s movements and are easily readable without creating distracting motion or discomfort.
- Strive to integrate your interface elements as intuitive and immersive parts of the 3D world. For example, ammo count might be visible on the user’s weapon rather than in a floating HUD.
- Draw any crosshair, reticle, or cursor at the same depth as the object it is targeting; otherwise, it can appear as a doubled image when it is not at the plane of depth on which the eyes are converged.
Controlling the Avatar
- User input devices can't be seen while wearing the Rift. Allow the use of familiar controllers as the default input method. If a keyboard is absolutely required, keep in mind that users will have to rely on tactile feedback (or trying keys) to find controls.
- Consider using head movement itself as a direct control or as a way of introducing context sensitivity into your control scheme.
Sound
- When designing audio, keep in mind that the output source follows the user’s head movements when they wear headphones, but not when they use speakers. Allow users to choose their output device in game settings, and make sure in-game sounds appear to emanate from the correct locations by accounting for head position relative to the output device.
- Presenting NPC (non-player character) speech over a central audio channel or left and right channels equally is a common practice, but can break immersion in VR. Spatializing audio, even roughly, can enhance the user’s experience.
- Keep positional tracking in mind with audio design; for example, sounds should get louder as the user leans towards their source, even if the avatar is otherwise stationary.
Content
- For recommendations related to distance, one meter in the real world corresponds roughly to one unit of distance in Unity.
- The optics of the DK2 Rift make it most comfortable to view objects that fall within a range of 0.75 to 3.5 meters from the user’s eyes. Although your full environment may occupy any range of depths, objects at which users will look for extended periods of time (such as menus and avatars) should fall in that range.
- Converging the eyes on objects closer than the comfortable distance range above can cause the lenses of the eyes to misfocus, making clearly rendered objects appear blurry as well as lead to eyestrain.
- Bright images, particularly in the periphery, can create noticeable display flicker for sensitive users; if possible, use darker colors to prevent discomfort.
- A virtual avatar representing the user’s body in VR can have pros and cons. On the one hand, it can increase immersion and help ground the user in the VR experience, when contrasted to representing the player as a disembodied entity. On the other hand, discrepancies between what the user’s real-world and virtual bodies are doing can lead to unusual sensations (for example, looking down and seeing a walking avatar body while the user is sitting still in a chair). Consider these factors in designing your content.
- Consider the size and texture of your artwork as you would with any system where visual resolution and texture aliasing is an issue (e.g. avoid very thin objects).
- Unexpected vertical accelerations, like those that accompany traveling over uneven or undulating terrain, can create discomfort. Consider flattening these surfaces or steadying the user’s viewpoint when traversing such terrain.
- Be aware that your user has an unprecedented level of immersion, and frightening or shocking content can have a profound effect on users (particularly sensitive ones) in a way past media could not. Make sure players receive warning of such content in advance so they can decide whether or not they wish to experience it.
- Don’t rely entirely on the stereoscopic 3D effect to provide depth to your content; lighting, texture, parallax (the way objects appear to move in relation to each other when the user moves), and other visual features are equally (if not more) important to conveying depth and space to the user. These depth cues should be consistent with the direction and magnitude of the stereoscopic effect.
- Design environments and interactions to minimize the need for strafing, back-stepping, or spinning, which can be uncomfortable in VR.
- People will typically move their heads/bodies if they have to shift their gaze and hold it on a point farther than 15-20° of visual angle away from where they are currently looking. Avoid forcing the user to make such large shifts to prevent muscle fatigue and discomfort.
- Don’t forget that the user is likely to look in any direction at any time; make sure they will not see anything that breaks their sense of immersion (such as technical cheats in rendering the environment).
Avatar Appearance
- When creating an experience, you might choose to have the player experience it as a ghost (no physical presence) or in a body that is very different from his or her own. For example, you might have a player interact with your experience as a historical figure, a fictional character, a cartoon, a dragon, a giant, an orc, an amoeba, or any other of a multitude of possibilities. Any such avatars should not create issues for users as long as you adhere to best practices guidelines for comfort and provide users with intuitive controls.
- When the avatar is meant to represent the players themselves inside the virtual environment, it can detract from immersion if the player looks down and sees a body or hands that are very different than his or her own. For example, a woman’s sense of immersion might be broken if she looks down and sees a man’s hands or body. If you are able to allow players to customize their hands and bodies, this can dramatically improve immersion. If this adds too much cost or complexity to your project, you can still take measures to minimize contradictions between VR and reality. For example, avoid overtly masculine or feminine bodily features in visible parts of the avatar. Gloves and unisex clothing that fit in the theme of your content can also serve to maintain ambiguity in aspects of the avatar’s identity, such as gender, body type, and skin color.
Health and Safety
- Carefully read and implement the warnings that accompany the Rift (see Health and Safety Warnings) to ensure the health and safety of both you, the developer, and your users.
- Refrain from using any high-contrast flashing or alternating colors that change with a frequency in the 1-30 hz range. This can trigger seizures in individuals with photosensitive epilepsy.
- Avoid high-contrast, high-spatial-frequency gratings (e.g., fine, black-and-white stripes), as they can also trigger epileptic seizures.