Unity XR Setup
This guide will help you setup a blank Unity project for VR development.
Compatible Unity Version
Please make sure you have version 2021.3.29f1
or 2021.3.30f1
! These are most likely to work correctly with Unity XR.
Part 1: Setup Unity for VR
Create a new project
Open Unity Hub and create a new "3D Core" project. You can reference the steps from HW 1 if you need a refresher.
Make sure you name the project, then click "Create Project" to open it in the Unity Editor.
Install Packages
In order to make your project work with VR, you'll need Unity's "XR Plugin Management" and "XR Interaction Toolkit".
Info
Remember what XR stands for? Extended Reality! Unity's XR plugin system supports AR, AV, and VR. It allows us to create apps for the Meta Quest, Windows Mixed Reality, Magic Leap, and more.
Configure Project for Quest 2
The first thing you need to do is make sure the project will work with the Meta Quest 2 headset. There are a few steps you'll need to configure, so pay attention!
In the top, go to Window
→ Package Manager
. In the Package Manager, click the Packages drop-down in the upper-left. Select Unity Registry
:
In the list on the left, scroll to the bottom and select XR Plugin Management
. Click the Install
button on the right corner. Keep this window open. Once that's done, scroll up a bit until you find the OpenXR Plugin
. Install this one as well.
Once it's done installing, open Edit
→ Project Settings...
and scroll down to select XR Plug-in Management
. On the right, click the tab corresponding to Android (). Check the box for OpenXR
.
Info
OpenXR is an open standard that allows for cross-platform AR/VR/MR development across multiple devices and OSes, including the Meta Quest and even desktop VR. Any device that implements the OpenXR standard is able to run apps developed for OpenXR. We have dumped Meta's proprietary Oculus Integration Toolkit for something that should be much better! Take that, corporate greed !
Once OpenXR finishes installing, it should open up the Project Validation
page, which is underneath the XR Plug-in Management
section. If this does not show up, navigate in the menu on the left to find Project Validation
underneath XR Plug-in Management
. (If you are missing this, you likely have an older version of Unity. Please upgrade to the recommended versions and try again!)
In this window, make sure you are on the Android tab on the right and click Fix All
(you may need to press it twice). This should fix everything except a warning about "interaction profiles". (Sometimes a fix will take a script recompilation to disappear. Go ahead and continue, and when you come back to Project Validation
, it may be resolved.)
Go ahead and click on Edit
next to that item. We need to select the controllers to use for our VR app! Under the Interaction Profiles
section, click on the plus () symbol. Select Oculus Touch Controller Profile
in the menu that appears.
Now, check Meta Quest Support
in the OpenXR Feature Groups
section.
Close the Project Settings, and go to File
→ Build Settings...
in the top menu. In the Platform list on the left, select Android
. Now click on Switch Platform
in the lower-right.
That's it! Your project is now set up for the Meta Quest 2.
Part 2: Import the XR Interaction Toolkit
We need to include the XR Interaction Toolkit, which helps us add locomotion and interaction to the scene.
In the top menu, go to Window
→ Package Manager
. Inside the Package Manager, in the upper-left, click the Packages drop-down and select Unity Registry
:
In the list on the left, scroll to the bottom and select XR Interaction Toolkit
. Click the Install
button in the lower-right.
XR Interaction Toolkit Version
Make sure the XR Interaction Toolkit is version 2.4.0 or above! Otherwise, some things in the guide below may look different. You can use the Package Manager to update the toolkit as long as your Unity version is also up-to-date.
Wait for the toolkit to install. If asked to restart to enable the new input backends, select "Yes".
Once Unity starts back up, open the Package Manager again. Find the XR Interaction Toolkit and select it. Now on the right, expand the Samples
item. Import the Starter Assets
, XR Device Simulator
, and Tunneling Vignette
samples. (In version 2.5.x, the Tunneling Vignette
sample is included in Starter Assets
.)
Once that's done, close the window.
Add XR Origin to Scene
Let's replace the Main Camera that's in the Hierarchy with an prefab from the XR Interaction Toolkit that works with VR. Select the existing Main Camera
and delete it (press the delete key or use right-click → delete).
In the Project View, search for XR Interaction Setup
(or go to Assets/Samples/XR Interaction Toolkit/<VERSION>/Starter Assets/Prefabs/
). Add that prefab to your scene by dragging and dropping it in an empty space in your Hierarchy. Expand the XR Interaction Setup
in your Hierarchy and select the XR Origin (XR Rig)
object.
In the Inspector, you should see an XR Origin
component. Change the Tracking Origin Mode
to Floor
. This will make the position of the floor in the virtual world match the floor in the real world.
Using the XR Device Simulator
First, we need to activate the simulator. In Edit
→ Project Settings...
, expand the XR Plug-in Management
section. Select the XR Interaction Toolkit
option underneath that. Then, on the right, check Use XR Device Simulator in scenes
.
Now, when you click the play button at the top, a little window should pop up showing you the controls for the simulator.
The basic controls are similar to the Scene View in Unity; use the WASD keys to move, and move the mouse to rotate the current device. Click and drag the right mouse button to look around. Pressing the Tab key will switch through the controllers, allowing you to control them and simulate their button and thumbstick presses. You can also hold Left Shift to control the left controller and hold Space to control the right controller.
See the Unity Documentation for more details.
→ When you're done, stop Play Mode (by toggling the play button at the top) and return to the Scene View.
Part 2: Add Locomotion
Great! Now you have both locomotion and interaction set up. Aren't prefabs convenient? By default, the left controller is set to use direct movement when moving the thumbstick, and the right controller is set to use teleportation when pushing up on the thumbstick. You also have a number of different interaction techniques set up (including a "poke", which is triggered by a little finger attached to the controller model).
You can see all the locomotion and interaction pieces expanded under the XR Interaction Setup
:
If you click on the Left Controller
and then go to the Inspector, you can see the Action Based Controller Manager
. This is responsible for mapping the controls from the physical controllers to the actions in the app. Generally, we don't want to change the Interactors or Controller Actions. Under the Locomotion Settings
section, you can see an option called Smooth Motion Enabled
. Hover your mouse over this option to see a description of what it does.
The Smooth Motion Enabled
option determines if the controller will use teleportation or avatar/continuous movement. The Smooth Turn Enabled
option determines if pushing the thumbstick left or right will instantaneously rotate the view by increments or continuously rotate while the stick is pushed (to avoid vection, do NOT enable smooth turning). Feel free to enable/disable these checkboxes to pick a locomotion technique that is suitable for your app (also see subsections below).
The GameObjects underneath the Locomotion System
in the Hierarchy contain different parameters for the turning, moving, and teleportation systems. Feel free to click on each of these and explore the options in the Inspector by hovering your mouse over them.
Create a plane in your scene by right-clicking on an empty area in your Hierarchy and going to 3D Object
→ Plane
. Reset the transform and resize the plane to be big enough to fill your world. This will represent our basic ground for the tutorials. Later on, you may want to delete this depending on how your scene is set up.
Next, setup a locomotion method by following the guides below.
Smooth Locomotion
If you choose smooth movement, then set the Smooth Motion Enabled
option on both controllers. Make sure that all the objects in your scenes have "colliders" of some kind (including the floor), otherwise your player will "clip through" them.
If you wish to change how the player is affected by gravity, locate the Continuous Move Provider
and change the settings in it. The documentation is here: Continuous Move Provider.
(Optional) Comfort/Vignette Movement
Right now, the continuous movement with the left controller is vection-inducing. We can add a vignette to reduce the field of view when moving.
First, expand the objects in the Hierarchy so that you can see the Main Camera
, underneath Camera Offset
. Next, in the Project View, search for TunnelingVignette
. You should find a few results, one of which is the TunningVignette.prefab
. This is the one we want. You can click each item in the Project View to see its full path at the bottom.
We want to add this prefab as a child of the Main Camera
. Go ahead and drag the prefab from the Project View and drop it onto the Main Camera
GameObject.
Select the TunnelingVignette
object you just added in the Hierarchy. In the Inspector, scroll down until you find the Tunneling Vignette Controller
. This contains settings to configure the size of the vignette, how fast it appears, and the color! You can adjust these later if you'd like.
To make the vignette appear when the user uses continuous movement, we have to add it as a "locomotion provider". In the Locomotion Vignette Providers
section, click the plus () symbol. This will create a new entry. To the right of Locomotion Provider
, click the circle with a dot in the middle to open the object selector.
In the object selector, we want to select the Move
object:
That's it! Enter Play Mode and test out the movement again (remember to use the Tab key to switch between the headset, left controller, and right controller). You should see a vignette appear when you move!
Setup Teleportation
Right now, you can't teleport anywhere. Why? There are no teleport destinations set!
It would be great if we could teleport anywhere on that plane. Right now, the controllers are only set up to teleport to areas that are on the Teleport
interaction layer. What this means is that we can put objects in these different "layers" to tell Unity what is and is not teleportable.
Make sure to stop "Play Mode" by using the play button at the top. In the Hierarchy, expand the Left Controller
. You will see a Teleport Interactor
under it.
Select the Teleport Interactor
. In the Inspector, you should be able to see an XR Ray Interactor
component. When the user wants to teleport somewhere, they hold up on the thumbstick and a line appears from the controller showing them where they will teleport to. This ray interactor represents that line. At the top, you can see how the Interaction Layer Mask
is set to the Teleport
layer.
Don't see the Teleport layer here?
If you aren't seeing the teleport layer (and is says something like Mixed
), please click the dropdown and select Add Layer
. In the Inspector on the right, you should see the Interaction Layer Settings.
Expand the Interaction Layers
list, then in User Layer 1
, type in Teleport
. This will create the new layer.
Now, you can go back to the Teleport Interactor
and change its Interaction Layer Mask
. First select the Nothing
option in the dropdown to remove all the other layers. Then, select Teleport
in the dropdown to check only that layer.
Under Raycast Configuration
, you can see that the Line Type
is set to Projectile Curve
, which means that the teleport line will follow an arc. You can change the line type to suit your needs. Typically, free teleportation uses projectile curves, while node-based teleportation uses straight lines.
If you scroll down, you can see the XR Interactor Line Visual
, which actually draws the line on the screen. You can change the width and color of the line here, as well as the "landing point"/reticule that shows up at the teleport destination.
Free Teleportation
For free teleportation, you will need to set up some Teleportation Area
s. A teleportation area denotes geometry in the world that the user can teleport to. All children of a GameObject with a Teleportation Area
component are considered part of this area. The user can freely teleport to anywhere in the area.
Colliders Are Required
Teleportation areas require "colliders" in order to function. Typically, 3D objects come with their own colliders (usually a mesh collider or a box collider), but you can add your own. Unity has different colliders to choose from: Colliders.
For now, let's go ahead and select the Plane
object in the Hierarchy.
In the Inspector, scroll to the bottom and click Add Component
. Search for Teleportation Area
and add it to this object.
We need to set our plane to be on the correct layer. In the Teleportation Area
component, click on the drop-down next to Interaction Layer Mask
. First select the Nothing
option to remove it from the Default
layer. Next, click the drop-down again and this time select the Teleport
layer. It should be the only option selected:
You have now created a teleportation area! The right controller can be used to teleport to any part of this area.
You can test this with the simulator. Enter Play Mode (by pressing the play button at the top) and try testing out the teleport by simulating the thumbsticks on the controllers. When you're done, make sure to exit Play Mode (by pressing the play button again).
Node-based Teleportation
A teleportation anchor represents node-based teleportation. When trying to teleport to a GameObject with a Teleportation Anchor
component, the user will instead be teleported to the center of the GameObject or a predefined position, as opposed to wherever they choose.
You can set up a teleportation anchor the same as a teleportation area. I recommend using short little cylinders for your teleportation anchors, they work well as "nodes".
Note: If you set up your XR Ray Interactor
earlier to work with the Teleport
layer, you will also need to set each anchor to the Teleport
layer.
You can also configure the settings to change the anchor location or direction. For more info, see the Unity Documentation.
Part 3: Build and Test on the Headset
Now let's try testing it on the headset! Each headset comes with a USB-C to USB-C cable. If you do not have a USB-C port on your computer, there should be a small USB-C to USB Type A adapter in the box. PLEASE DO NOT LOSE THESE!
Import Dr. Wang's VR Build Menu
I've created a helpful utility to help make deploying to the headset easier. Download this package: buildhelper.unitypackage.
Import it into your project. Remember, you can do so by going to the Project View, right-clicking the Assets folder, then selecting Import Package
→ Custom Package...
. Find the package you just downloaded and open it. When a window pops up showing the package contents, make sure all of the files are selected and then click Import
.
This will add a new menu to Unity called VR Build
. The menu contains options to assist with manual deployment.
Add Scenes to Build
Open the Build Settings window by going to File
→ Build Settings...
. Click Add Open Scenes
to add your current scene to the build.
If you have other scenes in your project, find them in the Project View and then drag and drop them to this list of scenes.
Keep the build window open for the next step.
(Optional) Rename Your App
Click on the Player Settings...
button in the lower-left of the Build Settings window, or go to Edit
→ Project Settings...
and select the Player
section. This will open up the Player Settings.
Here, you can change the Product Name
to whatever you'd like.
(Important) Turn Off the XR Device Simulator When Building
For some reason, the XR Device Simulator will make it so that your app no longer works on the headset. Thus, we must turn off the simulator before building!
You can quickly get to this setting by going to VR Build
→ Open XR Device Simulator Settings...
. Then, on the right, uncheck Use XR Device Simulator in scenes
.
You can also reach it by going to Edit
→ Project Settings...
, expanding the XR Plug-in Management
section, and then selecting the XR Interaction Toolkit
option underneath that.
Turn it back on when you need to test your app.
Build and Deploy to the Headset
There are two different ways to build and run an APK on the headset. You can use the Build and Run option, which requires a headset to be plugged in, OR you can build an APK and then deploy it manually.
Build and Run
Use the cable (and adapter) to connect the headset to your computer. Next, put on the headset. You should see an acknowledgement of some kind asking you if you would like to allow USB debugging. Go ahead and click "Allow" (or "Always allow from this computer").
Open the Build Settings
window (use the File
and VR Build
menus). Now, next to Run Device
, click Refresh
. Then, in the drop-down menu, look for a Oculus Quest 2
device and select it. (If you do not see it, refresh and try again. If that doesn't work, skip down to the manual deployment section.)
Once you have selected the headset, click Build and Run
. Unity will ask you for a location to save the "apk" file (which contains the built application). Create a folder in your project called Build
and save the apk inside as locomotion.apk
. This will build the app and automatically deploy it on the headset.
Manual Build
First, you need to build an apk and save it to a folder. You can do so from File
→ Build Settings...
and then click on Build
, or instead go to VR Build
→ Build APK Only...
.
This will ask you for a location to save the "apk" file (which contains the built application). Create a folder in your project called Build
and save the apk inside as locomotion.apk
.
This may take a few minutes the first time. Once your app is built, you can follow the Manual Deployment Steps below when you have access to a headset.
Manual Deployment Steps
Once the apk is built, connect the headset to your computer. Click VR Build
→ View Connected Devices
. You should see a device listed. If the device says "unauthorized", put on the headset. You should see an acknowledgement of some kind asking you if you would like to allow USB debugging. Go ahead and click "Allow" (or "Always allow from this computer").
Now, go to VR Build
→ Deploy APK to Headset...
. In the file picker that opens, select the apk file you built previously. This should now install your application onto the connected headset.
If the above steps do not work, you may need to run ADB (Android Debug Bridge) manually. ADB is a command-line application that allows us to manage Android devices. You can find where the adb executable is installed by going to VR Build
→ Show ADB Location
. Then, using a terminal/command prompt you can run adb commands to connect to a device with adb devices
, and install an app with adb install <PATH TO APK>
. Please feel free to ask for help with this step!