Lab 3: Locomotion Adventure
Revised 9/14/2023 12 PM
The goal of this lab is to give you experience working with VR in Unity, setting up locomotion, and creating triggers to transition between scenes.
Part 1 will take you through the setup process and create a simple scene to build and deploy on the headset.
Part 1: Setup Unity for VR
We will be using the built in Unity XR framework for creating our app today. Open Unity Hub and create a new project. This time, scroll down and select VR Core
in the list of templates. Click on the download button that appears on the right.
After the template is downloaded, go ahead and name your project and select a location to save it. I recommend naming your project <YOUR NAME>_Locomotion_Adventure
. Once you are happy with the name and location, click Create Project
.
When the editor opens, it will welcome you to the template project. Say "thank you" and then click Close
. On the right should be a Tutorials
tab. Right-click on it and select Close Tab
. We don't need no stinkin' tutorials!
Configure Project for Quest 2
The first thing you need to do is make sure the project will work with the Meta Quest 2 headset. There are a few steps you'll need to configure, so pay attention!
In the top menu, go to Edit
→ Project Settings...
. Scroll down until you find XR Plug-in Management
and select it. On the right, click the tab corresponding to Android (). Check the box for OpenXR
.
Info
OpenXR is an open standard that allows for cross-platform AR/VR/MR development across multiple devices and OSes, including the Meta Quest and even desktop VR. Any device that implements the OpenXR standard is able to run apps developed for OpenXR. We have dumped Meta's proprietary Oculus Integration Toolkit for something that should be much better! Take that, corporate greed !
Once OpenXR finishes installing, it should open up the Project Validation
page, which is underneath the XR Plug-in Management
section. Make sure you are on the Android tab and click Fix All
. This should fix everything except a warning about "interaction profiles".
Go ahead and click on Edit
next to that item. We need to select the controllers to use for our VR app! Under the Interaction Profiles
section, click on the plus () symbol. Select Oculus Touch Controller Profile
in the menu that appears.
Now, check Meta Quest Support
in the OpenXR Feature Groups
section.
Close the Project Settings, and go to File
→ Build Settings...
in the top menu. In the Platform list on the left, select Android
. Now click on Switch Platform
in the lower-right.
That's it! Your project is now set up for the Meta Quest 2.
Import the XR Interaction Toolkit
We need to include the XR Interaction Toolkit, which helps us add locomotion and interaction to the scene.
In the top menu, go to Window
→ Package Manager
. Inside the Package Manager, in the upper-left, click the Packages drop-down and select Unity Registry
:
In the list on the left, scroll to the bottom and select XR Interaction Toolkit
. Click the Install
button in the lower-right.
Wait for the toolkit to install. If asked to restart to enable the new input backends, select "Yes".
Once Unity starts back up, open the Package Manager again. Find the XR Interaction Toolkit and select it. Now on the right, expand the Samples
item. Import the Starter Assets
, XR Device Simulator
, and Tunneling Vignette
samples.
Once that's done, close the window.
There is one tiny bug with the VR template . In addition, it doesn't support any locomotion or interaction. So let's replace the XRRig that's in the Hierarchy with one from the XR Interaction Toolkit. Select the existing XRRig
and delete it (press the delete key or use right-click → delete). Also delete the XRPlatformControllerSetup
and the XR Interaction Manager
.
In the Project View, search for XR Interaction Setup
(or go to Assets/Samples/XR Interaction Toolkit/<VERSION>/Starter Assets/Prefabs/
). Add that prefab to your scene by dragging and dropping it in an empty space in your Hierarchy. Expand the XR Interaction Setup
in your Hierarchy and select the XR Origin (XR Rig)
object.
In the Inspector, you should see an XR Origin
component. Change the Tracking Origin Mode
to Floor
. This will make the position of the floor in the virtual world match the floor in the real world.
Using the XR Device Simulator
First, we need to activate the simulator. In Edit
→ Project Settings...
, expand the XR Plug-in Management
section. Select the XR Interaction Toolkit
option underneath that. Then, on the right, check Use XR Device Simulator in scenes
.
Now, when you click the play button at the top, a little window should pop up showing you the controls for the simulator.
The basic controls are similar to the Scene View in Unity; use the WASD keys to move, and move the mouse to rotate the current device. Click and drag the right mouse button to look around. Pressing the Tab key will switch through the controllers, allowing you to control them and simulate their button and thumbstick presses. You can also hold Left Shift to control the left controller and hold Space to control the right controller.
See the Unity Documentation for more details.
Click the play button again to stop Play Mode and return to the Scene View.
Part 2: Add Locomotion and Interaction
Great! Now you have both locomotion and interaction set up. Aren't prefabs convenient? By default, the left controller is set to use direct movement when moving the thumbstick, and the right controller is set to use teleportation when pushing up on the thumbstick. You also have a number of different interaction techniques set up (including a "poke", which is triggered by a little finger attached to the controller model). We'll use these in the future!
You can see all the locomotion and interaction pieces expanded under the XR Interaction Setup
:
If you click on the Left Controller
and then go to the Inspector, you can see the Action Based Controller Manager
. This is responsible for mapping the controls from the physical controllers to the actions in the app. Generally, we don't want to change the Interactors or Controller Actions. Under the Locomotion Settings
section, you can see an option called Smooth Motion Enabled
. Hover your mouse over this option to see a description of what it does.
The Smooth Motion Enabled
option determines if the controller will use teleportation or avatar/continuous movement. The Smooth Turn Enabled
option determines if pushing the thumbstick left or right will instantaneously rotate the view by increments or continuously rotate while the stick is pushed (to avoid vection, do NOT enable smooth turning).
The GameObjects underneath the Locomotion System
in the Hierarchy contain different parameters for the turning, moving, and teleportation systems. Feel free to click on each of these and explore the options in the Inspector by hovering your mouse over them.
Creating a Teleportation Area
Click the play button and try testing the locomotion systems. To move with the left controller, press Tab once to select the left controller. Then, you can use the WASD keys to simulate pressing the thumbstick up, left, down, and right. To teleport with the right controller, press Tab again to switch to the right controller. Then, you can aim the controller with the mouse and use the W key to simulate pressing up on the thumbstick to teleport.
Right now, you can't teleport anywhere. Why? There are no teleport destinations set! We have a large plane in the scene. It would be great if we could teleport anywhere on that plane. Right now, the controllers are only set up to teleport to areas that are on the Teleport
interaction layer. What this means is that we can put objects in these different "layers" to tell Unity what is and is not teleportable.
Make sure to stop "Play Mode" by using the play button at the top. In the Hierarchy, expand the Left Controller
. You will see a Teleport Interactor
under it.
Select the Teleport Interactor
. In the Inspector, you should be able to see an XR Ray Interactor
component. When the user wants to teleport somewhere, they hold up on the thumbstick and a line appears from the controller showing them where they will teleport to. This ray interactor represents that line. At the top, you can see how the Interaction Layer Mask
is set to the Teleport
layer.
Don't see the Teleport layer here?
If you aren't seeing the teleport layer (and is says something like Mixed
), please click the dropdown and select Add Layer
. In the Inspector on the right, you should see the Interaction Layer Settings.
Expand the Interaction Layers
list, then in User Layer 1
, type in Teleport
. This will create the new layer.
Now, you can go back to the Teleport Interactor
and change its Interaction Layer Mask
. First select the Nothing
option in the dropdown to remove all the other layers. Then, select Teleport
in the dropdown to check only that layer.
Under Raycast Configuration
, you can see that the Line Type
is set to Projectile Curve
, which means that the teleport line will follow an arc. You can change the line type to suit your needs. Typically, free teleportation uses projectile curves, while node-based teleportation uses straight lines.
If you scroll down, you can see the XR Interactor Line Visual
, which actually draws the line on the screen. You can change the width and color of the line here, as well as the "landing point"/reticule that shows up at the teleport destination.
→ For now, let's go ahead and select the Plane
object in the Hierarchy.
In the Inspector, scroll to the bottom and click Add Component
. Search for Teleportation Area
and add it to this object.
We need to set our plane to be on the correct layer. In the Teleportation Area
component, click on the drop-down next to Interaction Layer Mask
. First select the Nothing
option to remove it from the Default
layer. Next, click the drop-down again and this time select the Teleport
layer. It should be the only option selected:
You have now created a teleportation area! The right controller can be used to teleport to any part of this area. Enter Play Mode (by pressing the play button at the top) and try testing out the teleport. When you're done, make sure to exit Play Mode (by pressing the button again).
Teleportation Areas vs. Anchors
A teleportation area denotes geometry in the world that the user can teleport to. All children of a GameObject with a Teleportation Area
component are considered part of this area. The user can freely teleport to anywhere in the area.
On the other hand, a teleportation anchor represents node-based teleportation. When trying to teleport to a GameObject with a Teleportation Anchor
component, the user will instead be teleported to the center of the GameObject or a predefined position, as opposed to wherever they choose.
For more info on locomotion in the XR Interaction Toolkit, see the Unity Documentation.
Making Continuous Movement (More) Comfortable
Right now, the continuous movement with the left controller is vection-inducing. We can add a vignette to reduce the field of view when moving.
First, expand the objects in the Hierarchy so that you can see the Main Camera
, underneath Camera Offset
. Next, in the Project View, search for TunnelingVignette
. You should find a few results, one of which is the TunningVignette.prefab
. This is the one we want. You can click each item in the Project View to see its full path at the bottom.
We want to add this prefab as a child of the Main Camera
. Go ahead and drag the prefab from the Project View and drop it onto the Main Camera
GameObject.
Select the TunnelingVignette
object you just added in the Hierarchy. In the Inspector, scroll down until you find the Tunneling Vignette Controller
. This contains settings to configure the size of the vignette, how fast it appears, and the color! You can adjust these later if you'd like.
To make the vignette appear when the user uses continuous movement, we have to add it as a "locomotion provider". In the Locomotion Vignette Providers
section, click the plus () symbol. This will create a new entry. To the right of Locomotion Provider
, click the circle with a dot in the middle to open the object selector.
In the object selector, we want to select the Move
object:
That's it! Enter Play Mode and test out the movement again (remember to use the Tab key to switch between the headset, left controller, and right controller). You should see a vignette appear when you move!
Part 3: Build and Test on the Headset
Now let's try testing it on the headset! Each headset comes with a USB-C to USB-C cable. If you do not have a USB-C port on your computer, there should be a small USB-C to USB Type A adapter in the box. PLEASE DO NOT LOSE THESE!
Import Dr. Wang's VR Build Menu
I've created a helpful utility to help make deploying to the headset easier. Download this package: buildhelper.unitypackage.
Import it into your project. Remember, you can do so by going to the Project View, right-clicking the Assets folder, then selecting Import Package
→ Custom Package...
. Find the package you just downloaded and open it. When a window pops up showing the package contents, make sure all of the files are selected and then click Import
.
This will add a new menu to Unity called VR Build
. The menu contains options to assist with manual deployment.
Add Scenes to Build
Open the Build Settings window by going to File
→ Build Settings...
. Click Add Open Scenes
to add your current scene to the build.
Keep the build window open for the next step.
(Optional) Rename Your App
Click on the Player Settings...
button in the lower-left of the Build Settings window, or go to Edit
→ Project Settings...
and select the Player
section. This will open up the Player Settings.
Here, you can change the Product Name
to whatever you'd like.
Turn Off the XR Device Simulator
For some reason, the XR Device Simulator will make it so that your app no longer works on the headset. Thus, we must turn off the simulator before building!
You can quickly get to this setting by going to VR Build
→ Open XR Device Simulator Settings...
. Then, on the right, uncheck Use XR Device Simulator in scenes
.
You can also reach it by going to Edit
→ Project Settings...
, expanding the XR Plug-in Management
section, and then selecting the XR Interaction Toolkit
option underneath that.
Turn it back on when you need to test your app.
Build and Deploy to the Headset
Revised 9/14/2023 12 PM
First, you need to build an apk and save it to a folder. You can do so from File
→ Build Settings...
and then click on Build
, or instead go to VR Build
→ Build APK Only...
.
This will ask you for a location to save the "apk" file (which contains the built application). Create a folder in your project called Build
and save the apk inside as locomotion.apk
.
This may take a few minutes the first time. Once your app is built, you can follow the Manual Deployment Steps below when you have access to a headset.
Building and Running
Use the cable (and adapter) to connect the headset to your computer. Next, put on the headset. You should see an acknowledgement of some kind asking you if you would like to allow USB debugging. Go ahead and click "Allow" (or "Always allow from this computer").
Open the Build Settings
window (use the File
and VR Build
menus). Now, next to Run Device
, click Refresh
. Then, in the drop-down menu, look for a Oculus Quest 2
device and select it. (If you do not see it, refresh and try again. If that doesn't work, skip down to the manual deployment section.)
Once you have selected the headset, click Build and Run
. Unity will ask you for a location to save the "apk" file (which contains the built application). Create a folder in your project called Build
and save the apk inside as locomotion.apk
. This will build the app and automatically deploy it on the headset.
Manual Deployment Steps
Once the apk is built, connect the headset to your computer. Click VR Build
→ View Connected Devices
. You should see a device listed. If the device says "unauthorized", put on the headset. You should see an acknowledgement of some kind asking you if you would like to allow USB debugging. Go ahead and click "Allow" (or "Always allow from this computer").
Now, go to VR Build
→ Deploy APK to Headset...
. In the file picker that opens, select the apk file you built previously. This should now install your application onto the connected headset.
If the above steps do not work, you may need to run ADB (Android Debug Bridge) manually. ADB is a command-line application that allows us to manage Android devices. You can find where the adb executable is installed by going to VR Build
→ Show ADB Location
. Then, using a terminal/command prompt you can run adb commands to connect to a device with adb devices
, and install an app with adb install <PATH TO APK>
. Please feel free to ask for help with this step!
Submission
There is no submission for this lab. Credit will be based on attendance.