Introduction To Vr With Unity

Create an immersive Virtual Reality experience on iPhone/Android Cardboard or VR Device with Unity.

Last updated 2022-01-10 | 4.5

- Create a 3D VR project targeting a device as simple as iOS/Android cardboard.
- Create immersive VR experiences with panoramic videos.
- Create interactive VR game plays with advanced Unity features
- including Ray Casting and NAVigation (Path Finding).

What you'll learn

Create a 3D VR project targeting a device as simple as iOS/Android cardboard.
Create immersive VR experiences with panoramic videos.
Create interactive VR game plays with advanced Unity features
including Ray Casting and NAVigation (Path Finding).
Create interactive head's up 3D user interfaces.
Add support for Game Controllers and Cardboard "Screen Touch" button.
Take advantage of Ambisonic Audio files.
Use Unity Remote to test things in the Editor.
Bypass Unity XR SDKs and use the Gyroscope
to test things in the Editor with Unity Remote.
Take advantage of Unity's Events to trigger actions on interactive objects
including loading scenes.
Use Unity's Animator State Machine along with Collider Triggers
to trigger animations when passing by.

* Requirements

* Unity 2017.3 or later.
* If targeting iPhone : an Apple iOS Developer account
* a Mac computer and Xcode.
* If targeting Android : Android SDK (free).
* As an option
* if targeting iPhone : an MFI (Made For iPhone) bluetooth Game Controller.
* As an option
* if targeting Android : an Android bluetooth Game Controller.
* If targeting Windows/Mac : a supported VR device (Oculus Rift
* HTC Vive
* Microsoft Hololens).

Description

This course aims at helping anyone willing to learn Unity to create VR experiences.

No previous programming experience is required, and most of the principles covered in the course will help future programmers wrap their head around programming basics.

It features a self learning approach. Every topic comes in on a need to know basis.

Most of the course examples can be done with the simplest hardware.

Whether you want to experiment with a simple Android or iPhone cardboard, add a remote game controller, or go for pro hardware, the principles, techniques and code you'll take away from this course will help you deliver a full VR experience, fast!

Who this course is for:

  • VR enthusiasts willing to learn Unity
  • Unity developers willing to learn VR features

Course content

18 sections • 108 lectures

Introduction Preview 01:53

A short introduction of the author and a brief of overview of the content.

Requirements Preview 03:05

VR is spreading fast and Unity's doing its best to make it as generic as possible.

VR is somehow a recipe, with mandatory and optional ingredients.

To follow this course, you'll need :

  • A Windows PC or Mac Computer, matching the minimum requirements to run Unity
  • Unity 2017.3 or a later version. You can use the free version, every feature we'll use is available in it.
  • An iPhone or Android phone with a Cardboard system, or a VR device you can develop with. Don't plan on using a PSVR, unless you have a Sony Developer account.
    • If you want to develop on iPhone,
      • you'll need at least an iPhone 5, running iOS 8 or later.
      • You'll also need an Apple iOS Developer account, a Mac Computer, and installing the latest version of Xcode.
    • If you rather want to target Android,
      • you'll need a phone running Android 4.1 or later with a gyroscope, compatible with Google Cardboard or DayDream (see Google Daydream hardware requirements).
      • You will also have to install the Android SDK for Windows or MacOS.
    • If you plan on developing for a VR device, such as Gear VR, Oculus Rift, HTC Vive, or MS Hololens, make sure the hardware is supported on your development platform. Most of these hardware are not fully compatible yet on Mac.
  • For Cardboard users, I'll also touch on using a wireless controller to implement interactivity and navigation.
    • If you develop on iPhone, make sure to get an MFI (Made for iOS) game controller.
    • If you have a non MFI controller, such as the cheap remotes or ICade devices, I'll touch on hacking these with iOS.
    • If you develop on Android, make sure the controller is supported.
  • Any other VR related device can be used. I'll do my best to touch on them as I get to test them. First in list is the 3D Rudder, a foot controller.

I'll keep a list of hardware for reference here.

ROAD MAP Preview 00:55

An overview of current and upcoming content.

Project Setup & Editor Overview. Preview 06:08

Let's begin with creating a new project and have a quick overview of Unity Editor's interface.

(iPhone/Android/PC/Mac) Build & Player Settings. Preview 04:35

One of the 1st thing to setup in a project is the target platform in the Build Settings, as well as the project's publisher infos in the Player Settings.

Creating and placing Game Objects. Preview 07:29

Let's begin the creation of our VR Lobby scene with placing the camera and adding a few objects.

Creating and assigning Materials. Preview 03:03

To differentiate objects, let's assign them a few materials with different colours.

Importing textures. Preview 02:34

Let's now import a texture to use as a panoramic background.

Creating and assigning a SkyBox Material. Preview 04:13

To use the texture as a background, we need to create a special material, called a Skybox.

Adding XR SDKs to use head tracking and stereo camera rendering. Preview 02:11

To take advantage of automatic head tracking and stereo rendering, we simply need to add XR SDKs to the Player Settings.

(iPhone) Building and running the VR test scene on the device. Preview 03:19

Everything's now setup, let's build our project with Xcode and test it on an iPhone.

(Android) Build & Run. Preview 06:02

To build your project as an "apk" you can push on your Android device, you need to :

  1. Go to Unity's Preferences, then External Tools
  2. In the Android section, click on Download next to the SDK path, and install Android Studio
  3. Then browse for the SDK root folder you just installed.
  4. Then install jdk1.8.0_161 from the link provided here in the resources. (Not using the download button).
  5. Then browse to /Library/Java/JavaVirtualMachines/jdk1.8.0_161.jdk/Contents/Home.
  6. You shouldn't have to, but you can also install android-ndk-r13b using the download button, and browse to its location.
  7. Then run Android Studio, and click "Configure".
  8. Under SDK Tools, expand the SDKs and uncheck 25.0.3, then click on Apply.

Unity should now be able to build the project.

(Windows/Mac) Build & Run. Preview 01:02

Everything's now setup, let's build and run our project on Windows or Mac OSX.

Project Setup Review. Preview 01:32

If you've skipped the intro videos, this video will update you as to what we've done.

Importing Video Files. Preview 01:37

Let's begin with importing 360 video files.

Adding a Video Player to playback video content. Preview 03:23

Unity's built-in Video Player allows us to playback video files over different objects.

Creating a Render Texture to *link* the Video Player to the Skybox Material. Preview 04:24

To render the video in our Skybox background, we're going to need an intermediate texture, called a RenderTexture.

Objectives. Preview 02:26

A VR experience very much relies on what you're looking at, or pointing at with a remote controller.

In this section, we're going to introduce Unity scripting and C# standards, to have objects react to their position relative to the screen or camera's position.

Scene Setup. Preview 02:36

Let's begin by adding a few text objects above our selection items.

Using Prefabs. Preview 07:18

Using Prefabs allows us to manage similar objects within and across several scenes.

Creating a new C# script and setting up our development environment. Preview 06:23

Let's create our 1st C# script and setup MonoDevelop for proper C# standards.

Overview of the MonoBehaviour and C# scripting. Preview 05:31

"using", "public", "class", "MonoBehaviour", "void", what does it all mean ?

Accessing and modifying another component. Preview 08:14

Let's begin with accessing the object's Renderer component to change the colour of its Material.

Using a Gradient object. Preview 08:00

Let's use a Gradient property instead of a simple Color.

Using C# Properties to optimise the refreshing of values. Preview 09:06

Calling methods, such gradient.Evaluate(), on every frame, isn't optimal performance wise. Let's use a C# property, to call it only when the position changes.

Referencing another object's (Transform) component. Preview 12:04

In this lecture, we're going to reference another object's Transform component, that we'll use later to measure the object's relative position.

Using Code Snippets (Templates) to easily and quickly add more properties. Preview 08:57

Often, there are things we do all the time. Templates are great to keep our code consistent and type it faster.

Using MonoDevelop Tasks. Preview 03:38

To avoid loosing track of what's left to do, using Tasks allows us to quickly log remaining tasks with comments using special keywords.

Vectors and Visual Debug features. Preview 12:40

To have a better understanding of Vector maths, we're going to use some visual debug features and draw lines in the Scene View.

More maths, with inverse linear interpolations. Preview 05:49

If 10 means 1 and 30 means 0.. Wait! It's easy, the Maths helper class does it all for you.

Using Unity Events to drive any other value. Preview 13:34

We're now going to add Unity Events to the Object Focus component, so that we can connect its values to other values in the Editor.

Using Animation Curves to ease in and out the linear interpolation. Preview 08:21

While we could ease in and out the interpolation so far linear, we're going to use Unity's built-in Animation Curves, which not only makes it easier but also provide more control.

Object Focus Manager. Preview 04:31

To easily trigger actions on the object that is the most in focus, we're going to create a new manager script to look after all items.

Implementing the manager as a Singleton. Preview 09:18

A Singleton is a design pattern where a component has a unique instance that can easily be accessed at any time.

Static methods. Preview 10:59

Methods and members declared static are shared amongst all instances and a lot easier to access from other components.

Focus Events. Preview 13:45

Let's add events to change the display of the object currently in focus.

Setting up the Progress Bar. Preview 05:30

Now that we have access to the currently highlighted object, we're going to trigger actions on it. Let's begin with adding a Progress Bar image.

Handling touch screen inputs. Preview 11:17

If you're using the Cardboard with no Game Controller attached, then a single touch on the screen, using the Cardboard button, it our only option to trigger an event.

Action Trigger. Preview 03:26

Let's make the connections now between the Input touch and the currently highlighted item.

Cancelling the Trigger. Preview 07:39

And to wrap this up, we need to properly cancel the trigger when the user looks away.

Loading a Scene. Preview 04:08

Now that the input system is in place, let's use it to load another scene.

Handling wireless gamepads. Preview 06:07

Let's begin with making sure our gamepad is properly connected and recognised by Unity, using Unity Remote along with the Game Pad.

Using Keycodes. Preview 05:26

Unity supports monitoring keyboard keys and game pad buttons in two different ways : using key codes or Input Settings.

Let's add support for key codes first.

Using abstract classes to gather common mechanics. Preview 17:15

Whenever several classes share a fare amount of mechanics, Object Oriented Programming, especially inheritance, allows us to regather all the code in a parent class. This not only makes it easier to implement the mechanics in different components, but it's also easier to maintain.

Adding events to the GamePad Manager. Preview 03:50

With all the input mechanics now part of a parent class, implementing it in the Game Pad controller component becomes very simple.

Using Unity's Input class to support VR Device controllers. Preview 04:50

Sometimes it's easier to use Unity's built-in Input class than key codes. Again, using our parent class makes it super easy to implement a different version.

Recentering the sight when a scene is loaded. Preview 09:45

Unless you're using Room Scale Tracking, we have no idea of the cardinal direction the Gyroscope will be initialised in. We're going to use XR features, Scene Manager Events and make the object Persistent so that we "recenter" the orientation on every new scene load event.

Using the Gyroscope instead of XR tracking when on Cardboard in the Editor. Preview 09:00

When using the Cardboard, you can't preview the XR tracking in the Editor. To work around this and be able to preview things with Unity Remote, we're going to use Unity's Gyroscope native support instead.

Stereoscopic Rendering in the Editor when using the Cardboard. Preview 00:22

When using the Cardboard, the current XR SDKs doesn't provide any feedback in the Editor. Now that we have enabled tracking using the Gyroscope, let's go a little further and add support for stereoscopic rendering.

Different ways to navigate a 3D world. Preview 02:43

Let's begin with a quick overview of the ways we can navigate a 3D world with Unity.

Importing the basecamp scene from the Asset Store. Preview 02:21

To experiment scene navigation, we're going to use an environment from the Asset Store.

Scene setup. Preview 05:57

Let's begin with fixing issues and cleaning the scene a bit.

Navigation strategy. Preview 04:11

Before we dash into coding, let's have a look at our strategy.

Looking forward... Preview 06:40

Now, let's begin with a simple vector looking forward..

Using Ray Casting to find an intersection. Preview 11:18

Now that we have the heading direction, we're going to find the intersection with the first Collider we hit in the scene.

Ignoring intersections with set layers. Preview 04:17

Layers allow us to ignore sets of objects so that we can look through some categories of objects, such as characters, vehicles, etc.

Looking through trigger colliders. Preview 02:42

Colliders, when set as Triggers are usually not meant to block navigation. Let's let the ray cast look through them.

Bouncing off walls.. Preview 07:02

We're now going to "bounce" off walls and obstacles to avoid going to close to them.

More Ray Casting ... Preview 06:11

Let's use Ray Casting again to find a position on the ground.

Baking a NavMesh. Preview 08:38

Navigation, part of "AI" features of game development, allows us to navigate a 3D world using pre computed navigation data to find the shortest, easiest route from point A to point B. Setting it up in Unity is fairly simple.

Finding a position within the NavMesh. Preview 08:07

With the NavMesh now available, we're going to locate the closest position within the NavMesh, from the ground position we have.

Creating a 3D reticle object. Preview 16:34

In this lecture we are going to use Pro Builder (free extension available from the Asset Store) to create a 3D cone object.
You can import the Reticle-Cone package if you want to skip this part.

Then we'll code a custom shader to display a 2D reticle texture with alpha, slightly off its render position. This is known as Decal rendering.
You can also use the shader file provided in the resources.

Positioning the reticle. Preview 06:44

Let's now position the reticle object at the estimated destination, and deactivate it when we don't have one.

Setting up the Nav Agent. Preview 07:33

We're now going to set up a NavMesh Agent, and give it the target location for destination, with a method we can easily test in the Editor.

Setting up the navigation with XR devices. Preview 08:24

With this new component and the previous components we've made in earlier chapters, we're now going to be able to put together a First Person Camera Navigation system.

Using VR "Nodes" (controllers). Preview 00:45

When using a VR device, such as the HTC Vive, we can track controllers positions and rotations. Let's use this to aim for a target position with the controller instead of the Camera.

Disabling XR Camera position tracking. Preview 00:23

Sometimes we want to force disable room scale tracking. Useful in car racing games for example, you don't want the player's head to come out the car's rooftop.

Opening doors as we get close. Preview 09:43

Now that we can navigate our environment, it'd be nice to open doors instead of just walking through them. And this is going to be fairly simple with Animator, and Trigger Colliders.

Trigger detection. Preview 04:12

Let's begin with detecting when an object enters and leaves the Trigger Collider.

Animator parameter change. Preview 03:13

It's now as simple as changing the state of the Animator parameter when an object enters and leaves the Trigger.

Filtering with tags. Preview 02:22

As we may have other objects in the scene, we may want to filter the trigger so that it only triggers the Animation with a given object. We can easily do this with tags.

Applying changes to all doors, and add a few optimisations. Preview 07:09

Let's add a few optimisations to make the component event more generic, and apply the changes made to all doors.

Triggering Events. Preview 00:15

We can also use Unity Events, to trigger any action.

Creating the Video Gallery UI. Preview 01:01

Creating a vertical list of buttons to trigger playback of different videos.

Populating the UI, and adding listeners to Buttons' onClick events. Preview 01:08

We're now going to populate the UI with a button for every clip, and add actions to their onClick events to play a video.

Loading clips from Resources. Preview 00:56

Loading resources dynamically can save a lot of painful assignment in the Editor.

Streaming videos from Local Storage. Preview 01:49

Let's now handle streaming, from local storage, so that we don't have "carry" those video files in the Build, and can add more later, without a recompile.

How do I turn on VR (XR) support ? Preview 00:11

The first thing you want to do in a VR project, is to turn on support for XR SDKs.

How do I playback a 360 video background? Preview 00:31

Everything's been made in Unity for you to playback 360 video backgrounds. It takes a little bit of setup, but in the end, it's easy.

How do I default to Non VR, then switch VR on later ? Preview 00:51

Jumping right into a VR world is fine for prototyping. Although, you often want to offer the user/player the choice to turn VR on and off.

How do I use VR head tracking, but disable Stereoscopic rendering? Preview 00:11

You may not always want to use Stereoscopic rendering along head tracking. Especially using handheld devices.

How do I preview Head Tracking with Unity Remote? Preview 00:25

Working on handheld and tired of building every time you change something? It's time to use Unity Remote. Yes, it can be used, it just takes a few lines of code, and it's worth all the effort!

How do I track a VR device head or hand controller? Preview 00:15

Along head tracking, you can also track a VR controller, and apply its position and/or rotation onto an object's transforms.

This is also useful to apply head tracking to Cinemachine Virtual Cameras.

How do I disable head tracking on the Camera? Preview 00:19

You may need to disable default head tracking for the main camera. In particular when using Cinemachine.

What locations can I load or stream 360 videos from? Preview 00:59

You were not seriously going to build a Player app that contains all your videos, right?

A few free assets worth checking out. Preview 00:00

A few assets, free on the Asset Store.

Upgrading the Project to Unity 2018.x Preview 00:36

If you've started the course on an earlier version of Unity and want to upgrade it to 2018.x or later, this quick guide will take you through the major steps.

Upgrading Cinemachine to Package Manager version. Preview 00:15

If you had installed Cinemachine from the Asset Store, and want to use the latest version available from the Package Manager, this quick guide will show you how to properly upgrade your project.

Upgrading Pro Builder to Packager Manager version. Preview 00:29

If you had installed Pro Builder from the Asset Store, and want to use the latest version available from the Package Manager, this quick guide will show you how to properly upgrade your project.

Multi-pass VS Single Pass Rendering. Preview 00:07

Stereo Rendering can be done using a Multi Pass or Single Pass method. The former is fully compatible with all post-processing effects, the latter brings better performances.

Chapter Preview. Preview 00:28

Here's a quick preview video of what you can expect from this chapter/section.

Importing assets. Preview 00:16

Importing the free assets.

Clearing the environment. Preview 00:25

Let's make some room in the Hangar...

Bringing the Fork Loader and adding the Camera. Preview 00:23

Bringing in the Fork Loader prefab and positioning a Camera so the VR view is aligned.

Test Build. Preview 00:15

Testing the Camera behaviour on the device.

Enhancing the Camera experience. Preview 01:17

Enhancing the Camera's positioning without Room Scale tracking.

Scripting the Fork Lifting. Preview 01:10

Let's add a simple script to control the lifting action.

Setting up a wheeled vehicle. Preview 01:41

Wheel vehicles can be real fun, and are quite easy to setup with Unity's built-in Physics.

Resetting the vehicle when it falls... Preview 00:29

We can't always guaranty safety. So let's put the vehicles back on its wheels when it lands on the side.

Setting up other Physics objects to interact with. Preview 01:32

Properly interacting with Physics objects requires a few adjustments.

Have fun with other resources! Preview 00:13

Seated VR experiences are full of possibilities.

Working around 3DOF (degrees of freedom) limitions devices (like the OculusGo). Preview 00:13

3DOF (degrees of freedom) is a bit limiting on devices such as the Oculus Go. But all is not lost.

Chapter Preview Preview 00:54

This is a short video showing what you can expect in this chapter.

Preparing the assets. Preview 01:04

Let's prepare the assets and XR tracking for the Helicopter Simulator.

Setting up flying vehicle physics. Preview 00:48

Let's now setup the Helicopter's Physics and Controls.

Shooting targets. Preview 01:57

Let's now have some fun shooting at the environment.

Applying damage. Preview 00:33

Let's add some damage and blow things up!

Bonus lecture. Preview 00:17

Going further.