Oculus, XR

TL;DR:

This post talks about technical implementation of movement from different VR experiences I’ve tried in the last 2 months. All of them delivered great immersive experiences without breaking presence, and wanted to share my analysis of their techniques. Part 2 will analyze environment interactions. All the experiences I’m talking about showcased different hardware (Vive, Touch, Omni and Gamepad) and genres.

Teleportation is one of the best techniques to use for movement. If you don’t want to use Teleportation, use constant forward movement. If you want to implement strafing, make sure its smooth and continuous and try limiting the degree of movement.

Reviews/Technical Analysis

So the past 3 months have had me traveling around the U.S. enabling me to take part in many amazing Gaming and VR conferences: Pax West, GaymerX4, Oculus Connect 3. I wanted to use this post to talk about some of my experiences on the different games and hardware as well as dive into the unique technical solutions that each of these experiences implement. Most of what I talk about will revolve around player movement and environment interaction; two of the most common areas where presence is broken.

Velocibeasts by Churroboros – HTC Vive Multiplayer

Technical Highlights: Attack and Movement Controls

Game Description:

“Have you ever wanted to throw something at someone? VELOCIBEASTS can make that dream a reality. Pick a weapon, throw it, teleport, and kill your friends.

Battle as a variety of animals equipped with floating mecha suits in this fast paced multiplayer battle arena VR title.”

-Churroboros

Review:

I managed to get a pretty in depth demo at GaymerX4 this year. The highlight of this game is the attack and movement controls. In VR, player movement is one of the fastest ways to break a users sense of presence. So, why am I impressed? I’ll explain. In the game you are a mecha-suit beast in a large arena, trying to kill your opponent.

You attack by throwing your weapon, similarly to casting a fishing line, toward your enemy. Pressing and holding the trigger grips your weapon, and releasing the trigger to throws it. However the interesting part of the game play is when you press the trigger again. When you do, you instantly teleport to your weapons location. The mix of coordinating attacks and movement simultaneously creates a fun, fast pace experience that immerses players.

Now in general I’m not prone to motion sickness in VR. However, in most first person shooter/fighting games, falling and strafing cause the most motion sickness issues for me. Velocibeasts avoids both by allowing your beast to float (because it’s in a mecha-suit), avoiding falling, and teleportation avoiding strafing. The floating mechanism also gives users complete 6 degrees of freedom for moving around the arena.

I’m impressed because many games use the teleportation technique but not many of them make it so well integrated into game play. The movement controls were also very easy to use and only took a few throws to get the timing and rhythm down. Below are some pictures of me playing the game, getting really into it.

Links

@ChurroborosVR
https://www.facebook.com/ChurroborosVR/

World War Toons by Reload Studios – Omni, Gamepad, Rift, PSVR

Technical Highlights: Full FPS style game, Good use of strafing controls, and Omni integration

Game Description:

“World War Toons is a cartoony first-person shooter where you never know if you’re going to turn the corner and see a rocket rushing towards you, grand pianos falling from the sky, or a massive tank staring you in the face.”

– Reload Studios

Technical Review and First Opinions:

I played this game at PAX West and got the opportunity to play with the Rift and gamepad, as well as the Rift with the Omni. It was a very polished game, the mechanics played like an FPS, which isn’t always the best thing in VR. World War Toons is one of the few games that I’ve played that has strafing (lateral movement independent of camera position) in VR. The reason why VR experience shy away from this? Because users get sick, really, really quickly.

Now, despite having strafing, I only felt nauseous a few times during game play; specifically when my character was falling off the side of a wall, and when being launched in the air by trampolines.

The creators limited movement to just the d-pad directions (left, right, forward, backward), to limit the amount of discomfort when players were strafing.

However, when playing the game on the Omni, I had no issues with nausea. The hardware made a huge difference when the character is launched about the arena, and falling off drops. It was also completely immersive, compared to full game pad controls.

Links

http://voicesofvr.com/455-vr-first-person-shooters-esports-with-world-war-toons/

roqovan.com

@StudioRoqovan

Eagle Flight by Ubisoft – Ubisoft

Technical Highlights: Player Movement, Gamepad, Multiplayer

Description:

“50 years after humans vanished from the face of the Earth, nature reclaimed the city of Paris, leaving a breathtaking playground. As an eagle, you soar past iconic landmarks, dice through narrow streets, and engage in heart-punding aerial dog fights to protect your territory from opponents.”

-Ubisoft

Technical Review and First Opinions:

I first saw this game at GDC this year, and at Oculus Connect 3 I was able to play it. A 3 v 3 multiplayer, capture the flag, game. My team won 7-0. YAY!

Game start: The opening of the game you can see your fellow teammates as Eagles in a for match making. You are able to look around at your teammates’ eagle bodies, whose head movements correspond to the players’ head movements. I mention this because these little details increase player immersion. Once everyone is ready in the lobby the game beings.

Game Play: When the game finally starts the camera, fades in onto the scene. You as the eagle, is already in flight. In VR, if you want to have the player move around your scene (not teleportation) there are only a few ways to do it without getting them sick. One of the ways is to have a semi constant speed, and always have forward movement in the direction the player is looking. Eagle Flight employs this technique, with head tilts, to turn yourself left and right. However, I did still feel some discomfort as I was getting used to the controls of moving around Paris.

The other thing Ubisoft does to help immerse the player, is add a beak to the players view. There have been VR studies that show adding a nose, or a grounding element to the player’s view helped to immerse them faster and alleviate motion sickness. I hadn’t seen any games employ this technique though, until this one.

The third technique Ubisoft uses for movement and immersion is vignetting the players view, when increasing speed; similar to tunnel vision. I’ve seen this technique a few times when player movement is increased. I like this technique since it eases my motion sickness by limiting the amount of visual inputs.

Eagle Flight is an Oculus Rift Gamepad game, it’s also coming to PSVR. I usually dislike gamepad games for VR because I think it takes away from presence, however this game only used a few buttons, for firing, shield, and speed controls. If you are going to use a gamepad for your First Person VR game, I suggest simplifying the controls, keeping them as intuitive as possible, and styling your game on a third person view.

You can see some of the gameplay from E3 here:

Links

https://www.ubisoft.com/en-US/game/eagle-flight

@UbisoftVR

Summary

Figuring out which technique you want your user to use to explore your Virtual World is important. Take into consideration the limits of hardware and style of gameplay when making your decision. In Velocibeasts I doubt I would have enjoyed the game play as much if I had to alternate between teleporting and fighting my opponent due to the game’s fast pace flow. Eagle flight had to center it’s gameplay around a constant movement since players are birds. It would have felt super disconnected if our birds were teleporting everywhere instead of peacefully gliding.

Teleportation is one of the best techniques to use for movement. If you don’t want to use Teleportation, use constant forward movement. If you want to implement strafing, make sure its smooth and continuous and try limiting the degree of movement.

Now that I’m done traveling more videos and posts about how to implement all the awesome techniques I talked about to come =)

Happy Hacking!

– The Napping Kat

Bots, Unity, XR

TL;DR

It works! I managed to get HoloLens inputs working with the LUIS integration I did before in Unity. The project melds phrase recognition with dictation from HoloAcademy’s github example and then pings the LUIS API.

All the code is here: Github/KatVHarris

LUIS + UNITY + JSON post is here

LUIS 101 post is here


Okay so this is a very short post. Most of the harder parts were completed before this in my LUIS post and the Hololen’s Academy code helped a lot. I’ll just mention here some of the pains I went through and how it works.

Phrase Recognition vs. Dictation

HoloLens has 3 main types of input control for users. Gaze, Touch, and Voice. This application focuses on the last one. The voice input uses the Speech library for Windows.

using UnityEngine.Windows.Speech;

This library allows the HoloLens to then use Phrase Recognition to trigger specific actions or commands in your project. Yet that defeats the point of natural Language processing. In order to then interact with LUIS we needed to feed in what the user is saying. So to do this, I integrated the Communicator Class from the HoloLens Example into my project. This class handles the Phrase Recognition of the project but it also handles dictation, enabling natural language to be captured from the user. My Communicator is slightly different than the HoloLens because of the LUIS interactions as well as reworking the code to call for multiple dictation requests.

Now to activate the dictation Phrase Recognition commands are used. So no need to Tap to activate.

Dev Notes

I did have some speech/phrase recognition trouble. The original phrase to activate dictation was “Allie” (the character on the 100 which my original bot project is based off); however, the recognizer doesn’t recognize that spelling of her name. Changing it to “ally” caused the recognizer to trigger. DictationRecognizer is similar to the PhraseRecognizer in that it also doesn’t recognize the spelling of many names; for example, I would say “Tell me about Clarke.” and the dictation recognizer would write “tell me about clark.”. To fix the dictation errors I used Regex to replace the spelling before querying the LUIS API. One can also change their LUIS API to accept the speech recognition spelling but because multiple bots and applications are connected to my LUIS API I couldn’t implement that solution.

private void DictationRecognizer_DictationResult(string text, ConfidenceLevel confidence)
    {
        // Check to see if dictation is spelling the names correctly 
        text = Checknames(text);

        // 3.a: Append textSoFar with latest text
        textSoFar.Append(text + ". ");

        // 3.a: Set DictationDisplay text to be textSoFar
        DictationDisplay.text = textSoFar.ToString();
    }

Anyway that’s all there is to it. All the major code is in the:

Hope this helps, and let me know if you have any questions with it on Twitter @KatVHarris

Happy Hacking

– TheNappingKat

Unity, XR

Hi All,

I solved the NO HMD DETECTED, Tracker Connected error you get with trying to do extended screen when using the headset for the Unity Editor.

NoHMD

Well I got it working, not necessarily solved.

Specs:

  • Lenovo X1 Carbon
  • Intel Core i7 – 3667U CPU 2.0ghz
  • 8gb ram
  • Windows 8.1
  • 64-bit
  • Intel HD Graphics 4000
  • Oculus DK2
  • SDK 5.0.1

To start I detached my oculus from the computer. Reattched the oculus, made sure it was working in the normal Direct HMD Access mode. It was.

1) Then I hit “Windows + P” and made sure my Projection setting was on extended.

2) Then I switched the mode to Extended in the Utility

3)  Then on the desktop I right clicked and hit Screen Resolution.

4) I selected the second screen and hit “Detect”, a window came up with Another display not detected.

5) I made the Oculus screen primary and then switched back to the main computer being primary and it worked. The screen now appeared in the Oculus but the orientation was off. so I just adjusted it in the Screen resolution window.

DisplaySettings

Now the Oculus Configuration Utility looks like this, but it works.

AttachedNoTracker

In the Unity Editor I can move the game tab to the Headset Screen and maximize it, I can still see the awkward black rim around the screen but it’s better than nothing. Hopefully the Oculus team can fix this soon.

OculusExtendedScreenFull

Hope this helps, Happy Coding!

-TheNappingKat

Oculus, Unity, XR

Setting Up

GREAT NEWS! In 2015 you no longer need Unity Pro Edition to integrate Oculus into your projects. YAY!

Things you’ll need for integration:

  • An Oculus (really you don’t need one to develop but how else will you test it out?)
  • Oculus SDK
  • Oculus Runtime
  • Unity 4 Integration

Okay so, the first thing you’ll need is to have your game all set up and running, in Unity. If you’ve been following my blog then you should have the bulk of the game running.

Cool. Next we need to grab the Oculus pieces from their site.

https://developer.oculus.com/downloads/

Now if you have a Mac or Linux download from those links.

WebsiteDownloads

After you download the links you now need to install the runtime. Then restart your computer.

Integrating into Unity

Extract the files from the Unity Integration Package you downloaded. Go Into Unity to Assets>Import Package> Custom Package

Find where you extracted the files and navigate to the Unity Plugin.

ImportingPackage2

Then hit import.

ImportingPackage3

Now you should have a new folder in your Assets called OVR

AssetsOVR

Cool so now it’s integrated lets Start using the Oculus Camera in the game.

Using Oculus Cameras

Now using the Oculus Package is super easy. Oculus has already created prefabs for developers to use. They have a prefab with just the camera rig as well as one with the rig and a character motor.

OVRPrefabs

To use them. Just do what you what you would normally do with prefabs. Click and Drag it into your scene.  I created a test scene called OVRTest to make sure everything would work without worrying about the infinite platforms generating around me.

I placed the OVRPlayerController at 0, 2, 0.

OVRinScene

Cool Now try running the game. You should have something that looks like this:

OVRGame

YAY! See super easy. The double circle screen is what will be fed to your Oculus, and with the lenses and the headset it should become one image with a 3 dimensional feel.

Now that you have the basic character installed you can add it to the main game scene and try it with the infinite platforms.

Happy Hacking!

-TheNappingKat