Unity/VR Capstone Walk-in-Place Mechanic

The major development work for my Udacity VR Developer Nanodegree capstone was in developing a walk-in-place mechanic for the HTC Vive. I have previously tried out an arm swing mechanic that I did not care for, and induced nausea in my husband. To begin I did one of the things I do best: collect data! (I will always be a scientist at heart.)  I had decided that I did not want to require the use of controllers for the walk. This would free them up for other uses. In order to collect data on the headset, I wrote a simple reporting script:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using System.IO;
public class Reporting : MonoBehaviour {
    string path = "Assets/Logs.txt";
    Vector3 current;
    Vector3 previous;
    Vector3 delta;
     // Use this for initialization
    void Start () {
        
        StreamWriter writer = new StreamWriter(path, true);
        writer.WriteLine("transform x, transform y, transform z, delta x, delta y, delta z");
        writer.Close();
     
        current = transform.localPosition;
        previous = transform.localPosition;
     }
    
    // Update is called once per frame
    void Update () {
        //Write some text to the test.txt file
        StreamWriter writer = new StreamWriter(path, true);
        current = transform.localPosition;
        vel = (current - previous) / Time.deltaTime;
        writer.WriteLine(transform.localPosition.x + ","+ transform.localPosition.y+ ","+ transform.localPosition.z + "," + delta.x + "," + delta.y + "," + delta.z);
        writer.Close();
        
        previous = transform.localPosition;
    }
}
This collected a ton of data. I had several people wear the head and walk in place. Then I started analyzing. I graphed movement on each axis against each other axis:

X vs Y

X vs Z

Y vs Z

Then I graphed each position vs time:

X

Y

Z

 

I decided that I didn’t want to use straight transform data as it was fairly volatile, especially if the player wasn’t walking exactly in place. I had also collected a change in position over subsequent frames.

Delta X

Delta Y

Delta Z

Based on this data it seemed like delta x would be the best measurement to base a step detection algorithm. I created an algorithm that detects the change of delta x from a negative value to a positive one. Each of these is assumed to be a step. What I actually found when I implemented this was that it worked beautifully… unless the player turned their head rapidly from side to side. Since I wanted users to be able to look around during the experience, this wasn’t going to work. Fortunately, it did work when applied the the delta y variable. This had the added advantage of allowing users to simply bob their heads if they didn’t want to actually walk, while still allowing them to look around.

The second part of walking was to move the player each time a step was detected. I opted to use one of the controllers as a forward pointer, allowing the player to look around while walking. The movement of the player is a bit jerky, but I found that this was less likely to cause VR sickness in the users than if the movement was smoothed. In a future iteration I would both like to be able to calibrate the step detection threshold for each user and allow the user to control the degree to which movement is smoothed out.

Leave a Reply

Your email address will not be published. Required fields are marked *