Robot Dog Project

Some Background for Redirected Walking

  • Redirected walking is a locomotion method that allows a user in VR to walk around the 3D virtual environment (VE) with their own two feet. It does so by slightly rotating the world around the user during head, eye, or other body rotation under a certain threshold so the rotational distortion is imperceptible.

  • Much of my work tries to extend its functionality and practical applicability.

Redirected Walking for Visually-Impaired Users

  • Advisor: Prof. Ming Lin, UMD College Park

  • Goal: Since redirected walking (RDW) allows users to walk naturally through a virtual environment of any size, it's a good option for VR applications for the visually-impaired. However, since we need to work under perceptual thresholds and VR tracking spaces are fairly small, we need to distract the user somehow. Thus, our goal is adapt RDW for audio-only apps.

  • Current progress: We were able to prove that RDW should be able to work imperceptibly in terms of performance and quality with only audio (See Completed Projects), but we still need to make a distractor that is more natural for the scenario (see following robot dog entries).

Summary of these projects

gamma talk 10/1
  • Goal: Formalize the robot dog problem by generalizing the problem of robots in virtual environments and figuring out where the robot fits with respect to other problems in the area of haptic proxies.

  • Current progress: Mostly done and working on the submission. Info at its own site.

SUB-STUDY: Designing a robot guide dog for VR training simulations for visually-impaired navigation

  • Goal: We want to create a VR simulation that helps train visually-impaired kids to navigate a large virtual environment like a city. We need to use real walking that permits free roam in a large, open space, thus redirected walking is required. We hypothesize that haptics will improve response time, which we need to study

  • Current progress: See the next two entries, which are sub-projects of this one.

  • Technology that I'm using: Unreal 4 for the entire realtime implementation (Blueprint and C++). ViveTracker2.0 to track the dog and leash. Wireless Vive Pro with Lighthouse 2.0 for head tracking. Blender for editing and optimizing meshes. Leap Motion for hand tracking. Elegoo 4-wheeled Arduino with HC06 bluetooth adapter for the robot.

SUB-STUDY: "A Walk to the Park": Virtual Locomotion with Robot-Guided Haptics and Persistent Distractors

  • Goal: Improve current distraction-based redirected walking systems to allow for a robot tethered to the user, providing haptic feedback, to be feasible for training scenarios.

  • Current progress: We succeeded, and now we need to do the real user study. Check out the current pending preprint below.

A_Walk_to_the_Park__Robot_Based_Active_Haptics_for_Virtual_Navigation_SIGGRAPH.pdf

[full talk will be uploaded here after SIGGRAPH '20 ends]

"Walk a Robot Dog in VR!" -- SIGGRAPH '20 Immersive Pavilion Demo

  • Goal: Improve current distraction-based redirected walking systems to allow for a robot tethered to the user, providing haptic feedback, to be feasible for training scenarios.

  • Current progress: We succeeded, and now we need to do the real user study. Check out the current print below.

walktopark.pdf

SUB-STUDY: Creating a Simulated User for Virtual Locomotion Simulation

  • Goal: create a synthetic user that can be used in simulations of redirected walking, motion compression, etc. to accurate simulate how a user will behave in such a simulation. It's important to simulate randomness of micro (twitchiness) and macro (swaying, overshooting target, etc.) movement AND decision.

  • Current progress: We made 2 models of simulated users--one that tries to calculate the velocity needed to reach a target, and another that tries to accelerate/decelerate to the target instead. Their biomechanics are derived from real observed users (the audio and vision groups from the 2019 paper), and they then complete the same users the real users did. The acceleration model works quite well, and we see some significant resemblance to the real users in many of the scenarios. The velocity model seems better at translation in some cases. We can mix the ideas from each model to create an even better representation.

  • Previous version: In our "Walk to the Park" paper, we successfully simulated a SYNTHETIC user that it good enough to provide results similar to Razzaque's Firedrill Scene that were taken from REAL users, so we're getting there! See the middle part of the video for our simulation.

automated_user_studies_2column (1).pdf

Our Prior Work

Evaluating the Effectiveness of Redirected Walking with Auditory Distractors for Navigation in Virtual Environments (2019)

In this project, we prove for the first time that RDW with distractors is suitable for audio-only users. We design a distractor that successfully navigates users around obstacles and different types of scenes of various tasks. We also run a 3-group study showing that audio-only immersion and performance are not significantly worse than users with vision. In fact, the audio-only performance had a few metrics that were much better and the distractor was less active for them. See the attached videos for more information.

Evaluating_the_Effectiveness_of_Redirected_Walking_with_Auditory_Distractors_for_Navigation_in_Virtual_Environments__Revision_ (2).pdf