DOI

10.17077/etd.c2ot-rd8t

Document Type

Dissertation

Date of Degree

Spring 2019

Access Restrictions

Access restricted until 07/29/2021

Degree Name

PhD (Doctor of Philosophy)

Degree In

Computer Science

First Advisor

Kearney, Joseph

First Committee Member

Plumert, Jodie M.

Second Committee Member

Thomas, Geb W.

Third Committee Member

Hourcade, Juan Pablo

Fourth Committee Member

Chipara, Octav

Abstract

A fundamental difference between interaction in real-world and virtual environments is that the natural interactions with the physical world are temporally synchronous, whereas interactions in a virtual environment always involve latency due to the time it takes to synthetically simulate cause and effect reactions. This inevitable delay in VR systems may impair user performance by creating a discrepancy between sensory systems. In contrast to virtual environments, we perceive our interactions with real world as seamless, coherent, and integrated despite the fact that there is more than a hundred milliseconds delay between our visual and motor command systems. In fact, our multisensory system is well adapted to cross-sensory temporal misalignments such that we perceive our interactions with real-world coherent and integrated. This raises questions about how we perform in presence of visuomotor latency and whether we can adapt to visuomotor latency.

The question of whether we can adapt to visuomotor delay is a controversial topic. Some preliminary studies showed that users cannot adapt to visuomotor delay. They showed that users change their strategy to act-and-wait to mitigate the effect of latency and consequently, they do not adapt to the visuomotor latency. In contrast, new studies have shown that users are very capable of adapting to visuomotor latency. However, while there are many studies that have focused on visuomotor delay adaptation in ballistic movements (e.g., pushing a button or reaching to a stationary object), only a few studies have explored the visuomotor delay adaptation in continuous tasks (e.g., bike riding or driving vehicles). These studies suggest that users adapt to the visuomotor latency when a constant pace of interaction is imposed.

We examined the effect of visuomotor latency on user steering performance in a continuous task which involved riding a bike in a virtual environment. In a preliminary experiment, we investigating three factors involved in user performance, consisting of the pace of the interaction (i.e., speed of bike), amount of visuomotor latency, and the complexity of the task (i.e., turning angle). Our preliminary results indicated that users adapted to different constant speeds and different levels of latency very quickly. In addition, we found that the size of the turn angle greatly affected user performance. In the main experiment, we examined how users adapt to visuomotor latency in two different conditions 1) when the bike traveled with a constant speed 2) when the user had control over speed of the bike by pedaling and braking. We found that users adapted to the imposed visuomotor latency rapidly and they readapted when the latency was removed. In addition, we showed that users performed better when they had control over the speed of bike. Users adjusted the speed of bike based on the complexity of the path, slowing down as they approached turning points and speeding up once they passed the turning point. Finally, we found that users gradually increased their speed as they adapted to the level of latency and gained better control. We hypothesize that this represents a process of balancing speed and error dependent on controllability. We call this “error homeostasis”. These findings support the idea that users can adapt to the visuomotor latency during a steering task in a virtual environment. In addition, our study suggests that visuomotor latency adaptation can happen whether or not users control the speed of self-movement.

Pages

xi, 68 pages

Bibliography

Includes bibliographical references (pages 65-68).

Copyright

Copyright © 2019 Pooya Rahimain

Available for download on Thursday, July 29, 2021

Share

COinS