Abstract

Authors: Niklas Hypki

Following rapid technological developments in recent years, HMDs can now track movement trajectories of the head, feet, hands, joints and even our eyes simultaneously. The possibility to rearrange and manipulate virtual stimuli based on our behaviour makes VR a potentially interesting method for experiments in psychophysics. This thesis presents three empirical, peer-reviewed studies exploring this approach.

To obtain a fair assessment of the latency of eye tracking sensors in HMDs, we developed a method to measure it using simultaneous EOG recording. Delays (time from an eye movement to the availability of the corresponding data) ranged from 15 ms to 52 ms, end-to-end latency (time from an eye movement to a corresponding change on the display) ranged from 45 ms to 81 ms.

Based on HMD gaze data, we then predicted future waypoints. To achieve this, 18 participants performed tasks such as walking along curved paths, avoiding or approaching objects and searching while we recorded their position, orientation and eye tracking data. Segments of 2.5 s of data were used to train a LSTM model that predicts the position the user reaches in 2.5 s. The prediction was fairly accurate with an average error of 66 cm. The extent of model improvements based on eye movement data varied depending on the task and environment. Overall, eye movements enabled more accurate prediction of locomotion behaviour, especially in situations with varying walking speeds.

Finally, we analyse the eye and head movements of 68 participants as they searched for salient and less salient targets in spatially grouped sets inside and beyond the initial FOV. Salient targets within the initial FOV facilitated search.

Salient stimuli that appeared to ‘pop out’ from the periphery at the beginning of a search no longer exhibited this property when they entered the FOV due to a head movement during an ongoing search. This suggests that the influence of stimulus saliency differs between searching on a static display and in scenarios in which we can explore our surroundings using head turns.

The general discussion addresses current limitations of VR and presents approaches for reducing eye tracking sensor latency in future devices. Subsequently, ideas for improving locomotion prediction are presented. Finally, gaze behaviour is discussed in relation to the typical temporal course of the influence of top-down and bottom-up guidance.

Altogether, this thesis shows that VR can be used as a method in psychophysics. Our results demonstrate that experiments in VEs have the potential to contribute to our understanding of perception as well as to the advancement of VR.

Keywords

Visual Perception, Virtual Reality, Eye Movements, Eye Tracking, Gaze, Head Movements, Visual Search, Locomotion, Prediction, Salience