Jacobs, O., Anderson, N., Bischof, W. F., & Kingstone, A. (2020). Into the unknown: head-based selection is less dependent on peripheral information than gaze-based selection in 360-degree virtual reality scenes. PsyArXiv.
People naturally move both their head and eyes to attend to information. Yet, little is known about how the head and eyes coordinate in attentional selection due to the relative sparsity of past work that has simultaneously measured head and gaze behaviour. In the present study, participants were asked to view fully immersive 360-degree scenes using a virtual reality headset with built-in eye tracking. Participants viewed these scenes through a small moving window that was yoked either to their head or gaze movements. We found that limiting peripheral information via the head- or gaze-contingent windows affected head and gaze movements differently. Compared with free viewing, gaze-contingent viewing was more disruptive than head-contingent viewing, indicating that gaze-based selection is more reliant on peripheral information than head-based selection. These data dovetail with the nested effectors hypothesis, which proposes that people prefer to use their head for exploration into non-visible space while using their eyes to exploit visible or semi-visible areas of space. This suggests that real-world orienting may be more head-based than previously thought. Our work also highlights the utility, ecological validity, and future potential of unconstrained head and eye tracking in virtual reality.
Back to publications.