Assumptions about the positioning of virtual stimuli affect gaze direction estimates during Augmented Reality based interactions.

Nicola Binetti, Tsu-Jui Cheng, I Mareschal, Duncan Brumby, S Julier, N Bianchi-Berthouze
in Sci Rep, Journal article


We investigated gaze direction determination in dyadic interactions mediated by an Augmented Reality (AR) head-mounted-display. With AR, virtual content is overlaid on top of the real-world scene, offering unique data visualization and interaction opportunities. A drawback of AR however is related to uncertainty regarding the AR user's focus of attention in social-collaborative settings: an AR user looking in our direction might either be paying attention to us or to augmentations positioned somewhere in between. In two psychophysical experiments, we assessed what impact assumptions concerning the positioning of virtual content attended by an AR user have on other people's sensitivity to their gaze direction. In the first experiment we found that gaze discrimination was better when the participant was aware that the AR user was focusing on stimuli positioned on their depth plane as opposed to being positioned halfway between the AR user and the participant. In the second experiment, we found that this modulatory effect was explained by participants' assumptions concerning which plane the AR user was focusing on, irrespective of these being correct. We discuss the significance of AR reduced gaze determination in social-collaborative settings as well as theoretical implications regarding the impact of this technology on social behaviour.