Shared gaze in co-present XR affects embodied information behavior – Read the Paper

Some real-world impact created at xREZ this year: the Innovation Research Methods class, taught by xREZ Director Ruth West in our lab, resulted in a paper published in Imaging Journal and presented at the Engineering Reality of Virtual Reality 2022 conference. The grad students also collaborated with renowned embodied information behavior experts Dr. Christopher Lueg and Dr. Michel Twidale from the University of Illinois Urbana-Champaign iSchool.

Our interdisciplinary classes bring together students from design, computer science, information science, anthropology and more to  gain hands-on research experience and create real world impact. This course enables students to take their work all the way through the peer-reviewed publication process. We’re excited to be hosting these courses in the lab and  can’t wait to see what our future grads will accomplish!

Title: A state of the art and scoping review of embodied information behavior in shared, co-present extended reality experiences


Kathryn Hays1 , Arturo Barrera1 , Lydia Ogbadu-Oladapo1 , Olumuyiwa Oyedare1 , Julia Payne1 , Mohotarema Rashid1 , Jennifer Stanley1 , Lisa Stocker1 , Christopher Lueg2 , Michael Twidale2 , Ruth West1

1 University of North Texas; Denton, Texas, USA, 2 University of Illinois, Champaign, Illinois, USA


We present a state of the art and scoping review of the literature to examine embodied information behaviors, as reflected in shared gaze interactions, within co-present extended reality experiences. Recent proliferation of consumer-grade head-mounted XR displays, situated at multiple points along the Reality-Virtuality Continuum, has increased their application in social, collaborative, and analytical scenarios that utilize data and information at multiple scales. Shared gaze represents a modality for synchronous interaction in these scenarios, yet there is a lack of understanding of the implementation of shared eye gaze within co-present extended reality contexts. We use gaze behaviors as a proxy to examine embodied information behaviors. This review examines the application of eye tracking technology to facilitate interaction in multiuser XR by sharing a user’s gaze, identifies salient themes within existing research since 2013 in this context, and identifies patterns within these themes relevant to embodied information behavior in XR. We review a corpus of 50 research papers that investigate the application of shared gaze and gaze tracking in XR generated using the SALSA framework and searches in multiple databases. The publications were reviewed for study characteristics, technology types, use scenarios, and task types. We construct a state-of-the field and highlight opportunities for innovation and challenges for future research directions.

Download (PDF, 678KB)

xREZ Art + Science Lab

Imagine. Discover. Create.
xREZ Art+Science lab forges productive paths that harness the unique creative forces found only at the intersection of the arts, sciences and humanities to open new portals of imagination, knowledge and communication.Our goal: alter the landscape of human endeavor to create transformative impact on our world and our future.

Contact Us
Copyright © 2015. All rights reserved.