NIH Research Festival
At the Advanced Visualization Branch of the National Institute of Nursing Research, we use immersive virtual reality (IVR) as a technical platform to study how people carry out instrumental activities of daily living, such as grocery shopping. We performed a pilot test to evaluate the performance of eye-tracking technology within our IVR grocery store to validate what study participants are viewing on the nutrition label. Our eye-tracking system was developed using commercial software that was modified for the purpose of our test. Lab staff and associates (n=8) acted as testers, and entered a custom IVR experience where they viewed a virtual nutrition label; We created digital boundaries around each nutrient value on the label so that the eye-tracker system could detect eye-gaze locations. Participants then looked at different nutrients, and the performance of the eye-tracker was assessed based on rate of detection of the expected nutrient value. Overall, eye-tracking was able to detect the true eye-gaze location in 98.5% of observations, but the average duration of detection varied based on nutrient. Our preliminary test results indicate that our eye-tracking system is capable of detecting the location of eye-gaze, but whether this detection is consistent or accurate enough to be reliable for our future study remains to be seen. The difference in performance between testers with and without prescription eyewear also warrants further exploration. Our preliminary data provides us with enough information to refine the design process and further explore the performance of eye-tracking in our IVR environments.
Scientific Focus Area: Social and Behavioral Sciences
This page was last updated on Monday, September 25, 2023