conflict-free replicated data type (CRDT) Is the metaverse bad for the environment? Exploring a VR world

Eye tracking in VR: Everything you need to know

Advancements in eye tracking technology could make VR headsets more useful. Still, products need to address cybersickness and other factors that affect the user experience.

Eye tracking technology uses special sensors to follow what our eyes look at. Various flavors of the technology have been around for decades in medical and behavioral research, workplace safety, interface design studies, automotive attention tracking and computer gaming interfaces. HTC first incorporated eye tracking into its VR headset in 2017, and Apple included the tech in the Vision Pro spatial computing platform that debuted in early 2024. Other headset vendors, including Sony, HP, Pico and Pimax, are in the preliminary stages of using special eye tracking hardware and chips from Tobii, an eye tracking innovator, particularly in computer gaming.

Innovation in eye tracking in virtual reality (VR) could result in improved graphics, new user experiences, better research and more adaptive training programs. It also raises new privacy concerns over its capabilities to track attention more precisely and to gather in-depth medical information.

How does eye tracking in VR work?

Eye movements are the fastest and most delicate movements of the human body. When we look out, we normally see a relatively static world. But underneath the surface, the eyes dance in slow, smooth pursuit movements, jumpy saccadic movements and delicate microsaccades gently vibrating across the textures of things hundreds of times per second. Vision-tracking tools that monitor these movements can help identify what we are looking at and what is happening within our minds and bodies.

As far back as ancient Greece, Plato opined in Timaeus that seeing is an active process in which we grip objects in the world with our attention. These days, researchers are more interested in what we are looking at, the mechanics of looking, and what it might mean inside the body and our minds.

Early eye tracking technologies employed complicated and invasive techniques using uncomfortable magnetic contact lenses or sensors that listened to our eye muscles firing. These days, most eye trackers bounce infrared light off the eye and track reflections using multiple cameras in a process called video oculography.

The performance of these trackers is measured in hertz (frames per second), while fidelity is measured in degrees of resolution and accuracy. Resolution is the smallest level of detail that's detectable, while accuracy indicates how well it corresponds to where we are looking. At the high end, devices like SR Research's EyeLink II support 500 Hz, 0.01 degrees of resolution, and 0.5 degrees of accuracy. This is relatively expensive, however, and at present is too large for use in headsets.

Most virtual reality and augmented reality products outside Apple use Tobii's eye tracking chips. Tobii does not report its VR headset chip specs, but, for comparison, its latest head-mounted glasses for researchers support up to 100 Hz and 0.6 degrees of accuracy.

New user experiences

For now, the top use case of eye tracking is dynamic foveated rendering, which improves image quality for the small area you can see with the most sensitive part of the retina. Most of our eyes have a much lower concentration of rods and cones for detecting light and mostly help with peripheral vision.

Innovations in eye tracking could also usher in a new user interface paradigm as transformational as innovations like the mouse and touchscreens were in their day. Zhiwei Zhu, principal scientist in SRI's computer vision technologies lab, said that an eye tracking system with a frequency higher than 200 Hz and an accuracy of approximately 0.5 degrees is sufficient to support new user VR experiences for most users with normal eyesight. Apple already uses the tech in Vision Pro to allow you to gaze at virtual objects and touch your fingers to interact with the virtual world.

Thomas Dexmier, associate vice president of business development and enterprise solutions at HTC VIVE, said new user experiences depend on the use case. VIVE's new eye and head tracker can attain 120 Hz, a sweet spot to support current and near-future use cases. For example, a lot of today's consumer research focuses on relatively close-up actions, such as looking at how shelves are stacked, how clothes are laid out or how someone behaves in front of an audience they address virtually. Even examining complex 3D models is done at a short distance. "This means there doesn't need to be the ability to track millimeter precision from ten meters away," Dexmier said.

Dexmier sees the biggest obstacle to creating new experiences as simply making the technology available to designers and researchers. "We've found that once they see what's possible, they embrace it quickly," he said.

On a technical level, developers need access to extensive SDKs to build future eye tracking-based applications. It's also important to simplify the landscape of 3D content and eye tracking file and data formats so developers can create new experiences more easily.

Further improvements may also allow you to cut and paste text with your eyes and eliminate the need for a mouse. But Jared Timmins, vice president of innovation for media solutions at Diversified, an IT services and consulting firm, believes it may take a few more iterations in eye tracking tech to catch up with the fluidity of traditional mouse and touch interactions. "While these advancements are impressive, further progress in areas such as improving accuracy and precision across a wider field of view, reducing cost and enhancing user comfort are important developments still to come," he said.

Obstacles in eye tracking

For broader adoption of eye tracking in new user experiences, research and training, several obstacles must be addressed.

Zhu points to the need for low latency eye tracking systems capable of operating effectively in diverse lighting conditions and varied application environments. Also, users wearing eyeglasses or with different eye anatomies will need to be accommodated. On the UI side, the primary challenge is to design interfaces that facilitate natural and intuitive interactions, utilizing eye movements without causing discomfort or fatigue. These common disorienting effects are typically described as cybersickness or VR sickness.

The primary challenge is to design interfaces that facilitate natural and intuitive interactions, utilizing eye movements without causing discomfort or fatigue.

Adam Gross, CEO and co-founder at HarmonEyes and RightEye, believes more work will be required to improve the interoperability of eye tracking models and applications across eye tracking capable devices. "Currently, manual validation and testing is required to transfer and generalize eye tracking models and apps to any new or different eye tracking device," he said.

His two companies are developing eye tracking data aggregation and translation tools that use AI and machine learning to correlate results from different trackers with various models for tracking attention, assessing the impacts of concussions, and improving sports performance and training programs.

It's important to address the privacy and ethics implications of new eye tracking tools. Consumers already cautious about how their clicks and likes are used may be even more concerned about platforms that track what they look at in real and virtual worlds. Businesses and customers will need to have guardrails and measures in place to address those concerns.

"Much like the concerns surrounding the use of AI, robust data protection practices and top-down governing policies will be the required due diligence for all companies in which eye tracking is employed," Timmins said.

Future of eye tracking technology in VR

Cybersickness and fatigue are two of the factors that have so far hampered device adoption. Researchers are still trying to understand what causes cybersickness, which varies from person to person. Products that are able to mitigate some of these problems could lead users to comfortably extend their time on a device.

Better eye tracking could also broaden VR's medical use cases, such as with remote diagnostics for conditions such as Parkinson's disease, Lyme disease, mild cognitive impairment and Alzheimer's disease. Drug development, clinical trial services and telehealth could also benefit from eye tracking.

Other short-term use cases could include enhanced user experiences and improved performance, including adaptive gaming based on an individual's user state, improved skill acquisition during training tasks, reading co-pilots that enhance student learning, visual search features that assure a user is looking at the right place at the right time and remote patient monitoring, Gross said.

Dexmier said one of the most interesting enterprise applications of VR is improved design. "When showing design concepts to stakeholders in VR, designers can gain valuable feedback by analyzing where the subject's gaze and attention is drawn to most," he said. Not only can this help validate areas of a design or 3D model prototype, but it can also highlight areas with less focus, making iteration even faster.

Zhu predicts the first big opportunity will be using eye gaze as an alternative user interface input to control the device, allowing us to use eye gaze to make graphics look better through foveated rendering and interaction with avatars. Also, the monitoring of users' cognitive load and where they look could be useful for targeted ads.

George Lawton is a journalist based in London. Over the last 30 years, he has written more than 3,000 stories about computers, communications, knowledge management, business, health and other areas that interest him.

Next Steps

Workplace benefits of immersive learning with AR and VR

Top technologies for metaverse development

How will the metaverse affect the future of work?

Dig Deeper on Digital transformation

Cloud Computing
Mobile Computing
Data Center
and ESG