Duke University Presents EyeSyn – The “Virtual Eyes” That Mimic Human Eye Movement

Share

 

The metaverse promises to be the next big thing after the mobile internet revolution. Connecting people is not enough – ambitious companies want to bring them together in a virtual world that perfectly replicates the real one. Avatars will interact with each other, do business and have fun just like in the physical world.

To get there, there is a constant need for datasets to train AI and develop metaverse platforms. And Duke University has just supplied an important piece of the puzzle. A team of computer engineers has recently developed EyeSyn – virtual eyes that mimic the movement and focus of real human eyes.

Why Focus on Developing Virtual Eyes?

The development of the metaverse depends on how users interact with the virtual world and what they focus on. Human eye movement is the most relevant way of evaluating what users find interesting, intriguing, and worth looking at. The eyes have the ability to tell more about a person’s feelings, interests, preferences, and biases than any other type of non-verbal communication.

“If you’re interested in detecting whether a person is reading a comic book or advanced literature by looking at their eyes alone, you can do that,” said the Nortel Networks Assistant Professor of Electrical and Computer Engineering at Duke University, Maria Gorlatova, explaining the concept behind the EyeSyn program. “Where you’re prioritizing your vision says a lot about you as a person, too. It can inadvertently reveal sexual and racial biases, interests that we don’t want others to know about, and information that we may not even know about ourselves.”

The Importance of Eye Movement in Developing the Metaverse

The efforts put into developing the simulated human eye movement platform EyeSyn are well justified. AR and VR content creators and developers can learn a lot about users’ preferences. Thus, they are able to:

  • Serve them tailored content;
  • Reduce peripheral vision resolution to save computing power;
  • Allow users to customize their interactions with virtual assets according to their preferences.
Overview of EyeSyn; Source: EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition

With the virtual eyes developed by the team of researchers at Duke University, companies that work on building the metaverse can train their AR and VR platforms and software without having to access real users’ eye-tracking data.

EyeSyn Virtual Eyes Open Opportunities for Smaller Companies

At the same time, the virtual eyes will allow smaller content creators to access valuable eye-tracking data, without conducting expensive tests with real-life users.

“We wanted to develop software that … allows smaller companies who don’t have those levels of resources to get into the metaverse game,” said Gorlatova. “Smaller companies can use it rather than spending the time and money of trying to build their own real-world datasets (with human subjects).”

How Accurate Are the Virtual Eyes?

The Duke University team tested EyeSyn on videos of Dr. Anthony Fauci during press conferences and compared the results with actual eye movements by human viewers. The results indicate a close match between the focus of the virtual eyes and that of real people.

“The synthetic data alone isn’t perfect, but it’s a good starting point,” Gorlatova said. “If you give EyeSyn a lot of different inputs and run it enough times, you’ll create a data set of synthetic eye movements that is large enough to train a (machine learning) classifier for a new program.”

Using EyeSyn Addresses the Issue of Privacy Concerns

Companies have yet another reason to resort to the virtual eyes developed by Duke University. Recording real people’s eye movements means collecting personal data – currently a very sensitive issue.

The virtual eyes do not belong to any person – thus there is no potential breach of data privacy in creating and using datasets in this manner.

The full report on EyeSyn virtual eyes will be presented by the research team at the International Conference on Information Processing in Sensor Networks, which will take place virtually on May 4-6.

ARPost

Recent Posts

How Virtual Reality Is Revolutionizing Police Training

Virtual reality creates safe and immersive environments that help police officers hone their skills, equipping them with new ways to…

1 year ago

Hands-On Review: YOGES Handle Attachments for Quest 2 Controllers

YOGES has an innovative take on adapters for Quest 2 controllers, including a unique removable piece that turns both controllers…

1 year ago

Exploring the World of Live XR Theater

Live XR theater started during the pandemic, but it’s more than a way to avoid a crowd. The medium allows…

1 year ago

Challenges Behind Applying Real-World Laws to XR Spaces and Ensuring User Safety

Applying real-world laws to XR spaces will require governments, businesses, and institutions to work together for effective regulation.

1 year ago

Alien Invasion AR FPS Review

Alien Invasion AR FPS does a lot with relatively little as it hooks me into the story through a combination…

1 year ago

Talespin Launches AI Lab for Product and Implementation Development

AI is like any tool - it makes the job easier but only if you know how to use it.…

1 year ago