Look me in the eyes, avatar: Social interaction in the metaverse

Vom Viewpointsystem Redaktionsteam

The metaverse may not be much more than a vision today, but it is also one of the most exciting and promising future tech projects of our time. The impact of this future 3D virtual space on the way we work, communicate and experience things together in the future is likely to be enormous. Many companies are committed to shaping the metaverse. And so are we. This is because eye tracking and, in particular, our Digital Iris technology offer the possibility of emulating authentic human gaze behavior on avatars.

Why is eye contact so important to our social interaction, whether in real life or in the metaverse? We talked to Alejandro Gloriani, R&D Senior Developer in the Advanced Technology Team at Viewpointsystem, about this.

ALEJANDRO, WHAT ARE THE ADVANTAGES OF EYE TRACKING TECHNOLOGY WHEN IT COMES TO BUILDING THE METAVERSE?  

First of all, we see the metaverse as a virtual space, or many interconnected spaces, where you can work, create and explore with other people who are not in the same physical space as you. This means that social interactions will have a key role, especially those of the type person-to-person.

Eye tracking, meaning the process of tracking eye movements or the absolute point of gaze of a person, enables a more realistic and engaging experience of virtual person-to-person interaction. The technology developed by Viewpointsystem allows to create digital twins of the user’s eyes, emulating the real eye movements of the user on avatars and avoiding staring into fixed, unmoving eyes. Avatars without natural eye movements will always seem strange or even creepy to us. Eye tracking offers the possibility to avoid this so-called “uncanny valley” phenomenon, and to mitigate or even completely eliminate people’s negative reaction to otherwise lifelike avatars.

In addition, with a new approach developed by Viewpointsystem, it is possible to detect eye contact between two avatars, and to automatically trigger an action – like voice interaction, starting the chat box or the option to leave your strictest privacy level/safety bubble.

WHY IS EYE CONTACT WITH OTHERS SO IMPORTANT FOR US HUMANS?

Human eyes are different from the eyes of other primates. We have a larger and very visible sclera (the whites of the eyes) combined with a colored iris. The resulting contrast makes it easy to focus on the eyes when engaging in face-to-face contact, allowing humans to tell which direction someone’s eyes are pointing.

As homo sapiens, we evolved to “read” emotions from other human beings. From a young age, we learn to interpret people’s faces, and this is a key skill to socialize and to avoid conflicts. Studies have shown that in a face-to-face interaction, the facial area to which we pay more attention by far are the eyes, followed by the mouth.

It is an evolutionary mechanism that human beings have a bias to detect faces, even where there are none, and to read the emotions of those faces. Also, we build social interactions and relationships through the eyes. Someone looks at you, you return the look, a conversation begins. Or you look away if you are not interested in making contact. That’s how social behavior works in real life, and that’s how it needs to happen for an authentic metaverse experience. Without waving around with hands or controllers, without clicks, just as authentic and socially accepted as in real life.

WHY IS THE INTEGRATION OF EYE TRACKING IN XR DEVICES SO CHALLENGING?

The main challenge is the development of an eye tracking system that is robust, small, and efficient enough to be integrated in consumer devices, such as AR and VR glasses. Among others, such a system must work properly with different faces, long eyelashes or with mascara, in the presence of eyelids that can partially occlude the eye, in users with refractive errors wearing contact lenses or correction lenses, under different lighting conditions or in conditions in which the pupils are very small or very large, with diverse iris colors, or in the presence of eye disorders such as lazy eye and strabismus.

In the typical eye tracking for research studies, the aforementioned factors are controlled. Exclusion criteria or customized solutions are carried out. However, when we talk about eye tracking in XR devices, we are talking about providing an eye tracking solution that works for all users and in all scenarios. The challenge is to bring this solution that fits all close to 100%, to make it as perfect as possible. 

So what we need are eye tracking solutions for real world conditions – something that we at Viewpointsystem are particularly experienced in due to the use of our system in practical work on the factory floor or outdoors, among other things.

TO WRAP UP THIS TALK, COULD YOU SAY WHAT THE CHALLENGES ARE IN SETTING UP SOCIAL INTERACTION THROUGH EYE CONTACT IN THE METAVERSE?

A big challenge is to distinguish different types of eye contact to trigger the right action from it. What is fleeting interest, what is a request for conversation? When does one desire privacy? Profound in-depth knowledge in the field of human gaze behavior is required here.

Also, the situational context as well as cultural and individual components must be taken into account. Based on our expertise, we have found potential ways and the technology to draw the right conclusions and trigger appropriate actions – such as voice interaction, starting the chat box, or even the option to block a contact.

REFERENCES

Meta, “Building the Metaverse Responsibly”, 2021.

IEEE Spectrum, “What Is the Uncanny Valley?”, 2019.

Majaranta, P., & Bulling, A., “Eye Tracking and Eye-Based Human – Computer Interaction”, 2014.

Kobayashi, H., & Kohshima, S., “Einzigartige Morphologie des menschlichen Auges und ihre adaptive Bedeutung: Vergleichende Studien zur äußeren Morphologie des Primatenauges”, Journal of Human Evolution, 40(5), 419-435, 2001.


ALWAYS UP TO DATE

Would you like to receive regular updates on exciting future topics and the latest innovations for the industry? Then subscribe to our newsletter!