Resume: A new study reveals the precise moment the brain detects gaze direction, advancing our understanding of social interactions and conditions like autism and Alzheimer’s. Researchers used EEG and machine learning to analyze brain activity as participants viewed avatars with different head and eye directions.
They found that the brain processes head orientation before eye direction, with task context affecting gaze accuracy. This breakthrough could aid early diagnosis and treatment of autism and Alzheimer’s.
Key Facts:
- Hierarchical processing: The brain first detects the head direction after 20 ms, then the eye direction after 140 ms.
- Task influence: The accuracy of detecting gaze direction improves when attention is focused on the gaze.
- Diagnostic potential: Findings may contribute to early diagnosis and treatment of autism and Alzheimer’s.
Source: University of Geneva
Gaze plays a central role in everyday social interactions. Our ability to communicate directly depends on the brain’s ability to detect and interpret the direction of others’ gaze. How does our brain detect gaze direction, and what factors influence the process?
In a recent study published in the journal NeuroImageA team from the University of Geneva (UNIGE) has succeeded in determining with unprecedented precision the exact moment at which the direction of gaze is detected.
These findings significantly expand our knowledge of autism spectrum disorders and may offer therapeutic opportunities for people with Alzheimer’s disease.
Human faces are the most common and consistent visual stimuli we encounter from the moment we are born. Our brains have developed the expertise to remember and recognize faces, and to interpret the messages they convey. For example, direct eye contact signals a desire to engage in social interaction, while avoiding eye contact conveys the opposite message. But how quickly can our brains understand the gaze of others?
This topic has been extensively studied. However, existing publications mainly focus on studying the eye region in isolation, neglecting other factors such as head orientation.
Cerebral analysis of gaze
A team from UNIGE presented the study participants with 3D avatars, each with different head and gaze directions. In the first task, the volunteers were asked to indicate the orientation of the head, while in the second task they had to identify the direction of the eyes.
By analyzing brain activity using an electroencephalogram, the research team discovered that these two processes can be reliably decoded independently.
”The experiment also shows a certain hierarchy in the processing of these two pieces of information. The brain first perceives the more global visual signals, i.e. the orientation of the head, starting at 20 milliseconds, before focusing on the more local information, i.e. the eyes, starting at 140 milliseconds.
“This hierarchical organization then allows for the integration of information about the eye region and head orientation, to ensure accurate and effective assessment of gaze direction,” explains Domilė Tautvydaitė, a postdoctoral researcher and associate professor at the Faculty of Psychology and Educational Sciences at UNIGE, and the first author of the study.
The study also shows that gaze direction decoding was significantly more accurate when participants were specifically asked to attend to the gaze of the presented faces. This indicates that task context influences gaze perception and understanding.
“These results show that in everyday life, people can recognize the intentions of others better and faster when they are actively involved in a ‘social mode’,” explains Nicolas Burra, associate professor at the Faculty of Psychology and Educational Sciences and director of the Experimental Social Cognition Laboratory (ESClab) at UNIGE, who led this research.
An advanced method
The method used yields extremely accurate results for these two mechanisms. By integrating the analysis of neural activity using electroencephalography (EEG) with machine learning techniques, the research team was able to predict the decoding of gaze and head direction even before the participants were aware of it.
”This method represents a major technical innovation in this field and allows a much more precise analysis than previously possible,” adds Nicolas Burra.
In people with autism spectrum disorders, the decoding of this information may be impaired and avoidance of eye contact may become preferential. This is also the case in Alzheimer’s disease, where memory problems during the progression of the disease impoverish the person’s relationships with others and often lead to social withdrawal. It is therefore essential to understand the neural mechanisms involved in detecting gaze direction.
The research results and the method used make a concrete contribution to the early diagnosis of autism spectrum disorders in children. As for Alzheimer’s disease, one of the most striking symptoms as the disease progresses is the inability to recognize faces, even those of family members.
This study therefore paves the way for a better understanding of the neural mechanisms associated with impaired social interaction and memory for faces – a topic currently being studied by Dr. Tautvydaitė at McGill University in Canada. UNIGE’s ESClab laboratory research will continue in this area by analyzing these processes during real-life social interactions.
About this visual neuroscience research news
Author: Antoine Guenot
Source: University of Geneva
Contact: Antoine Guenot – University of Geneva
Image: The image is attributed to Neuroscience News
Original research: Open access.
“The timing of gaze direction perception: ERP decoding and task modulation” by Nicolas Burra et al. NeuroImage
Abstract
The timing of gaze direction perception: ERP decoding and task modulation
Being able to distinguish the direction in which another person’s gaze is directed is incredibly important in everyday social interaction, as it provides crucial information about the attention and therefore intentions of that person.
The temporal dynamics of gaze processing were investigated using event-related potentials (ERPs) recorded using electroencephalography (EEG).
However, the timing at which our brains distinguish gaze direction (GD), regardless of other facial cues, remains unclear. To address this question, the aim of the current study was to investigate the time course of gaze direction processing, using an ERP decoding approach based on the combination of a support vector machine and error-correcting output codes.
We recorded EEG in young healthy subjects, 32 of whom performed GD detection and 34 performed face orientation tasks. Both tasks presented 3D realistic faces with each five different head and gaze orientations: 30°, 15° to the left or right and 0°.
While classical ERP analyses showed no clear GD effects, ERP decoding analyses revealed that discrimination of GD, regardless of head orientation, began at 140 ms in the GD task and at 120 ms in the face orientation task. GD decoding accuracy was higher in the GD task than in the face orientation task and was highest for direct gaze in both tasks.
These findings suggest that brain pattern decoding is influenced by task relevance, altering the latency and accuracy of GD decoding.