In a way forward to treating people with autism, schizophrenia and depression, doctors at the All India Institute of Medical Science (AIIMS) Delhi have come up with a new tool to detect human emotions known as Facial Toolbox for Emotion Recognition (AFTER) database.
According to the doctors who participated in the study, this tool can be helpful in understanding the emotion recognition capability of individuals with various neuropsychiatric conditions such as autism, schizophrenia and depression. “The clinical utility of the AFTER database could be in ascertaining the emotion recognition capability of individuals with various neuropsychiatric conditions, assessing change in emotion perception when patients move from an acute to a remitted state, or predicting the likelihood of relapse,” said Dr Rohit Verma, Assistant Professor at Department of Psychiatry at AIIMS.
The tool, he said, will be helpful for both patients and doctors. “Being a computerised version, the tool may find use in developing emotion-recognition-related tasks in brain imaging studies of emotion recognition with techniques such as functional magnetic resonance imaging, functional near infra-red spectroscopy, quantitative electroencephalography and eye-tracking studies, with a more culturally relevant stance,” he added.
The study was conducted by observing 15 volunteers, who were selected from the National School of Drama. They were instructed to pose with different emotional expressions in high and low intensity. A total of 240 pictures were captured in a brightly lit room against a common, light background. The artists displayed seven facial expressions — neutral, happiness, anger, sadness, disgust, fear, and surprise. Each picture was validated independently by 19 mental health experts and two professional teachers of dramatic art. Apart from recognition of emotional quality, ratings were done for each emotion on a five-point Likert scale with respect to three dimensions—intensity, clarity, and genuineness. Results were discussed in terms of mean scores on all four parameters.
The researchers also focussed on contempt as an emotion which was later dropped as it was not easily distinguishable. The low-intensity images were later not included in the database as the expert consensus for recognition of low-intensity emotion was very low.
As per the study, the database would be useful in the Indian context for conducting research in the field of emotion recognition. “Such a culturally sensitive database may be useful to capture the perception of emotion from an ethnic perspective. AFTER has been validated in a cohort of experts and is found to have good inter-rater reliability. This database shows promise for use in research settings and needs to be validated in the general population,” says the study, which has been sponsored through the Cognitive Science Research Initiative (CSRI) grant of Department of Science and Technology (DST).
The study adds that processing faces and facial expressions are crucial to all forms of social communication, and the interpretation of emotion is culturally dependent. Most existing databases are based on Caucasian, Mongoloid (Chinese, Japanese, Koreans), or African-American faces and there is a limited database containing Indian faces.
Dr Rohit said that this is the first part of the tool and now work is on for the next phase where the focus will be on videos, text and audio. “They all will be combined for a dynamic tool. Reading and listening are other aspects of emotion recognition that reflect real-life scenarios. The current study reports a static image database. However, the final overall emotional toolbox will comprise these other facets of emotion recognition,” he added.
Dr Rohit adds that facial expressions have been called the universal language of emotion. The concept claims that all humans communicate six basic internal emotional states (happiness, anger, sadness, surprise, fear, and disgust) using similar facial movements, by virtue of their biological and evolutionary origins.
“Many recent researchers have opposed the notion and suggested that perception and interpretation of emotion are culture-dependent. While classic studies demonstrated that emotion recognition was above-chance even for individuals from disparate cultures, they also mentioned that the recognition was more accurate when the emotions were both expressed and perceived by the members of similar cultures. The facial stimuli in existing databases tend to vary substantially in terms of facial feature characteristics and expression of emotions, depending on the representative culture from which the database was built,” Dr Rohit said.