IITGn Develops Gaze Sensitive, VR-Based Social Interactive System for Individuals with Autism
The Indian Institute of Technology, Gandhinagar (IITGn), has developed a virtual reality-based system that can deliver individualised feedback based on the dynamic gaze patterns of individuals with autism, a neurodevelopmental disorder.
The system, ‘Towards design of multi-user virtual reality-based interactive social communication platform for individuals with Autism and its implications on eye gaze’, developed under Prof Uttama Lahiri, an associate professor at electrical engineering department, works on eye-tracking measures that are considered to be very essential since individuals with Autism Spectrum Disorder (ASD) are often distracted and have less attention span. The work has already been patented and published in a few journals.
According to paediatric neurologists, persons with ASD have difficulties in looking at a communicators face (referred as atypical gaze pattern) during a social conversation, leading to difficulties in comprehending emotions, social gaze and even affecting the maintenance of appropriate social proximity.
Pradeep Raj K B, the main researcher of the new system, said, “Be it a school, home, shop or office environment, persons with ASD face difficulties in engaging with social partners since they cannot understand the essence of other’s behaviour, emotional preferences while taking part in a collaborative task or teamwork, thereby failing to adapt to their social partners’ needs. These challenges call for alternative technology-assisted solutions that can possibly ensure easily accessible, quantifiable and individualised services.”
Pradeep has completed his PhD from the cognitive science discipline. He is now pursuing his post-doctoral research from Pratt School of Engineering, Duke University, North Carolina.
In the new system, Pradeep said, they used composite estimates of looking pattern, pupillary dilation and blink rate to understand one’s affective state to help offer individualised training. “The system has a facility to give information on one’s affective state that can help even inexperienced therapists or parents, who can modulate their training modules by offering a different level of task difficulty by understanding the individual affective state of a child,” he said.
The study, tested with 18 individuals with ASD while exposing them to the system, indicated that persons with ASD have improved their communication ability in terms of understanding other’s emotions, proximity, eye-gaze and preference with multiple exposures as well as improvement in looking pattern as compared to their first exposure.
In the system, multi-user VR-based platforms have been integrated with eye-tracking technology, which is not susceptible to the communication vulnerabilities and can help to quantify one’s subtle affective states while interacting with the task platform, ensuring individualised training facility. Additionally, the system has the capability of offering varying difficulty level by switching between tasks based on individual needs on its own in the absence of therapists or parents.
Pradeep said such a multi-user VR-based platform could have a potential to serve as a complementary tool to interventionists by providing them with some insights into the participants’ performance and affective states, quantified in terms of gaze-related indices, for taking faster decisions while tuning the intervention paradigm. The platform, he added, if made available in various special needs schools and homes could ensure better accessibility to individuals with ASD and helping them lead a productive social and community life.
Taking this forward, Pradeep said, they are looking for resources in the form of a partner or company with whom the idea can be developed on a larger scale with a wider applicability.