Autism and AR: Building AR Solutions for Autism Spectrum Disorder
Despite affecting more than 3.5 million Americans and 1 in 68 newborn children in the US, Autism Spectrum Disorder (ASD) remains poorly understood by the general public. At its core, ASD is a complex developmental disorder that affects a person’s abilities in social interaction and communication. These symptoms can look very different from individual to individual, and the severity of symptoms can range from moderate to profound. The goal of treatment for ASD has always been an improvement in quality of life, and this typically takes the form of training and practice in social cues and communication.
One of the biggest difficulties with this sort of treatment is that it’s almost impossible to administer in real-time: Life happens quickly, and since many people living with ASD require more time to process and understand cues that seem second nature to unaffected individuals, developing adaptive strategies has proven a major challenge for researchers and clinicians working on mitigating the effects of the disorder.
Although still in a relatively early stage, Augmented Reality solutions seem poised to revolutionize many aspects of how ASD is treated, especially in children and adolescents. One such solution comes in the Empower Me suite of apps, developed by Cambridge, MA startup Brain Power. Empower Me was designed and built with Google Glass, the wearable augmented reality glasses best known to the general public as a gigantic commercial flop. Repurposed as a medical device, though, it offers huge advantages for developing apps with real-world applications such as “facilitat[ing] life skills such as language, emotional understanding, eye contact, control of behaviors, conversation skills, meltdown prevention, social connection, self-confidence, perspective-taking, and more.”
Empower Me wrapped an enormously successful Indiegogo campaign in January, but it grew out of relatively small beginnings at Harvard and MIT. Founder of its parent company Brain Power, Dr. Ned Sahin, has spoken at length about his intentional choice of Glass as the platform and the general approach to how the system works: “Our applications are gamified and engaging, and run on smart glasses. Unlike with a tablet or phone, the person is looking up, and our software encourages social interaction with other people.”
The Indiegogo campaign offered a brief example of one of the app’s augmented reality games, called Emotion Charades. A deficit in being able to correlate facial expressions to their corresponding emotions is one of the most common symptoms in ASD, so in the app, a user works with a partner to build skills in this area by being presented with two choices on either side of the partner’s face, as you can see below:
Facial recognition algorithms in the Empower Me software already ‘know’ which emotion the partner’s facial expression actually corresponds to, so the platform is able to rapidly generate a correct and incorrect answer without any prior knowledge. Afterward, responses, time data, and measures of stress and anxiety (detected through various sensors in the Glass hardware) are aggregated into a dashboard that can be reviewed and tracked over time by the user, families, and healthcare professionals. This allows for reliable progress tracking over time, and built-in rewards and achievements help the user to maintain motivation and continue to use the app to build skills.
A pilot study for Brain Power was reported in Frontiers in Pediatrics, and other large research groups like the one behind the Autism Glass Project at the Stanford University School of Medicine are conducting large-scale trials with similar AR technology on heads-up displays. On Medium, autism educator Craig Smith outlined several product concepts for how Apple’s ARKit SDK could be used to build a host of responsive, engaging apps to address various deficits that typically present in autism spectrum disorders.
In his seminal New Yorker profile of Temple Grandin An Anthropologist on Mars, Oliver Sacks wrote at length about German researcher Hans Asperger, whose work on autism was not translated into English until 1991. It’s worth quoting Sacks’s writing at length because what he chose to highlight about Asperger’s work on autism beautifully highlights the particular intelligence and abilities that come along with what is often portrayed as an exclusively devastating disorder:
Asperger brought out other striking features, stressing, “They do not make eye contact . . . they seem to take in things with short, peripheral glances. . . . There is a poverty of facial expressions and gestures. . . . The use of language always appears abnormal, unnatural. . . . The children totally follow their own impulses, regardless of the demands of the environment [but] there can be excellent ability of logical abstract thinking.” While [a fellow seminal researcher] seemed to see it as an unmitigated disaster, Asperger felt that it might have certain positive or compensating features—a “particular originality of thought and experience, which may well lead to exceptional achievements in later life.”
Asperger seemed to see potential where most of his contemporaries saw only deficits, and the use of cutting-edge technology to augment these deficits represents a use case for augmented reality that is so much more than the novel ‘Wow!’ factor of many current AR implementations. Augmented and virtual reality systems are at their very best when they are enhancing the user’s experience in a way that would have been previously impossible without the technology. For individuals living with autism, social communication training with the help of AR has huge potential to help mitigate the fear, discomfort, and difficulty associated with many aspects of everyday life.