When Rebecca Ramnauth talks about AI and robots, she doesn’t speak about whether they will replace human tasks or facilitate more efficient teaching and learning. Instead, she discusses the ways in which AI and robots can help us better connect with one another as humans.
A doctoral candidate working at the Yale Social Robotics Lab in Connecticut, Ramnauth has dedicated her career to building robots as tools for understanding how people work. A key part of her research is studying how robots and AI can be used to support people better in general, but specifically people with autism.
Ramnauth discussed her work and this promising branch of AI and robotics research as the keynote speaker at the NYC Schools Tech Summit 2024 in March. In a followup interview, she shares what inspired her interest, the success the work has had so far, and the implications for educators.
Using AI and Robots To Help Students With Autism: A Personal Connection
Ramnauth’s interest in this field was inspired by her younger sister, who has been diagnosed with autism.
“I saw how she grew up, and some of the difficulties that she faced,” Ramnauth says. “Making friends looks different for her. Things that we take for granted, like engaging in small talk, or going on a date, or being interrupted by an alarm, or an ambulance running across the street — these are everyday situations, but she sees the world very differently in each of those contexts.”
Inspired by her sister, Ramnauth decided to study ways in which socially assistive robot education tools might be developed. This was an ambitious undertaking as most assistive technology research has focused on physical assistance. Ramnauth and her colleagues, however, are making great strides when it comes to social assistance.
A Pilot Study Full of Promise
One area that Ramnauth decided to look in was eye contact, which can be something many with autism struggle to maintain.
In a small pilot study, Ramnauth and colleagues loaned robots to children with autism and their parents in Connecticut. “The robot did something very simple, it just looked at the child, waited until the child made eye contact, and then looked at the parent and then looked back at the child,” Reamnauth says. “The idea was that the robot would be modeling attention sharing, and that, hopefully, the child would look at the robot and then turn to look at their parent and be willing to engage in some sort of conversation.”
Ramnauth and her fellow researchers found that the children who received the robots engaged in more natural and spontaneous conversations with their family, and that these interactions were statistically significant.
This was remarkable, Ramnauth says, given that the robots were fairly basic: essentially screens with eyes that could swivel. The robots also only utilized basic AI technology for facial recognition.
Next Steps For Research
This pilot study was small with only 30 participants, and it’s hard to make sweeping generalizations, yet the results are compelling enough to suggest future areas of research.
“It’s a large enough sample size to say something about how we should design technology and that there is a potential for this technology to be useful,” Ramnauth says. “These are interesting enough outcomes that there are clinicians who are starting to pay attention to this technology.”
She adds that after the success of this project they’ve heard from various institutions looking to use these robots, including a pediatric center to interact with kids who are experiencing lots of stress as well as a school in Brooklyn working with various children with different needs. “They’re building a sensory room for their ICT classroom,” she says, which will be a space where students can go and detox from the traditional learning environment. “But [they said], ‘We want something there that’s not just toys. We want something that engages them and kind of lights their brain up in a way that toys or books just can’t.’”
Other Opportunities For Robots In Schools
The educational implications of AI-powered robots to connect socially with students and facilitate learning go beyond Ramnauth’s specific research. Previous research has found that children respond to robots in a similar way they would to other social interactions as long the robot has eyes and moves in the world physically, Ramnauth says. The types of social interactions robots that meet these requirements can provide are extensive.
“We’ve done studies that range from teaching deaf infants sign language to teaching English to non-native English speakers, or just basic classroom instruction, like teaching math skills or reading skills,” Ramnauth says.
Additionally, robots can facilitate more than just subject matter learning. “The thing that I find more interesting is the social implications,” Ramnauth says. “It can encourage the quietest kid in the classroom to speak up more when there’s a robot in her group. Or there’s a lot more teamwork. We’ve seen students be far more likely to ask for help when they see a robot ask for help.”