This project is part of our National Science Foundation Expedition in Computing for Socially Assistive Robotics (NSF IIS-1139148). Human teaching is a multimodal and interactive process. Teachers often consider verbal and nonverbal cues along with performance signals to assess a student’s learning progress and offer quality feedback. Personalized and interactive education has been shown to improve learning, especially for students with special needs. However, most robotic tutoring systems often follow highly structured feedback models and do not utilize all available multimodal features.

The research project focuses on designing computational models that could recognize the affective state of learning for children with Autism Spectrum Disorder (ASD) so that we could respond with personalized feedback to proceed learning at an optimal pace for each child. We aim to build this ML framework by using multimodal data from interactions including visual, audio, and performance features from over 15 month-long in-home deployments conducted through the Expedition Project. Eventually, this computational ML framework would be integrated with a Socially Assistive Robot Learning Companion that could track learner’s emotions and guide them through their learning journey with positive mindset.