Robotics Research Lab
CRES
USC Computer Science
USC Engineering
USC
/ Research / Projects / Affect in Motion: Comparative Effectiveness of Virtual and Physical Data Collections

Overview Support Contact Details

Overview

As robots increasingly find applications in socially assistive domains they will need to effectively embody and broadcast socially meaningful signals, such as affect, through body language and facial expression. Understanding how robot motion is perceived by humans is the first step towards programming robots to show these displays. Confounding this pursuit, is the plethora of form factors and degrees of freedom on modern robot platforms and the difficulty of gathering large amounts of data through human studies. One potential solution to this problem is to gather data virtually via the web through services such as Amazon's Mechanical Turk. It is unclear however to what degree these signals are interpreted the same by humans when shown a physical embodiment vs a virtual embodiment. This study aims to asses the degree to which virtual data collections produce valid results from which to build control systems for physical robots. This study will test static facial expressions, dynamic facial expressions (with a time varying component), and abstract body motion.

Support

This work is supported by the NSF Expeditions in Computing grant for "Socially Asistive Robotics". NSF IIS-1139148. Project website here.

Contact

Brian Kohan: bkohan at usc dot edu