Robotics Research Lab
CRES
USC Computer Science
USC Engineering
USC
/ Research / Projects / Automated Proxemic Behavior Recognition and Production

Overview Proxemics and Speech/Gesture Recognition Proxemics and Speech/Gesture Production Publications Support Contact

Top
Overview

This research investigates proxemics in human-robot interaction (HRI). Proxemics is the study of the dynamic process by which people position themselves in face-to-face social encounters. This process is governed by sociocultural norms that, in effect, determine the overall sensory experience of each interacting participant. To facilitate situated and mobile HRI, this research seeks to develop functional and socially appropriate probabilistic computational models of proxemics for the purposes of both autonomous proxemic behavior recognition (of one or many people) and autonomous proxemic behavior production (by a sociable robot). This work is related to the broader research objective of developing social primitives for situated and embodied interactive agents.

Top
Proxemics and Speech/Gesture Recognition

A lack of high-resolution metrics limited previous proxemic behavior recognition efforts to coarse analyses in both space and time. Using modern advancements in sensor technology (specifically, the Microsoft Kinect), a system was developed for the autonomous real-time annotation of proxemic behavior in multi-person social encounters. The system is non-invasive to participants, is readily deployable in a variety of environments (ranging from an instrumented workspace to a mobile robot platform), and does not interfere with the social interaction itself. All automated annotations were inspired by a set of validated metrics from the social sciences, and included individual, attentional, interpersonal, and physiological factors that contribute to human-human social spacing. The understanding, modeling, and automation of these behaviors is necessary for the development of socially aware autonomous robots. A more detailed discussion of this work can be found here.

Automatic recognition of proxemic behavior was supported primarily by the NSF IIS-1117279 grant.

Top
Proxemics and Speech/Gesture Production

Drawing upon insights gained in the above work on automated proxemic behavior recognition and analysis, a novel method for human-robot proxemic behavior production was developed. This method utilizes a probabilistic framework for spatial interaction that considered the sensory experience of each agent (human or robot) in a social encounter. In preliminary work, the robot attempted to maintain a set of human body features (e.g., head and arms) in its camera field-of-view. This methodology is the first to address the functional aspects of proxemic behavior in human-robot interaction, and provides an elegant connection between previous approaches. A more detailed discussion of this work can be found here.

Automatic production of proxemic behavior was supported primarily by the NSF National Robotics Initiative (NRI) IIS-1208500 grant.

Publications
Support

This work is supported in part by an NSF Graduate Research Fellowship, as well as NSF IIS-1208500, IIS-1117279, and CNS-0709296 and ONR MURI N00014-09-1-1031 grants, and the Willow Garage PR2 Beta Program.

Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation, the Office of Naval Research, or Willow Garage.

Contact

Ross Mead