/ Research / Areas / Activity Modeling and Understanding

We are interested in developing computational models of user activity. Our research examines approaches to analyzing human activity data from various modalities, and novel applications of user modeling for HRI, as well as HCI. Through the use of latest sensor technologies, we are able to extract and utilize body movement, speech, and physiological signals during human-machine interactions in order to model and recognize user beliefs, desires, and intentions. We are enabling robots to detect spatially situated social cues, such as detecting transitions into, during, and out of an interaction. We are also developing methods for providing feedback to the user during exercise or other task-performance sessions, based on physiological data, such as heart rate, respiration rate, skin temperature, and accelerometry. Our methods employ a variety of machine learning and signal processing algorithms toward robust activity recognition and understanding for use in autonomous decision for socially situated agents in HRI and HCI contexts.

Current Projects
Past Projects