Robotics Research Lab
RASC
USC Computer Science
USC Engineering
USC
/ Infrastructure / Hardware & Software Systems

Humanoid Simulations
Deictic Gesture Validation System
Motion Tracking Systems
Stage & Gazebo
Retired Systems

Humanoid Simulations

Bandit II Simulation

Bandit II Simulation

Bandit II humanoid simulation using the ROS visualization environment (RViz)

banditrviz

Deictic Gesture Validation System

For experiments in deictic gesture perception and production, we constructed a 12x8 foot freestanding screen with transparent acrylic. The screen is used as a planar target on which we measure both the actual targets of a given deictic gesture and the induced perceived target location by an observer. We have conducted experiments with both robot and human gesturers and have study variables including gesture modality and distance. In addition, we have produced a set of markers of various sizes that can be affixed to the screen to study the effects of saliency on gesture perception. When a participant is placed 3 feet from the screen it covers an horizontal field of view of 120 degrees and a vertical field of view of about 70 degrees.


screen for gesture validation

Motion-Tracking Systems

Overhead Interaction Toolkit
Motion Capture Suit

Overhead Interaction Toolkit (OIT)

OIT is an overhead 2D visual tracking package intended for tracking humans and robots in indoor settings. OIT will work with any camera compatible with the ROS project. Research involving the OIT concerns detecting and interpreting the behavior of people as they interact with robots.

Development of the OIT was done in C++ and python, but is compatible with any language used by the ROS project. The OIT was developed on x86 hardware, but probably is compatible with any POSIX-compliant OS and hardware.

The overhead interaction toolkit


Available as a ROS stack in the usc-ros-pkgs repository:

> svn co https://usc-ros-pkg.svn.sourceforge.net/svnroot/usc-ros-pkg/trunk/OIT



Motion Capture from IMU Suit

A wireless motion tracking system consisting of 3-axis IMUs that measure acceleration, angular change of rate, and magnetic field strength. This system uses an onboard wearable computer to store, process, and transmit these data to a local laptop or PC, and is an updated version of the motion suit developed previously in the lab.

This system is currently being used in a robot-based rehabilitation framework. The data from the sensors can be used to assess quality of motion as participants interact with the robot and perform various motor tasks.

In the past, the system has also been used for other projects, including teleoperation of the NASA Robonaut.

Robonaut
Stage & Gazebo

Stage
Gazebo

Stage

Stage simulates mobile robots moving in and sensing a two-dimensional bitmapped environment, controlled through Player. Player provides a powerful, flexible interface to ActivMedia Pioneer 2 robots; Stage provides virtual Player robots. Various sensor models are provided, including sonar, scanning laser rangefinder, pan-tilt-zoom camera with color blob detection and odometry. Several controllers designed in Stage have been demonstrated to work on real robots.


Gazebo

Gazebo is a mobile robot simulator much like Stage with a focus on outdoor environments. Like Stage, it is capable of simulating a population of robots, sensors and objects, but does so in a three-dimensional world. It generates both realistic sensor feedback and physically plausible interactions between objects (it includes an accurate simulation of rigid-body physics).

For more information, please see this page.

Retired Systems

General Kinematic Simulation
Humanoid Simulation
Physics-Based Simulation
Player
Vision Based Tracking
Mezzanine
Laser-Based Motion-Tracking System
Volume-Based Motion Capture

General Kinematic Simulation

General kinematic humanoid simulator (developed by Marcelo Kallman)

humSim


Humanoid Simulation

Humanoid simulation testbed (using Vortex libraries)

simulator


Physics-Based Simulation

Physics based humanoid simulation: Adonis (developed by Jessica Hodgins).

DOF_sized


Player

Player is a multi-threaded robot device server. It gives you simple and complete control over the physical sensors and actuators on your mobile robot (Player is currently specifically implemented for the ActivMedia Pioneer 2-DX mobile robot running Linux, but the extensible server architecture should facilitate porting to other platforms). When Player is running on your robot, your client control program connects to it via a standard TCP socket, and communication is accomplished by the sending and receiving of some of a small set of simple messages. In addition to talking to your hardware, Player can interface to the robot simulator Stage.


Vision Based Tracking

An overhead vision system for tracking our robots and other lab activities (created by Paolo Pirjanian).

oh1

oh2

A motion tracking system for recording human torso movement (created by Stefan Weber).

stefan


Mezzanine

Mezzanine is an overhead 2D visual tracking package intended primarily for use as a mobile robot metrology system. It uses a camera to track objects marked with color-coded fiducials and infers the pose (position and orientation) of these objects in world coordinates. Mezzanine will work with most color cameras (even the really cheap ones) and can correct for the barrel distortion produced by wide-angle lenses.

Mezzanine is language and architecture neutral: object poses are written into a shared memory-mapped file which other programs can read and make use of as they see fit. Mezzanine itself is written in pure C and was developed under Linux on x86 hardware.

mezzcal


Laser-based motion-tracking system

One of our research directions involves activity tracking and modeling. We use multiple planar laser range finders to track robots and people in the lab. The lasers are automatically calibrated relative to each other using mesh relaxation. A background model representing the positions of static objects is maintained. Range readings that do not fit this model are assumed to come from moving object. Foreground points are then clustered into groups representing the boundaries of objects to be tracked. We have used two approaches for tracking targets:

  1. Kalman filters: Each cluster is tracked by a Kalman filter-based object tracker. Foreground clusters are matched to one or more clusters from the previous scan. A cluster inherits the Kalman filter from its matched cluster. New object trackers are instantiated for clusters that are not matched with existing clusters. Trackers persist for a duration even when no clusters are associated with them, enabling tracking of temporarily occluded objects.
  2. Particle filters: Each cluster is assigned to a filter (consisting of 2000 particles) based on its distance to the particle mean. If no particle filter exists within a threshold distance, then a new one is instantiated. Filters that are not associated with any clusters for an extended period of time are removed.
tracker_laser

The figure (click for a larger image) shows a snapshot of 3 lasers tracking two people walking in the lab. The people are the two clusters of particles, the blue edges are the background model.

For more information on our laser tracking projects, visit Helen Yan's webpage.



Volume-Based Motion Capture

Volume-based markerless motion capture is a motion tracking system. It estimates both kinematics model and joint angles from volume data extracted by multiple cameras.