Robotics Research Lab
CRES
USC Computer Science
USC Engineering
USC
/ Robots

Humanoid Robots
Dragonbot
Humanoid Simulations
Deictic Gesture Validation System
Motion Tracking Systems
Mobile Remote Presence Robots
Mobile Robots
Player, Stage, Gazebo
Sony Dog Robots
Lab Library

Humanoid Robots

Bandit I
Bandit II
ESRA
Nao
PR2
iRobot Ava
Sparky
Maki
Hermes
MyKeepon
Romo

Bandit I

A user-friendly humanoid robot torso platform, developed by Living Creatures, details found here. Additional images of this platform are found here.

 


Bandit II

Bandit II is an updated version of Bandit I, designed and fabricated by BlueSky Robotics. Our lab currently maintains six of these expressive humanoid platforms.

 


ESRA

ESRA II from Robodyssey Systems LLC.


Nao

The Interaction Lab currently has 11 Nao robots. The Nao robot is a full-body humanoid robot by Aldebaran Robotics. It is approximately 26" tall with 26 degrees of freedom in the neck, shoulders, elbows, hands, hips, knees, wrists, and ankles. It has a number of sensors, including an IMU, sonar, video, microphones, and force sensors. It has two speakers and colored LEDs in the eyes for expressive output.


PR2

The PR2 is a mobile manipulation platform, designed and manufactured by Willow Garage.


iRobot Ava

The Ava is a fully-holonomic mobile robotics platform designed and manufactured by iRobot. It features integrated autonomous navigation capabilities supported by an impressive set of sensors.


Sparky

Minimatronic Figure (TM) courtesy of Walt Disney Imagineering Research and Development.

 


Maki

Maki is an entirely 3D-printed platform developed by Hello-Robo. It measures 342mm (13.5") tall and has 6 degrees of freedom in its head.

 


Hermes

Hermes is a SARCOS Humanoid, a hydraulic full body humanoid robot with 44 degrees of freedom, an active vision system with two moving eyes, actuated hands, an inbuilt Kinect 3D camera, and various touch and IMU sensing devices. The robot is part of the USC Human Centered Robotics lab for research in robot control, autonomous learning, and human-robot interaction.


MyKeepon

MyKeepon is a consumer-grade version of the Keepon robotic platform. It has a microphone, located in the nose, and can sense when it is being touched on its head or its sides. MyKeepon comes pre-programmed for dancing and reacting to touch, but the platform is easily hacked into for more advanced robotics programming.

 


Romo

Romo is comsumer robot by Romotive. It has an animated face that works on either an iPhone or iPod Touch and a mobile base.

Dragonbot

The Dragonbot is a five degree of freedom dragon robot designed by the MIT Personal Robotics Group. The aesthetics of the robot as seen here were designed by Viva La Puppet.

Humanoid Simulations

Bandit II Simulation
General Kinematic Simulation
Humanoid Simulation
Physics-based Simulation

Bandit II Simulation

Bandit II humanoid simulation using the ROS visualization environment (RViz)

banditrviz


General Kinematic Simulation

General kinematic humanoid simulator (developed by Marcelo Kallman)

humSim


Humanoid Simulation

Humanoid simulation testbed (using Vortex libraries)

simulator


Physic-based Simulation

Physics based humanoid simulation: Adonis (developed by Jessica Hodgins).

DOF_sized

Mobile Remote Presence Robots

The Giraff is a remotely controlled mobile, human-height physical avatar integrated with a videoconferencing system (including a camera, display, speaker and microphone). It is powered by motors that can propel and turn the device in any direction, even backwards. The Giraff can turn in place to face a person or object, and the display can also be tilted up and down to extend the vertical field-of-view.

Deictic Gesture Validation System

For experiments in deictic gesture perception and production, we constructed a 12x8 foot freestanding screen with transparent acrylic. The screen is used as a planar target on which we measure both the actual targets of a given deictic gesture and the induced perceived target location by an observer. We have conducted experiments with both robot and human gesturers and have study variables including gesture modality and distance. In addition, we have produced a set of markers of various sizes that can be affixed to the screen to study the effects of saliency on gesture perception. When a participant is placed 3 feet from the screen it covers an horizontal field of view of 120 degrees and a vertical field of view of about 70 degrees.


screen for gesture validation


Motion-Tracking Systems

Motion Capture Suit
Vision Based Tracking
Overhead Interaction Toolkit
Mezzanine
Laser-Based Motion-Tracking System
Volume-Based Motion Capture

Overhead Interaction Toolkit (OIT)

OIT is an overhead 2D visual tracking package intended for tracking humans and robots in indoor settings. OIT will work with any camera compatible with the ROS project. Research involving the OIT concerns detecting and interpreting the behavior of people as they interact with robots.

Development of the OIT was done in C++ and python, but is compatible with any language used by the ROS project. The OIT was developed on x86 hardware, but probably is compatible with any POSIX-compliant OS and hardware.

The overhead interaction toolkit


Available as a ROS stack in the usc-ros-pkgs repository:

> svn co https://usc-ros-pkg.svn.sourceforge.net/svnroot/usc-ros-pkg/trunk/OIT



Motion Capture from IMU Suit

A wireless motion tracking system consisting of 3-axis IMUs that measure acceleration, angular change of rate, and magnetic field strength. This system uses an onboard wearable computer to store, process, and transmit these data to a local laptop or PC, and is an updated version of the motion suit developed previously in the lab.

This system is currently being used in a robot-based rehabilitation framework. The data from the sensors can be used to assess quality of motion as participants interact with the robot and perform various motor tasks.

In the past, the system has also been used for other projects, including teleoperation of the NASA Robonaut.

Robonaut


Vision Based Tracking

An overhead vision system for tracking our robots and other lab activities (created by Paolo Pirjanian).

oh1

oh2

A motion tracking system for recording human torso movement (created by Stefan Weber).

stefan


Mezzanine

Mezzanine is an overhead 2D visual tracking package intended primarily for use as a mobile robot metrology system. It uses a camera to track objects marked with color-coded fiducials and infers the pose (position and orientation) of these objects in world coordinates. Mezzanine will work with most color cameras (even the really cheap ones) and can correct for the barrel distortion produced by wide-angle lenses.

Mezzanine is language and architecture neutral: object poses are written into a shared memory-mapped file which other programs can read and make use of as they see fit. Mezzanine itself is written in pure C and was developed under Linux on x86 hardware.

mezzcal


Laser-based motion-tracking system

One of our research directions involves activity tracking and modeling. We use multiple planar laser range finders to track robots and people in the lab. The lasers are automatically calibrated relative to each other using mesh relaxation. A background model representing the positions of static objects is maintained. Range readings that do not fit this model are assumed to come from moving object. Foreground points are then clustered into groups representing the boundaries of objects to be tracked. We have used two approaches for tracking targets:

  1. Kalman filters: Each cluster is tracked by a Kalman filter-based object tracker. Foreground clusters are matched to one or more clusters from the previous scan. A cluster inherits the Kalman filter from its matched cluster. New object trackers are instantiated for clusters that are not matched with existing clusters. Trackers persist for a duration even when no clusters are associated with them, enabling tracking of temporarily occluded objects.
  2. Particle filters: Each cluster is assigned to a filter (consisting of 2000 particles) based on its distance to the particle mean. If no particle filter exists within a threshold distance, then a new one is instantiated. Filters that are not associated with any clusters for an extended period of time are removed.
tracker_laser

The figure (click for a larger image) shows a snapshot of 3 lasers tracking two people walking in the lab. The people are the two clusters of particles, the blue edges are the background model.

For more information on our laser tracking projects, visit Helen Yan's webpage.



Volume-Based Motion Capture

Volume-based markerless motion capture is a motion tracking system. It estimates both kinematics model and joint angles from volume data extracted by multiple cameras.


Mobile Robots

The Turtlebot robot is a small non-holonomic mobile robot, fully integrated with ROS, and including a netbook and kinect for onboard computation and navigation.

Our constantly growing collection of robots started with 20 R1s (affectionately dubbed the Nerd Herd and named after things that come out of a toaster; now retired) and 4 R2s (affectionately dubbed the Socially Mobile and named Don Quixote, Don Corleone, Donna E. Mobile, The Donald and Don' Work, now also retired). The R1s and the R2s were generously given to the Interaction Lab by Prof. Rodney A. Brooks from the MIT AI Lab. We then added two Pioneers to the family, Benjamin and Mae (the little Volens), then a few more indoor and outdoor versions, and are now, as part of the USC Robotics Lab, up to to 13 Pioneer robots (9 Pioneer 2 DX, 2 Pioneer 1s, and 2 Pioneer 1 ATs), and continue to grow that part of the robot family. We also have two Sony Dog Robots we call Laika and Dogmatix, four tiny Khepera robots, a lawn-mowing robot, and a Phoenix robot from Johuco. Evolution Robotics has also provided us with six ER1 robots, one ER1 SDK, and one Scorpion.

familyJan200
Robot Family (2000)
familyJan200
Robot Family (2003)
Three Evolution Robotics mobile robots.
From left to right: ER1, ER1 SDK, ER Scorpion
Player, Stage, Gazebo

Player is a multi-threaded robot device server. It gives you simple and complete control over the physical sensors and actuators on your mobile robot (Player is currently specifically implemented for the ActivMedia Pioneer 2-DX mobile robot running Linux, but the extensible server architecture should facilitate porting to other platforms). When Player is running on your robot, your client control program connects to it via a standard TCP socket, and communication is accomplished by the sending and receiving of some of a small set of simple messages. In addition to talking to your hardware, Player can interface to the robot simulator Stage.

Stage simulates mobile robots moving in and sensing a two-dimensional bitmapped environment, controlled through Player. Player provides a powerful, flexible interface to ActivMedia Pioneer 2 robots; Stage provides virtual Player robots. Various sensor models are provided, including sonar, scanning laser rangefinder, pan-tilt-zoom camera with color blob detection and odometry. Several controllers designed in Stage have been demonstrated to work on real robots.

Gazebo is a mobile robot simulator much like Stage with a focus on outdoor environments. Like Stage, it is capable of simulating a population of robots, sensors and objects, but does so in a three-dimensional world. It generates both realistic sensor feedback and physically plausible interactions between objects (it includes an accurate simulation of rigid-body physics).

For more information, please see this page.

Sony Dog Robots
aibojp3
Laika
aibo210
Dogmatix

Lab Library

The library includes conference proceedings, journals, and other reading material avilable in the lab.