Our research group is interested in mobile perception and motion planning.
Perception: In robotics a core problem is simultaneous localization and mapping (SLAM). We have researched estimation techniques such particle filtering, Kalman filtering, smoothing and mapping (SAM). We also have experience with a variety of sensors (sonar, LIDAR, vision, acoustics, radar and inertial). We have applied these methods in a number of different domains: speech source tracking, marine vehicle navigation, and camera localization and humanoid state estimation.
Motion Planning: Humanoid robots typically have many more degrees of freedom (DOF) than typical fixed-base robots. This gives advantages of redundancy - making it possible to reach to a certain goal in many different ways. However it also brings complexity as the high number of DOFs (e.g. 30) and the requirement to remain balanced mean that typical planning approaches designed for low dimensional robots (e.g. 7 DOF) do not scale. We are interested in hybrid planning approaches which avoid this issue.
- Humanoid Motion Planning
- Humanoid State Estimation
- Dense 3D Vision
- NASA/NSF Space Robotics Challenge (upcoming)
- Long Term Visual Mapping
Previous Research Projects:
- DARPA Robotics Challenge
- RGB-D Localization
- AUV Target Relocalization
- Assistive Technology
- Man Portable Mapping (2011-2012)
- Cooperative Underwater Vehicle Navigation (2008-2011)
- Mapping of Large Scale Complex Marine Structures (2010)
- Acoustic Speech Source Tracking (PhD work, 2004-2008)