|
CMU Planning and Autonomy Lab
As part of the CMU Robotics Institute, research at the Planning and Autonomy Lab
focuses on motion planning, perception, and
control for autonomous humanoid robots, quadrupeds, mobile manipulators and UAVs. Our core research involves
developing efficient new techniques for navigation planning, footstep planning, autonomous grasping and
manipulation planning, trajectory planning with dynamic constraints, and integrated perception and planning.
|
|
Humanoid Interaction Team
The Digital Human Research Center is part of the
National Institute of Advanced Industrial Science and Technology (AIST)
which is an innovation hub for science research in Japan.
The Humanoid Interaction Team is responsible
for developing practical humanoid robotic platforms and software technology.
|
|
Graphical Simulation of Robotic Systems
The goal of this research is to create graphical simulation software for
complex robots such as humanoids. Simulated control, 3D perception,
motion planning for obstacle avoidance, and algorithms for integrating
vision and planning can then be developed and tested safely and at low
cost. |
|
Motion Planning for Humanoids I am interested in developing algorithms to automatically generate motion for
tasks such as navigation and footstep planning, object grasping and manipulation,
as well as tasks that require full-body dynamically-stable motion planning. |
|
Self-Collision Detection
for Complex Articulated Structures This research aims at developing
algorithms for detecting and preventing self-collisions, which
occur when one or more of the links of an articulated robot or character
model collides with another link. |
|
Classical Path Planning
The goal of this research is to develop practical and efficient algorithms for
solving path planning problems in high dimensions. Applications include
robotics, assembly analysis, virtual prototyping, pharmaceutical drug design,
manufacturing, and computer
animation. |
|
Autonomous Animated Characters
In this research, we explore techniques for creating animated characters whose
motion is generated automatically from high-level task commands. Applications
include virtual reality, video games, web avatars, desktop movie studios, and
other real-time virtual human simulations. |
|
Haptic (Force-Feedback) Virtual
Xylophone This project explores the applications and design of a graphical
virtual instrument that uses haptic force-feedback Phantom device as part of the
user interface.
|
|
Automatic Animation of Human Arm Motions.
This research project comprises an initial attempt to develop methods for
automatically calculating arm motions for animated human characters given a
high-level description of a task and geometric models of the character and
environment. |
|
Motion Planning with Dynamic Constraints
The goal of this research is to solve motion planning problems involving systems with constraints on the velocities induced by the system
dynamics (such as hovercrafts, satellites, and space robots).
|
|
Multibody Dynamics Package
(Physically
based Simulation).In this work, I collaborated with Brian Mirtich to
develop an efficient library implementation of Featherstone's algorithm for
computing linear-time dynamics of tree-like structures of rigid bodies.
|
|
Quadraped Walking Robot
(Titan VII) This project involved designing, building, and writing
software to control a quadraped walking robot suitable for construction tasks.
The robot was developed by researchers at the Tokyo Institute of Technology whom
I had a chance to work with in Japan.
|
|
Robot-Assisted
3D Model Acquisition with a
Laser-RangefinderI worked on this project
as part of a graphics class. At the time, laser-rangefinders were just
emerging as new technology for model acquisition. |
|
Copper K-Shell Ionization Experiment at SLAC.
As an undergraduate, I worked at the Stanford Linear Accelerator Center (SLAC)
as part of a high-energy atomic physics research project. |