We present a framework for online imitation of human motion by the humanoid robot HRP-2. We introduce a representation of human motion, the humanoid-Normalized model, and a Center of Mass (CoM) anticipation model to prepare the robot in case the human lifts his/her foot. Our proposed motion representation codifies operational space and geometric information. Whole body robot motion is computed using a task-based prioritized inverse kinematics solver. By setting the human motion model as the target, and giving the maintenance of robot CoM a high priority, we can achieve a large range of motion imitation. We present two scenarios of motion imitation, first where the humanoid mimics a dancing motion of the human, and second where it balances on one foot. Our results show that we can effectively transfer a large range of motion from the human to the humanoid. We also evaluate the tracking errors between the original and imitated motion, and consider the restrictions on the range of trans ferable human motions using this approach.
History
Citation
Montecillo-Puente, F., Sreenivasa, M. N. & Laumond, J. (2010). On Real-time Whole-body Human to Humanoid Motion Transfer. In J. Filipe, J. A. Cetto & J. Ferrier (Eds.), Proceedings of the 7th International Conference on Informatics in Control, Automation and Robotics - Volume 2 (pp. 22-31). SciTePress.
Parent title
ICINCO 2010 - Proceedings of the 7th International Conference on Informatics in Control, Automation and Robotics