We present a framework for online imitation of human motion by the humanoid robot HRP-2. We introduce a representation of human motion, the humanoid-Normalized model, and a Center of Mass (CoM) anticipation model to prepare the robot in case the human lifts his/her foot. Our proposed motion representation codifies operational space and geometric information. Whole body robot motion is computed using a task-based prioritized inverse kinematics solver. By setting the human motion model as the target, and giving the maintenance of robot CoM a high priority, we can achieve a large range of motion imitation. We present two scenarios of motion imitation, first where the humanoid mimics a dancing motion of the human, and second where it balances on one foot. Our results show that we can effectively transfer a large range of motion from the human to the humanoid. We also evaluate the tracking errors between the original and imitated motion, and consider the restrictions on the range of trans ferable human motions using this approach.