Walking to grasp: Modeling of human movements as invariants and an application to humanoid robotics
RIS ID
128032
Abstract
Concurrent advancements in mechanical design and motion planning algorithms allow state-of-the-art humanoid robots to exhibit complex and realistic behavior. In face of this added complexity and the need for humanlike behavior, research has begun to look toward studies in human neuroscience to better organize and guide humanoid robot motion. In this paper, we present one such method of generating anthropomorphic motion by building the "invariants" of human movements and applying them as kinematic tasks. Whole-body motion of 14 healthy participants was recorded during a walking and grasping task. The recorded data were statistically analyzed to extract invariants which best described the observed motion. These invariants were expressed as a set of rules that were used to synthesize the stereotypy in human motion. We propose an algorithm that reproduces the key parameters of motion, taking into account the knowledge from human movement and the limitations of the target anthropomorph. The results are then generalized such that we can generate motion for targets which were not originally recorded. The algorithmic output is applied in a task-based prioritized inverse kinematics solver to generate dynamically stable and realistic anthropomorphic motion. We illustrate our results on the humanoid HRP-2 by making it walk to and grasp objects at various positions. Our approach complements classical optimization or motion-planning-based methods and provides interesting perspectives toward the use of human movements for deducing effective cost functions in optimization techniques or heuristics for planning algorithms.
Publication Details
Sreenivasa, M., Soueres, P. & Laumond, J. (2012). Walking to grasp: Modeling of human movements as invariants and an application to humanoid robotics. IEEE Transactions on Systems, Man and Cybernetics: Systems, 42 (4), 880-893.