Research is underway to explore the feasibility of reconstructing human manipulation skills in complex constrained motion by tracing and learning the manipulation performed by the operator. The approach consists of two major steps. In the first step the constraints acquired from operator’s trajectory is generalised as manipulation skills. In the second step, the manipulation skills are transformed to a robotic trajectory to perform the task. Operator’s trajectory is recorded from a haptic-rendered virtual environment. Proxy algorithm which is used in haptic collision detection has been developed and applied to the physical model of the process. In this work a six degree-of-freedom (6DOF) haptic device called PHANToM Premium 1.5 and the haptic rendering package Reachin API along with VRML and Python are used to construct the virtual haptic manipulation. The concept is studied based on the gear assembly process which represents a typical constrained motion force sensitive manufacturing task with the attendant issues of jamming, tight clearance, and the need for quick mating times. In the developed system, a human operator demonstrates both good and bad examples of the desired behaviour in the haptic virtual environment. Position and contact force and torque data generated in the virtual environment combined with a priori knowledge about the task are used to identify and learn the skills in the newly demonstrated task. The captured data is pre-processed offline to remove noise which primarily consists of wrong or irrelevant manipulation steps. Noise data is removed in the process of analysis. Optimum data is generalized after analysis and locally weighted regression (LWR) is employed in discovering constraints of the controller’s skill. The robot evaluates the controller’s performance and thus learns the best way to produce that behaviour. The concept behind the project is described. The approach developed and the results obtained are reported.