Learning kernel parameters is important for kernel based methods because these parameters have significant impact on the generalization abilities of these methods. Besides the methods of Cross-Validation and Leave-One-Out, minimizing some upper bounds on the generalization error, such as the radius-margin bound, was also proposed to more efficiently learn the optimal kernel parameters. In this paper, a class separability criterion is proposed for learning kernel parameters. The optimal kernel parameters are regarded as those that can maximize the class separability in the induced feature space. With this criterion, learning the kernel parameters in SVM can avoid solving the quadratic programming problem. The relationship between this criterion and the radius-margin bound is also explored. Both theoretical analysis and experimental results show that the class separability criterion is effective in learning kernel parameters for SVM.