Abstract
The well-known classifier support vector machine has many parameters associated with its various kernel functions. The radial basis function kernel, being the most preferred kernel, has two parameters to be optimized. The problem of optimizing these parameter values is called model selection in the literature, and its results strongly influence the performance of the classifier. Another factor that affects the classification performance of a classifier is the feature subset. Both these factors are interdependent and must be dealt with simultaneously. Following the multiobjective definition of feature selection, we have applied a multiobjective genetic algorithm, NSGA II, to optimize the feature subset and model parameters simultaneously. Comparison of the proposed approach with the grid algorithm and GA-based method suggests that the MOGA-based approach performs better than the grid algorithm and is as good as the GA-based approach. Moreover, it provides multiple solutions instead of a single solution. The users can prefer one feature subset over the other as per their requirement and available resources.