Support Vector Machine SVM

XX st Month 2013. Vol. x No.x © 2005 - 2013 JATIT LLS. All rights reserved. ISSN: 1992-8645 www.jatit.org E-ISSN: 1817-3195 35 data and testing to validate the data model has been obtained. In this experiment, it is seen that the SVM require a greater number of features to perform the classification process as well. Maximum accuracy on the training data obtained at log 2 C = -2 with d = 5 is 83.02 at CV = 5. Results of this parameter is applied to the whole training data and testing to validate the data model has been obtained. SVM with RBF Kernel Scenario 3 was conducted with a range of values using RBF kernel with variations in the value of log 2 C 0 to 5, -2 to 7, and -5 to 10. Value of log 2 G -5 to 0, -7 to 2, dan - 10 to 5. This is done to find the maximum possible accuracy value greater range. Figure 3 : Results Comparison of C and Gamma accuracy RBF Kernel In Figure 3 shows that the third scenario is getting the trial by referring to a value that is log 2 C = 3 and log 2 Gamma = -1 with d = 3, with an accuracy of 90.48 obtained at CV = 5. Effect of Gamma parameters of the classification process is quite visible with the consistency of this parameter on a log 2 Gamma = -1. Gamma parameter is influenced by a parameter C which has a variation of the start of log 2 C 0. Results of this parameter is applied to the whole training data and testing to validate the data model has been obtained. SVM with polynomial kernel Experiment was conducted using polynomial kernel with 3 variations of C and G similar to those used in the RBF kernel. But in this polynomial kernel parameter is added 1 piece of degree d. d value used is 1-6. Figure 4 : Comparison of Results Accuracy of C and Gamma Kernel polynomial Figure 4 shows that the third scenario is getting the experimental maximum accuracy at log 2 C = 4 dan log 2 Gamma = -3 at d = 3 by 88.80 at CV = 5 by default degree = 3. In this experiment, the influence of parameter Gamma is quite clearly seen in the range of -5 and 0, but the results obtained are affected by the parameter C with a specific pattern. Parameter values obtained from the Grid Search is applied again to find the best parameter degree. Below in Table 2 is the result of accuracy with the value d = 1,2,3,4,5, and 6. Table 2 : Accuracy of results with different degree value d egree A ccuracy 1,7 7,0 8,8 8,4 7,8 7,0 From the results in Table 2 showed that with increasing the parameter d in polynomial kernel then the results will continue to increase accuracy. But the degree d that is too large will cause the algorithm also difficult to find a consistent hyperplane, it can be considered as the value of d 3. Results of this parameter is applied to the whole training data and testing to validate the data model has been obtained. Comparison of results of SVM classification Parameter results that have been obtained in trials could be applied to the entire data SVM training and testing to validate the data model has been obtained. This experiment was conducted to obtain the best accuracy results are applied to the whole training data and testing with a number of different d. The accuracy of the comparison results with different SVM seen in Figure 5. In Figure 5, it is seen that the accuracy value is applied to the training data is much greater than that applied to the data testing. Overfitting occurs when applied to the training data, when applied to the testing of data will be a significant degradation of accuracy. It can be caused by variations in image training and testing parts of the eye or the eye not 36 having several different positions due to the slope of the face or eye shape.

3.2 Comparison of RBF SVM classification with

Euclidean The best SVM classification results, using RBF compared with the results obtained in the validation 2DPCA using euclidean distance. The results obtained can be seen in Figure 6. In Figure 6, it can be concluded that by using the SVM classification results obtained are much better than just using euclidean.

3.3 Optimization using GA

After experimenting in the previous stage, the optimization process is carried out using GA on RBF kernel. Here is an early initialization parameter GA along with the results obtained. Number of bits per variable = 10 Population size = 5 Lower limit variable C = 1 The upper limit of the variable C = 8 P Cross over= 0.8 P Mutation = 0.1 Lower limit of the variable G = 0:25 The upper limit of the variable G = 2 Figure 7 : Comparison Chart Fitness iterations with the GA Figure 7 shows that GA has done well optimization. It can be seen from the results obtained by using the optimal GA scenario is almost the same as the grid search on previous experiments on CV = 90.12 5 with parameter Gamma = 0.5009 and C = 482.01. In this experiment GA was not overly contribute significantly to the accuracy of the results obtained. But the GA parameters obtained in accordance with the tentative conclusion that states that to get the best accuracy of SVM with RBF kernel , parameter values log 2 G = -1 and C 0. Results of this parameter is applied to the whole training data and testing to validate the data model has been obtained. The results obtained in the present in Table 3. Table 3 : Results Testing GA Parameters Optimization in SVM RBF Kernel D ata T P T N F N F P T raining 1 023 3 200 1 T esting 1 96 5 94 6 4 6 From the results obtained in testing the data , it appears that accuracy by applying the training parameters is much lower than the accuracy obtained in the process of the training data . In this experiment the GA has been successfully performed parameter optimization SVM with RBF kernel on eye detection system .

4. Conclusion

Detection model eye feature extraction using SVM with 2DPCA produce the highest accuracy on RBF kernel with Log2 value Gamma = -1 and C 0 with the value of 99.97 on training data and 88.16 on the testing of data. REFRENCES: [1] Orman Z, Battal A, Kemer E. A Study on Face, Eye Detection and Gaze Estimation. International Journal of Computer Science Engineering Survey IJCSES. Volume 2 No.3. [2] Bhoi N, Mihir MN. Template Matching based Eye Detection in Facial Image. International Journal of Computer Applications 0975 – 8887. Volume 12 No.5. [3] Wang Q, Yang J. Eye Detection in Facial Images with Unconstrained Background. Journal of Pattern Recognition Research 1. Volume 1 No 1. [4] Lessmann S, Stahlbock R, Crone S F. Genetic Algorithms for Support Vector Machine Model Selection. International Joint Conference on Neural Networks IJCNN 2006. [5] Huang C, Wang C. A GA-based feature selection and parameters optimization for support vector machines. Expert Systems with Applications. Volume 31 Issues 2. Pages 231 –240. [6] Hoang Le T, Bui L. Face Recognition Based on SVM and 2DPCA. International Journal of Signal Processing, Image Processing and Pattern Recognition. Volume 4 No. 3. [7] Yang J, Zhang D. Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition.

Dokumen yang terkait

Support Vector Backpropagation: Implementasi Backpropagation Sebagai Feature Reductor Pada Support Vector Machine

0 75 103

The Developing of Dual Tires Detection Model of Two Axles Truck by Using 2D-PCA Feature Extraction and SVM as Classifiers

0 4 128

Development of Face Recognition Model using Bi-2DPCA and Support Vector Machine

0 4 33

Speaker Identification System Modeling Using MFCC as Feature Extraction and SVM as Pattern Recognition

0 2 52

WATERMELON PLANT CLASSIFICATION BASED ON SHAPE AND TEXTURE FEATURE LEAF USING SUPPORT VECTOR MACHINE (SVM).

0 0 6

Deteksi Tumor Otak dengan Ektrasi Ciri Feature Selection mengunakan Linear Discriminant Analysis (LDA) dan Support Vector Machine (SVM) Brain Tumor’s Detection With Feature Extraction Feature Selection Using Linear Discriminant Analysis (LDA) and Support

0 1 8

SUPPORT VECTOR MACHINE TRAFFIC SIGN DETECTION AND CLASSIFICATION USING SUPPORT VECTOR MACHINE VECTOR

0 0 11

MACHINE (SVM) PADA ANDROID IMPLEMENTATION OF RUGAE PALATINE PATTERN IDENTIFICATION USING BINARY LARGE OBJECT (BLOB) DETECTION AND SUPPORT VECTOR MACHINE (SVM) CLASSIFICATION ON ANDROID DEVICE

0 2 7

IMPLEMENTASI PENDEKATAN GRAPHIC PROCESSING UNIT (GPU) PADA ALGORITMA SUPPORT VECTOR MACHINE (SVM) DAN LEAST SQUARES SUPPORT VECTOR MACHINE (LS-SVM) IMPLEMENTATION OF GRAPHIC PROCESSING UNIT (GPU) APPROACH IN SUPPORT VECTOR MACHINE (SVM) AND LEAST SQUARES

0 0 9

MACHINE SIMULATION AND ANALYSIS OF SUPPORT VECTOR MACHINE FOR GENRE CLASSIFICATION OF MUSIC

0 0 9