Support Vector Machine SVM
XX
st
Month 2013. Vol. x No.x © 2005 - 2013 JATIT LLS. All rights reserved.
ISSN:
1992-8645
www.jatit.org E-ISSN:
1817-3195
35
data and testing to validate the data model has been obtained.
In this experiment, it is seen that the SVM require a greater number of features to perform the
classification process as well. Maximum accuracy on the training data obtained at log
2
C = -2 with d = 5 is 83.02 at CV = 5. Results of this parameter is
applied to the whole training data and testing to validate the data model has been obtained.
SVM with RBF Kernel Scenario 3 was conducted with a range
of values using RBF kernel with variations in the value of
log
2
C 0 to 5, -2 to 7, and -5 to 10. Value of log
2
G -5 to 0, -7 to 2, dan - 10 to 5. This is done to find the maximum
possible accuracy value greater range.
Figure 3 : Results Comparison of C and Gamma accuracy RBF Kernel
In Figure 3 shows that the third scenario is getting the trial by referring to a value that is
log
2
C = 3 and log
2
Gamma = -1 with
d = 3, with an accuracy of 90.48 obtained at CV = 5. Effect of
Gamma parameters of the classification process is quite visible with the consistency of this parameter
on a log
2
Gamma = -1. Gamma parameter is influenced by a parameter C which has a variation
of the start of log
2
C 0. Results of this parameter is applied to the whole training data and testing to
validate the data model has been obtained. SVM with polynomial kernel
Experiment was conducted using polynomial kernel with 3 variations of C and G similar to those
used in the RBF kernel. But in this polynomial kernel parameter is added 1 piece of degree d. d
value used is 1-6. Figure 4 : Comparison of Results Accuracy of C and
Gamma Kernel polynomial Figure 4 shows that the third scenario is getting
the experimental maximum accuracy at log
2
C = 4 dan log
2
Gamma = -3 at d = 3 by 88.80 at CV = 5
by default degree = 3. In this experiment, the influence of parameter Gamma is quite clearly seen
in the range of -5 and 0, but the results obtained are affected by the parameter C with a specific pattern.
Parameter values obtained from the Grid Search is applied again to find the best parameter degree.
Below in Table 2 is the result of accuracy with the value d = 1,2,3,4,5, and 6.
Table 2 : Accuracy of results with different degree value
d egree
A ccuracy
1,7 7,0
8,8 8,4
7,8 7,0
From the results in Table 2 showed that with increasing the parameter d in polynomial kernel
then the results will continue to increase accuracy. But the degree d that is too large will cause the
algorithm also difficult to find a consistent hyperplane, it can be considered as the value of d
3. Results of this parameter is applied to the whole training data and testing to validate the data model
has been obtained.
Comparison of results of SVM classification
Parameter results that have been obtained in trials could be applied to the entire data SVM
training and testing to validate the data model has been obtained. This experiment was conducted to
obtain the best accuracy results are applied to the whole training data and testing with a number of
different d. The accuracy of the comparison results with different SVM seen in Figure 5.
In Figure 5, it is seen that the accuracy value is applied to the training data is much greater than that
applied to the data testing. Overfitting occurs when applied to the training data, when applied to the
testing of data will be a significant degradation of accuracy. It can be caused by variations in image
training and testing parts of the eye or the eye not
36
having several different positions due to the slope of the face or eye shape.