Multicollinearity Test DATA AND METHODOLOGY

The JB test of normality is an asymptotic, or large-sample, test. It is also based on the OLS residuals. This test first computes the skewness and kurtosis measures of the OLS residuals by using the following test statistic: JB = n 3.7 In which: n = sample size S = skewness coefficient K = kurtosis coefficient Ho: normal distribution data Ha: not a normal distribution data If the computed p value of the JB statistics in an application is sufficiently low, in which the value of the statistics is very different from 0, one can reject the hypothesis. But if the p value is reasonably high, in which the value of the statistic is close to zero, we do not reject the normality assumption. Or in short we can say that JB test method measures value of skewness and kurtosis if JB statistic X 2 Chi-square value table, it means that residual value distribution is normal Firmansyah, 2000

b. Multicollinearity Test

Multicollinearity relationship to the linear situation which is to be sure or close to independent variable Gujarati, 2003, multicollinearity problems rise when independent variables have a correlation among each other. Whether to decrease the ability to explain and predict multicollinearity, it also causes a mistake of t test coefficient to un-trust indicators. One of the assumptions of the classical linear regression model is that there is no 2 2 2 3 ~ 6 24 S K X   − +     multicollinearity among the explained variables. The purpose of multicollinearity test is to know if there is relationship among independent variables where has a linear correlation inside the regression model used. If multicollinearity happens, it will cause prediction variable become higher, t statistic will be unbiased but not efficient. The term multicollinearity is due to Ragnar Frisch. Originally it meant the existence of a “perfect”, or exact, linear relationship among some or all explanatory variables of a regression model. For the k-variable regression involving explanatory variable X 1 ,X 2 ,…,X k where X 1 =1 for all observations to allow for the intercept term, an exact linear relationship is said to exist if the following condition is met: λ 1 X 1 + λ 2 X 2 + …. + λ k X k = 0 3.8 The consequences of multicollinearity are as follows: if there is perfect collinearity among the X’s, their regression coefficients are indeterminate and standard errors are not defined. If collinearity is high but not perfect, estimation of regression coefficients is possible but their standard errors tend to be large. As a result, the population values of the coefficients cannot be estimated precisely. However, if the objective is to estimate linear combinations of these coefficients, the estimable functions, this can be done even in the presence of perfect multicollinearity. The speed with which variances and covariances increase can be seen with the variance-inflating factor VIF, Which is defined as: VIF = 3.9 2 1 1 j r − VIF shows how the variance of an estimator is inflated by the presence of multicollinearity. As � � 2 approaches 1, the VIF approaches infinity. That is, as the extent of collinearity increases, the variance of an estimator increases, and in the limit it can become infinite. As it can be readily seen, if there is no collinearity between X 2 and X 3 , VIF will be 1. If the speed with which variances and covariance increase, which can be seen with the variance-inflating factor VIF, it may be noted that the inverse of the VIF is called tolerance TOL. That is: ��� � = 1 ��� � = �1 − � � 2 � 3.10 When � � 2 = 1 i.e., perfect collinearity, ��� � = 0 and � � 2 = 0 i.e., no collinearity whatsoever, ��� � is 1. Because of the intimate connection between VIF and TOL, one can use them interchangeably. In this research multicollinearity test will be done through auxiliary regression to detect is there any multicollinearity. The criteria is if R 2 of regression equation more than R 2 auxiliary regression it means that there is no multicollinearity inside. Auxiliary regression model are, � � = � 2 ∙ � 1 ∙ � 2 ∙ � 3 ∙∙∙ � � � − 2 1 − � 2 ∙ � 1 ∙ � 2 ∙ � 3 ∙∙∙ � � � − � + 1 3.11

c. Autocorrelation Test