46
b. Normality Test in Statistics Normality test graphically can be misleading if not
carefully look at it. Therefore it is recommended to complete normality test graphically statistical normality test Ghozali,
2011:163. In addition to seeing the normal curve P-plot, the
normality test can also be performed using the Kolmogorov- Smirnov test. In Kolmogorov Smirnov test the hypotheses that
apply are: H
= Samples derived from data or population v normally distributed.
Ha = Samples derived from data or populations that are not normally distributed.
In this test if sig. 0,05 then the data is not distributed normally. However, if the value of sig. 0,05 then
normally distributed data Santoso, 2011:193-196.
2. Multicollinearity Test
Multicollinearity test aims to test whether the regression model found a correlation between free variables of service quality, sales
promotion, and customer satisfaction. In the regression model is a good should not happen correlation between independent variables
Ghozali, 2011:105.
47
A good regression model should not happen correlation between independent variables. If the independent variables are
correlated, then these variables are not orthogonal. Orthogonal variable is the independent variable correlation values between the
members of the independent variables equal to zero. To detect the presence or absence multicollinearity in the
regression model are as follows:
a.
The value of R
2
generated by an empirical regression model estimate is very high, but individually many independent
variables were not significantly affecting the dependent variable.
b.
Analyze the correlation matrix of the independent variables. If there is correlation between the independent variables are
quite high generally above 0,90, then this is an indication of
multicollinearity . The absence of a high correlation
between the independent variable does not mean free of multicollinearity
. Multicollinearity
may be due to the combined effect of two or more independent variables.
c.
Multicollinearity can also be seen from: 1 The value of
tolerance and the opponent; 2 Variance Inflation Factor VIF. Both these measurements indicate each independent
variable which explained by other independent variable. In simple terms each independent variable the dependent
48
variable and regressed against other independent variables. Tolerance measures the variability of independent variables
was chosen that are not explained by other independent variable. So a low tolerance value equal to the value of a
high VIF for VIF = 1Tolerance. Value cutoff commonly used to indicate the presence
multicollinearity is the
tolerance value 10 or equal to VIF 10 Ghazali, 2011:106.
3. Heteroskedasticity