52
b. Relialibility Test
Reliability test is a measure of the stability and consistency of the respondents in answering the issues related to the constructs
question is a variable dimension and are arranged in the form of a questionnaire. Said to be reliable or reliable instrument if someone
answers to questions are consistent or stable from time to time. Reliability test is used to measure the variables used that are
completely free of errors so as to produce a constant result though was tested several times. Reliability calculations performed using Cronbach
Alpha statistical test. A construct or variable said to be reliable if the Cronbach Alpha value 0.60 Nunnally in Ghozali, 2005
E. Classical Assumption Test
1. Normality Test
Normality test aims to test whether the regression model, or residual confounding variable has a normal distribution Ghozali, 2005: 110.
Good data and fit for use in research is one that has a normal distribution. Normality of data can be viewed in several ways, including by looking at
the normal curve p-plot. A variable is said to be normal if the distribution of the image data points are spread around the diagonal line, and the
spread of the data points in the direction to follow a diagonal line. 2. Multicollinearity Test
According Ghozali 2005: 91, multicollinearity test aims to test whether in the regression model found a correlation between the
53
independent variables independent. Good regression model should not happen correlation between independent variables. If the independent
variables are correlated, then these variables are not orthogonal. Orthogonal variable is the independent variable that the correlation
between the members of the independent variables equal to zero. To detect the presence or absence of multicollinearity in the regression model are as
follows: a. R2 value generated by an empirical regression model estimation is
very high, but individually many independent variables are not significantly affect the dependent variable.
b. Analyzing the correlation matrix of independent variables. If there is a correlation between variables were quite high generally above 0.90,
then this is an indication of the presence of multicollinearity. Not a high correlation between the independent variable does not mean free
of multicollinearity. Multicollinearity may be due to the effect of the combination of two or more independent variables.
c. Multicolinearity can also be seen from 1 the value of tolerance and his opponent 2 variance inflation factor VIF. Both of these
measurements indicate each independent variable Which is explained by the other independent variables. In simple terms each independent
variable becomes dependent variable dependent and diregres against other independent variables. Tolerance measures the variability of
independent variables selected when described by other independent
54
variables. So a low tolerance value equal to the value of VIF high because VIF = 1 tolerance. Cut-off value which is commonly used
to indicate the presence of multicollinearity is a tolerance value 0.10 or equal to the value of VIF 10. Each investigator should determine
the level of collinearity that can be tolerated. For example, the value of tolerance = 0:10 collinearity level equal to 0.95. Although
multicollinearity can be detected with the value of tolerance and VIF, but we still do not know where the independent variables are the
mutually correlated. 3. Heteroscedasticity Test
According Ghozali 2005: 105, heteroscedasticity test aims to test whether in the regression model occurs inequality variance of residuals of
the observations to other observations. If the variance of the residuals of the observations to other observations remain, it is called and if different
homokedastisitas called heterocedastity. Good regression model is that homokedastisitas or not happen heterocedastity. Most data crossection
containing heterocedastisity situation because this data to collect data that represent various sizes small, medium, large.
One way to detect the presence or absence of heteroscedasticity is by looking at the graph plots the predicted value of the dependent variable
dependent is ZPRED with residual SRESID. Detection of the presence or absence heterocedastisity can be done by looking whether there is a
specific pattern on the scatterplot graph between SRESID and ZPRED
55
where Y is the Y axis that has been predicted, and the X axis is the residual prediction Y - Y in fact who has in-studentized. With the
analysis if there is a specific pattern, such as the existing dots forming a pattern of regular wavy, widened and then narrowed, it indicates there
has been a heterocedastisity and if there is no clear pattern, as well as points spread above and below the 0 on the Y axis, it does not happen
heterocedastity.
F. Multiple Regression Analysis