63
their influence towards dependent variable, namely stock price. Multiple regressions will determine the relationship between dependent and independent variables, the
direction of the relationship, the degree of the relationship and strength of the relationship. Multiple regression are most sophisticated extension of correlation and
are used to explore the predict ability of a set of independent variables on dependent variable. Four hypotheses then generated, which then give direction to assess the
statistical relationship between the dependent and independent variable. To obtain the best model of research, researcher should perform other pre-
tests. The tests are classical assumption test and regression analysis, which comprises of hypothesis test.
1. Classical Assumption Test a. Multicollinearity Test
Multicollinearity test aims to test if there are correlation inter-independent variables in regression model Santoso, 2010: 203. A good regression model should
not account correlation amongst the independent variables Santoso, 2010: 204. If independent variables correlate one to another, it indicates these variables are not
orthogonal Ghozali, 2006: 96. Orthogonal variable is independent variable of which the correlation value among independent variables equals to zero Ghozali, 2006: 96.
To detect if multicollinearity happens in regression model, Ghozali 2006: 96 suggest researcher to consider the following:
R
2
value of an estimation of empirical regression model is high, but partially any independent variables are not significant influencing dependent one.
64
Analyzing correlation matrix among independent variables. If there is high autocorrelation usually above 0.90 among independent variables, this
indicates multicollinearity appears. Whilst the relatively fair correlation among independent variables does not also mean no multicollinearity. It can
be affected due to effect of combination of two or more independent variables. Multicollinearity also can be drawn from 1 tolerance value and the opposite
2 variance inflation factor VIF. Both measurement can predict which independent variable explained by another variables. In modest
interpretation, each independent variable bound to dependent one and is regressed towards other independent variables. In addition, tolerance measures
variability of chosen independent variables which is not explained by other independent variables. Therefore, a small tolerance score equals to high VIF
score because VIF = 1Tolerance. A commonly used cut-off score to indicate multicollinearity is Tolerance score
≤ 0.10 or simply equals to VIF score ≥ 10. Every researcher should determine collinearity level which can be tolerated.
In addition, a regression model can be said free from multicollinearity if correlation coefficient among independent variables should be lower than 0.5. if the
correlation so strong, multicollinearity exists. Furthermore if it occurs, Santoso 2010: 207 suggests:
Dropping out one of variables, for instance independent variable A and B is strongly correlated each other, so the researcher may determine if variable A
or B to be dropped from regression model. Using advanced method, such as Bayesian regression or Ridge regression.