DistributionMarket Strategy X3 Measurement of Variables

64  Analyzing correlation matrix among independent variables. If there is high autocorrelation usually above 0.90 among independent variables, this indicates multicollinearity appears. Whilst the relatively fair correlation among independent variables does not also mean no multicollinearity. It can be affected due to effect of combination of two or more independent variables.  Multicollinearity also can be drawn from 1 tolerance value and the opposite 2 variance inflation factor VIF. Both measurement can predict which independent variable explained by another variables. In modest interpretation, each independent variable bound to dependent one and is regressed towards other independent variables. In addition, tolerance measures variability of chosen independent variables which is not explained by other independent variables. Therefore, a small tolerance score equals to high VIF score because VIF = 1Tolerance. A commonly used cut-off score to indicate multicollinearity is Tolerance score ≤ 0.10 or simply equals to VIF score ≥ 10. Every researcher should determine collinearity level which can be tolerated. In addition, a regression model can be said free from multicollinearity if correlation coefficient among independent variables should be lower than 0.5. if the correlation so strong, multicollinearity exists. Furthermore if it occurs, Santoso 2010: 207 suggests:  Dropping out one of variables, for instance independent variable A and B is strongly correlated each other, so the researcher may determine if variable A or B to be dropped from regression model.  Using advanced method, such as Bayesian regression or Ridge regression. 65

b. Autocorrelation Test

Autocorrelation test aims to test if there is correlation in linear regression model between disturbances in t period with period t-1 previous period Santoso, 2010: 213. If correlation occurs, it refers to autocorrelation problem. It occurs because sequential observation along with time series. This problem appears because residual disturbance is not free from one observation to another observation Ghozali, 2006: 99. It is often found in time series because of disturbance in individual or group tends to influence disturbance in the same individual or group in the next period Ghozali, 2006: 100. In cross-section data, autocorrelation problem relatively rarely occurs because disturbance in different observations come from different individual or group Ghozali, 2006: 99. A good regression model is one which free from autocorrelation Santoso, 2010: 213. This research uses the Durbin-Watson test suggested by Santoso 2010. To detect autocorrelation, there are some accepted frameworks, such as:  D-W value is lower than -2 indicate there is positive autocorrelation.  D-W value is in between -2 and +2 indicate no autocorrelation.  D-W value is more than +2 indicate there is negative autocorrelation.

c. Heteroscedasticity Test

Heteroscedasticity test aims to test if there is variance difference from residual of one observation to another observations occurs Santoso, 2010: 207. Furthermore, if the variance remains constant, it is called homoscedasticity and if it is changing or different, it is called heteroscedasticity Santoso, 2010: 207. Most cross-