Design of The Experiment Results

considerations that can be used as a guide to a choice of a particular k value : stability of a system as k is increased, reasonable absolute values and signs of the estimated coefficients, and a reasonable variance of regression as indicated by the residual sum of squares. Several researchers had applied the ridge regression to some economic models. Brown and Beattie 1975 improved the estimation of the parameters in Cobb Douglas function by use of ridge regression. They estimated Ruttan’s Cobb-Douglas Function to measure the effect of irrigation on the output of irrigated Cropland in 1956. Overall, their findings indicate that the ridge estimation appears to have promise for estimating Cobb-Douglas Production Function p.31. Watson and White 1976 applied ridge regression in forecasting demand for money under changing term structure of interest rate in USA. The results indicated that ridge regression can be a better predictor than OLS in the presence of multicollinearity. Ridge regression is useful not only in estimating the term structure of interest rate but in many applications in larger scale econometrics models. Sedlacek and Brooks 1976 postulated non-cognitive variables that are predictive of minority student academic success. Tracey and Sedlacek in press developed a brief questionnaire, the Non- Cognitive Questionnaire NCQ, to assess these variables and found eight non-cognitive factors to be highly predictive of grades and enrollment status for both whites and blacks above and beyond using SAT scores alone. But, it was also found that these variables shared a high degree of variance with the SAT scores, so there was a fairly high degree of multicollinearity. It was felt that these ten variables SATV, SATM, and the eight non-cognitive factors would be an ideal application of ridge regression. Tracey, Sedlacek, and Miars 1983 compared ridge regression and least squares regression in predicting freshman year cumulative grade point average GPA based on SAT scores and high school GPA. They found that ridge regression resulted in cross validated correlations similar to those found by using ordinary least squares regression. The failure of ridge regression to yield less shrinkage over OLS regression was postulated to have been due to a relatively low ratio of the number of predictors p to sample size n used in the study. Faden 1978 found that the key dimension where ridge regression proved superior to OLS regression was where the pn ratio was high.

III. Design of The Experiment

This study use the formulation of Goldfeld’s conventional demand for money function. The model is estimated using quarterly data for the sample period 1995:1 – 2006:3. Initial specification model was derived from a partial adjusment model so that in functional notation : M=fY, M t-1 , IDP3, IR ………………..6 Where M is Real Money Stock at 2000 prices narrowly defined as M1 plus time deposit, GNP is real GNP, interest deposits 3 months IDP3 and interest call money rate ICMR is short term interest rate. The variables are as defined by Goldfeld and are replaced by their logarithms for estimation. This study also adds lending rate IWCL as one more interest rate variable on the equation 6 to see the effect of adding predictor. The asssumption in the equation 6 is that all independent variables are exogenous. Monetarists, in particular, are likely to argue that this is not the case. Nevertheles, since the purpose here is to demonstrate a statistical procedure and not to enter monetarist debate this study retain the specification of the equation 6 and hypothesize positive coefficients on the money and income variables and negative coefficients on the interest rate. The predictability of the model in the forecast period is tested by forecast root mean square error RMSE defined simply as : 2 2 t t t M M RMSE M − = …………………7 Where : M = Money Demand

IV. Results

The summary of the correlations between variables are presented in Table 1. The correlations between interest rate are commonly high and so close to 1. GNP and the previous money stock M t-1 are also highly correlated close to 1. Table 1. Correlations Between Variables and Descriptive Statistics in Natural Logarithms M IDP3 ICMR ICWL GNP M t-1 M 1.000 -0.488 -0.435 -0.382 0.980 0.997 IDP3 -0.488 1.000 0.936 0.955 -0.563 -0.535 ICMR -0.435 0.936 1.000 0.921 -0.490 -0.492 ICWL -0.382 0.955 0.921 1.000 -0.488 -0.438 GNP 0.980 -0.563 -0.490 -0.488 1.000 0.977 M t-1 0.997 -0.535 -0.492 -0.438 0.977 1.000 M IDP3 ICMR ICWL GNP M t-1 Mean 12.816 2.685 2.596 2.954 12.651 12.799 St. Dev. 0.597 0.500 0.694 0.248 0.624 0.593 From the table above it is clear that there is intercorrelated between independent variables. The intercorrelation between independent variables so close to 1 will yield greater variance of the estimator Gujarati, 2003. Table 2 Estimated Coefficients and T-values Model 1 Model 2 Model 3 Model 4 Variable OLS K=0.02 OLS K=0.02 OLS K=0.03 OLS K=0.01 IDP3 -0.05 -0.046 -0.185 -0.124 -1.064 -1.480 -1.952 -1.889 ICMR -0.02 -0.039 -0.027 -0.045 -0.190 -0.154 -0.697 -2.062 -0.996 -2.917 -3.937 -3.833 ICWL 0.275 0.319 0.200 0.231 0.123 0.134 1.113 0.859 2.543 5.880 2.441 5.517 4.435 4.697 6.845 7.104 GNP 0.265 0.374 0.300 0.401 0.213 0.364 0.968 0.947 4.160 26.543 4.027 26.67 4.389 16.340 42.97 42.72 M-1 0.714 0.591 0.731 0.550 0.772 0.605 11.020 36.590 11.626 36.527 32.200 27.46 Intercept -1.732 -0.187 -0.226 0.153 0.097 0.098 -1.732 -0.98 -1.178 0.739 -0.991 0.736 16.300 0.481 -4.282 -2.254 R 2 0.996 0.998 0.996 0.997 0.996 0.997 0.985 0.992 RMSE 1.537 0.039 0.037 0.031 0.37 0.042 0.087 0.078 Dependent Variable : Real M1 Plus Time Deposit Independent Variables: GNP : Real GNP IDP3 : Interest Deposits 3 Months ICMR : Interest Call Money Rate ICWL : Lending Rate Table 2 shows that the estimators in the ridge regression dominantly have lower variance than those in the OLS regression. It is indicated by the higher t-values in the ridge regression than the t- values in the OLS except model 4 which dropped M t-1 . RMSE results prove that ridge regression can be better predictor than OLS in the presence of multicollinearity. In all cases, RMSE results of ridge regression are lower than OLS results. Figure 1. Relationship Between Beta Coefficients and k Value Figure 1 shows a representative ridge trace for model 1. The Trace shows the path of each coefficient as k increase. The ridge trace clearly shows one aspect of multicollinearity problem. When two independent variables are highly correlated the sum of respective coefficients is likely have a lower variance than either individual coefficients. This is reflected in the traces for GNP and M t-1 or between interest rate. At levels of k near zero coefficients move in opposite directions revealing a fairly constant sum. The procedure in finding an optimal k from the ridge trace is to search values of k greater than zero until the major instabilities of the coefficients have disappeared. Wald test on the coefficients is conducted on each model to prove that each parameter in ridge regression is not different as OLS result. Table 3 shows the Wald coefficients test on each model : Table 3 Wald Coefficient Test Variable Model 1 Model 2 Model 3 Model 4 IDP3 F-stat = 0.008 Do not reject Ho F-stat = 0.411 Do not reject Ho ICMR F-stat = 0.138 Do not reject Ho F-stat = 0.432 Do not reject Ho F-stat = 0.559 Do not reject Ho ICWL F-stat = 0.779 Do not reject Ho F-stat = 0.152 Do not reject Ho F-stat = 0.069 Do not reject Ho F-stat = 2.427 Do not reject Ho GNP F-stat = 2.741 Do not reject Ho F-stat = 2.44 Do not reject Ho F-stat = 9.253 Reject Ho F-stat = 0.877 Do not reject Ho M-1 F-stat = 3,652 Do not reject Ho F-stat = 4.33 Reject Ho F-stat = 11.71 Reject Ho Where : Ho : i = j i = beta coefficients from OLS results j = beta coefficients from Ridge regression Significance level = 5 From table 3, it can be seen that in model 1 and model 4, all coefficients of the ridge regression are statistically not different to OLS results. Model 2 and model 3 with lower predictors have one and two estimated coefficients respectively that are different statistically with OLS results. Consistent with Faden 1978, this research found that ridge regression proved superior to OLS regression when the ratio of predictors p to sample size n or the pn ratio is high. The bias between ridge estimation and OLS seems disappear when ratio pn ratio increases. Rozeboom 1979 demonstrated that ridge regression may enhance prediction if the conditions are right. but if not, decreased accuracy would result. These conditions are: a when there is a high degree of multicollinearity, which was the case here, b when the samples on which the equations are based are small, and c when the ratio of pn is relatively large Darlington, 1979; Dempster, Schatzoff, Wermuth, 1977; Faden,1978.

V. Conclusion