Under the null hypothesis H the G-statistic will follow a chi-
square distribution with p degree of freedom Hosmer and Lemeshow 1989 in Sutisna 2002, so a chi-square test is the test of the fit of the
model Wald test is used to test the significance of parameter
β
i
, where i = 1,2,3, .., p partially. This test is used, when the null
hypothesis H in the G-statistic is rejected. The formula of Wald test:
With is the estimator for coefficient
β
i
β
ˆ
i
and
i SE
β
ˆ
is the standard error of
. Null hypothesis for coefficient regression is zero will be rejected if
|W
i
β
ˆ
i
| W
α
2.
Under the null hypothesis the Wald test will follow normal distribution Hosmer and Lemeshow, 1989 in
Sutisna 2002.
2.1.3. Model Interpretation
SPSS will offer a variety of statistical tests. Usually, though, overall significance is tested using what SPSS calls the Model Chi-
square, which is derived from the likelihood of observing the actual data under the assumption that the model that has been fitted is
accurate. It is convenient to use -2 times the log base e of this likelihood; it’s called -2LL. The difference between -2LL for the
best-fitting model full model and -2LL null model for the null hypothesis model – initial chi-square in which all the
β values are set to zero is distributed like chi-squared, with degrees of freedom equal
to the number of predictors; this difference is the Model chi-square that SPSS refers to. Very conveniently, the difference between -2LL
values for models with successive terms added also has a chi-squared distribution, so when using a stepwise procedure, it can use chi-
squared tests to find out if adding one or more extra predictors
9
10
significantly improves the fit of our model Departement of Psychology–University of Exeter 1997.
Model chi-square measures the improvement in fit that the explanatory variables make compared to the null model. Model chi-
square is a likelihood ratio test which reflects the difference between error not knowing the independents initial chi-square and error when
the independents are included in the model deviance. When probability model chi-square
≤ .05, it reject the null hypothesis that knowing the independents makes no difference in predicting the
dependent in logistic regression CHASS-NCSU 2006. Coefficient of logit model can be formulated as
β
I
= g x+1 – gx. Parameter of
β
i
, depicts the change of gx logit function for the changing of one unit of independent variable x, and as called log odds
Hosmer and Lemeshow 1989 in Amanati 2001. For significance of individual predictors, SPSS also offer what
it calls Wald statistic, together with a corresponding significant level. The Wald statistic also has a chi-squared distribution Departement of
Psychology – University of Exeter 1997. The ratio of the logistic coefficient
β to is Standard Error SE squared, equals the Wald statistic. If the Wald statistic is significant
i.e less than 0.05 then the parameter is significant in the model. Coefficient interpretation its self will be done for the significant
predictors by seeing the value of each coefficient. If the coefficient is positive, it tends Y =1 to be greater than for occurring independent
variable X = 1 than X = 0. According to Hosmer and Lemeshow 1989 in Amanati
2001, coefficient of logit model is written β
i
= π x + 1 - π x.
Parameter β
i
depicts the changes in logit π x for one unit changes of
11
independent variable X that call Odd Ratio. Log odds is difference between two values of logit, and being noted as:
ln [ ψ a,b] = gx=a – gx=b
= β
i
a-b One of the values of risk level is odd ratio Freeman 1987 in
Amanati 2001. For dichotomy variables, the estimator of odds ratio is:
ψ = [π1 1- π1] [π0 1- π0] ln
ψ = g1 – g0 ln
ψ = β
i
ψ = exp [β
i
1-0]
With the result that, if a-b=1, so ψ = expβ
i
. This odd ratio can be interpreted as a tendency of Y=1 at x = 1 with the amount of
ψ times by comparing at x = 0 Amanati 2001.
2.1.4. Logistic coefficients and correlation