Significance Tests for Parameter Predictor

7 Model parameter can be estimated by an predictor maximum likelihood, iterative reweighted least squares, and discriminant analysis Hosmer and Lemeshow 1989 in Amanati, 2001. Parameters testing of logistic regression is based on assumption that parameter β i is normal distributed Freeman 1987 in Amanati 2001. In this study maximum likelihood method is used to estimate the parameters β i.

2.1.2. Significance Tests for Parameter Predictor

There are numerous models in logistic regression: a constant intercept only that includes no predictors null model, an incomplete model that includes the constant plus some predictors, a full model that includes the constant plus all predictors, and a perfect hypothetical model that would provide an exact fit of expected frequencies to observe frequencies if only the right set of predictors were measured Tabachnick and Fidell 2001 In SPSS program, there are two steps model as default. The first step, called Step 0, includes no predictors and just the intercept, and also called null model. And the second is the first step or model with predictors in it. In this case, it is the full model that we specified in the logistic regression command. When it has no reasons for assigning some predictors higher priority than others, statistical criteria can be used to determine order in preliminary research. That is, if being wanted a reduce set of predictors but has no preferences among them, stepwise method can be used to reduced set Tabachnick and Fedell 2001 In this study, it uses Forward Stepwise of Maximum Likelihood method in SPSS process, to obtain the statistical reasons. SPSS allows to have different steps in a logistic regression model. The difference between the steps is the predictors that are included. This similar is to blocking variables into groups and then entering them into equation one group at a time. Stepwise regression analysis begins with a full model and variables are eliminated from the model in an iterative process. The fit of the model is tested after the elimination of each variable to ensure that the model still adequately fits the data. When no more variables can be eliminated from the model, the analysis has been completed Estimating parameter β , β 1 , …, β p in the logistic regression model, it can be done using maximum likelihood method. The function of the model is given by : Sutisna 2002 For simply, the likelihood function can be written in the form of log- likelihood as follow: Once an adequate model has been obtained, the next step is to test the significant of the parameter estimates. There are two types of test that can be used, G-test, a likelihood ratio-based tests statistic and Wald test. G-test is used to test the significance all of the parameters in the model. The formula of G statistic: where L o = likelihood without independent variable L p = likelihood with independent variable with hypothesis of test: H = β = β 1 = β 2 = β 3 = … = β p = 0 H 1 = at least one β i is not the same as zero. 8 Under the null hypothesis H the G-statistic will follow a chi- square distribution with p degree of freedom Hosmer and Lemeshow 1989 in Sutisna 2002, so a chi-square test is the test of the fit of the model Wald test is used to test the significance of parameter β i , where i = 1,2,3, .., p partially. This test is used, when the null hypothesis H in the G-statistic is rejected. The formula of Wald test: With is the estimator for coefficient β i β ˆ i and i SE β ˆ is the standard error of . Null hypothesis for coefficient regression is zero will be rejected if |W i β ˆ i | W α 2. Under the null hypothesis the Wald test will follow normal distribution Hosmer and Lemeshow, 1989 in Sutisna 2002.

2.1.3. Model Interpretation