45
D. Analysis Method
The analysis method used in this research is a model of multiple linear regression analysis with the help of software SPSS 18 for Windows7. Data
analysis was performed by Descriptive Statistical Analysis, Classical Assumptions Test, Hypothesis Test, Coefficient Determination Test R
2
-test. Classical assumptions test include Multicollonearity Test, Heteroscedasticity
Test, Autocorrelation Test and Normality Test. Hypothesis Test include Partial Significance Test T-test and Simultaneous Significance Test F-test.
1. Descriptive Statistical Analysis
Descriptive analysis is used to provide an overview of the study variables. Descriptive statistics were used, among others, mean, median, minimum,
maximum, and standard deviation. Ghozali, 2013:19. Statistical analysis was used to test the quality of the data and testing hypotheses. Statistical analyzes
were performed was the classic assumption test and hypothesis test. The data in this study were analyzed with descriptive statistics.
Descriptive statistical testing in this research basically is a process transformation research data in a form of tabulation in order that can be easier
to be understood and interpreted. Tabulation in generally is used by researcher to obtain information about characteristics of primary variable in research. The
measurement applied in this descriptive statistical testing depends on the type of scale of measurement. The descriptive statistical testing obtains a picture or
46
describes data that can be seen from median, mean, mode, standard deviation, variance, maximum and minimum.
2. Classical Assumption Test
Classical test assumption is aims to determine the relationship between the variables in the data. Before conduct the regression analyzes, firstly the tested
classical assumptions need to determine whether there is a relationship between the variables. In the Classical Assumption Test it is include:
Multicollonearity Test, Heteroscedasticity Test, Autocorrelation Test and
Normality Test. a.
Normality Test
Normality test aims to test whether the regression model or residual confounding variable has a normal distribution. There are two ways to detect
whether or not residual normal distribution, i.e. the graph analysis and statistical tests Ghozali, 2013: 160. Normality test can use the tools such as
statistical tests to Kolmogorov-Smirnov Z 1 - Sample KS, the basic decision- making Ghozali, 2013: 164:
1. If the value Asymp. Sig. 2-tailed less than 0.05, then H0 is rejected. This means that the data are not normally distributed residuals.
2. If the value Asymp. Sig. 2-tailed of more than 0.05, then H0 is accepted. This means that the data were normally distributed residuals.
47
b. Multicollinearity Test
Multicollinearity test aims to test whether the regression model found a correlation between the independent variables Ghozali, 2013:105. A good
regression model should not happened correlation between the independent variables. To detect the presence or absence of multicollinearity in the
regression model can be seen from the value of tolerance and the Variance Inflation Factor VIF. Multicollinearity the tolerance value 0.10 or VIF 10.
Both of these measurements indicate each independent variable which is explained by the other independent variables.
c. Autocorrelation Test
Autocorrelation test aims to test something, in a linear regression model. There is a correlation between the error of a bug in the period t to bug errors t-
1 period or previous period Ghozali 2013:110. Diagnose the autocorrelation done through testing to test the value of Durbin Watson DW test by Ghozali
2013:111. Basis for decision-making as follows:
1 If 0 Dw DL there is any positive autocorrelation. 2 If DL Dw Du or 4-Du D 4-DL uncertain conclusion.
3 If Du Dw 4-Du there is no autocorrelation. 4 If 4-DL Dw 4 there is any negative autocorrelation.
d. Heteroscedasticity Test
According to Ghozali 2013: 139, the aim from heteroscedasticity test is to test whether the regression model occur the variance inequality of the
48
residual from one observation to another observation. If the variance from residual of one observation to other observations is fixed, it is called
homocedasticity and if it different called heteroscedasticity.
3. Coefficient Determination R
2
The coefficient of determination R2 essentially measures how far the ability of the independent variable [The Size of Board of Director BOD, The
Size of Board of Independent BOI, Managerial Ownership MO and Institutional Ownership IO] in explaining the dependent variable [Firm
Performance ROA]. Determination coefficient value is between 0 zero and 1 one. Means that the value near to 1 one, the independent variable provide
nearly all the information required to predict the dependent variable Ghozali, 2013: 97
The fundamental weakness of the use of coefficient determination is biased against the number of independent variables included in the model.
Every additional one variable, then R
2
must have increased does not matter, whether these variables significantly influence the dependent variable.
Therefore, this study uses the Adjust value R
2
, which the value may vary if the independent variables are added into the model.
If Adjust R
2
value 1, it is mean that the dependent variable fluctuations should be explained by the independent variable and there is no other factors
that cause fluctuations in the dependent variable. Adjusted R
2
ranges between 0 and 1. If approaching 1 means that the stronger ability of independent
49
variables can explain the dependent variable. Vice versa, if the value of Adjusted R
2
closer to the number 0 means that the weak capability of independent variables can explain the fluctuations of dependent variable
Ghozali, 2013: 97. The criteria of correlation according to Sarwono 2014: 100 are:
Table 3.1 Criteria of Correlation Coefficient
VALUE INFORMATION
No correlation between variable – 0.25
Very weak correlation 0.25
– 0.5 Fairly strong correlation
0.5 – 0.75
Strong correlation 0.75
– 0.99 Very strong correlation
1 Perfectly correlation
Sources:
Sarwono 2014: 100
4. Multiple Regression Analysis