39
IR
t
= Interest rate SBI rate on t period IR
t-1
= Interest rate SBI rate on t-1 period
D. Data Analysis Technique Regression analysis will be used to test hypotheses formulated for this
study. Three variables inflation, interest rate, and exchange rate were entered.
Multiple regressions will determine the significant relationship between dependent and independent variables, the direction of the relationship, the degree
of the relationship and strength of the relationship Sekaran, 2006. Multiple regression are most sophisticated extension of correlation and are used to explore
the predict ability of a set of independent variables on dependent variable Pallant, 2001. Three hypotheses generated. From the hypothesis it gives direction to
assess the statistical relationship between the dependent and independent variables. The convention of P value has been set as of 5 i.e. 0.05 used as
evidence of a statistical association between the dependent and independent variables.
To gather the best model of research, researcher must perform other pre-tests. The test are: normality test, assumption test heteroscedasticity test, auto-correlation
test, multi-collinearity test, and hypothesis test.
1. Normality Test
In statistics, normality tests are used to determine whether a data set is well-modeled by a normal distribution or not, or to compute how likely an
40
underlying random variable is to be normally distributed. An informal approach to testing normality is to compare a histogram of the residuals to a normal
probability curve. The actual distribution of the residuals the histogram should be bell-shaped and resemble the normal distribution.
There are certain methods to detect whether data is normally distributed or not. The methods are using Histogram of Residual, Normal Probability Plot, and
Jarque-Bera Test. In this research, researchers want to use a method proposed by Jarque Bera or commonly known as Jarque Bera Test to gather the most accurate
model of data.
a. Jarque-Bera Test of Normality
The JB test of normality is an asymptotic, or large-sample, test. It is also based on
the OLS residuals Gujarati, 2004. This test first computes the skewness and kurtosis
measures of the OLS residuals and uses the following test statistic: [
]
3.8
Where, n = sample size
S = skewness coefficient K = kurtosis coefficient
For a normally distributed variable, S=0 and K=3. Therefore, the JB test of normality is a test of the joint hypothesis that S and K are 0 and 3, respectively. In
that case the value of the JB statistic is expected to be 0.
41
Regarding this, the hypothesis of Jarque-Bera Test is described as follows: H
: Data is not normally distributed H
a
: Data is normally distributed To detect whether the variable is normally distributed or not, one can compare the
value of Jarque Bera statistic with the value of Jarque Bera table X
2
., as follows: a. If JB Statistic X
2
, the data is not normally distributed, and thus we do not reject H
0.
b. If JB Statistic X
2
, the data is normally distributed, and thus we reject H
0.
2. Classical Assumption Test
The Gaussian, standard, or classical linear regression model CLRM, which is the cornerstone of most econometric theory, makes 10 assumptions
underlying of Ordinary Least Square method Gujarati, 2004, p.65. This research will focus on its 6 basic assumption in context of the two-variable regression
model.
Assumption 1 : Linear Regression Model. The regression model is linear in
the parameters
Assumption 2 : X values The independent variable are fixed in repeated
sampling. Values taken by the regressor X are considered fixed
in repeated samples. More technically, X is assumed to be nonstochastic.
Assumption3 : Zero mean value of disturbance u
i
.
Given the value of X, the mean, or expected, value of the random disturbance term u
i
is
42
zero. Technically, the conditional mean value of u
i
is zero. Symbolically, we have
|�
3.9 Assumption 4
: Homoscedasticity or equal variance of u
i .
Given the value of X, the variance of u
i
is the same for all observations. That is, the conditional variances of u
i
are identical. Symbolically, we have
|�
3.10 Assumption 5
: No autocorrelation between the disturbances. Given any
two X values, X
i
and X
j
i≠ j, the correlation between any two u
i
and u
j
i ≠j is zero. Symbolically, we have
� |�
�
3.11 Assumption 6
: Zero covariance between u
i
and X
i
, or Eu
i
X
i
= 0. By Assumption,
� �
3.12
43
As noted earlier, given the assumptions of the classical linear regression model, the least-squares estimates possess some ideal or optimum properties. These
properties are contained in the well-known Gauss –Markov theorem. To
understand this theorem, we need to consider the best linear unbiasedness property
of an estimator. The OLS estimator is said to be a best linear unbiased estimator BLUE if the following hold Brooks, 2002:
1. It is linear, that is, a linear function of a random variable, such as the