60
According to Duwi Priyatno 2010: 90 in the determination of whether or not an items that are to be used, usually done on the
significance of the correlation coefficient test on minimum correlation of 0.30, meaning that an item is considered valid if the total score is
greater than 0.30. Validity
and reliability
test conducted
by distributing
questionnaires to 60 respondents residing in North Jakarta, where the questionnaire contains 39 questionsstatements that should be
answered by the respondents and then the data will be processed using the software of Statistical Product and Service Solution SPSS 20 for
Windows.
3. Classic Assuming Test
To determine whether the equation of the regression line obtained by line and can be used for forecasting, then the test must be performed
classical assumptions are:
a. Normality
Normality test aims to test whether the regression model or residual confounding variable has a normal distribution Imam
Ghozali, 2007. The methods used in this research are graphic and one sample kolmogrov smirnov test. A variable is said normal if the
distributions of the image data points are spread around the diagonal line, and the spread of data points in the direction to follow the
diagonal line. According to Duwi Priyatno 2012:147, one sample
61
kolmogrov smirnov test used to determine the distribution of data, whether to follow a normal distribution, poisson, uniform, or
exponential. Residual has normal distribution if the value of significant is more than 0.05.
b. Multicolinearity
According to Malhotra 2007 multicolinearity used to prove there is linear correlation among independent variable in regression model.
A good regression model should not be a correlation between the independent variables. If independent variable creates the perfect
correlation it could be perfect multicolinearity. Duwi Priyatno 2012:151 stated that multicollinearity is a
condition where on the regression model found any perfect or approaching perfect correlation between the independent variables. A
good regression model should have no correlation among the independent variable the correlation is 1 or close to 1. Some
multicollinearity test method is to look at the value of the tolerance and variance inflation factor VIF in the regression models or by
comparing individual coefficient of determination r² with simultaneous determination R². To know a regression model free
from multicollinearity, which has a value of VIF Variance Inflation Factor is less than 10 and have a tolerance of more than 0.1.
62
c. Heterokesdasticity
According to Imam Ghozali 2007, heterokesdasticity test aims to test whether the regression model of the residual variance in equality
occurred one observation to another observation. If the residual variance from one observation to another observation remains, it is
called homokesdasticity and if different called heterokesdasticity. To analyze is there any heterokesdasticity, use some method,
which are: 1 Observe the graph plot between the predicted value of the
dependent variable is the residual ZPRED its SRESID. To detect the presence or absence heterokesdasticity can be seen with the
presence or absence of certain pattern sin charts and scatter plot between SRESID and ZPRED, where point Y is Y that already
predicted and point X is the residual Y predicted – Y real that
studentized.
2 The basis of analysis that is if there is a certain pattern, like dots
that form a regular pattern wavy, widened, then narrowed, then it indicates there has been heterokesdasticity as and if there is no
clear pattern, as well as the points spread above and below zero on
the Y axis, there is no heterokesdasticity. 4.
Regression Analysis
In this case of research researcher use multiple regression analysis because to bring in additional predictor variables X1, X2, X3 and one
63
variable predictor as Y. the goals are still the same, researcher want to construct an regression to estimate values of the criterion variable, but now
researcher do to from several predictor variables as customer value, customer satisfaction and trust in brand. And researcher still will to
measure the closeness of the estimated relationship. The objective in introducing additional variables is basic to improve predictions of the
criterion variable. Things are getting more complicated, so researcher needs to notation
a regression model with predictor variables, can be to write: Regression Analysis:
Where: Y
= Customer Loyalty criterion variable a
= intercept parameter in the multiple regression equation 1, 2, 3
= Coefficient Regression X1
= Customer Value predictor X2
= Customer Satisfaction predictor X3
= Trust in Brand predictor e
= Standard Error term with the prediction of Y where X1, X2, X3 are the predictor variables
Y = a +
1
X
1
+
2
X
2
+
3
X
3
+e
64
These are the following explanation that is connecting to the problem above, that is:
a. The Coefficient of Correlation Test